mirror of
https://github.com/ynput/ayon-core.git
synced 2025-12-24 21:04:40 +01:00
Merge branch 'develop' of https://github.com/moonyuet/OpenPype_fork into bugfix/OP-3356_Maya-Review-Image-plane-attribute
This commit is contained in:
commit
4ec96b8d42
187 changed files with 6537 additions and 2628 deletions
7
.gitmodules
vendored
Normal file
7
.gitmodules
vendored
Normal file
|
|
@ -0,0 +1,7 @@
|
|||
[submodule "tools/modules/powershell/BurntToast"]
|
||||
path = tools/modules/powershell/BurntToast
|
||||
url = https://github.com/Windos/BurntToast.git
|
||||
|
||||
[submodule "tools/modules/powershell/PSWriteColor"]
|
||||
path = tools/modules/powershell/PSWriteColor
|
||||
url = https://github.com/EvotecIT/PSWriteColor.git
|
||||
162
CHANGELOG.md
162
CHANGELOG.md
|
|
@ -1,8 +1,59 @@
|
|||
# Changelog
|
||||
|
||||
## [3.12.1-nightly.3](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
## [3.12.2-nightly.3](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.12.0...HEAD)
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.12.1...HEAD)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
- Update website with more studios [\#3554](https://github.com/pypeclub/OpenPype/pull/3554)
|
||||
- Documentation: Update publishing dev docs [\#3549](https://github.com/pypeclub/OpenPype/pull/3549)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Maya: add additional validators to Settings [\#3540](https://github.com/pypeclub/OpenPype/pull/3540)
|
||||
- General: Interactive console in cli [\#3526](https://github.com/pypeclub/OpenPype/pull/3526)
|
||||
- Ftrack: Automatic daily review session creation can define trigger hour [\#3516](https://github.com/pypeclub/OpenPype/pull/3516)
|
||||
- Ftrack: add source into Note [\#3509](https://github.com/pypeclub/OpenPype/pull/3509)
|
||||
- Ftrack: Trigger custom ftrack topic of project structure creation [\#3506](https://github.com/pypeclub/OpenPype/pull/3506)
|
||||
- Settings UI: Add extract to file action on project view [\#3505](https://github.com/pypeclub/OpenPype/pull/3505)
|
||||
- Add pack and unpack convenience scripts [\#3502](https://github.com/pypeclub/OpenPype/pull/3502)
|
||||
- General: Event system [\#3499](https://github.com/pypeclub/OpenPype/pull/3499)
|
||||
- NewPublisher: Keep plugins with mismatch target in report [\#3498](https://github.com/pypeclub/OpenPype/pull/3498)
|
||||
- Nuke: load clip with options from settings [\#3497](https://github.com/pypeclub/OpenPype/pull/3497)
|
||||
- TrayPublisher: implemented render\_mov\_batch [\#3486](https://github.com/pypeclub/OpenPype/pull/3486)
|
||||
- Migrate basic families to the new Tray Publisher [\#3469](https://github.com/pypeclub/OpenPype/pull/3469)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Remove invalid submodules from `/vendor` [\#3557](https://github.com/pypeclub/OpenPype/pull/3557)
|
||||
- General: Remove hosts filter on integrator plugins [\#3556](https://github.com/pypeclub/OpenPype/pull/3556)
|
||||
- Settings: Clean default values of environments [\#3550](https://github.com/pypeclub/OpenPype/pull/3550)
|
||||
- Module interfaces: Fix import error [\#3547](https://github.com/pypeclub/OpenPype/pull/3547)
|
||||
- Workfiles tool: Show of tool and it's flags [\#3539](https://github.com/pypeclub/OpenPype/pull/3539)
|
||||
- General: Create workfile documents works again [\#3538](https://github.com/pypeclub/OpenPype/pull/3538)
|
||||
- Additional fixes for powershell scripts [\#3525](https://github.com/pypeclub/OpenPype/pull/3525)
|
||||
- Maya: Added wrapper around cmds.setAttr [\#3523](https://github.com/pypeclub/OpenPype/pull/3523)
|
||||
- Nuke: double slate [\#3521](https://github.com/pypeclub/OpenPype/pull/3521)
|
||||
- General: Fix hash of centos oiio archive [\#3519](https://github.com/pypeclub/OpenPype/pull/3519)
|
||||
- Maya: Renderman display output fix [\#3514](https://github.com/pypeclub/OpenPype/pull/3514)
|
||||
- TrayPublisher: Simple creation enhancements and fixes [\#3513](https://github.com/pypeclub/OpenPype/pull/3513)
|
||||
- NewPublisher: Publish attributes are properly collected [\#3510](https://github.com/pypeclub/OpenPype/pull/3510)
|
||||
- TrayPublisher: Make sure host name is filled [\#3504](https://github.com/pypeclub/OpenPype/pull/3504)
|
||||
- NewPublisher: Groups work and enum multivalue [\#3501](https://github.com/pypeclub/OpenPype/pull/3501)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- Refactor Integrate Asset [\#3530](https://github.com/pypeclub/OpenPype/pull/3530)
|
||||
- General: Client docstrings cleanup [\#3529](https://github.com/pypeclub/OpenPype/pull/3529)
|
||||
- General: Get current context document functions [\#3522](https://github.com/pypeclub/OpenPype/pull/3522)
|
||||
- Kitsu: Use query function from client [\#3496](https://github.com/pypeclub/OpenPype/pull/3496)
|
||||
- TimersManager: Use query functions [\#3495](https://github.com/pypeclub/OpenPype/pull/3495)
|
||||
- Deadline: Use query functions [\#3466](https://github.com/pypeclub/OpenPype/pull/3466)
|
||||
|
||||
## [3.12.1](https://github.com/pypeclub/OpenPype/tree/3.12.1) (2022-07-13)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.12.1-nightly.6...3.12.1)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
|
|
@ -14,135 +65,74 @@
|
|||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- TrayPublisher: Added more options for grouping of instances [\#3494](https://github.com/pypeclub/OpenPype/pull/3494)
|
||||
- NewPublisher: Align creator attributes from top to bottom [\#3487](https://github.com/pypeclub/OpenPype/pull/3487)
|
||||
- NewPublisher: Added ability to use label of instance [\#3484](https://github.com/pypeclub/OpenPype/pull/3484)
|
||||
- General: Creator Plugins have access to project [\#3476](https://github.com/pypeclub/OpenPype/pull/3476)
|
||||
- General: Better arguments order in creator init [\#3475](https://github.com/pypeclub/OpenPype/pull/3475)
|
||||
- Ftrack: Trigger custom ftrack events on project creation and preparation [\#3465](https://github.com/pypeclub/OpenPype/pull/3465)
|
||||
- Windows installer: Clean old files and add version subfolder [\#3445](https://github.com/pypeclub/OpenPype/pull/3445)
|
||||
- Blender: Bugfix - Set fps properly on open [\#3426](https://github.com/pypeclub/OpenPype/pull/3426)
|
||||
- Blender: pre pyside install for all platforms [\#3400](https://github.com/pypeclub/OpenPype/pull/3400)
|
||||
- Maya: Ability to set resolution for playblasts from asset, and override through review instance. [\#3360](https://github.com/pypeclub/OpenPype/pull/3360)
|
||||
- Hiero: Add custom scripts menu [\#3425](https://github.com/pypeclub/OpenPype/pull/3425)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- TrayPublisher: Keep use instance label in list view [\#3493](https://github.com/pypeclub/OpenPype/pull/3493)
|
||||
- General: Extract review use first frame of input sequence [\#3491](https://github.com/pypeclub/OpenPype/pull/3491)
|
||||
- General: Fix Plist loading for application launch [\#3485](https://github.com/pypeclub/OpenPype/pull/3485)
|
||||
- Nuke: Workfile tools open on start [\#3479](https://github.com/pypeclub/OpenPype/pull/3479)
|
||||
- New Publisher: Disabled context change allows creation [\#3478](https://github.com/pypeclub/OpenPype/pull/3478)
|
||||
- General: thumbnail extractor fix [\#3474](https://github.com/pypeclub/OpenPype/pull/3474)
|
||||
- Kitsu: bugfix with sync-service ans publish plugins [\#3473](https://github.com/pypeclub/OpenPype/pull/3473)
|
||||
- Flame: solved problem with multi-selected loading [\#3470](https://github.com/pypeclub/OpenPype/pull/3470)
|
||||
- General: Fix query function in update logic [\#3468](https://github.com/pypeclub/OpenPype/pull/3468)
|
||||
- Resolve: removed few bugs [\#3464](https://github.com/pypeclub/OpenPype/pull/3464)
|
||||
- General: Delete old versions is safer when ftrack is disabled [\#3462](https://github.com/pypeclub/OpenPype/pull/3462)
|
||||
- Nuke: fixing metadata slate TC difference [\#3455](https://github.com/pypeclub/OpenPype/pull/3455)
|
||||
- Nuke: prerender reviewable fails [\#3450](https://github.com/pypeclub/OpenPype/pull/3450)
|
||||
- Maya: fix hashing in Python 3 for tile rendering [\#3447](https://github.com/pypeclub/OpenPype/pull/3447)
|
||||
- LogViewer: Escape html characters in log message [\#3443](https://github.com/pypeclub/OpenPype/pull/3443)
|
||||
- Nuke: Slate frame is integrated [\#3427](https://github.com/pypeclub/OpenPype/pull/3427)
|
||||
- Maya: Camera extra data - additional fix for \#3304 [\#3386](https://github.com/pypeclub/OpenPype/pull/3386)
|
||||
- Harmony: added unc path to zifile command in Harmony [\#3372](https://github.com/pypeclub/OpenPype/pull/3372)
|
||||
- Maya: Handle excluding `model` family from frame range validator. [\#3370](https://github.com/pypeclub/OpenPype/pull/3370)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- Maya: Merge animation + pointcache extractor logic [\#3461](https://github.com/pypeclub/OpenPype/pull/3461)
|
||||
- Maya: Re-use `maintained\_time` from lib [\#3460](https://github.com/pypeclub/OpenPype/pull/3460)
|
||||
- General: Use query functions in global plugins [\#3459](https://github.com/pypeclub/OpenPype/pull/3459)
|
||||
- Clockify: Use query functions in clockify actions [\#3458](https://github.com/pypeclub/OpenPype/pull/3458)
|
||||
- General: Use query functions in rest api calls [\#3457](https://github.com/pypeclub/OpenPype/pull/3457)
|
||||
- General: Use query functions in openpype lib functions [\#3454](https://github.com/pypeclub/OpenPype/pull/3454)
|
||||
- General: Use query functions in load utils [\#3446](https://github.com/pypeclub/OpenPype/pull/3446)
|
||||
- General: Move publish plugin and publish render abstractions [\#3442](https://github.com/pypeclub/OpenPype/pull/3442)
|
||||
- General: Use Anatomy after move to pipeline [\#3436](https://github.com/pypeclub/OpenPype/pull/3436)
|
||||
- General: Anatomy moved to pipeline [\#3435](https://github.com/pypeclub/OpenPype/pull/3435)
|
||||
- Resolve: Use client query functions [\#3379](https://github.com/pypeclub/OpenPype/pull/3379)
|
||||
- General: Host implementation defined with class [\#3337](https://github.com/pypeclub/OpenPype/pull/3337)
|
||||
|
||||
## [3.12.0](https://github.com/pypeclub/OpenPype/tree/3.12.0) (2022-06-28)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.12.0-nightly.3...3.12.0)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
- Fix typo in documentation: pyenv on mac [\#3417](https://github.com/pypeclub/OpenPype/pull/3417)
|
||||
- Linux: update OIIO package [\#3401](https://github.com/pypeclub/OpenPype/pull/3401)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Webserver: Added CORS middleware [\#3422](https://github.com/pypeclub/OpenPype/pull/3422)
|
||||
- Attribute Defs UI: Files widget show what is allowed to drop in [\#3411](https://github.com/pypeclub/OpenPype/pull/3411)
|
||||
- Hosts: More options for in-host callbacks [\#3357](https://github.com/pypeclub/OpenPype/pull/3357)
|
||||
- Multiverse: expose some settings to GUI [\#3350](https://github.com/pypeclub/OpenPype/pull/3350)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- NewPublisher: Fix subset name change on change of creator plugin [\#3420](https://github.com/pypeclub/OpenPype/pull/3420)
|
||||
- Bug: fix invalid avalon import [\#3418](https://github.com/pypeclub/OpenPype/pull/3418)
|
||||
- Nuke: Fix keyword argument in query function [\#3414](https://github.com/pypeclub/OpenPype/pull/3414)
|
||||
- Houdini: fix loading and updating vbd/bgeo sequences [\#3408](https://github.com/pypeclub/OpenPype/pull/3408)
|
||||
- Nuke: Collect representation files based on Write [\#3407](https://github.com/pypeclub/OpenPype/pull/3407)
|
||||
- General: Filter representations before integration start [\#3398](https://github.com/pypeclub/OpenPype/pull/3398)
|
||||
- Maya: look collector typo [\#3392](https://github.com/pypeclub/OpenPype/pull/3392)
|
||||
- TVPaint: Make sure exit code is set to not None [\#3382](https://github.com/pypeclub/OpenPype/pull/3382)
|
||||
- Maya: vray device aspect ratio fix [\#3381](https://github.com/pypeclub/OpenPype/pull/3381)
|
||||
- Flame: bunch of publishing issues [\#3377](https://github.com/pypeclub/OpenPype/pull/3377)
|
||||
- Standalone: settings improvements [\#3355](https://github.com/pypeclub/OpenPype/pull/3355)
|
||||
- Nuke: Load full model hierarchy by default [\#3328](https://github.com/pypeclub/OpenPype/pull/3328)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- Unreal: Use client query functions [\#3421](https://github.com/pypeclub/OpenPype/pull/3421)
|
||||
- General: Move editorial lib to pipeline [\#3419](https://github.com/pypeclub/OpenPype/pull/3419)
|
||||
- Kitsu: renaming to plural func sync\_all\_projects [\#3397](https://github.com/pypeclub/OpenPype/pull/3397)
|
||||
- Houdini: Use client query functions [\#3395](https://github.com/pypeclub/OpenPype/pull/3395)
|
||||
- Nuke: Use client query functions [\#3391](https://github.com/pypeclub/OpenPype/pull/3391)
|
||||
- Maya: Use client query functions [\#3385](https://github.com/pypeclub/OpenPype/pull/3385)
|
||||
- Fusion: Use client query functions [\#3380](https://github.com/pypeclub/OpenPype/pull/3380)
|
||||
- Harmony: Use client query functions [\#3378](https://github.com/pypeclub/OpenPype/pull/3378)
|
||||
- Celaction: Use client query functions [\#3376](https://github.com/pypeclub/OpenPype/pull/3376)
|
||||
- Photoshop: Use client query functions [\#3375](https://github.com/pypeclub/OpenPype/pull/3375)
|
||||
- AfterEffects: Use client query functions [\#3374](https://github.com/pypeclub/OpenPype/pull/3374)
|
||||
- TVPaint: Use client query functions [\#3340](https://github.com/pypeclub/OpenPype/pull/3340)
|
||||
- Ftrack: Use client query functions [\#3339](https://github.com/pypeclub/OpenPype/pull/3339)
|
||||
- Standalone Publisher: Use client query functions [\#3330](https://github.com/pypeclub/OpenPype/pull/3330)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Sync Queue: Added far future value for null values for dates [\#3371](https://github.com/pypeclub/OpenPype/pull/3371)
|
||||
- Maya - added support for single frame playblast review [\#3369](https://github.com/pypeclub/OpenPype/pull/3369)
|
||||
|
||||
## [3.11.1](https://github.com/pypeclub/OpenPype/tree/3.11.1) (2022-06-20)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.11.1-nightly.1...3.11.1)
|
||||
|
||||
**🆕 New features**
|
||||
|
||||
- Flame: custom export temp folder [\#3346](https://github.com/pypeclub/OpenPype/pull/3346)
|
||||
- Nuke: removing third-party plugins [\#3344](https://github.com/pypeclub/OpenPype/pull/3344)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Pyblish Pype: Hiding/Close issues [\#3367](https://github.com/pypeclub/OpenPype/pull/3367)
|
||||
- General: Add ability to change user value for templates [\#3366](https://github.com/pypeclub/OpenPype/pull/3366)
|
||||
- Ftrack: Removed requirement of pypeclub role from default settings [\#3354](https://github.com/pypeclub/OpenPype/pull/3354)
|
||||
- Kitsu: Prevent crash on missing frames information [\#3352](https://github.com/pypeclub/OpenPype/pull/3352)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Nuke: bake streams with slate on farm [\#3368](https://github.com/pypeclub/OpenPype/pull/3368)
|
||||
- Harmony: audio validator has wrong logic [\#3364](https://github.com/pypeclub/OpenPype/pull/3364)
|
||||
- Nuke: Fix missing variable in extract thumbnail [\#3363](https://github.com/pypeclub/OpenPype/pull/3363)
|
||||
- Nuke: Fix precollect writes [\#3361](https://github.com/pypeclub/OpenPype/pull/3361)
|
||||
- AE- fix validate\_scene\_settings and renderLocal [\#3358](https://github.com/pypeclub/OpenPype/pull/3358)
|
||||
- deadline: fixing misidentification of revieables [\#3356](https://github.com/pypeclub/OpenPype/pull/3356)
|
||||
- General: Create only one thumbnail per instance [\#3351](https://github.com/pypeclub/OpenPype/pull/3351)
|
||||
- nuke: adding extract thumbnail settings 3.10 [\#3347](https://github.com/pypeclub/OpenPype/pull/3347)
|
||||
- General: Fix last version function [\#3345](https://github.com/pypeclub/OpenPype/pull/3345)
|
||||
- Deadline: added OPENPYPE\_MONGO to filter [\#3336](https://github.com/pypeclub/OpenPype/pull/3336)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- Webpublisher: Use client query functions [\#3333](https://github.com/pypeclub/OpenPype/pull/3333)
|
||||
|
||||
## [3.11.0](https://github.com/pypeclub/OpenPype/tree/3.11.0) (2022-06-17)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.11.0-nightly.4...3.11.0)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Settings: Settings can be extracted from UI [\#3323](https://github.com/pypeclub/OpenPype/pull/3323)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- General: Handle empty source key on instance [\#3342](https://github.com/pypeclub/OpenPype/pull/3342)
|
||||
- Houdini: Fix Houdini VDB manage update wrong file attribute name [\#3322](https://github.com/pypeclub/OpenPype/pull/3322)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
- Blender: Use client query functions [\#3331](https://github.com/pypeclub/OpenPype/pull/3331)
|
||||
|
||||
## [3.10.0](https://github.com/pypeclub/OpenPype/tree/3.10.0) (2022-05-26)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.10.0-nightly.6...3.10.0)
|
||||
|
|
|
|||
|
|
@ -15,7 +15,6 @@ from .lib import (
|
|||
run_subprocess,
|
||||
version_up,
|
||||
get_asset,
|
||||
get_hierarchy,
|
||||
get_workdir_data,
|
||||
get_version_from_path,
|
||||
get_last_version_from_path,
|
||||
|
|
@ -101,7 +100,6 @@ __all__ = [
|
|||
# get contextual data
|
||||
"version_up",
|
||||
"get_asset",
|
||||
"get_hierarchy",
|
||||
"get_workdir_data",
|
||||
"get_version_from_path",
|
||||
"get_last_version_from_path",
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@
|
|||
"""Package for handling pype command line arguments."""
|
||||
import os
|
||||
import sys
|
||||
|
||||
import code
|
||||
import click
|
||||
|
||||
# import sys
|
||||
|
|
@ -424,3 +424,22 @@ def pack_project(project, dirpath):
|
|||
def unpack_project(zipfile, root):
|
||||
"""Create a package of project with all files and database dump."""
|
||||
PypeCommands().unpack_project(zipfile, root)
|
||||
|
||||
|
||||
@main.command()
|
||||
def interactive():
|
||||
"""Interative (Python like) console.
|
||||
|
||||
Helpfull command not only for development to directly work with python
|
||||
interpreter.
|
||||
|
||||
Warning:
|
||||
Executable 'openpype_gui' on windows won't work.
|
||||
"""
|
||||
|
||||
from openpype.version import __version__
|
||||
|
||||
banner = "OpenPype {}\nPython {} on {}".format(
|
||||
__version__, sys.version, sys.platform
|
||||
)
|
||||
code.interact(banner)
|
||||
|
|
|
|||
|
|
@ -1,6 +1,7 @@
|
|||
from .entities import (
|
||||
get_projects,
|
||||
get_project,
|
||||
get_whole_project,
|
||||
|
||||
get_asset_by_id,
|
||||
get_asset_by_name,
|
||||
|
|
@ -24,20 +25,26 @@ from .entities import (
|
|||
get_last_version_by_subset_name,
|
||||
get_output_link_versions,
|
||||
|
||||
version_is_latest,
|
||||
|
||||
get_representation_by_id,
|
||||
get_representation_by_name,
|
||||
get_representations,
|
||||
get_representation_parents,
|
||||
get_representations_parents,
|
||||
get_archived_representations,
|
||||
|
||||
get_thumbnail,
|
||||
get_thumbnails,
|
||||
get_thumbnail_id_from_source,
|
||||
|
||||
get_workfile_info,
|
||||
)
|
||||
|
||||
__all__ = (
|
||||
"get_projects",
|
||||
"get_project",
|
||||
"get_whole_project",
|
||||
|
||||
"get_asset_by_id",
|
||||
"get_asset_by_name",
|
||||
|
|
@ -61,13 +68,18 @@ __all__ = (
|
|||
"get_last_version_by_subset_name",
|
||||
"get_output_link_versions",
|
||||
|
||||
"version_is_latest",
|
||||
|
||||
"get_representation_by_id",
|
||||
"get_representation_by_name",
|
||||
"get_representations",
|
||||
"get_representation_parents",
|
||||
"get_representations_parents",
|
||||
"get_archived_representations",
|
||||
|
||||
"get_thumbnail",
|
||||
"get_thumbnails",
|
||||
"get_thumbnail_id_from_source",
|
||||
|
||||
"get_workfile_info",
|
||||
)
|
||||
|
|
|
|||
File diff suppressed because it is too large
Load diff
|
|
@ -1,5 +1,4 @@
|
|||
import os
|
||||
import sys
|
||||
|
||||
from Qt import QtWidgets
|
||||
|
||||
|
|
@ -15,6 +14,7 @@ from openpype.pipeline import (
|
|||
AVALON_CONTAINER_ID,
|
||||
legacy_io,
|
||||
)
|
||||
from openpype.pipeline.load import any_outdated_containers
|
||||
import openpype.hosts.aftereffects
|
||||
from openpype.lib import register_event_callback
|
||||
|
||||
|
|
@ -136,7 +136,7 @@ def ls():
|
|||
|
||||
def check_inventory():
|
||||
"""Checks loaded containers if they are of highest version"""
|
||||
if not lib.any_outdated():
|
||||
if not any_outdated_containers():
|
||||
return
|
||||
|
||||
# Warn about outdated containers.
|
||||
|
|
|
|||
|
|
@ -17,11 +17,8 @@ class RenderCreator(Creator):
|
|||
|
||||
create_allow_context_change = True
|
||||
|
||||
def __init__(
|
||||
self, create_context, system_settings, project_settings, headless=False
|
||||
):
|
||||
super(RenderCreator, self).__init__(create_context, system_settings,
|
||||
project_settings, headless)
|
||||
def __init__(self, project_settings, *args, **kwargs):
|
||||
super(RenderCreator, self).__init__(project_settings, *args, **kwargs)
|
||||
self._default_variants = (project_settings["aftereffects"]
|
||||
["create"]
|
||||
["RenderCreator"]
|
||||
|
|
|
|||
|
|
@ -6,8 +6,8 @@ import attr
|
|||
import pyblish.api
|
||||
|
||||
from openpype.settings import get_project_settings
|
||||
from openpype.lib import abstract_collect_render
|
||||
from openpype.lib.abstract_collect_render import RenderInstance
|
||||
from openpype.pipeline import publish
|
||||
from openpype.pipeline.publish import RenderInstance
|
||||
|
||||
from openpype.hosts.aftereffects.api import get_stub
|
||||
|
||||
|
|
@ -25,7 +25,7 @@ class AERenderInstance(RenderInstance):
|
|||
file_name = attr.ib(default=None)
|
||||
|
||||
|
||||
class CollectAERender(abstract_collect_render.AbstractCollectRender):
|
||||
class CollectAERender(publish.AbstractCollectRender):
|
||||
|
||||
order = pyblish.api.CollectorOrder + 0.405
|
||||
label = "Collect After Effects Render Layers"
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@ import os
|
|||
import flame
|
||||
from pprint import pformat
|
||||
import openpype.hosts.flame.api as opfapi
|
||||
|
||||
from openpype.lib import StringTemplate
|
||||
|
||||
class LoadClip(opfapi.ClipLoader):
|
||||
"""Load a subset to timeline as clip
|
||||
|
|
@ -22,7 +22,7 @@ class LoadClip(opfapi.ClipLoader):
|
|||
# settings
|
||||
reel_group_name = "OpenPype_Reels"
|
||||
reel_name = "Loaded"
|
||||
clip_name_template = "{asset}_{subset}_{output}"
|
||||
clip_name_template = "{asset}_{subset}<_{output}>"
|
||||
|
||||
def load(self, context, name, namespace, options):
|
||||
|
||||
|
|
@ -36,8 +36,8 @@ class LoadClip(opfapi.ClipLoader):
|
|||
version_data = version.get("data", {})
|
||||
version_name = version.get("name", None)
|
||||
colorspace = version_data.get("colorspace", None)
|
||||
clip_name = self.clip_name_template.format(
|
||||
**context["representation"]["context"])
|
||||
clip_name = StringTemplate(self.clip_name_template).format(
|
||||
context["representation"]["context"])
|
||||
|
||||
# TODO: settings in imageio
|
||||
# convert colorspace with ocio to flame mapping
|
||||
|
|
|
|||
|
|
@ -2,6 +2,7 @@ import os
|
|||
import flame
|
||||
from pprint import pformat
|
||||
import openpype.hosts.flame.api as opfapi
|
||||
from openpype.lib import StringTemplate
|
||||
|
||||
|
||||
class LoadClipBatch(opfapi.ClipLoader):
|
||||
|
|
@ -21,7 +22,7 @@ class LoadClipBatch(opfapi.ClipLoader):
|
|||
|
||||
# settings
|
||||
reel_name = "OP_LoadedReel"
|
||||
clip_name_template = "{asset}_{subset}_{output}"
|
||||
clip_name_template = "{asset}_{subset}<_{output}>"
|
||||
|
||||
def load(self, context, name, namespace, options):
|
||||
|
||||
|
|
@ -39,8 +40,8 @@ class LoadClipBatch(opfapi.ClipLoader):
|
|||
if not context["representation"]["context"].get("output"):
|
||||
self.clip_name_template.replace("output", "representation")
|
||||
|
||||
clip_name = self.clip_name_template.format(
|
||||
**context["representation"]["context"])
|
||||
clip_name = StringTemplate(self.clip_name_template).format(
|
||||
context["representation"]["context"])
|
||||
|
||||
# TODO: settings in imageio
|
||||
# convert colorspace with ocio to flame mapping
|
||||
|
|
|
|||
|
|
@ -323,6 +323,8 @@ class IntegrateBatchGroup(pyblish.api.InstancePlugin):
|
|||
def _get_shot_task_dir_path(self, instance, task_data):
|
||||
project_doc = instance.data["projectEntity"]
|
||||
asset_entity = instance.data["assetEntity"]
|
||||
anatomy = instance.context.data["anatomy"]
|
||||
|
||||
return get_workdir(
|
||||
project_doc, asset_entity, task_data["name"], "flame")
|
||||
project_doc, asset_entity, task_data["name"], "flame", anatomy
|
||||
)
|
||||
|
|
|
|||
|
|
@ -4,17 +4,16 @@ import logging
|
|||
|
||||
import pyblish.api
|
||||
|
||||
from openpype import lib
|
||||
from openpype.client import get_representation_by_id
|
||||
from openpype.lib import register_event_callback
|
||||
from openpype.pipeline import (
|
||||
legacy_io,
|
||||
register_loader_plugin_path,
|
||||
register_creator_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.pipeline.load import get_outdated_containers
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
import openpype.hosts.harmony
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
|
|
@ -50,7 +49,9 @@ def get_asset_settings():
|
|||
dict: Scene data.
|
||||
|
||||
"""
|
||||
asset_data = lib.get_asset()["data"]
|
||||
|
||||
asset_doc = get_current_project_asset()
|
||||
asset_data = asset_doc["data"]
|
||||
fps = asset_data.get("fps")
|
||||
frame_start = asset_data.get("frameStart")
|
||||
frame_end = asset_data.get("frameEnd")
|
||||
|
|
@ -105,16 +106,7 @@ def check_inventory():
|
|||
in Harmony.
|
||||
"""
|
||||
|
||||
project_name = legacy_io.active_project()
|
||||
outdated_containers = []
|
||||
for container in ls():
|
||||
representation_id = container['representation']
|
||||
representation_doc = get_representation_by_id(
|
||||
project_name, representation_id, fields=["parent"]
|
||||
)
|
||||
if representation_doc and not lib.is_latest(representation_doc):
|
||||
outdated_containers.append(container)
|
||||
|
||||
outdated_containers = get_outdated_containers()
|
||||
if not outdated_containers:
|
||||
return
|
||||
|
||||
|
|
|
|||
|
|
@ -5,8 +5,8 @@ from openpype.pipeline import (
|
|||
load,
|
||||
get_representation_path,
|
||||
)
|
||||
from openpype.pipeline.context_tools import is_representation_from_latest
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
import openpype.lib
|
||||
|
||||
|
||||
copy_files = """function copyFile(srcFilename, dstFilename)
|
||||
|
|
@ -280,9 +280,7 @@ class BackgroundLoader(load.LoaderPlugin):
|
|||
)
|
||||
|
||||
def update(self, container, representation):
|
||||
|
||||
path = get_representation_path(representation)
|
||||
|
||||
with open(path) as json_file:
|
||||
data = json.load(json_file)
|
||||
|
||||
|
|
@ -300,10 +298,9 @@ class BackgroundLoader(load.LoaderPlugin):
|
|||
|
||||
bg_folder = os.path.dirname(path)
|
||||
|
||||
path = get_representation_path(representation)
|
||||
|
||||
print(container)
|
||||
|
||||
is_latest = is_representation_from_latest(representation)
|
||||
for layer in sorted(layers):
|
||||
file_to_import = [
|
||||
os.path.join(bg_folder, layer).replace("\\", "/")
|
||||
|
|
@ -347,7 +344,7 @@ class BackgroundLoader(load.LoaderPlugin):
|
|||
}
|
||||
%s
|
||||
""" % (sig, sig)
|
||||
if openpype.lib.is_latest(representation):
|
||||
if is_latest:
|
||||
harmony.send({"function": func, "args": [node, "green"]})
|
||||
else:
|
||||
harmony.send({"function": func, "args": [node, "red"]})
|
||||
|
|
|
|||
|
|
@ -10,8 +10,8 @@ from openpype.pipeline import (
|
|||
load,
|
||||
get_representation_path,
|
||||
)
|
||||
from openpype.pipeline.context_tools import is_representation_from_latest
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
import openpype.lib
|
||||
|
||||
|
||||
class ImageSequenceLoader(load.LoaderPlugin):
|
||||
|
|
@ -109,7 +109,7 @@ class ImageSequenceLoader(load.LoaderPlugin):
|
|||
)
|
||||
|
||||
# Colour node.
|
||||
if openpype.lib.is_latest(representation):
|
||||
if is_representation_from_latest(representation):
|
||||
harmony.send(
|
||||
{
|
||||
"function": "PypeHarmony.setColor",
|
||||
|
|
|
|||
|
|
@ -10,8 +10,8 @@ from openpype.pipeline import (
|
|||
load,
|
||||
get_representation_path,
|
||||
)
|
||||
from openpype.pipeline.context_tools import is_representation_from_latest
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
import openpype.lib
|
||||
|
||||
|
||||
class TemplateLoader(load.LoaderPlugin):
|
||||
|
|
@ -83,7 +83,7 @@ class TemplateLoader(load.LoaderPlugin):
|
|||
self_name = self.__class__.__name__
|
||||
|
||||
update_and_replace = False
|
||||
if openpype.lib.is_latest(representation):
|
||||
if is_representation_from_latest(representation):
|
||||
self._set_green(node)
|
||||
else:
|
||||
self._set_red(node)
|
||||
|
|
|
|||
|
|
@ -4,11 +4,10 @@ from pathlib import Path
|
|||
|
||||
import attr
|
||||
|
||||
import openpype.lib
|
||||
import openpype.lib.abstract_collect_render
|
||||
from openpype.lib.abstract_collect_render import RenderInstance
|
||||
from openpype.lib import get_formatted_current_time
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype.pipeline import publish
|
||||
from openpype.pipeline.publish import RenderInstance
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
|
||||
|
|
@ -20,8 +19,7 @@ class HarmonyRenderInstance(RenderInstance):
|
|||
leadingZeros = attr.ib(default=3)
|
||||
|
||||
|
||||
class CollectFarmRender(openpype.lib.abstract_collect_render.
|
||||
AbstractCollectRender):
|
||||
class CollectFarmRender(publish.AbstractCollectRender):
|
||||
"""Gather all publishable renders."""
|
||||
|
||||
# https://docs.toonboom.com/help/harmony-17/premium/reference/node/output/write-node-image-formats.html
|
||||
|
|
|
|||
|
|
@ -55,6 +55,10 @@ class ValidateSceneSettings(pyblish.api.InstancePlugin):
|
|||
|
||||
def process(self, instance):
|
||||
"""Plugin entry point."""
|
||||
|
||||
# TODO 'get_asset_settings' could expect asset document as argument
|
||||
# which is available on 'context.data["assetEntity"]'
|
||||
# - the same approach can be used in 'ValidateSceneSettingsRepair'
|
||||
expected_settings = harmony.get_asset_settings()
|
||||
self.log.info("scene settings from DB:".format(expected_settings))
|
||||
|
||||
|
|
|
|||
85
openpype/hosts/hiero/api/launchforhiero.py
Normal file
85
openpype/hosts/hiero/api/launchforhiero.py
Normal file
|
|
@ -0,0 +1,85 @@
|
|||
import logging
|
||||
|
||||
from scriptsmenu import scriptsmenu
|
||||
from Qt import QtWidgets
|
||||
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def _hiero_main_window():
|
||||
"""Return Hiero's main window"""
|
||||
for obj in QtWidgets.QApplication.topLevelWidgets():
|
||||
if (obj.inherits('QMainWindow') and
|
||||
obj.metaObject().className() == 'Foundry::UI::DockMainWindow'):
|
||||
return obj
|
||||
raise RuntimeError('Could not find HieroWindow instance')
|
||||
|
||||
|
||||
def _hiero_main_menubar():
|
||||
"""Retrieve the main menubar of the Hiero window"""
|
||||
hiero_window = _hiero_main_window()
|
||||
menubar = [i for i in hiero_window.children() if isinstance(
|
||||
i,
|
||||
QtWidgets.QMenuBar
|
||||
)]
|
||||
|
||||
assert len(menubar) == 1, "Error, could not find menu bar!"
|
||||
return menubar[0]
|
||||
|
||||
|
||||
def find_scripts_menu(title, parent):
|
||||
"""
|
||||
Check if the menu exists with the given title in the parent
|
||||
|
||||
Args:
|
||||
title (str): the title name of the scripts menu
|
||||
|
||||
parent (QtWidgets.QMenuBar): the menubar to check
|
||||
|
||||
Returns:
|
||||
QtWidgets.QMenu or None
|
||||
|
||||
"""
|
||||
|
||||
menu = None
|
||||
search = [i for i in parent.children() if
|
||||
isinstance(i, scriptsmenu.ScriptsMenu)
|
||||
and i.title() == title]
|
||||
if search:
|
||||
assert len(search) < 2, ("Multiple instances of menu '{}' "
|
||||
"in menu bar".format(title))
|
||||
menu = search[0]
|
||||
|
||||
return menu
|
||||
|
||||
|
||||
def main(title="Scripts", parent=None, objectName=None):
|
||||
"""Build the main scripts menu in Hiero
|
||||
|
||||
Args:
|
||||
title (str): name of the menu in the application
|
||||
|
||||
parent (QtWidgets.QtMenuBar): the parent object for the menu
|
||||
|
||||
objectName (str): custom objectName for scripts menu
|
||||
|
||||
Returns:
|
||||
scriptsmenu.ScriptsMenu instance
|
||||
|
||||
"""
|
||||
hieromainbar = parent or _hiero_main_menubar()
|
||||
try:
|
||||
# check menu already exists
|
||||
menu = find_scripts_menu(title, hieromainbar)
|
||||
if not menu:
|
||||
log.info("Attempting to build menu ...")
|
||||
object_name = objectName or title.lower()
|
||||
menu = scriptsmenu.ScriptsMenu(title=title,
|
||||
parent=hieromainbar,
|
||||
objectName=object_name)
|
||||
except Exception as e:
|
||||
log.error(e)
|
||||
return
|
||||
|
||||
return menu
|
||||
|
|
@ -9,6 +9,7 @@ from openpype.pipeline import legacy_io
|
|||
from openpype.tools.utils import host_tools
|
||||
|
||||
from . import tags
|
||||
from openpype.settings import get_project_settings
|
||||
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
|
|
@ -41,6 +42,7 @@ def menu_install():
|
|||
Installing menu into Hiero
|
||||
|
||||
"""
|
||||
|
||||
from Qt import QtGui
|
||||
from . import (
|
||||
publish, launch_workfiles_app, reload_config,
|
||||
|
|
@ -138,3 +140,30 @@ def menu_install():
|
|||
exeprimental_action.triggered.connect(
|
||||
lambda: host_tools.show_experimental_tools_dialog(parent=main_window)
|
||||
)
|
||||
|
||||
|
||||
def add_scripts_menu():
|
||||
try:
|
||||
from . import launchforhiero
|
||||
except ImportError:
|
||||
|
||||
log.warning(
|
||||
"Skipping studio.menu install, because "
|
||||
"'scriptsmenu' module seems unavailable."
|
||||
)
|
||||
return
|
||||
|
||||
# load configuration of custom menu
|
||||
project_settings = get_project_settings(os.getenv("AVALON_PROJECT"))
|
||||
config = project_settings["hiero"]["scriptsmenu"]["definition"]
|
||||
_menu = project_settings["hiero"]["scriptsmenu"]["name"]
|
||||
|
||||
if not config:
|
||||
log.warning("Skipping studio menu, no definition found.")
|
||||
return
|
||||
|
||||
# run the launcher for Hiero menu
|
||||
studio_menu = launchforhiero.main(title=_menu.title())
|
||||
|
||||
# apply configuration
|
||||
studio_menu.build_from_configuration(studio_menu, config)
|
||||
|
|
|
|||
|
|
@ -48,6 +48,7 @@ def install():
|
|||
|
||||
# install menu
|
||||
menu.menu_install()
|
||||
menu.add_scripts_menu()
|
||||
|
||||
# register hiero events
|
||||
events.register_hiero_events()
|
||||
|
|
|
|||
|
|
@ -10,6 +10,7 @@ import qargparse
|
|||
|
||||
import openpype.api as openpype
|
||||
from openpype.pipeline import LoaderPlugin, LegacyCreator
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
from . import lib
|
||||
|
||||
log = openpype.Logger().get_logger(__name__)
|
||||
|
|
@ -484,7 +485,7 @@ class ClipLoader:
|
|||
|
||||
"""
|
||||
asset_name = self.context["representation"]["context"]["asset"]
|
||||
asset_doc = openpype.get_asset(asset_name)
|
||||
asset_doc = get_current_project_asset(asset_name)
|
||||
log.debug("__ asset_doc: {}".format(pformat(asset_doc)))
|
||||
self.data["assetData"] = asset_doc["data"]
|
||||
|
||||
|
|
|
|||
|
|
@ -5,8 +5,8 @@ from contextlib import contextmanager
|
|||
import six
|
||||
|
||||
from openpype.client import get_asset_by_name
|
||||
from openpype.api import get_asset
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
|
||||
|
||||
import hou
|
||||
|
|
@ -16,7 +16,7 @@ log = logging.getLogger(__name__)
|
|||
|
||||
def get_asset_fps():
|
||||
"""Return current asset fps."""
|
||||
return get_asset()["data"].get("fps")
|
||||
return get_current_project_asset()["data"].get("fps")
|
||||
|
||||
|
||||
def set_id(node, unique_id, overwrite=False):
|
||||
|
|
|
|||
|
|
@ -12,13 +12,13 @@ from openpype.pipeline import (
|
|||
register_loader_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.pipeline.load import any_outdated_containers
|
||||
import openpype.hosts.houdini
|
||||
from openpype.hosts.houdini.api import lib
|
||||
|
||||
from openpype.lib import (
|
||||
register_event_callback,
|
||||
emit_event,
|
||||
any_outdated,
|
||||
)
|
||||
|
||||
from .lib import get_asset_fps
|
||||
|
|
@ -245,7 +245,7 @@ def on_open():
|
|||
# ensure it is using correct FPS for the asset
|
||||
lib.validate_fps()
|
||||
|
||||
if any_outdated():
|
||||
if any_outdated_containers():
|
||||
from openpype.widgets import popup
|
||||
|
||||
log.warning("Scene has outdated content.")
|
||||
|
|
|
|||
|
|
@ -23,7 +23,6 @@ from openpype.client import (
|
|||
get_last_versions,
|
||||
get_representation_by_name
|
||||
)
|
||||
from openpype import lib
|
||||
from openpype.api import get_anatomy_settings
|
||||
from openpype.pipeline import (
|
||||
legacy_io,
|
||||
|
|
@ -33,6 +32,7 @@ from openpype.pipeline import (
|
|||
load_container,
|
||||
registered_host,
|
||||
)
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
from .commands import reset_frame_range
|
||||
|
||||
|
||||
|
|
@ -2174,7 +2174,7 @@ def reset_scene_resolution():
|
|||
project_name = legacy_io.active_project()
|
||||
project_doc = get_project(project_name)
|
||||
project_data = project_doc["data"]
|
||||
asset_data = lib.get_asset()["data"]
|
||||
asset_data = get_current_project_asset()["data"]
|
||||
|
||||
# Set project resolution
|
||||
width_key = "resolutionWidth"
|
||||
|
|
@ -2208,7 +2208,8 @@ def set_context_settings():
|
|||
project_name = legacy_io.active_project()
|
||||
project_doc = get_project(project_name)
|
||||
project_data = project_doc["data"]
|
||||
asset_data = lib.get_asset()["data"]
|
||||
asset_doc = get_current_project_asset(fields=["data.fps"])
|
||||
asset_data = asset_doc.get("data", {})
|
||||
|
||||
# Set project fps
|
||||
fps = asset_data.get("fps", project_data.get("fps", 25))
|
||||
|
|
@ -2233,7 +2234,7 @@ def validate_fps():
|
|||
|
||||
"""
|
||||
|
||||
fps = lib.get_asset()["data"]["fps"]
|
||||
fps = get_current_project_asset(fields=["data.fps"])["data"]["fps"]
|
||||
# TODO(antirotor): This is hack as for framerates having multiple
|
||||
# decimal places. FTrack is ceiling decimal values on
|
||||
# fps to two decimal places but Maya 2019+ is reporting those fps
|
||||
|
|
@ -2522,12 +2523,30 @@ def load_capture_preset(data=None):
|
|||
temp_options2['multiSampleEnable'] = False
|
||||
temp_options2['multiSampleCount'] = preset[id][key]
|
||||
|
||||
if key == 'renderDepthOfField':
|
||||
temp_options2['renderDepthOfField'] = preset[id][key]
|
||||
|
||||
if key == 'ssaoEnable':
|
||||
if preset[id][key] is True:
|
||||
temp_options2['ssaoEnable'] = True
|
||||
else:
|
||||
temp_options2['ssaoEnable'] = False
|
||||
|
||||
if key == 'ssaoSamples':
|
||||
temp_options2['ssaoSamples'] = preset[id][key]
|
||||
|
||||
if key == 'ssaoAmount':
|
||||
temp_options2['ssaoAmount'] = preset[id][key]
|
||||
|
||||
if key == 'ssaoRadius':
|
||||
temp_options2['ssaoRadius'] = preset[id][key]
|
||||
|
||||
if key == 'hwFogDensity':
|
||||
temp_options2['hwFogDensity'] = preset[id][key]
|
||||
|
||||
if key == 'ssaoFilterRadius':
|
||||
temp_options2['ssaoFilterRadius'] = preset[id][key]
|
||||
|
||||
if key == 'alphaCut':
|
||||
temp_options2['transparencyAlgorithm'] = 5
|
||||
temp_options2['transparencyQuality'] = 1
|
||||
|
|
@ -2535,6 +2554,48 @@ def load_capture_preset(data=None):
|
|||
if key == 'headsUpDisplay':
|
||||
temp_options['headsUpDisplay'] = True
|
||||
|
||||
if key == 'fogging':
|
||||
temp_options['fogging'] = preset[id][key] or False
|
||||
|
||||
if key == 'hwFogStart':
|
||||
temp_options2['hwFogStart'] = preset[id][key]
|
||||
|
||||
if key == 'hwFogEnd':
|
||||
temp_options2['hwFogEnd'] = preset[id][key]
|
||||
|
||||
if key == 'hwFogAlpha':
|
||||
temp_options2['hwFogAlpha'] = preset[id][key]
|
||||
|
||||
if key == 'hwFogFalloff':
|
||||
temp_options2['hwFogFalloff'] = int(preset[id][key])
|
||||
|
||||
if key == 'hwFogColorR':
|
||||
temp_options2['hwFogColorR'] = preset[id][key]
|
||||
|
||||
if key == 'hwFogColorG':
|
||||
temp_options2['hwFogColorG'] = preset[id][key]
|
||||
|
||||
if key == 'hwFogColorB':
|
||||
temp_options2['hwFogColorB'] = preset[id][key]
|
||||
|
||||
if key == 'motionBlurEnable':
|
||||
if preset[id][key] is True:
|
||||
temp_options2['motionBlurEnable'] = True
|
||||
else:
|
||||
temp_options2['motionBlurEnable'] = False
|
||||
|
||||
if key == 'motionBlurSampleCount':
|
||||
temp_options2['motionBlurSampleCount'] = preset[id][key]
|
||||
|
||||
if key == 'motionBlurShutterOpenFraction':
|
||||
temp_options2['motionBlurShutterOpenFraction'] = preset[id][key]
|
||||
|
||||
if key == 'lineAAEnable':
|
||||
if preset[id][key] is True:
|
||||
temp_options2['lineAAEnable'] = True
|
||||
else:
|
||||
temp_options2['lineAAEnable'] = False
|
||||
|
||||
else:
|
||||
temp_options[str(key)] = preset[id][key]
|
||||
|
||||
|
|
@ -2544,7 +2605,24 @@ def load_capture_preset(data=None):
|
|||
'gpuCacheDisplayFilter',
|
||||
'multiSample',
|
||||
'ssaoEnable',
|
||||
'textureMaxResolution'
|
||||
'ssaoSamples',
|
||||
'ssaoAmount',
|
||||
'ssaoFilterRadius',
|
||||
'ssaoRadius',
|
||||
'hwFogStart',
|
||||
'hwFogEnd',
|
||||
'hwFogAlpha',
|
||||
'hwFogFalloff',
|
||||
'hwFogColorR',
|
||||
'hwFogColorG',
|
||||
'hwFogColorB',
|
||||
'hwFogDensity',
|
||||
'textureMaxResolution',
|
||||
'motionBlurEnable',
|
||||
'motionBlurSampleCount',
|
||||
'motionBlurShutterOpenFraction',
|
||||
'lineAAEnable',
|
||||
'renderDepthOfField'
|
||||
]:
|
||||
temp_options.pop(key, None)
|
||||
|
||||
|
|
@ -2974,8 +3052,9 @@ def update_content_on_context_change():
|
|||
This will update scene content to match new asset on context change
|
||||
"""
|
||||
scene_sets = cmds.listSets(allSets=True)
|
||||
new_asset = legacy_io.Session["AVALON_ASSET"]
|
||||
new_data = lib.get_asset()["data"]
|
||||
asset_doc = get_current_project_asset()
|
||||
new_asset = asset_doc["name"]
|
||||
new_data = asset_doc["data"]
|
||||
for s in scene_sets:
|
||||
try:
|
||||
if cmds.getAttr("{}.id".format(s)) == "pyblish.avalon.instance":
|
||||
|
|
|
|||
|
|
@ -1087,7 +1087,7 @@ class RenderProductsRenderman(ARenderProducts):
|
|||
"d_tiff": "tif"
|
||||
}
|
||||
|
||||
displays = get_displays()["displays"]
|
||||
displays = get_displays(override_dst="render")["displays"]
|
||||
for name, display in displays.items():
|
||||
enabled = display["params"]["enable"]["value"]
|
||||
if not enabled:
|
||||
|
|
@ -1106,9 +1106,33 @@ class RenderProductsRenderman(ARenderProducts):
|
|||
display["driverNode"]["type"], "exr")
|
||||
|
||||
for camera in cameras:
|
||||
product = RenderProduct(productName=aov_name,
|
||||
ext=extensions,
|
||||
camera=camera)
|
||||
# Create render product and set it as multipart only on
|
||||
# display types supporting it. In all other cases, Renderman
|
||||
# will create separate output per channel.
|
||||
if display["driverNode"]["type"] in ["d_openexr", "d_deepexr", "d_tiff"]: # noqa
|
||||
product = RenderProduct(
|
||||
productName=aov_name,
|
||||
ext=extensions,
|
||||
camera=camera,
|
||||
multipart=True
|
||||
)
|
||||
else:
|
||||
# this code should handle the case where no multipart
|
||||
# capable format is selected. But since it involves
|
||||
# shady logic to determine what channel become what
|
||||
# lets not do that as all productions will use exr anyway.
|
||||
"""
|
||||
for channel in display['params']['displayChannels']['value']: # noqa
|
||||
product = RenderProduct(
|
||||
productName="{}_{}".format(aov_name, channel),
|
||||
ext=extensions,
|
||||
camera=camera,
|
||||
multipart=False
|
||||
)
|
||||
"""
|
||||
raise UnsupportedImageFormatException(
|
||||
"Only exr, deep exr and tiff formats are supported.")
|
||||
|
||||
products.append(product)
|
||||
|
||||
return products
|
||||
|
|
@ -1201,3 +1225,7 @@ class UnsupportedRendererException(Exception):
|
|||
|
||||
Raised when requesting data from unsupported renderer.
|
||||
"""
|
||||
|
||||
|
||||
class UnsupportedImageFormatException(Exception):
|
||||
"""Custom exception to report unsupported output image format."""
|
||||
|
|
|
|||
|
|
@ -13,7 +13,6 @@ from openpype.host import HostBase, IWorkfileHost, ILoadHost
|
|||
import openpype.hosts.maya
|
||||
from openpype.tools.utils import host_tools
|
||||
from openpype.lib import (
|
||||
any_outdated,
|
||||
register_event_callback,
|
||||
emit_event
|
||||
)
|
||||
|
|
@ -28,6 +27,7 @@ from openpype.pipeline import (
|
|||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.pipeline.load import any_outdated_containers
|
||||
from openpype.hosts.maya.lib import copy_workspace_mel
|
||||
from . import menu, lib
|
||||
from .workio import (
|
||||
|
|
@ -470,7 +470,7 @@ def on_open():
|
|||
lib.validate_fps()
|
||||
lib.fix_incompatible_containers()
|
||||
|
||||
if any_outdated():
|
||||
if any_outdated_containers():
|
||||
log.warning("Scene has outdated content.")
|
||||
|
||||
# Find maya main window
|
||||
|
|
|
|||
|
|
@ -15,13 +15,13 @@ from openpype.hosts.maya.api import (
|
|||
from openpype.lib import requests_get
|
||||
from openpype.api import (
|
||||
get_system_settings,
|
||||
get_project_settings,
|
||||
get_asset)
|
||||
get_project_settings)
|
||||
from openpype.modules import ModulesManager
|
||||
from openpype.pipeline import (
|
||||
CreatorError,
|
||||
legacy_io,
|
||||
)
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
|
||||
|
||||
class CreateRender(plugin.Creator):
|
||||
|
|
@ -413,7 +413,7 @@ class CreateRender(plugin.Creator):
|
|||
prefix,
|
||||
type="string")
|
||||
|
||||
asset = get_asset()
|
||||
asset = get_current_project_asset()
|
||||
|
||||
if renderer == "arnold":
|
||||
# set format to exr
|
||||
|
|
|
|||
|
|
@ -2,8 +2,8 @@ import maya.cmds as cmds
|
|||
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
from openpype import lib
|
||||
import openpype.hosts.maya.api.lib as mayalib
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
from math import ceil
|
||||
|
||||
|
||||
|
|
@ -41,7 +41,9 @@ class ValidateMayaUnits(pyblish.api.ContextPlugin):
|
|||
# now flooring the value?
|
||||
fps = float_round(context.data.get('fps'), 2, ceil)
|
||||
|
||||
asset_fps = lib.get_asset()["data"]["fps"]
|
||||
# TODO repace query with using 'context.data["assetEntity"]'
|
||||
asset_doc = get_current_project_asset()
|
||||
asset_fps = asset_doc["data"]["fps"]
|
||||
|
||||
self.log.info('Units (linear): {0}'.format(linearunits))
|
||||
self.log.info('Units (angular): {0}'.format(angularunits))
|
||||
|
|
@ -91,5 +93,7 @@ class ValidateMayaUnits(pyblish.api.ContextPlugin):
|
|||
cls.log.debug(current_linear)
|
||||
|
||||
cls.log.info("Setting time unit to match project")
|
||||
asset_fps = lib.get_asset()["data"]["fps"]
|
||||
# TODO repace query with using 'context.data["assetEntity"]'
|
||||
asset_doc = get_current_project_asset()
|
||||
asset_fps = asset_doc["data"]["fps"]
|
||||
mayalib.set_scene_fps(asset_fps)
|
||||
|
|
|
|||
|
|
@ -6,7 +6,7 @@ from openpype.pipeline import PublishXmlValidationError
|
|||
|
||||
|
||||
class ValidateReviewSubsetUniqueness(pyblish.api.ContextPlugin):
|
||||
"""Validates that nodes has common root."""
|
||||
"""Validates that review subset has unique name."""
|
||||
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
hosts = ["maya"]
|
||||
|
|
@ -17,7 +17,7 @@ class ValidateReviewSubsetUniqueness(pyblish.api.ContextPlugin):
|
|||
subset_names = []
|
||||
|
||||
for instance in context:
|
||||
self.log.info("instance:: {}".format(instance.data))
|
||||
self.log.debug("Instance: {}".format(instance.data))
|
||||
if instance.data.get('publish'):
|
||||
subset_names.append(instance.data.get('subset'))
|
||||
|
||||
|
|
|
|||
|
|
@ -4,8 +4,7 @@ import openpype.api
|
|||
|
||||
|
||||
class ValidateSetdressRoot(pyblish.api.InstancePlugin):
|
||||
"""
|
||||
"""
|
||||
"""Validate if set dress top root node is published."""
|
||||
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
label = "SetDress Root"
|
||||
|
|
|
|||
|
|
@ -10,6 +10,7 @@ from collections import OrderedDict
|
|||
import clique
|
||||
|
||||
import nuke
|
||||
from Qt import QtCore, QtWidgets
|
||||
|
||||
from openpype.client import (
|
||||
get_project,
|
||||
|
|
@ -23,10 +24,10 @@ from openpype.api import (
|
|||
BuildWorkfile,
|
||||
get_version_from_path,
|
||||
get_workdir_data,
|
||||
get_asset,
|
||||
get_current_project_settings,
|
||||
)
|
||||
from openpype.tools.utils import host_tools
|
||||
from openpype.lib import env_value_to_bool
|
||||
from openpype.lib.path_tools import HostDirmap
|
||||
from openpype.settings import (
|
||||
get_project_settings,
|
||||
|
|
@ -38,6 +39,7 @@ from openpype.pipeline import (
|
|||
legacy_io,
|
||||
Anatomy,
|
||||
)
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
|
||||
from . import gizmo_menu
|
||||
|
||||
|
|
@ -63,7 +65,10 @@ class Context:
|
|||
main_window = None
|
||||
context_label = None
|
||||
project_name = os.getenv("AVALON_PROJECT")
|
||||
# Workfile related code
|
||||
workfiles_launched = False
|
||||
workfiles_tool_timer = None
|
||||
|
||||
# Seems unused
|
||||
_project_doc = None
|
||||
|
||||
|
|
@ -1761,7 +1766,7 @@ class WorkfileSettings(object):
|
|||
kwargs.get("asset_name")
|
||||
or legacy_io.Session["AVALON_ASSET"]
|
||||
)
|
||||
self._asset_entity = get_asset(self._asset)
|
||||
self._asset_entity = get_current_project_asset(self._asset)
|
||||
self._root_node = root_node or nuke.root()
|
||||
self._nodes = self.get_nodes(nodes=nodes)
|
||||
|
||||
|
|
@ -2384,12 +2389,19 @@ def select_nodes(nodes):
|
|||
|
||||
|
||||
def launch_workfiles_app():
|
||||
'''Function letting start workfiles after start of host
|
||||
'''
|
||||
from openpype.lib import (
|
||||
env_value_to_bool
|
||||
)
|
||||
from .pipeline import get_main_window
|
||||
"""Show workfiles tool on nuke launch.
|
||||
|
||||
Trigger to show workfiles tool on application launch. Can be executed only
|
||||
once all other calls are ignored.
|
||||
|
||||
Workfiles tool show is deffered after application initialization using
|
||||
QTimer.
|
||||
"""
|
||||
|
||||
if Context.workfiles_launched:
|
||||
return
|
||||
|
||||
Context.workfiles_launched = True
|
||||
|
||||
# get all imortant settings
|
||||
open_at_start = env_value_to_bool(
|
||||
|
|
@ -2400,10 +2412,40 @@ def launch_workfiles_app():
|
|||
if not open_at_start:
|
||||
return
|
||||
|
||||
if not Context.workfiles_launched:
|
||||
Context.workfiles_launched = True
|
||||
main_window = get_main_window()
|
||||
host_tools.show_workfiles(parent=main_window)
|
||||
# Show workfiles tool using timer
|
||||
# - this will be probably triggered during initialization in that case
|
||||
# the application is not be able to show uis so it must be
|
||||
# deffered using timer
|
||||
# - timer should be processed when initialization ends
|
||||
# When applications starts to process events.
|
||||
timer = QtCore.QTimer()
|
||||
timer.timeout.connect(_launch_workfile_app)
|
||||
timer.setInterval(100)
|
||||
Context.workfiles_tool_timer = timer
|
||||
timer.start()
|
||||
|
||||
|
||||
def _launch_workfile_app():
|
||||
# Safeguard to not show window when application is still starting up
|
||||
# or is already closing down.
|
||||
closing_down = QtWidgets.QApplication.closingDown()
|
||||
starting_up = QtWidgets.QApplication.startingUp()
|
||||
|
||||
# Stop the timer if application finished start up of is closing down
|
||||
if closing_down or not starting_up:
|
||||
Context.workfiles_tool_timer.stop()
|
||||
Context.workfiles_tool_timer = None
|
||||
|
||||
# Skip if application is starting up or closing down
|
||||
if starting_up or closing_down:
|
||||
return
|
||||
|
||||
# Make sure on top is enabled on first show so the window is not hidden
|
||||
# under main nuke window
|
||||
# - this happened on Centos 7 and it is because the focus of nuke
|
||||
# changes to the main window after showing because of initialization
|
||||
# which moves workfiles tool under it
|
||||
host_tools.show_workfiles(parent=None, on_top=True)
|
||||
|
||||
|
||||
def process_workfile_builder():
|
||||
|
|
|
|||
|
|
@ -120,8 +120,9 @@ def install():
|
|||
nuke.addOnCreate(workfile_settings.set_context_settings, nodeClass="Root")
|
||||
nuke.addOnCreate(workfile_settings.set_favorites, nodeClass="Root")
|
||||
nuke.addOnCreate(process_workfile_builder, nodeClass="Root")
|
||||
nuke.addOnCreate(launch_workfiles_app, nodeClass="Root")
|
||||
|
||||
_install_menu()
|
||||
launch_workfiles_app()
|
||||
|
||||
|
||||
def uninstall():
|
||||
|
|
@ -141,6 +142,14 @@ def uninstall():
|
|||
_uninstall_menu()
|
||||
|
||||
|
||||
def _show_workfiles():
|
||||
# Make sure parent is not set
|
||||
# - this makes Workfiles tool as separated window which
|
||||
# avoid issues with reopening
|
||||
# - it is possible to explicitly change on top flag of the tool
|
||||
host_tools.show_workfiles(parent=None, on_top=False)
|
||||
|
||||
|
||||
def _install_menu():
|
||||
# uninstall original avalon menu
|
||||
main_window = get_main_window()
|
||||
|
|
@ -157,7 +166,7 @@ def _install_menu():
|
|||
menu.addSeparator()
|
||||
menu.addCommand(
|
||||
"Work Files...",
|
||||
lambda: host_tools.show_workfiles(parent=main_window)
|
||||
_show_workfiles
|
||||
)
|
||||
|
||||
menu.addSeparator()
|
||||
|
|
|
|||
|
|
@ -54,20 +54,28 @@ class LoadClip(plugin.NukeLoader):
|
|||
script_start = int(nuke.root()["first_frame"].value())
|
||||
|
||||
# option gui
|
||||
defaults = {
|
||||
"start_at_workfile": True
|
||||
options_defaults = {
|
||||
"start_at_workfile": True,
|
||||
"add_retime": True
|
||||
}
|
||||
|
||||
options = [
|
||||
qargparse.Boolean(
|
||||
"start_at_workfile",
|
||||
help="Load at workfile start frame",
|
||||
default=True
|
||||
)
|
||||
]
|
||||
|
||||
node_name_template = "{class_name}_{ext}"
|
||||
|
||||
@classmethod
|
||||
def get_options(cls, *args):
|
||||
return [
|
||||
qargparse.Boolean(
|
||||
"start_at_workfile",
|
||||
help="Load at workfile start frame",
|
||||
default=cls.options_defaults["start_at_workfile"]
|
||||
),
|
||||
qargparse.Boolean(
|
||||
"add_retime",
|
||||
help="Load with retime",
|
||||
default=cls.options_defaults["add_retime"]
|
||||
)
|
||||
]
|
||||
|
||||
@classmethod
|
||||
def get_representations(cls):
|
||||
return (
|
||||
|
|
@ -86,7 +94,10 @@ class LoadClip(plugin.NukeLoader):
|
|||
file = self.fname.replace("\\", "/")
|
||||
|
||||
start_at_workfile = options.get(
|
||||
"start_at_workfile", self.defaults["start_at_workfile"])
|
||||
"start_at_workfile", self.options_defaults["start_at_workfile"])
|
||||
|
||||
add_retime = options.get(
|
||||
"add_retime", self.options_defaults["add_retime"])
|
||||
|
||||
version = context['version']
|
||||
version_data = version.get("data", {})
|
||||
|
|
@ -151,7 +162,7 @@ class LoadClip(plugin.NukeLoader):
|
|||
data_imprint = {}
|
||||
for k in add_keys:
|
||||
if k == 'version':
|
||||
data_imprint.update({k: context["version"]['name']})
|
||||
data_imprint[k] = context["version"]['name']
|
||||
elif k == 'colorspace':
|
||||
colorspace = repre["data"].get(k)
|
||||
colorspace = colorspace or version_data.get(k)
|
||||
|
|
@ -159,10 +170,13 @@ class LoadClip(plugin.NukeLoader):
|
|||
if used_colorspace:
|
||||
data_imprint["used_colorspace"] = used_colorspace
|
||||
else:
|
||||
data_imprint.update(
|
||||
{k: context["version"]['data'].get(k, str(None))})
|
||||
data_imprint[k] = context["version"]['data'].get(
|
||||
k, str(None))
|
||||
|
||||
data_imprint.update({"objectName": read_name})
|
||||
data_imprint["objectName"] = read_name
|
||||
|
||||
if add_retime and version_data.get("retime", None):
|
||||
data_imprint["addRetime"] = True
|
||||
|
||||
read_node["tile_color"].setValue(int("0x4ecd25ff", 16))
|
||||
|
||||
|
|
@ -174,7 +188,7 @@ class LoadClip(plugin.NukeLoader):
|
|||
loader=self.__class__.__name__,
|
||||
data=data_imprint)
|
||||
|
||||
if version_data.get("retime", None):
|
||||
if add_retime and version_data.get("retime", None):
|
||||
self._make_retimes(read_node, version_data)
|
||||
|
||||
self.set_as_member(read_node)
|
||||
|
|
@ -198,7 +212,12 @@ class LoadClip(plugin.NukeLoader):
|
|||
read_node = nuke.toNode(container['objectName'])
|
||||
file = get_representation_path(representation).replace("\\", "/")
|
||||
|
||||
start_at_workfile = bool("start at" in read_node['frame_mode'].value())
|
||||
start_at_workfile = "start at" in read_node['frame_mode'].value()
|
||||
|
||||
add_retime = [
|
||||
key for key in read_node.knobs().keys()
|
||||
if "addRetime" in key
|
||||
]
|
||||
|
||||
project_name = legacy_io.active_project()
|
||||
version_doc = get_version_by_id(project_name, representation["parent"])
|
||||
|
|
@ -286,7 +305,7 @@ class LoadClip(plugin.NukeLoader):
|
|||
"updated to version: {}".format(version_doc.get("name"))
|
||||
)
|
||||
|
||||
if version_data.get("retime", None):
|
||||
if add_retime and version_data.get("retime", None):
|
||||
self._make_retimes(read_node, version_data)
|
||||
else:
|
||||
self.clear_members(read_node)
|
||||
|
|
|
|||
|
|
@ -152,6 +152,7 @@ class ExtractSlateFrame(openpype.api.Extractor):
|
|||
self.log.debug("__ first_frame: {}".format(first_frame))
|
||||
self.log.debug("__ slate_first_frame: {}".format(slate_first_frame))
|
||||
|
||||
above_slate_node = slate_node.dependencies().pop()
|
||||
# fallback if files does not exists
|
||||
if self._check_frames_exists(instance):
|
||||
# Read node
|
||||
|
|
@ -164,8 +165,16 @@ class ExtractSlateFrame(openpype.api.Extractor):
|
|||
r_node["colorspace"].setValue(instance.data["colorspace"])
|
||||
previous_node = r_node
|
||||
temporary_nodes = [previous_node]
|
||||
|
||||
# adding copy metadata node for correct frame metadata
|
||||
cm_node = nuke.createNode("CopyMetaData")
|
||||
cm_node.setInput(0, previous_node)
|
||||
cm_node.setInput(1, above_slate_node)
|
||||
previous_node = cm_node
|
||||
temporary_nodes.append(cm_node)
|
||||
|
||||
else:
|
||||
previous_node = slate_node.dependencies().pop()
|
||||
previous_node = above_slate_node
|
||||
temporary_nodes = []
|
||||
|
||||
# only create colorspace baking if toggled on
|
||||
|
|
|
|||
|
|
@ -1,7 +1,6 @@
|
|||
import pyblish.api
|
||||
|
||||
from openpype.client import get_project, get_asset_by_id
|
||||
from openpype import lib
|
||||
from openpype.client import get_project, get_asset_by_id, get_asset_by_name
|
||||
from openpype.pipeline import legacy_io
|
||||
|
||||
|
||||
|
|
@ -17,10 +16,11 @@ class ValidateScript(pyblish.api.InstancePlugin):
|
|||
|
||||
def process(self, instance):
|
||||
ctx_data = instance.context.data
|
||||
asset_name = ctx_data["asset"]
|
||||
asset = lib.get_asset(asset_name)
|
||||
asset_data = asset["data"]
|
||||
project_name = legacy_io.active_project()
|
||||
asset_name = ctx_data["asset"]
|
||||
# TODO repace query with using 'instance.data["assetEntity"]'
|
||||
asset = get_asset_by_name(project_name, asset_name)
|
||||
asset_data = asset["data"]
|
||||
|
||||
# These attributes will be checked
|
||||
attributes = [
|
||||
|
|
|
|||
|
|
@ -1,6 +1,5 @@
|
|||
import os
|
||||
from Qt import QtWidgets
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
import pyblish.api
|
||||
|
||||
|
|
@ -13,8 +12,8 @@ from openpype.pipeline import (
|
|||
deregister_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
registered_host,
|
||||
)
|
||||
from openpype.pipeline.load import any_outdated_containers
|
||||
import openpype.hosts.photoshop
|
||||
|
||||
from . import lib
|
||||
|
|
@ -30,7 +29,7 @@ INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
|
|||
|
||||
|
||||
def check_inventory():
|
||||
if not lib.any_outdated():
|
||||
if not any_outdated_containers():
|
||||
return
|
||||
|
||||
# Warn about outdated containers.
|
||||
|
|
|
|||
|
|
@ -319,14 +319,13 @@ def get_current_timeline_items(
|
|||
selected_track_count = timeline.GetTrackCount(track_type)
|
||||
|
||||
# loop all tracks and get items
|
||||
_clips = dict()
|
||||
_clips = {}
|
||||
for track_index in range(1, (int(selected_track_count) + 1)):
|
||||
_track_name = timeline.GetTrackName(track_type, track_index)
|
||||
|
||||
# filter out all unmathed track names
|
||||
if track_name:
|
||||
if _track_name not in track_name:
|
||||
continue
|
||||
if track_name and _track_name not in track_name:
|
||||
continue
|
||||
|
||||
timeline_items = timeline.GetItemListInTrack(
|
||||
track_type, track_index)
|
||||
|
|
@ -348,12 +347,8 @@ def get_current_timeline_items(
|
|||
"index": clip_index
|
||||
}
|
||||
ti_color = ti.GetClipColor()
|
||||
if filter is True:
|
||||
if selecting_color in ti_color:
|
||||
selected_clips.append(data)
|
||||
else:
|
||||
if filter and selecting_color in ti_color or not filter:
|
||||
selected_clips.append(data)
|
||||
|
||||
return selected_clips
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -4,11 +4,11 @@ import uuid
|
|||
import qargparse
|
||||
from Qt import QtWidgets, QtCore
|
||||
|
||||
import openpype.api as pype
|
||||
from openpype.pipeline import (
|
||||
LegacyCreator,
|
||||
LoaderPlugin,
|
||||
)
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
from openpype.hosts import resolve
|
||||
from . import lib
|
||||
|
||||
|
|
@ -375,7 +375,7 @@ class ClipLoader:
|
|||
|
||||
"""
|
||||
asset_name = self.context["representation"]["context"]["asset"]
|
||||
self.data["assetData"] = pype.get_asset(asset_name)["data"]
|
||||
self.data["assetData"] = get_current_project_asset(asset_name)["data"]
|
||||
|
||||
def load(self):
|
||||
# create project bin for the media to be imported into
|
||||
|
|
@ -506,7 +506,7 @@ class Creator(LegacyCreator):
|
|||
super(Creator, self).__init__(*args, **kwargs)
|
||||
from openpype.api import get_current_project_settings
|
||||
resolve_p_settings = get_current_project_settings().get("resolve")
|
||||
self.presets = dict()
|
||||
self.presets = {}
|
||||
if resolve_p_settings:
|
||||
self.presets = resolve_p_settings["create"].get(
|
||||
self.__class__.__name__, {})
|
||||
|
|
|
|||
|
|
@ -116,12 +116,13 @@ class CreateShotClip(resolve.Creator):
|
|||
"order": 0},
|
||||
"vSyncTrack": {
|
||||
"value": gui_tracks, # noqa
|
||||
"type": "QComboBox",
|
||||
"label": "Hero track",
|
||||
"target": "ui",
|
||||
"toolTip": "Select driving track name which should be mastering all others", # noqa
|
||||
"order": 1}
|
||||
"type": "QComboBox",
|
||||
"label": "Hero track",
|
||||
"target": "ui",
|
||||
"toolTip": "Select driving track name which should be mastering all others", # noqa
|
||||
"order": 1
|
||||
}
|
||||
}
|
||||
},
|
||||
"publishSettings": {
|
||||
"type": "section",
|
||||
|
|
@ -172,28 +173,31 @@ class CreateShotClip(resolve.Creator):
|
|||
"target": "ui",
|
||||
"order": 4,
|
||||
"value": {
|
||||
"workfileFrameStart": {
|
||||
"value": 1001,
|
||||
"type": "QSpinBox",
|
||||
"label": "Workfiles Start Frame",
|
||||
"target": "tag",
|
||||
"toolTip": "Set workfile starting frame number", # noqa
|
||||
"order": 0},
|
||||
"handleStart": {
|
||||
"value": 0,
|
||||
"type": "QSpinBox",
|
||||
"label": "Handle start (head)",
|
||||
"target": "tag",
|
||||
"toolTip": "Handle at start of clip", # noqa
|
||||
"order": 1},
|
||||
"handleEnd": {
|
||||
"value": 0,
|
||||
"type": "QSpinBox",
|
||||
"label": "Handle end (tail)",
|
||||
"target": "tag",
|
||||
"toolTip": "Handle at end of clip", # noqa
|
||||
"order": 2},
|
||||
}
|
||||
"workfileFrameStart": {
|
||||
"value": 1001,
|
||||
"type": "QSpinBox",
|
||||
"label": "Workfiles Start Frame",
|
||||
"target": "tag",
|
||||
"toolTip": "Set workfile starting frame number", # noqa
|
||||
"order": 0
|
||||
},
|
||||
"handleStart": {
|
||||
"value": 0,
|
||||
"type": "QSpinBox",
|
||||
"label": "Handle start (head)",
|
||||
"target": "tag",
|
||||
"toolTip": "Handle at start of clip", # noqa
|
||||
"order": 1
|
||||
},
|
||||
"handleEnd": {
|
||||
"value": 0,
|
||||
"type": "QSpinBox",
|
||||
"label": "Handle end (tail)",
|
||||
"target": "tag",
|
||||
"toolTip": "Handle at end of clip", # noqa
|
||||
"order": 2
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -229,8 +233,10 @@ class CreateShotClip(resolve.Creator):
|
|||
v_sync_track = widget.result["vSyncTrack"]["value"]
|
||||
|
||||
# sort selected trackItems by
|
||||
sorted_selected_track_items = list()
|
||||
unsorted_selected_track_items = list()
|
||||
sorted_selected_track_items = []
|
||||
unsorted_selected_track_items = []
|
||||
print("_____ selected ______")
|
||||
print(self.selected)
|
||||
for track_item_data in self.selected:
|
||||
if track_item_data["track"]["name"] in v_sync_track:
|
||||
sorted_selected_track_items.append(track_item_data)
|
||||
|
|
@ -253,10 +259,10 @@ class CreateShotClip(resolve.Creator):
|
|||
"sq_frame_start": sq_frame_start,
|
||||
"sq_markers": sq_markers
|
||||
}
|
||||
|
||||
print(kwargs)
|
||||
for i, track_item_data in enumerate(sorted_selected_track_items):
|
||||
self.rename_index = i
|
||||
|
||||
self.log.info(track_item_data)
|
||||
# convert track item to timeline media pool item
|
||||
track_item = resolve.PublishClip(
|
||||
self, track_item_data, **kwargs).convert()
|
||||
|
|
|
|||
|
|
@ -23,7 +23,7 @@ class LoadClip(resolve.TimelineItemLoader):
|
|||
"""
|
||||
|
||||
families = ["render2d", "source", "plate", "render", "review"]
|
||||
representations = ["exr", "dpx", "jpg", "jpeg", "png", "h264", ".mov"]
|
||||
representations = ["exr", "dpx", "jpg", "jpeg", "png", "h264", "mov"]
|
||||
|
||||
label = "Load as clip"
|
||||
order = -10
|
||||
|
|
|
|||
|
|
@ -30,7 +30,8 @@ class PrecollectWorkfile(pyblish.api.ContextPlugin):
|
|||
"asset": asset,
|
||||
"subset": "{}{}".format(asset, subset.capitalize()),
|
||||
"item": project,
|
||||
"family": "workfile"
|
||||
"family": "workfile",
|
||||
"families": []
|
||||
}
|
||||
|
||||
# create instance with workfile
|
||||
|
|
|
|||
|
|
@ -19,6 +19,7 @@ import os
|
|||
import opentimelineio as otio
|
||||
import pyblish.api
|
||||
from openpype import lib as plib
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
|
||||
|
||||
class OTIO_View(pyblish.api.Action):
|
||||
|
|
@ -116,7 +117,7 @@ class CollectEditorial(pyblish.api.InstancePlugin):
|
|||
if extension == ".edl":
|
||||
# EDL has no frame rate embedded so needs explicit
|
||||
# frame rate else 24 is asssumed.
|
||||
kwargs["rate"] = plib.get_asset()["data"]["fps"]
|
||||
kwargs["rate"] = get_current_project_asset()["data"]["fps"]
|
||||
|
||||
instance.data["otio_timeline"] = otio.adapters.read_from_file(
|
||||
file_path, **kwargs)
|
||||
|
|
|
|||
|
|
@ -1,8 +1,12 @@
|
|||
import os
|
||||
from copy import deepcopy
|
||||
|
||||
import opentimelineio as otio
|
||||
import pyblish.api
|
||||
|
||||
from openpype import lib as plib
|
||||
from copy import deepcopy
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
|
||||
|
||||
class CollectInstances(pyblish.api.InstancePlugin):
|
||||
"""Collect instances from editorial's OTIO sequence"""
|
||||
|
|
@ -48,7 +52,7 @@ class CollectInstances(pyblish.api.InstancePlugin):
|
|||
|
||||
# get timeline otio data
|
||||
timeline = instance.data["otio_timeline"]
|
||||
fps = plib.get_asset()["data"]["fps"]
|
||||
fps = get_current_project_asset()["data"]["fps"]
|
||||
|
||||
tracks = timeline.each_child(
|
||||
descended_from_type=otio.schema.Track
|
||||
|
|
|
|||
|
|
@ -3,8 +3,8 @@ import re
|
|||
import pyblish.api
|
||||
|
||||
import openpype.api
|
||||
from openpype import lib
|
||||
from openpype.pipeline import PublishXmlValidationError
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
|
||||
|
||||
class ValidateFrameRange(pyblish.api.InstancePlugin):
|
||||
|
|
@ -27,7 +27,8 @@ class ValidateFrameRange(pyblish.api.InstancePlugin):
|
|||
for pattern in self.skip_timelines_check):
|
||||
self.log.info("Skipping for {} task".format(instance.data["task"]))
|
||||
|
||||
asset_data = lib.get_asset(instance.data["asset"])["data"]
|
||||
# TODO repace query with using 'instance.data["assetEntity"]'
|
||||
asset_data = get_current_project_asset(instance.data["asset"])["data"]
|
||||
frame_start = asset_data["frameStart"]
|
||||
frame_end = asset_data["frameEnd"]
|
||||
handle_start = asset_data["handleStart"]
|
||||
|
|
|
|||
|
|
@ -1,20 +1,8 @@
|
|||
from .pipeline import (
|
||||
install,
|
||||
ls,
|
||||
|
||||
set_project_name,
|
||||
get_context_title,
|
||||
get_context_data,
|
||||
update_context_data,
|
||||
TrayPublisherHost,
|
||||
)
|
||||
|
||||
|
||||
__all__ = (
|
||||
"install",
|
||||
"ls",
|
||||
|
||||
"set_project_name",
|
||||
"get_context_title",
|
||||
"get_context_data",
|
||||
"update_context_data",
|
||||
"TrayPublisherHost",
|
||||
)
|
||||
|
|
|
|||
|
|
@ -9,6 +9,8 @@ from openpype.pipeline import (
|
|||
register_creator_plugin_path,
|
||||
legacy_io,
|
||||
)
|
||||
from openpype.host import HostBase, INewPublisher
|
||||
|
||||
|
||||
ROOT_DIR = os.path.dirname(os.path.dirname(
|
||||
os.path.abspath(__file__)
|
||||
|
|
@ -17,6 +19,35 @@ PUBLISH_PATH = os.path.join(ROOT_DIR, "plugins", "publish")
|
|||
CREATE_PATH = os.path.join(ROOT_DIR, "plugins", "create")
|
||||
|
||||
|
||||
class TrayPublisherHost(HostBase, INewPublisher):
|
||||
name = "traypublisher"
|
||||
|
||||
def install(self):
|
||||
os.environ["AVALON_APP"] = self.name
|
||||
legacy_io.Session["AVALON_APP"] = self.name
|
||||
|
||||
pyblish.api.register_host("traypublisher")
|
||||
pyblish.api.register_plugin_path(PUBLISH_PATH)
|
||||
register_creator_plugin_path(CREATE_PATH)
|
||||
|
||||
def get_context_title(self):
|
||||
return HostContext.get_project_name()
|
||||
|
||||
def get_context_data(self):
|
||||
return HostContext.get_context_data()
|
||||
|
||||
def update_context_data(self, data, changes):
|
||||
HostContext.save_context_data(data, changes)
|
||||
|
||||
def set_project_name(self, project_name):
|
||||
# TODO Deregister project specific plugins and register new project
|
||||
# plugins
|
||||
os.environ["AVALON_PROJECT"] = project_name
|
||||
legacy_io.Session["AVALON_PROJECT"] = project_name
|
||||
legacy_io.install()
|
||||
HostContext.set_project_name(project_name)
|
||||
|
||||
|
||||
class HostContext:
|
||||
_context_json_path = None
|
||||
|
||||
|
|
@ -150,32 +181,3 @@ def get_context_data():
|
|||
|
||||
def update_context_data(data, changes):
|
||||
HostContext.save_context_data(data)
|
||||
|
||||
|
||||
def get_context_title():
|
||||
return HostContext.get_project_name()
|
||||
|
||||
|
||||
def ls():
|
||||
"""Probably will never return loaded containers."""
|
||||
return []
|
||||
|
||||
|
||||
def install():
|
||||
"""This is called before a project is known.
|
||||
|
||||
Project is defined with 'set_project_name'.
|
||||
"""
|
||||
os.environ["AVALON_APP"] = "traypublisher"
|
||||
|
||||
pyblish.api.register_host("traypublisher")
|
||||
pyblish.api.register_plugin_path(PUBLISH_PATH)
|
||||
register_creator_plugin_path(CREATE_PATH)
|
||||
|
||||
|
||||
def set_project_name(project_name):
|
||||
# TODO Deregister project specific plugins and register new project plugins
|
||||
os.environ["AVALON_PROJECT"] = project_name
|
||||
legacy_io.Session["AVALON_PROJECT"] = project_name
|
||||
legacy_io.install()
|
||||
HostContext.set_project_name(project_name)
|
||||
|
|
|
|||
|
|
@ -1,8 +1,8 @@
|
|||
from openpype.lib.attribute_definitions import FileDef
|
||||
from openpype.pipeline import (
|
||||
Creator,
|
||||
CreatedInstance
|
||||
)
|
||||
from openpype.lib import FileDef
|
||||
|
||||
from .pipeline import (
|
||||
list_instances,
|
||||
|
|
@ -12,6 +12,29 @@ from .pipeline import (
|
|||
)
|
||||
|
||||
|
||||
IMAGE_EXTENSIONS = [
|
||||
".ani", ".anim", ".apng", ".art", ".bmp", ".bpg", ".bsave", ".cal",
|
||||
".cin", ".cpc", ".cpt", ".dds", ".dpx", ".ecw", ".exr", ".fits",
|
||||
".flic", ".flif", ".fpx", ".gif", ".hdri", ".hevc", ".icer",
|
||||
".icns", ".ico", ".cur", ".ics", ".ilbm", ".jbig", ".jbig2",
|
||||
".jng", ".jpeg", ".jpeg-ls", ".jpeg", ".2000", ".jpg", ".xr",
|
||||
".jpeg", ".xt", ".jpeg-hdr", ".kra", ".mng", ".miff", ".nrrd",
|
||||
".ora", ".pam", ".pbm", ".pgm", ".ppm", ".pnm", ".pcx", ".pgf",
|
||||
".pictor", ".png", ".psb", ".psp", ".qtvr", ".ras",
|
||||
".rgbe", ".logluv", ".tiff", ".sgi", ".tga", ".tiff", ".tiff/ep",
|
||||
".tiff/it", ".ufo", ".ufp", ".wbmp", ".webp", ".xbm", ".xcf",
|
||||
".xpm", ".xwd"
|
||||
]
|
||||
VIDEO_EXTENSIONS = [
|
||||
".3g2", ".3gp", ".amv", ".asf", ".avi", ".drc", ".f4a", ".f4b",
|
||||
".f4p", ".f4v", ".flv", ".gif", ".gifv", ".m2v", ".m4p", ".m4v",
|
||||
".mkv", ".mng", ".mov", ".mp2", ".mp4", ".mpe", ".mpeg", ".mpg",
|
||||
".mpv", ".mxf", ".nsv", ".ogg", ".ogv", ".qt", ".rm", ".rmvb",
|
||||
".roq", ".svi", ".vob", ".webm", ".wmv", ".yuv"
|
||||
]
|
||||
REVIEW_EXTENSIONS = IMAGE_EXTENSIONS + VIDEO_EXTENSIONS
|
||||
|
||||
|
||||
class TrayPublishCreator(Creator):
|
||||
create_allow_context_change = True
|
||||
host_name = "traypublisher"
|
||||
|
|
@ -37,6 +60,21 @@ class TrayPublishCreator(Creator):
|
|||
# Use same attributes as for instance attrobites
|
||||
return self.get_instance_attr_defs()
|
||||
|
||||
def _store_new_instance(self, new_instance):
|
||||
"""Tray publisher specific method to store instance.
|
||||
|
||||
Instance is stored into "workfile" of traypublisher and also add it
|
||||
to CreateContext.
|
||||
|
||||
Args:
|
||||
new_instance (CreatedInstance): Instance that should be stored.
|
||||
"""
|
||||
|
||||
# Host implementation of storing metadata about instance
|
||||
HostContext.add_instance(new_instance.data_to_store())
|
||||
# Add instance to current context
|
||||
self._add_instance_to_context(new_instance)
|
||||
|
||||
|
||||
class SettingsCreator(TrayPublishCreator):
|
||||
create_allow_context_change = True
|
||||
|
|
@ -58,19 +96,27 @@ class SettingsCreator(TrayPublishCreator):
|
|||
data["settings_creator"] = True
|
||||
# Create new instance
|
||||
new_instance = CreatedInstance(self.family, subset_name, data, self)
|
||||
# Host implementation of storing metadata about instance
|
||||
HostContext.add_instance(new_instance.data_to_store())
|
||||
# Add instance to current context
|
||||
self._add_instance_to_context(new_instance)
|
||||
|
||||
self._store_new_instance(new_instance)
|
||||
|
||||
def get_instance_attr_defs(self):
|
||||
return [
|
||||
FileDef(
|
||||
"filepath",
|
||||
"representation_files",
|
||||
folders=False,
|
||||
extensions=self.extensions,
|
||||
allow_sequences=self.allow_sequences,
|
||||
label="Filepath",
|
||||
single_item=not self.allow_multiple_items,
|
||||
label="Representations",
|
||||
),
|
||||
FileDef(
|
||||
"reviewable",
|
||||
folders=False,
|
||||
extensions=REVIEW_EXTENSIONS,
|
||||
allow_sequences=True,
|
||||
single_item=True,
|
||||
label="Reviewable representations",
|
||||
extensions_label="Single reviewable item"
|
||||
)
|
||||
]
|
||||
|
||||
|
|
@ -92,6 +138,7 @@ class SettingsCreator(TrayPublishCreator):
|
|||
"detailed_description": item_data["detailed_description"],
|
||||
"extensions": item_data["extensions"],
|
||||
"allow_sequences": item_data["allow_sequences"],
|
||||
"allow_multiple_items": item_data["allow_multiple_items"],
|
||||
"default_variants": item_data["default_variants"]
|
||||
}
|
||||
)
|
||||
|
|
|
|||
|
|
@ -0,0 +1,216 @@
|
|||
import copy
|
||||
import os
|
||||
import re
|
||||
|
||||
from openpype.client import get_assets, get_asset_by_name
|
||||
from openpype.lib import (
|
||||
FileDef,
|
||||
BoolDef,
|
||||
get_subset_name_with_asset_doc,
|
||||
TaskNotSetError,
|
||||
)
|
||||
from openpype.pipeline import (
|
||||
CreatedInstance,
|
||||
CreatorError
|
||||
)
|
||||
|
||||
from openpype.hosts.traypublisher.api.plugin import TrayPublishCreator
|
||||
|
||||
|
||||
class BatchMovieCreator(TrayPublishCreator):
|
||||
"""Creates instances from movie file(s).
|
||||
|
||||
Intended for .mov files, but should work for any video file.
|
||||
Doesn't handle image sequences though.
|
||||
"""
|
||||
identifier = "render_movie_batch"
|
||||
label = "Batch Movies"
|
||||
family = "render"
|
||||
description = "Publish batch of video files"
|
||||
|
||||
create_allow_context_change = False
|
||||
version_regex = re.compile(r"^(.+)_v([0-9]+)$")
|
||||
|
||||
def __init__(self, project_settings, *args, **kwargs):
|
||||
super(BatchMovieCreator, self).__init__(project_settings,
|
||||
*args, **kwargs)
|
||||
creator_settings = (
|
||||
project_settings["traypublisher"]["BatchMovieCreator"]
|
||||
)
|
||||
self.default_variants = creator_settings["default_variants"]
|
||||
self.default_tasks = creator_settings["default_tasks"]
|
||||
self.extensions = creator_settings["extensions"]
|
||||
|
||||
def get_icon(self):
|
||||
return "fa.file"
|
||||
|
||||
def create(self, subset_name, data, pre_create_data):
|
||||
file_paths = pre_create_data.get("filepath")
|
||||
if not file_paths:
|
||||
return
|
||||
|
||||
for file_info in file_paths:
|
||||
instance_data = copy.deepcopy(data)
|
||||
file_name = file_info["filenames"][0]
|
||||
filepath = os.path.join(file_info["directory"], file_name)
|
||||
instance_data["creator_attributes"] = {"filepath": filepath}
|
||||
|
||||
asset_doc, version = self.get_asset_doc_from_file_name(
|
||||
file_name, self.project_name)
|
||||
|
||||
subset_name, task_name = self._get_subset_and_task(
|
||||
asset_doc, data["variant"], self.project_name)
|
||||
|
||||
instance_data["task"] = task_name
|
||||
instance_data["asset"] = asset_doc["name"]
|
||||
|
||||
# Create new instance
|
||||
new_instance = CreatedInstance(self.family, subset_name,
|
||||
instance_data, self)
|
||||
self._store_new_instance(new_instance)
|
||||
|
||||
def get_asset_doc_from_file_name(self, source_filename, project_name):
|
||||
"""Try to parse out asset name from file name provided.
|
||||
|
||||
Artists might provide various file name formats.
|
||||
Currently handled:
|
||||
- chair.mov
|
||||
- chair_v001.mov
|
||||
- my_chair_to_upload.mov
|
||||
"""
|
||||
version = None
|
||||
asset_name = os.path.splitext(source_filename)[0]
|
||||
# Always first check if source filename is in assets
|
||||
matching_asset_doc = self._get_asset_by_name_case_not_sensitive(
|
||||
project_name, asset_name)
|
||||
|
||||
if matching_asset_doc is None:
|
||||
matching_asset_doc, version = (
|
||||
self._parse_with_version(project_name, asset_name))
|
||||
|
||||
if matching_asset_doc is None:
|
||||
matching_asset_doc = self._parse_containing(project_name,
|
||||
asset_name)
|
||||
|
||||
if matching_asset_doc is None:
|
||||
raise CreatorError(
|
||||
"Cannot guess asset name from {}".format(source_filename))
|
||||
|
||||
return matching_asset_doc, version
|
||||
|
||||
def _parse_with_version(self, project_name, asset_name):
|
||||
"""Try to parse asset name from a file name containing version too
|
||||
|
||||
Eg. 'chair_v001.mov' >> 'chair', 1
|
||||
"""
|
||||
self.log.debug((
|
||||
"Asset doc by \"{}\" was not found, trying version regex."
|
||||
).format(asset_name))
|
||||
|
||||
matching_asset_doc = version_number = None
|
||||
|
||||
regex_result = self.version_regex.findall(asset_name)
|
||||
if regex_result:
|
||||
_asset_name, _version_number = regex_result[0]
|
||||
matching_asset_doc = self._get_asset_by_name_case_not_sensitive(
|
||||
project_name, _asset_name)
|
||||
if matching_asset_doc:
|
||||
version_number = int(_version_number)
|
||||
|
||||
return matching_asset_doc, version_number
|
||||
|
||||
def _parse_containing(self, project_name, asset_name):
|
||||
"""Look if file name contains any existing asset name"""
|
||||
for asset_doc in get_assets(project_name, fields=["name"]):
|
||||
if asset_doc["name"].lower() in asset_name.lower():
|
||||
return get_asset_by_name(project_name, asset_doc["name"])
|
||||
|
||||
def _get_subset_and_task(self, asset_doc, variant, project_name):
|
||||
"""Create subset name according to standard template process"""
|
||||
task_name = self._get_task_name(asset_doc)
|
||||
|
||||
try:
|
||||
subset_name = get_subset_name_with_asset_doc(
|
||||
self.family,
|
||||
variant,
|
||||
task_name,
|
||||
asset_doc,
|
||||
project_name
|
||||
)
|
||||
except TaskNotSetError:
|
||||
# Create instance with fake task
|
||||
# - instance will be marked as invalid so it can't be published
|
||||
# but user have ability to change it
|
||||
# NOTE: This expect that there is not task 'Undefined' on asset
|
||||
task_name = "Undefined"
|
||||
subset_name = get_subset_name_with_asset_doc(
|
||||
self.family,
|
||||
variant,
|
||||
task_name,
|
||||
asset_doc,
|
||||
project_name
|
||||
)
|
||||
|
||||
return subset_name, task_name
|
||||
|
||||
def _get_task_name(self, asset_doc):
|
||||
"""Get applicable task from 'asset_doc' """
|
||||
available_task_names = {}
|
||||
asset_tasks = asset_doc.get("data", {}).get("tasks") or {}
|
||||
for task_name in asset_tasks.keys():
|
||||
available_task_names[task_name.lower()] = task_name
|
||||
|
||||
task_name = None
|
||||
for _task_name in self.default_tasks:
|
||||
_task_name_low = _task_name.lower()
|
||||
if _task_name_low in available_task_names:
|
||||
task_name = available_task_names[_task_name_low]
|
||||
break
|
||||
|
||||
return task_name
|
||||
|
||||
def get_instance_attr_defs(self):
|
||||
return [
|
||||
BoolDef(
|
||||
"add_review_family",
|
||||
default=True,
|
||||
label="Review"
|
||||
)
|
||||
]
|
||||
|
||||
def get_pre_create_attr_defs(self):
|
||||
# Use same attributes as for instance attributes
|
||||
return [
|
||||
FileDef(
|
||||
"filepath",
|
||||
folders=False,
|
||||
single_item=False,
|
||||
extensions=self.extensions,
|
||||
label="Filepath"
|
||||
),
|
||||
BoolDef(
|
||||
"add_review_family",
|
||||
default=True,
|
||||
label="Review"
|
||||
)
|
||||
]
|
||||
|
||||
def get_detail_description(self):
|
||||
return """# Publish batch of .mov to multiple assets.
|
||||
|
||||
File names must then contain only asset name, or asset name + version.
|
||||
(eg. 'chair.mov', 'chair_v001.mov', not really safe `my_chair_v001.mov`
|
||||
"""
|
||||
|
||||
def _get_asset_by_name_case_not_sensitive(self, project_name, asset_name):
|
||||
"""Handle more cases in file names"""
|
||||
asset_name = re.compile(asset_name, re.IGNORECASE)
|
||||
|
||||
assets = list(get_assets(project_name, asset_names=[asset_name]))
|
||||
if assets:
|
||||
if len(assets) > 1:
|
||||
self.log.warning("Too many records found for {}".format(
|
||||
asset_name))
|
||||
return
|
||||
|
||||
return assets.pop()
|
||||
|
|
@ -0,0 +1,47 @@
|
|||
import os
|
||||
|
||||
import pyblish.api
|
||||
from openpype.pipeline import OpenPypePyblishPluginMixin
|
||||
|
||||
|
||||
class CollectMovieBatch(
|
||||
pyblish.api.InstancePlugin, OpenPypePyblishPluginMixin
|
||||
):
|
||||
"""Collect file url for batch movies and create representation.
|
||||
|
||||
Adds review on instance and to repre.tags based on value of toggle button
|
||||
on creator.
|
||||
"""
|
||||
|
||||
label = "Collect Movie Batch Files"
|
||||
order = pyblish.api.CollectorOrder
|
||||
|
||||
hosts = ["traypublisher"]
|
||||
|
||||
def process(self, instance):
|
||||
if instance.data.get("creator_identifier") != "render_movie_batch":
|
||||
return
|
||||
|
||||
creator_attributes = instance.data["creator_attributes"]
|
||||
|
||||
file_url = creator_attributes["filepath"]
|
||||
file_name = os.path.basename(file_url)
|
||||
_, ext = os.path.splitext(file_name)
|
||||
|
||||
repre = {
|
||||
"name": ext[1:],
|
||||
"ext": ext[1:],
|
||||
"files": file_name,
|
||||
"stagingDir": os.path.dirname(file_url),
|
||||
"tags": []
|
||||
}
|
||||
|
||||
if creator_attributes["add_review_family"]:
|
||||
repre["tags"].append("review")
|
||||
instance.data["families"].append("review")
|
||||
|
||||
instance.data["representations"].append(repre)
|
||||
|
||||
instance.data["source"] = file_url
|
||||
|
||||
self.log.debug("instance.data {}".format(instance.data))
|
||||
|
|
@ -1,31 +0,0 @@
|
|||
import pyblish.api
|
||||
from openpype.lib import BoolDef
|
||||
from openpype.pipeline import OpenPypePyblishPluginMixin
|
||||
|
||||
|
||||
class CollectReviewFamily(
|
||||
pyblish.api.InstancePlugin, OpenPypePyblishPluginMixin
|
||||
):
|
||||
"""Add review family."""
|
||||
|
||||
label = "Collect Review Family"
|
||||
order = pyblish.api.CollectorOrder - 0.49
|
||||
|
||||
hosts = ["traypublisher"]
|
||||
families = [
|
||||
"image",
|
||||
"render",
|
||||
"plate",
|
||||
"review"
|
||||
]
|
||||
|
||||
def process(self, instance):
|
||||
values = self.get_attr_values_from_data(instance.data)
|
||||
if values.get("add_review_family"):
|
||||
instance.data["families"].append("review")
|
||||
|
||||
@classmethod
|
||||
def get_attribute_defs(cls):
|
||||
return [
|
||||
BoolDef("add_review_family", label="Review", default=True)
|
||||
]
|
||||
|
|
@ -1,9 +1,31 @@
|
|||
import os
|
||||
import tempfile
|
||||
|
||||
import clique
|
||||
import pyblish.api
|
||||
|
||||
|
||||
class CollectSettingsSimpleInstances(pyblish.api.InstancePlugin):
|
||||
"""Collect data for instances created by settings creators."""
|
||||
"""Collect data for instances created by settings creators.
|
||||
|
||||
Plugin create representations for simple instances based
|
||||
on 'representation_files' attribute stored on instance data.
|
||||
|
||||
There is also possibility to have reviewable representation which can be
|
||||
stored under 'reviewable' attribute stored on instance data. If there was
|
||||
already created representation with the same files as 'revieable' containes
|
||||
|
||||
Representations can be marked for review and in that case is also added
|
||||
'review' family to instance families. For review can be marked only one
|
||||
representation so **first** representation that has extension available
|
||||
in '_review_extensions' is used for review.
|
||||
|
||||
For instance 'source' is used path from last representation created
|
||||
from 'representation_files'.
|
||||
|
||||
Set staging directory on instance. That is probably never used because
|
||||
each created representation has it's own staging dir.
|
||||
"""
|
||||
|
||||
label = "Collect Settings Simple Instances"
|
||||
order = pyblish.api.CollectorOrder - 0.49
|
||||
|
|
@ -14,37 +36,193 @@ class CollectSettingsSimpleInstances(pyblish.api.InstancePlugin):
|
|||
if not instance.data.get("settings_creator"):
|
||||
return
|
||||
|
||||
if "families" not in instance.data:
|
||||
instance.data["families"] = []
|
||||
instance_label = instance.data["name"]
|
||||
# Create instance's staging dir in temp
|
||||
tmp_folder = tempfile.mkdtemp(prefix="traypublisher_")
|
||||
instance.data["stagingDir"] = tmp_folder
|
||||
instance.context.data["cleanupFullPaths"].append(tmp_folder)
|
||||
|
||||
if "representations" not in instance.data:
|
||||
instance.data["representations"] = []
|
||||
repres = instance.data["representations"]
|
||||
self.log.debug((
|
||||
"Created temp staging directory for instance {}. {}"
|
||||
).format(instance_label, tmp_folder))
|
||||
|
||||
# Store filepaths for validation of their existence
|
||||
source_filepaths = []
|
||||
# Make sure there are no representations with same name
|
||||
repre_names_counter = {}
|
||||
# Store created names for logging
|
||||
repre_names = []
|
||||
# Store set of filepaths per each representation
|
||||
representation_files_mapping = []
|
||||
source = self._create_main_representations(
|
||||
instance,
|
||||
source_filepaths,
|
||||
repre_names_counter,
|
||||
repre_names,
|
||||
representation_files_mapping
|
||||
)
|
||||
|
||||
self._create_review_representation(
|
||||
instance,
|
||||
source_filepaths,
|
||||
repre_names_counter,
|
||||
repre_names,
|
||||
representation_files_mapping
|
||||
)
|
||||
|
||||
instance.data["source"] = source
|
||||
instance.data["sourceFilepaths"] = list(set(source_filepaths))
|
||||
|
||||
self.log.debug(
|
||||
(
|
||||
"Created Simple Settings instance \"{}\""
|
||||
" with {} representations: {}"
|
||||
).format(
|
||||
instance_label,
|
||||
len(instance.data["representations"]),
|
||||
", ".join(repre_names)
|
||||
)
|
||||
)
|
||||
|
||||
def _create_main_representations(
|
||||
self,
|
||||
instance,
|
||||
source_filepaths,
|
||||
repre_names_counter,
|
||||
repre_names,
|
||||
representation_files_mapping
|
||||
):
|
||||
creator_attributes = instance.data["creator_attributes"]
|
||||
filepath_items = creator_attributes["representation_files"]
|
||||
if not isinstance(filepath_items, list):
|
||||
filepath_items = [filepath_items]
|
||||
|
||||
source = None
|
||||
for filepath_item in filepath_items:
|
||||
# Skip if filepath item does not have filenames
|
||||
if not filepath_item["filenames"]:
|
||||
continue
|
||||
|
||||
filepaths = {
|
||||
os.path.join(filepath_item["directory"], filename)
|
||||
for filename in filepath_item["filenames"]
|
||||
}
|
||||
source_filepaths.extend(filepaths)
|
||||
|
||||
source = self._calculate_source(filepaths)
|
||||
representation = self._create_representation_data(
|
||||
filepath_item, repre_names_counter, repre_names
|
||||
)
|
||||
instance.data["representations"].append(representation)
|
||||
representation_files_mapping.append(
|
||||
(filepaths, representation, source)
|
||||
)
|
||||
return source
|
||||
|
||||
def _create_review_representation(
|
||||
self,
|
||||
instance,
|
||||
source_filepaths,
|
||||
repre_names_counter,
|
||||
repre_names,
|
||||
representation_files_mapping
|
||||
):
|
||||
# Skip review representation creation if there are no representations
|
||||
# created for "main" part
|
||||
# - review representation must not be created in that case so
|
||||
# validation can care about it
|
||||
if not representation_files_mapping:
|
||||
self.log.warning((
|
||||
"There are missing source representations."
|
||||
" Creation of review representation was skipped."
|
||||
))
|
||||
return
|
||||
|
||||
creator_attributes = instance.data["creator_attributes"]
|
||||
filepath_item = creator_attributes["filepath"]
|
||||
self.log.info(filepath_item)
|
||||
filepaths = [
|
||||
os.path.join(filepath_item["directory"], filename)
|
||||
for filename in filepath_item["filenames"]
|
||||
]
|
||||
review_file_item = creator_attributes["reviewable"]
|
||||
filenames = review_file_item.get("filenames")
|
||||
if not filenames:
|
||||
self.log.debug((
|
||||
"Filepath for review is not defined."
|
||||
" Skipping review representation creation."
|
||||
))
|
||||
return
|
||||
|
||||
instance.data["sourceFilepaths"] = filepaths
|
||||
instance.data["stagingDir"] = filepath_item["directory"]
|
||||
filepaths = {
|
||||
os.path.join(review_file_item["directory"], filename)
|
||||
for filename in filenames
|
||||
}
|
||||
source_filepaths.extend(filepaths)
|
||||
# First try to find out representation with same filepaths
|
||||
# so it's not needed to create new representation just for review
|
||||
review_representation = None
|
||||
# Review path (only for logging)
|
||||
review_path = None
|
||||
for item in representation_files_mapping:
|
||||
_filepaths, representation, repre_path = item
|
||||
if _filepaths == filepaths:
|
||||
review_representation = representation
|
||||
review_path = repre_path
|
||||
break
|
||||
|
||||
if review_representation is None:
|
||||
self.log.debug("Creating new review representation")
|
||||
review_path = self._calculate_source(filepaths)
|
||||
review_representation = self._create_representation_data(
|
||||
review_file_item, repre_names_counter, repre_names
|
||||
)
|
||||
instance.data["representations"].append(review_representation)
|
||||
|
||||
if "review" not in instance.data["families"]:
|
||||
instance.data["families"].append("review")
|
||||
|
||||
review_representation["tags"].append("review")
|
||||
self.log.debug("Representation {} was marked for review. {}".format(
|
||||
review_representation["name"], review_path
|
||||
))
|
||||
|
||||
def _create_representation_data(
|
||||
self, filepath_item, repre_names_counter, repre_names
|
||||
):
|
||||
"""Create new representation data based on file item.
|
||||
|
||||
Args:
|
||||
filepath_item (Dict[str, Any]): Item with information about
|
||||
representation paths.
|
||||
repre_names_counter (Dict[str, int]): Store count of representation
|
||||
names.
|
||||
repre_names (List[str]): All used representation names. For
|
||||
logging purposes.
|
||||
|
||||
Returns:
|
||||
Dict: Prepared base representation data.
|
||||
"""
|
||||
|
||||
filenames = filepath_item["filenames"]
|
||||
_, ext = os.path.splitext(filenames[0])
|
||||
ext = ext[1:]
|
||||
if len(filenames) == 1:
|
||||
filenames = filenames[0]
|
||||
|
||||
repres.append({
|
||||
"ext": ext,
|
||||
"name": ext,
|
||||
repre_name = repre_ext = ext[1:]
|
||||
if repre_name not in repre_names_counter:
|
||||
repre_names_counter[repre_name] = 2
|
||||
else:
|
||||
counter = repre_names_counter[repre_name]
|
||||
repre_names_counter[repre_name] += 1
|
||||
repre_name = "{}_{}".format(repre_name, counter)
|
||||
repre_names.append(repre_name)
|
||||
return {
|
||||
"ext": repre_ext,
|
||||
"name": repre_name,
|
||||
"stagingDir": filepath_item["directory"],
|
||||
"files": filenames
|
||||
})
|
||||
"files": filenames,
|
||||
"tags": []
|
||||
}
|
||||
|
||||
self.log.debug("Created Simple Settings instance {}".format(
|
||||
instance.data
|
||||
))
|
||||
def _calculate_source(self, filepaths):
|
||||
cols, rems = clique.assemble(filepaths)
|
||||
if cols:
|
||||
source = cols[0].format("{head}{padding}{tail}")
|
||||
elif rems:
|
||||
source = rems[0]
|
||||
return source
|
||||
|
|
|
|||
|
|
@ -0,0 +1,15 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<root>
|
||||
<error id="main">
|
||||
<title>Invalid frame range</title>
|
||||
<description>
|
||||
## Invalid frame range
|
||||
|
||||
Expected duration or '{duration}' frames set in database, workfile contains only '{found}' frames.
|
||||
|
||||
### How to repair?
|
||||
|
||||
Modify configuration in the database or tweak frame range in the workfile.
|
||||
</description>
|
||||
</error>
|
||||
</root>
|
||||
|
|
@ -3,8 +3,17 @@ import pyblish.api
|
|||
from openpype.pipeline import PublishValidationError
|
||||
|
||||
|
||||
class ValidateWorkfilePath(pyblish.api.InstancePlugin):
|
||||
"""Validate existence of workfile instance existence."""
|
||||
class ValidateFilePath(pyblish.api.InstancePlugin):
|
||||
"""Validate existence of source filepaths on instance.
|
||||
|
||||
Plugins looks into key 'sourceFilepaths' and validate if paths there
|
||||
actually exist on disk.
|
||||
|
||||
Also validate if the key is filled but is empty. In that case also
|
||||
crashes so do not fill the key if unfilled value should not cause error.
|
||||
|
||||
This is primarily created for Simple Creator instances.
|
||||
"""
|
||||
|
||||
label = "Validate Workfile"
|
||||
order = pyblish.api.ValidatorOrder - 0.49
|
||||
|
|
@ -14,12 +23,28 @@ class ValidateWorkfilePath(pyblish.api.InstancePlugin):
|
|||
def process(self, instance):
|
||||
if "sourceFilepaths" not in instance.data:
|
||||
self.log.info((
|
||||
"Can't validate source filepaths existence."
|
||||
"Skipped validation of source filepaths existence."
|
||||
" Instance does not have collected 'sourceFilepaths'"
|
||||
))
|
||||
return
|
||||
|
||||
filepaths = instance.data.get("sourceFilepaths")
|
||||
family = instance.data["family"]
|
||||
label = instance.data["name"]
|
||||
filepaths = instance.data["sourceFilepaths"]
|
||||
if not filepaths:
|
||||
raise PublishValidationError(
|
||||
(
|
||||
"Source filepaths of '{}' instance \"{}\" are not filled"
|
||||
).format(family, label),
|
||||
"File not filled",
|
||||
(
|
||||
"## Files were not filled"
|
||||
"\nThis mean that you didn't enter any files into required"
|
||||
" file input."
|
||||
"\n- Please refresh publishing and check instance"
|
||||
" <b>{}</b>"
|
||||
).format(label)
|
||||
)
|
||||
|
||||
not_found_files = [
|
||||
filepath
|
||||
|
|
@ -34,11 +59,7 @@ class ValidateWorkfilePath(pyblish.api.InstancePlugin):
|
|||
raise PublishValidationError(
|
||||
(
|
||||
"Filepath of '{}' instance \"{}\" does not exist:\n{}"
|
||||
).format(
|
||||
instance.data["family"],
|
||||
instance.data["name"],
|
||||
joined_paths
|
||||
),
|
||||
).format(family, label, joined_paths),
|
||||
"File not found",
|
||||
(
|
||||
"## Files were not found\nFiles\n{}"
|
||||
|
|
|
|||
|
|
@ -0,0 +1,75 @@
|
|||
import re
|
||||
|
||||
import pyblish.api
|
||||
|
||||
import openpype.api
|
||||
from openpype.pipeline import (
|
||||
PublishXmlValidationError,
|
||||
OptionalPyblishPluginMixin
|
||||
)
|
||||
|
||||
|
||||
class ValidateFrameRange(OptionalPyblishPluginMixin,
|
||||
pyblish.api.InstancePlugin):
|
||||
"""Validating frame range of rendered files against state in DB."""
|
||||
|
||||
label = "Validate Frame Range"
|
||||
hosts = ["traypublisher"]
|
||||
families = ["render"]
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
|
||||
optional = True
|
||||
# published data might be sequence (.mov, .mp4) in that counting files
|
||||
# doesnt make sense
|
||||
check_extensions = ["exr", "dpx", "jpg", "jpeg", "png", "tiff", "tga",
|
||||
"gif", "svg"]
|
||||
skip_timelines_check = [] # skip for specific task names (regex)
|
||||
|
||||
def process(self, instance):
|
||||
# Skip the instance if is not active by data on the instance
|
||||
if not self.is_active(instance.data):
|
||||
return
|
||||
|
||||
if (self.skip_timelines_check and
|
||||
any(re.search(pattern, instance.data["task"])
|
||||
for pattern in self.skip_timelines_check)):
|
||||
self.log.info("Skipping for {} task".format(instance.data["task"]))
|
||||
|
||||
asset_doc = instance.data["assetEntity"]
|
||||
asset_data = asset_doc["data"]
|
||||
frame_start = asset_data["frameStart"]
|
||||
frame_end = asset_data["frameEnd"]
|
||||
handle_start = asset_data["handleStart"]
|
||||
handle_end = asset_data["handleEnd"]
|
||||
duration = (frame_end - frame_start + 1) + handle_start + handle_end
|
||||
|
||||
repres = instance.data.get("representations")
|
||||
if not repres:
|
||||
self.log.info("No representations, skipping.")
|
||||
return
|
||||
|
||||
first_repre = repres[0]
|
||||
ext = first_repre['ext'].replace(".", '')
|
||||
|
||||
if not ext or ext.lower() not in self.check_extensions:
|
||||
self.log.warning("Cannot check for extension {}".format(ext))
|
||||
return
|
||||
|
||||
files = first_repre["files"]
|
||||
if isinstance(files, str):
|
||||
files = [files]
|
||||
frames = len(files)
|
||||
|
||||
msg = (
|
||||
"Frame duration from DB:'{}' doesn't match number of files:'{}'"
|
||||
" Please change frame range for Asset or limit no. of files"
|
||||
). format(int(duration), frames)
|
||||
|
||||
formatting_data = {"duration": duration,
|
||||
"found": frames}
|
||||
if frames != duration:
|
||||
raise PublishXmlValidationError(self, msg,
|
||||
formatting_data=formatting_data)
|
||||
|
||||
self.log.debug("Valid ranges expected '{}' - found '{}'".
|
||||
format(int(duration), frames))
|
||||
|
|
@ -8,13 +8,13 @@ from unreal import EditorAssetLibrary
|
|||
from unreal import MovieSceneSkeletalAnimationTrack
|
||||
from unreal import MovieSceneSkeletalAnimationSection
|
||||
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
from openpype.pipeline import (
|
||||
get_representation_path,
|
||||
AVALON_CONTAINER_ID
|
||||
)
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
||||
from openpype.api import get_asset
|
||||
|
||||
|
||||
class AnimationFBXLoader(plugin.Loader):
|
||||
|
|
@ -53,6 +53,8 @@ class AnimationFBXLoader(plugin.Loader):
|
|||
if not actor:
|
||||
return None
|
||||
|
||||
asset_doc = get_current_project_asset(fields=["data.fps"])
|
||||
|
||||
task.set_editor_property('filename', self.fname)
|
||||
task.set_editor_property('destination_path', asset_dir)
|
||||
task.set_editor_property('destination_name', asset_name)
|
||||
|
|
@ -80,7 +82,7 @@ class AnimationFBXLoader(plugin.Loader):
|
|||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
'use_default_sample_rate', False)
|
||||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
'custom_sample_rate', get_asset()["data"].get("fps"))
|
||||
'custom_sample_rate', asset_doc.get("data", {}).get("fps"))
|
||||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
'import_custom_attribute', True)
|
||||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
|
|
@ -246,6 +248,7 @@ class AnimationFBXLoader(plugin.Loader):
|
|||
def update(self, container, representation):
|
||||
name = container["asset_name"]
|
||||
source_path = get_representation_path(representation)
|
||||
asset_doc = get_current_project_asset(fields=["data.fps"])
|
||||
destination_path = container["namespace"]
|
||||
|
||||
task = unreal.AssetImportTask()
|
||||
|
|
@ -279,7 +282,7 @@ class AnimationFBXLoader(plugin.Loader):
|
|||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
'use_default_sample_rate', False)
|
||||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
'custom_sample_rate', get_asset()["data"].get("fps"))
|
||||
'custom_sample_rate', asset_doc.get("data", {}).get("fps"))
|
||||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
'import_custom_attribute', True)
|
||||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
|
|
|
|||
|
|
@ -20,7 +20,7 @@ from openpype.pipeline import (
|
|||
AVALON_CONTAINER_ID,
|
||||
legacy_io,
|
||||
)
|
||||
from openpype.api import get_asset
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
from openpype.hosts.unreal.api import plugin
|
||||
from openpype.hosts.unreal.api import pipeline as unreal_pipeline
|
||||
|
||||
|
|
@ -225,6 +225,7 @@ class LayoutLoader(plugin.Loader):
|
|||
|
||||
anim_path = f"{asset_dir}/animations/{anim_file_name}"
|
||||
|
||||
asset_doc = get_current_project_asset()
|
||||
# Import animation
|
||||
task = unreal.AssetImportTask()
|
||||
task.options = unreal.FbxImportUI()
|
||||
|
|
@ -259,7 +260,7 @@ class LayoutLoader(plugin.Loader):
|
|||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
'use_default_sample_rate', False)
|
||||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
'custom_sample_rate', get_asset()["data"].get("fps"))
|
||||
'custom_sample_rate', asset_doc.get("data", {}).get("fps"))
|
||||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
'import_custom_attribute', True)
|
||||
task.options.anim_sequence_import_data.set_editor_property(
|
||||
|
|
|
|||
|
|
@ -120,7 +120,6 @@ from .avalon_context import (
|
|||
is_latest,
|
||||
any_outdated,
|
||||
get_asset,
|
||||
get_hierarchy,
|
||||
get_linked_assets,
|
||||
get_latest_version,
|
||||
get_system_general_anatomy_data,
|
||||
|
|
@ -292,7 +291,6 @@ __all__ = [
|
|||
"is_latest",
|
||||
"any_outdated",
|
||||
"get_asset",
|
||||
"get_hierarchy",
|
||||
"get_linked_assets",
|
||||
"get_latest_version",
|
||||
"get_system_general_anatomy_data",
|
||||
|
|
|
|||
|
|
@ -1,269 +1,33 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Collect render template.
|
||||
"""Content was moved to 'openpype.pipeline.publish.abstract_collect_render'.
|
||||
|
||||
TODO: use @dataclass when times come.
|
||||
Please change your imports as soon as possible.
|
||||
|
||||
File will be probably removed in OpenPype 3.14.*
|
||||
"""
|
||||
from abc import abstractmethod
|
||||
|
||||
import attr
|
||||
import six
|
||||
|
||||
import pyblish.api
|
||||
|
||||
from openpype.pipeline import legacy_io
|
||||
|
||||
from .abstract_metaplugins import AbstractMetaContextPlugin
|
||||
import warnings
|
||||
from openpype.pipeline.publish import AbstractCollectRender, RenderInstance
|
||||
|
||||
|
||||
@attr.s
|
||||
class RenderInstance(object):
|
||||
"""Data collected by collectors.
|
||||
|
||||
This data class later on passed to collected instances.
|
||||
Those attributes are required later on.
|
||||
|
||||
"""
|
||||
|
||||
# metadata
|
||||
version = attr.ib() # instance version
|
||||
time = attr.ib() # time of instance creation (get_formatted_current_time)
|
||||
source = attr.ib() # path to source scene file
|
||||
label = attr.ib() # label to show in GUI
|
||||
subset = attr.ib() # subset name
|
||||
task = attr.ib() # task name
|
||||
asset = attr.ib() # asset name (AVALON_ASSET)
|
||||
attachTo = attr.ib() # subset name to attach render to
|
||||
setMembers = attr.ib() # list of nodes/members producing render output
|
||||
publish = attr.ib() # bool, True to publish instance
|
||||
name = attr.ib() # instance name
|
||||
|
||||
# format settings
|
||||
resolutionWidth = attr.ib() # resolution width (1920)
|
||||
resolutionHeight = attr.ib() # resolution height (1080)
|
||||
pixelAspect = attr.ib() # pixel aspect (1.0)
|
||||
|
||||
# time settings
|
||||
frameStart = attr.ib() # start frame
|
||||
frameEnd = attr.ib() # start end
|
||||
frameStep = attr.ib() # frame step
|
||||
|
||||
handleStart = attr.ib(default=None) # start frame
|
||||
handleEnd = attr.ib(default=None) # start frame
|
||||
|
||||
# for software (like Harmony) where frame range cannot be set by DB
|
||||
# handles need to be propagated if exist
|
||||
ignoreFrameHandleCheck = attr.ib(default=False)
|
||||
|
||||
# --------------------
|
||||
# With default values
|
||||
# metadata
|
||||
renderer = attr.ib(default="") # renderer - can be used in Deadline
|
||||
review = attr.ib(default=False) # generate review from instance (bool)
|
||||
priority = attr.ib(default=50) # job priority on farm
|
||||
|
||||
family = attr.ib(default="renderlayer")
|
||||
families = attr.ib(default=["renderlayer"]) # list of families
|
||||
|
||||
# format settings
|
||||
multipartExr = attr.ib(default=False) # flag for multipart exrs
|
||||
convertToScanline = attr.ib(default=False) # flag for exr conversion
|
||||
|
||||
tileRendering = attr.ib(default=False) # bool: treat render as tiles
|
||||
tilesX = attr.ib(default=0) # number of tiles in X
|
||||
tilesY = attr.ib(default=0) # number of tiles in Y
|
||||
|
||||
# submit_publish_job
|
||||
toBeRenderedOn = attr.ib(default=None)
|
||||
deadlineSubmissionJob = attr.ib(default=None)
|
||||
anatomyData = attr.ib(default=None)
|
||||
outputDir = attr.ib(default=None)
|
||||
context = attr.ib(default=None)
|
||||
|
||||
@frameStart.validator
|
||||
def check_frame_start(self, _, value):
|
||||
"""Validate if frame start is not larger then end."""
|
||||
if value > self.frameEnd:
|
||||
raise ValueError("frameStart must be smaller "
|
||||
"or equal then frameEnd")
|
||||
|
||||
@frameEnd.validator
|
||||
def check_frame_end(self, _, value):
|
||||
"""Validate if frame end is not less then start."""
|
||||
if value < self.frameStart:
|
||||
raise ValueError("frameEnd must be smaller "
|
||||
"or equal then frameStart")
|
||||
|
||||
@tilesX.validator
|
||||
def check_tiles_x(self, _, value):
|
||||
"""Validate if tile x isn't less then 1."""
|
||||
if not self.tileRendering:
|
||||
return
|
||||
if value < 1:
|
||||
raise ValueError("tile X size cannot be less then 1")
|
||||
|
||||
if value == 1 and self.tilesY == 1:
|
||||
raise ValueError("both tiles X a Y sizes are set to 1")
|
||||
|
||||
@tilesY.validator
|
||||
def check_tiles_y(self, _, value):
|
||||
"""Validate if tile y isn't less then 1."""
|
||||
if not self.tileRendering:
|
||||
return
|
||||
if value < 1:
|
||||
raise ValueError("tile Y size cannot be less then 1")
|
||||
|
||||
if value == 1 and self.tilesX == 1:
|
||||
raise ValueError("both tiles X a Y sizes are set to 1")
|
||||
class CollectRenderDeprecated(DeprecationWarning):
|
||||
pass
|
||||
|
||||
|
||||
@six.add_metaclass(AbstractMetaContextPlugin)
|
||||
class AbstractCollectRender(pyblish.api.ContextPlugin):
|
||||
"""Gather all publishable render layers from renderSetup."""
|
||||
warnings.simplefilter("always", CollectRenderDeprecated)
|
||||
warnings.warn(
|
||||
(
|
||||
"Content of 'abstract_collect_render' was moved."
|
||||
"\nUsing deprecated source of 'abstract_collect_render'. Content was"
|
||||
" move to 'openpype.pipeline.publish.abstract_collect_render'."
|
||||
" Please change your imports as soon as possible."
|
||||
),
|
||||
category=CollectRenderDeprecated,
|
||||
stacklevel=4
|
||||
)
|
||||
|
||||
order = pyblish.api.CollectorOrder + 0.01
|
||||
label = "Collect Render"
|
||||
sync_workfile_version = False
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
"""Constructor."""
|
||||
super(AbstractCollectRender, self).__init__(*args, **kwargs)
|
||||
self._file_path = None
|
||||
self._asset = legacy_io.Session["AVALON_ASSET"]
|
||||
self._context = None
|
||||
|
||||
def process(self, context):
|
||||
"""Entry point to collector."""
|
||||
self._context = context
|
||||
for instance in context:
|
||||
# make sure workfile instance publishing is enabled
|
||||
try:
|
||||
if "workfile" in instance.data["families"]:
|
||||
instance.data["publish"] = True
|
||||
# TODO merge renderFarm and render.farm
|
||||
if ("renderFarm" in instance.data["families"] or
|
||||
"render.farm" in instance.data["families"]):
|
||||
instance.data["remove"] = True
|
||||
except KeyError:
|
||||
# be tolerant if 'families' is missing.
|
||||
pass
|
||||
|
||||
self._file_path = context.data["currentFile"].replace("\\", "/")
|
||||
|
||||
render_instances = self.get_instances(context)
|
||||
for render_instance in render_instances:
|
||||
exp_files = self.get_expected_files(render_instance)
|
||||
assert exp_files, "no file names were generated, this is bug"
|
||||
|
||||
# if we want to attach render to subset, check if we have AOV's
|
||||
# in expectedFiles. If so, raise error as we cannot attach AOV
|
||||
# (considered to be subset on its own) to another subset
|
||||
if render_instance.attachTo:
|
||||
assert isinstance(exp_files, list), (
|
||||
"attaching multiple AOVs or renderable cameras to "
|
||||
"subset is not supported"
|
||||
)
|
||||
|
||||
frame_start_render = int(render_instance.frameStart)
|
||||
frame_end_render = int(render_instance.frameEnd)
|
||||
if (render_instance.ignoreFrameHandleCheck or
|
||||
int(context.data['frameStartHandle']) == frame_start_render
|
||||
and int(context.data['frameEndHandle']) == frame_end_render): # noqa: W503, E501
|
||||
|
||||
handle_start = context.data['handleStart']
|
||||
handle_end = context.data['handleEnd']
|
||||
frame_start = context.data['frameStart']
|
||||
frame_end = context.data['frameEnd']
|
||||
frame_start_handle = context.data['frameStartHandle']
|
||||
frame_end_handle = context.data['frameEndHandle']
|
||||
else:
|
||||
handle_start = 0
|
||||
handle_end = 0
|
||||
frame_start = frame_start_render
|
||||
frame_end = frame_end_render
|
||||
frame_start_handle = frame_start_render
|
||||
frame_end_handle = frame_end_render
|
||||
|
||||
data = {
|
||||
"handleStart": handle_start,
|
||||
"handleEnd": handle_end,
|
||||
"frameStart": frame_start,
|
||||
"frameEnd": frame_end,
|
||||
"frameStartHandle": frame_start_handle,
|
||||
"frameEndHandle": frame_end_handle,
|
||||
"byFrameStep": int(render_instance.frameStep),
|
||||
|
||||
"author": context.data["user"],
|
||||
# Add source to allow tracing back to the scene from
|
||||
# which was submitted originally
|
||||
"expectedFiles": exp_files,
|
||||
}
|
||||
if self.sync_workfile_version:
|
||||
data["version"] = context.data["version"]
|
||||
|
||||
# add additional data
|
||||
data = self.add_additional_data(data)
|
||||
render_instance_dict = attr.asdict(render_instance)
|
||||
|
||||
instance = context.create_instance(render_instance.name)
|
||||
instance.data["label"] = render_instance.label
|
||||
instance.data.update(render_instance_dict)
|
||||
instance.data.update(data)
|
||||
|
||||
self.post_collecting_action()
|
||||
|
||||
@abstractmethod
|
||||
def get_instances(self, context):
|
||||
"""Get all renderable instances and their data.
|
||||
|
||||
Args:
|
||||
context (pyblish.api.Context): Context object.
|
||||
|
||||
Returns:
|
||||
list of :class:`RenderInstance`: All collected renderable instances
|
||||
(like render layers, write nodes, etc.)
|
||||
|
||||
"""
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def get_expected_files(self, render_instance):
|
||||
"""Get list of expected files.
|
||||
|
||||
Returns:
|
||||
list: expected files. This can be either simple list of files with
|
||||
their paths, or list of dictionaries, where key is name of AOV
|
||||
for example and value is list of files for that AOV.
|
||||
|
||||
Example::
|
||||
|
||||
['/path/to/file.001.exr', '/path/to/file.002.exr']
|
||||
|
||||
or as dictionary:
|
||||
|
||||
[
|
||||
{
|
||||
"beauty": ['/path/to/beauty.001.exr', ...],
|
||||
"mask": ['/path/to/mask.001.exr']
|
||||
}
|
||||
]
|
||||
|
||||
"""
|
||||
pass
|
||||
|
||||
def add_additional_data(self, data):
|
||||
"""Add additional data to collected instance.
|
||||
|
||||
This can be overridden by host implementation to add custom
|
||||
additional data.
|
||||
|
||||
"""
|
||||
return data
|
||||
|
||||
def post_collecting_action(self):
|
||||
"""Execute some code after collection is done.
|
||||
|
||||
This is useful for example for restoring current render layer.
|
||||
|
||||
"""
|
||||
pass
|
||||
__all__ = (
|
||||
"AbstractCollectRender",
|
||||
"RenderInstance"
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,53 +1,32 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Abstract ExpectedFile class definition."""
|
||||
from abc import ABCMeta, abstractmethod
|
||||
import six
|
||||
"""Content was moved to 'openpype.pipeline.publish.abstract_expected_files'.
|
||||
|
||||
Please change your imports as soon as possible.
|
||||
|
||||
File will be probably removed in OpenPype 3.14.*
|
||||
"""
|
||||
|
||||
import warnings
|
||||
from openpype.pipeline.publish import ExpectedFiles
|
||||
|
||||
|
||||
@six.add_metaclass(ABCMeta)
|
||||
class ExpectedFiles:
|
||||
"""Class grouping functionality for all supported renderers.
|
||||
|
||||
Attributes:
|
||||
multipart (bool): Flag if multipart exrs are used.
|
||||
|
||||
"""
|
||||
|
||||
multipart = False
|
||||
|
||||
@abstractmethod
|
||||
def get(self, render_instance):
|
||||
"""Get expected files for given renderer and render layer.
|
||||
|
||||
This method should return dictionary of all files we are expecting
|
||||
to be rendered from the host. Usually `render_instance` corresponds
|
||||
to *render layer*. Result can be either flat list with the file
|
||||
paths or it can be list of dictionaries. Each key corresponds to
|
||||
for example AOV name or channel, etc.
|
||||
|
||||
Example::
|
||||
|
||||
['/path/to/file.001.exr', '/path/to/file.002.exr']
|
||||
|
||||
or as dictionary:
|
||||
|
||||
[
|
||||
{
|
||||
"beauty": ['/path/to/beauty.001.exr', ...],
|
||||
"mask": ['/path/to/mask.001.exr']
|
||||
}
|
||||
]
|
||||
class ExpectedFilesDeprecated(DeprecationWarning):
|
||||
pass
|
||||
|
||||
|
||||
Args:
|
||||
render_instance (:class:`RenderInstance`): Data passed from
|
||||
collector to determine files. This should be instance of
|
||||
:class:`abstract_collect_render.RenderInstance`
|
||||
warnings.simplefilter("always", ExpectedFilesDeprecated)
|
||||
warnings.warn(
|
||||
(
|
||||
"Content of 'abstract_expected_files' was moved."
|
||||
"\nUsing deprecated source of 'abstract_expected_files'. Content was"
|
||||
" move to 'openpype.pipeline.publish.abstract_expected_files'."
|
||||
" Please change your imports as soon as possible."
|
||||
),
|
||||
category=ExpectedFilesDeprecated,
|
||||
stacklevel=4
|
||||
)
|
||||
|
||||
Returns:
|
||||
list: Full paths to expected rendered files.
|
||||
list of dict: Path to expected rendered files categorized by
|
||||
AOVs, etc.
|
||||
|
||||
"""
|
||||
raise NotImplementedError()
|
||||
__all__ = (
|
||||
"ExpectedFiles",
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,10 +1,35 @@
|
|||
from abc import ABCMeta
|
||||
from pyblish.plugin import MetaPlugin, ExplicitMetaPlugin
|
||||
"""Content was moved to 'openpype.pipeline.publish.publish_plugins'.
|
||||
|
||||
Please change your imports as soon as possible.
|
||||
|
||||
File will be probably removed in OpenPype 3.14.*
|
||||
"""
|
||||
|
||||
import warnings
|
||||
from openpype.pipeline.publish import (
|
||||
AbstractMetaInstancePlugin,
|
||||
AbstractMetaContextPlugin
|
||||
)
|
||||
|
||||
|
||||
class AbstractMetaInstancePlugin(ABCMeta, MetaPlugin):
|
||||
class MetaPluginsDeprecated(DeprecationWarning):
|
||||
pass
|
||||
|
||||
|
||||
class AbstractMetaContextPlugin(ABCMeta, ExplicitMetaPlugin):
|
||||
pass
|
||||
warnings.simplefilter("always", MetaPluginsDeprecated)
|
||||
warnings.warn(
|
||||
(
|
||||
"Content of 'abstract_metaplugins' was moved."
|
||||
"\nUsing deprecated source of 'abstract_metaplugins'. Content was"
|
||||
" moved to 'openpype.pipeline.publish.publish_plugins'."
|
||||
" Please change your imports as soon as possible."
|
||||
),
|
||||
category=MetaPluginsDeprecated,
|
||||
stacklevel=4
|
||||
)
|
||||
|
||||
|
||||
__all__ = (
|
||||
"AbstractMetaInstancePlugin",
|
||||
"AbstractMetaContextPlugin",
|
||||
)
|
||||
|
|
|
|||
|
|
@ -11,6 +11,10 @@ from abc import ABCMeta, abstractmethod
|
|||
|
||||
import six
|
||||
|
||||
from openpype.client import (
|
||||
get_project,
|
||||
get_asset_by_name,
|
||||
)
|
||||
from openpype.settings import (
|
||||
get_system_settings,
|
||||
get_project_settings,
|
||||
|
|
@ -661,7 +665,11 @@ class ApplicationExecutable:
|
|||
if os.path.exists(plist_filepath):
|
||||
import plistlib
|
||||
|
||||
parsed_plist = plistlib.readPlist(plist_filepath)
|
||||
if hasattr(plistlib, "load"):
|
||||
with open(plist_filepath, "rb") as stream:
|
||||
parsed_plist = plistlib.load(stream)
|
||||
else:
|
||||
parsed_plist = plistlib.readPlist(plist_filepath)
|
||||
executable_filename = parsed_plist.get("CFBundleExecutable")
|
||||
|
||||
if executable_filename:
|
||||
|
|
@ -1310,11 +1318,8 @@ def get_app_environments_for_context(
|
|||
dbcon.install()
|
||||
|
||||
# Project document
|
||||
project_doc = dbcon.find_one({"type": "project"})
|
||||
asset_doc = dbcon.find_one({
|
||||
"type": "asset",
|
||||
"name": asset_name
|
||||
})
|
||||
project_doc = get_project(project_name)
|
||||
asset_doc = get_asset_by_name(project_name, asset_name)
|
||||
|
||||
if modules_manager is None:
|
||||
from openpype.modules import ModulesManager
|
||||
|
|
|
|||
|
|
@ -14,6 +14,7 @@ class AbstractAttrDefMeta(ABCMeta):
|
|||
|
||||
Each object of `AbtractAttrDef` mus have defined 'key' attribute.
|
||||
"""
|
||||
|
||||
def __call__(self, *args, **kwargs):
|
||||
obj = super(AbstractAttrDefMeta, self).__call__(*args, **kwargs)
|
||||
init_class = getattr(obj, "__init__class__", None)
|
||||
|
|
@ -45,6 +46,7 @@ class AbtractAttrDef:
|
|||
is_label_horizontal(bool): UI specific argument. Specify if label is
|
||||
next to value input or ahead.
|
||||
"""
|
||||
|
||||
is_value_def = True
|
||||
|
||||
def __init__(
|
||||
|
|
@ -77,6 +79,7 @@ class AbtractAttrDef:
|
|||
Convert passed value to a valid type. Use default if value can't be
|
||||
converted.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
|
||||
|
|
@ -113,6 +116,7 @@ class UnknownDef(AbtractAttrDef):
|
|||
This attribute can be used to keep existing data unchanged but does not
|
||||
have known definition of type.
|
||||
"""
|
||||
|
||||
def __init__(self, key, default=None, **kwargs):
|
||||
kwargs["default"] = default
|
||||
super(UnknownDef, self).__init__(key, **kwargs)
|
||||
|
|
@ -204,6 +208,7 @@ class TextDef(AbtractAttrDef):
|
|||
placeholder(str): UI placeholder for attribute.
|
||||
default(str, None): Default value. Empty string used when not defined.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self, key, multiline=None, regex=None, placeholder=None, default=None,
|
||||
**kwargs
|
||||
|
|
@ -531,14 +536,15 @@ class FileDef(AbtractAttrDef):
|
|||
Args:
|
||||
single_item(bool): Allow only single path item.
|
||||
folders(bool): Allow folder paths.
|
||||
extensions(list<str>): Allow files with extensions. Empty list will
|
||||
extensions(List[str]): Allow files with extensions. Empty list will
|
||||
allow all extensions and None will disable files completely.
|
||||
default(str, list<str>): Defautl value.
|
||||
extensions_label(str): Custom label shown instead of extensions in UI.
|
||||
default(str, List[str]): Default value.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self, key, single_item=True, folders=None, extensions=None,
|
||||
allow_sequences=True, default=None, **kwargs
|
||||
allow_sequences=True, extensions_label=None, default=None, **kwargs
|
||||
):
|
||||
if folders is None and extensions is None:
|
||||
folders = True
|
||||
|
|
@ -578,6 +584,7 @@ class FileDef(AbtractAttrDef):
|
|||
self.folders = folders
|
||||
self.extensions = set(extensions)
|
||||
self.allow_sequences = allow_sequences
|
||||
self.extensions_label = extensions_label
|
||||
super(FileDef, self).__init__(key, default=default, **kwargs)
|
||||
|
||||
def __eq__(self, other):
|
||||
|
|
|
|||
|
|
@ -7,9 +7,20 @@ import platform
|
|||
import logging
|
||||
import collections
|
||||
import functools
|
||||
import warnings
|
||||
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
from openpype.client import (
|
||||
get_project,
|
||||
get_assets,
|
||||
get_asset_by_name,
|
||||
get_subset_by_name,
|
||||
get_subsets,
|
||||
get_last_versions,
|
||||
get_last_version_by_subset_id,
|
||||
get_last_version_by_subset_name,
|
||||
get_representations,
|
||||
get_workfile_info,
|
||||
)
|
||||
from openpype.settings import (
|
||||
get_project_settings,
|
||||
get_system_settings
|
||||
|
|
@ -35,6 +46,51 @@ PROJECT_NAME_REGEX = re.compile(
|
|||
)
|
||||
|
||||
|
||||
class AvalonContextDeprecatedWarning(DeprecationWarning):
|
||||
pass
|
||||
|
||||
|
||||
def deprecated(new_destination):
|
||||
"""Mark functions as deprecated.
|
||||
|
||||
It will result in a warning being emitted when the function is used.
|
||||
"""
|
||||
|
||||
func = None
|
||||
if callable(new_destination):
|
||||
func = new_destination
|
||||
new_destination = None
|
||||
|
||||
def _decorator(decorated_func):
|
||||
if new_destination is None:
|
||||
warning_message = (
|
||||
" Please check content of deprecated function to figure out"
|
||||
" possible replacement."
|
||||
)
|
||||
else:
|
||||
warning_message = " Please replace your usage with '{}'.".format(
|
||||
new_destination
|
||||
)
|
||||
|
||||
@functools.wraps(decorated_func)
|
||||
def wrapper(*args, **kwargs):
|
||||
warnings.simplefilter("always", AvalonContextDeprecatedWarning)
|
||||
warnings.warn(
|
||||
(
|
||||
"Call to deprecated function '{}'"
|
||||
"\nFunction was moved or removed.{}"
|
||||
).format(decorated_func.__name__, warning_message),
|
||||
category=AvalonContextDeprecatedWarning,
|
||||
stacklevel=4
|
||||
)
|
||||
return decorated_func(*args, **kwargs)
|
||||
return wrapper
|
||||
|
||||
if func is None:
|
||||
return _decorator
|
||||
return _decorator(func)
|
||||
|
||||
|
||||
def create_project(
|
||||
project_name, project_code, library_project=False, dbcon=None
|
||||
):
|
||||
|
|
@ -64,6 +120,11 @@ def create_project(
|
|||
from openpype.pipeline import AvalonMongoDB
|
||||
from openpype.pipeline.schema import validate
|
||||
|
||||
if get_project(project_name, fields=["name"]):
|
||||
raise ValueError("Project with name \"{}\" already exists".format(
|
||||
project_name
|
||||
))
|
||||
|
||||
if dbcon is None:
|
||||
dbcon = AvalonMongoDB()
|
||||
|
||||
|
|
@ -73,15 +134,6 @@ def create_project(
|
|||
).format(project_name))
|
||||
|
||||
database = dbcon.database
|
||||
project_doc = database[project_name].find_one(
|
||||
{"type": "project"},
|
||||
{"name": 1}
|
||||
)
|
||||
if project_doc:
|
||||
raise ValueError("Project with name \"{}\" already exists".format(
|
||||
project_name
|
||||
))
|
||||
|
||||
project_doc = {
|
||||
"type": "project",
|
||||
"name": project_name,
|
||||
|
|
@ -104,7 +156,7 @@ def create_project(
|
|||
database[project_name].delete_one({"type": "project"})
|
||||
raise
|
||||
|
||||
project_doc = database[project_name].find_one({"type": "project"})
|
||||
project_doc = get_project(project_name)
|
||||
|
||||
try:
|
||||
# Validate created project document
|
||||
|
|
@ -127,7 +179,7 @@ def with_pipeline_io(func):
|
|||
return wrapped
|
||||
|
||||
|
||||
@with_pipeline_io
|
||||
@deprecated("openpype.pipeline.context_tools.is_representation_from_latest")
|
||||
def is_latest(representation):
|
||||
"""Return whether the representation is from latest version
|
||||
|
||||
|
|
@ -136,135 +188,43 @@ def is_latest(representation):
|
|||
|
||||
Returns:
|
||||
bool: Whether the representation is of latest version.
|
||||
|
||||
"""
|
||||
|
||||
version = legacy_io.find_one({"_id": representation['parent']})
|
||||
if version["type"] == "hero_version":
|
||||
return True
|
||||
from openpype.pipeline.context_tools import is_representation_from_latest
|
||||
|
||||
# Get highest version under the parent
|
||||
highest_version = legacy_io.find_one({
|
||||
"type": "version",
|
||||
"parent": version["parent"]
|
||||
}, sort=[("name", -1)], projection={"name": True})
|
||||
|
||||
if version['name'] == highest_version['name']:
|
||||
return True
|
||||
else:
|
||||
return False
|
||||
return is_representation_from_latest(representation)
|
||||
|
||||
|
||||
@with_pipeline_io
|
||||
@deprecated("openpype.pipeline.load.any_outdated_containers")
|
||||
def any_outdated():
|
||||
"""Return whether the current scene has any outdated content"""
|
||||
from openpype.pipeline import registered_host
|
||||
|
||||
checked = set()
|
||||
host = registered_host()
|
||||
for container in host.ls():
|
||||
representation = container['representation']
|
||||
if representation in checked:
|
||||
continue
|
||||
from openpype.pipeline.load import any_outdated_containers
|
||||
|
||||
representation_doc = legacy_io.find_one(
|
||||
{
|
||||
"_id": ObjectId(representation),
|
||||
"type": "representation"
|
||||
},
|
||||
projection={"parent": True}
|
||||
)
|
||||
if representation_doc and not is_latest(representation_doc):
|
||||
return True
|
||||
elif not representation_doc:
|
||||
log.debug("Container '{objectName}' has an invalid "
|
||||
"representation, it is missing in the "
|
||||
"database".format(**container))
|
||||
|
||||
checked.add(representation)
|
||||
|
||||
return False
|
||||
return any_outdated_containers()
|
||||
|
||||
|
||||
@with_pipeline_io
|
||||
@deprecated("openpype.pipeline.context_tools.get_current_project_asset")
|
||||
def get_asset(asset_name=None):
|
||||
""" Returning asset document from database by its name.
|
||||
|
||||
Doesn't count with duplicities on asset names!
|
||||
|
||||
Args:
|
||||
asset_name (str)
|
||||
|
||||
Returns:
|
||||
(MongoDB document)
|
||||
"""
|
||||
if not asset_name:
|
||||
asset_name = legacy_io.Session["AVALON_ASSET"]
|
||||
|
||||
asset_document = legacy_io.find_one({
|
||||
"name": asset_name,
|
||||
"type": "asset"
|
||||
})
|
||||
|
||||
if not asset_document:
|
||||
raise TypeError("Entity \"{}\" was not found in DB".format(asset_name))
|
||||
|
||||
return asset_document
|
||||
|
||||
|
||||
@with_pipeline_io
|
||||
def get_hierarchy(asset_name=None):
|
||||
"""
|
||||
Obtain asset hierarchy path string from mongo db
|
||||
Doesn't count with duplicities on asset names!
|
||||
|
||||
Args:
|
||||
asset_name (str)
|
||||
|
||||
Returns:
|
||||
(string): asset hierarchy path
|
||||
|
||||
(MongoDB document)
|
||||
"""
|
||||
if not asset_name:
|
||||
asset_name = legacy_io.Session.get(
|
||||
"AVALON_ASSET",
|
||||
os.environ["AVALON_ASSET"]
|
||||
)
|
||||
|
||||
asset_entity = legacy_io.find_one({
|
||||
"type": 'asset',
|
||||
"name": asset_name
|
||||
})
|
||||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
|
||||
not_set = "PARENTS_NOT_SET"
|
||||
entity_parents = asset_entity.get("data", {}).get("parents", not_set)
|
||||
|
||||
# If entity already have parents then just return joined
|
||||
if entity_parents != not_set:
|
||||
return "/".join(entity_parents)
|
||||
|
||||
# Else query parents through visualParents and store result to entity
|
||||
hierarchy_items = []
|
||||
entity = asset_entity
|
||||
while True:
|
||||
parent_id = entity.get("data", {}).get("visualParent")
|
||||
if not parent_id:
|
||||
break
|
||||
entity = legacy_io.find_one({"_id": parent_id})
|
||||
hierarchy_items.append(entity["name"])
|
||||
|
||||
# Add parents to entity data for next query
|
||||
entity_data = asset_entity.get("data", {})
|
||||
entity_data["parents"] = hierarchy_items
|
||||
legacy_io.update_many(
|
||||
{"_id": asset_entity["_id"]},
|
||||
{"$set": {"data": entity_data}}
|
||||
)
|
||||
|
||||
return "/".join(hierarchy_items)
|
||||
return get_current_project_asset(asset_name=asset_name)
|
||||
|
||||
|
||||
def get_system_general_anatomy_data():
|
||||
system_settings = get_system_settings()
|
||||
def get_system_general_anatomy_data(system_settings=None):
|
||||
if not system_settings:
|
||||
system_settings = get_system_settings()
|
||||
studio_name = system_settings["general"]["studio_name"]
|
||||
studio_code = system_settings["general"]["studio_code"]
|
||||
return {
|
||||
|
|
@ -312,14 +272,16 @@ def get_linked_assets(asset_doc):
|
|||
Returns:
|
||||
(list) Asset documents of input links for passed asset doc.
|
||||
"""
|
||||
|
||||
link_ids = get_linked_asset_ids(asset_doc)
|
||||
if not link_ids:
|
||||
return []
|
||||
|
||||
return list(legacy_io.find({"_id": {"$in": link_ids}}))
|
||||
project_name = legacy_io.active_project()
|
||||
return list(get_assets(project_name, link_ids))
|
||||
|
||||
|
||||
@with_pipeline_io
|
||||
@deprecated("openpype.client.get_last_version_by_subset_name")
|
||||
def get_latest_version(asset_name, subset_name, dbcon=None, project_name=None):
|
||||
"""Retrieve latest version from `asset_name`, and `subset_name`.
|
||||
|
||||
|
|
@ -338,57 +300,20 @@ def get_latest_version(asset_name, subset_name, dbcon=None, project_name=None):
|
|||
dict: Last version document for entered .
|
||||
"""
|
||||
|
||||
if not dbcon:
|
||||
log.debug("Using `legacy_io` for query.")
|
||||
dbcon = legacy_io
|
||||
# Make sure is installed
|
||||
dbcon.install()
|
||||
if not project_name:
|
||||
if not dbcon:
|
||||
from openpype.pipeline import legacy_io
|
||||
|
||||
if project_name and project_name != dbcon.Session.get("AVALON_PROJECT"):
|
||||
# `legacy_io` has only `_database` attribute
|
||||
# but `AvalonMongoDB` has `database`
|
||||
database = getattr(dbcon, "database", dbcon._database)
|
||||
collection = database[project_name]
|
||||
else:
|
||||
project_name = dbcon.Session.get("AVALON_PROJECT")
|
||||
collection = dbcon
|
||||
log.debug("Using `legacy_io` for query.")
|
||||
dbcon = legacy_io
|
||||
# Make sure is installed
|
||||
dbcon.install()
|
||||
|
||||
log.debug((
|
||||
"Getting latest version for Project: \"{}\" Asset: \"{}\""
|
||||
" and Subset: \"{}\""
|
||||
).format(project_name, asset_name, subset_name))
|
||||
project_name = dbcon.active_project()
|
||||
|
||||
# Query asset document id by asset name
|
||||
asset_doc = collection.find_one(
|
||||
{"type": "asset", "name": asset_name},
|
||||
{"_id": True}
|
||||
return get_last_version_by_subset_name(
|
||||
project_name, subset_name, asset_name=asset_name
|
||||
)
|
||||
if not asset_doc:
|
||||
log.info(
|
||||
"Asset \"{}\" was not found in Database.".format(asset_name)
|
||||
)
|
||||
return None
|
||||
|
||||
subset_doc = collection.find_one(
|
||||
{"type": "subset", "name": subset_name, "parent": asset_doc["_id"]},
|
||||
{"_id": True}
|
||||
)
|
||||
if not subset_doc:
|
||||
log.info(
|
||||
"Subset \"{}\" was not found in Database.".format(subset_name)
|
||||
)
|
||||
return None
|
||||
|
||||
version_doc = collection.find_one(
|
||||
{"type": "version", "parent": subset_doc["_id"]},
|
||||
sort=[("name", -1)],
|
||||
)
|
||||
if not version_doc:
|
||||
log.info(
|
||||
"Subset \"{}\" does not have any version yet.".format(subset_name)
|
||||
)
|
||||
return None
|
||||
return version_doc
|
||||
|
||||
|
||||
def get_workfile_template_key_from_context(
|
||||
|
|
@ -419,28 +344,17 @@ def get_workfile_template_key_from_context(
|
|||
ValueError: When both 'dbcon' and 'project_name' were not
|
||||
passed.
|
||||
"""
|
||||
if not dbcon:
|
||||
if not project_name:
|
||||
if not project_name:
|
||||
if not dbcon:
|
||||
raise ValueError((
|
||||
"`get_workfile_template_key_from_context` requires to pass"
|
||||
" one of 'dbcon' or 'project_name' arguments."
|
||||
))
|
||||
from openpype.pipeline import AvalonMongoDB
|
||||
|
||||
dbcon = AvalonMongoDB()
|
||||
dbcon.Session["AVALON_PROJECT"] = project_name
|
||||
project_name = dbcon.active_project()
|
||||
|
||||
elif not project_name:
|
||||
project_name = dbcon.Session["AVALON_PROJECT"]
|
||||
|
||||
asset_doc = dbcon.find_one(
|
||||
{
|
||||
"type": "asset",
|
||||
"name": asset_name
|
||||
},
|
||||
{
|
||||
"data.tasks": 1
|
||||
}
|
||||
asset_doc = get_asset_by_name(
|
||||
project_name, asset_name, fields=["data.tasks"]
|
||||
)
|
||||
asset_tasks = asset_doc.get("data", {}).get("tasks") or {}
|
||||
task_info = asset_tasks.get(task_name) or {}
|
||||
|
|
@ -604,10 +518,10 @@ def get_workdir_with_workdir_data(
|
|||
|
||||
anatomy_filled = anatomy.format(workdir_data)
|
||||
# Output is TemplateResult object which contain useful data
|
||||
path = anatomy_filled[template_key]["folder"]
|
||||
if path:
|
||||
path = os.path.normpath(path)
|
||||
return path
|
||||
output = anatomy_filled[template_key]["folder"]
|
||||
if output:
|
||||
return output.normalized()
|
||||
return output
|
||||
|
||||
|
||||
def get_workdir(
|
||||
|
|
@ -637,6 +551,7 @@ def get_workdir(
|
|||
Returns:
|
||||
TemplateResult: Workdir path.
|
||||
"""
|
||||
|
||||
if not anatomy:
|
||||
from openpype.pipeline import Anatomy
|
||||
anatomy = Anatomy(project_doc["name"])
|
||||
|
|
@ -665,15 +580,11 @@ def template_data_from_session(session=None):
|
|||
session = legacy_io.Session
|
||||
|
||||
project_name = session["AVALON_PROJECT"]
|
||||
project_doc = legacy_io.database[project_name].find_one({
|
||||
"type": "project"
|
||||
})
|
||||
asset_doc = legacy_io.database[project_name].find_one({
|
||||
"type": "asset",
|
||||
"name": session["AVALON_ASSET"]
|
||||
})
|
||||
asset_name = session["AVALON_ASSET"]
|
||||
task_name = session["AVALON_TASK"]
|
||||
host_name = session["AVALON_APP"]
|
||||
project_doc = get_project(project_name)
|
||||
asset_doc = get_asset_by_name(project_name, asset_name)
|
||||
return get_workdir_data(project_doc, asset_doc, task_name, host_name)
|
||||
|
||||
|
||||
|
|
@ -698,8 +609,8 @@ def compute_session_changes(
|
|||
|
||||
Returns:
|
||||
dict: The required changes in the Session dictionary.
|
||||
|
||||
"""
|
||||
|
||||
changes = dict()
|
||||
|
||||
# If no changes, return directly
|
||||
|
|
@ -717,12 +628,9 @@ def compute_session_changes(
|
|||
|
||||
if not asset_document or not asset_tasks:
|
||||
# Assume asset name
|
||||
asset_document = legacy_io.find_one(
|
||||
{
|
||||
"name": asset,
|
||||
"type": "asset"
|
||||
},
|
||||
{"data.tasks": True}
|
||||
project_name = session["AVALON_PROJECT"]
|
||||
asset_document = get_asset_by_name(
|
||||
project_name, asset, fields=["data.tasks"]
|
||||
)
|
||||
assert asset_document, "Asset must exist"
|
||||
|
||||
|
|
@ -819,6 +727,7 @@ def update_current_task(task=None, asset=None, app=None, template_key=None):
|
|||
|
||||
|
||||
@with_pipeline_io
|
||||
@deprecated("openpype.client.get_workfile_info")
|
||||
def get_workfile_doc(asset_id, task_name, filename, dbcon=None):
|
||||
"""Return workfile document for entered context.
|
||||
|
||||
|
|
@ -835,16 +744,13 @@ def get_workfile_doc(asset_id, task_name, filename, dbcon=None):
|
|||
Returns:
|
||||
dict: Workfile document or None.
|
||||
"""
|
||||
|
||||
# Use legacy_io if dbcon is not entered
|
||||
if not dbcon:
|
||||
dbcon = legacy_io
|
||||
|
||||
return dbcon.find_one({
|
||||
"type": "workfile",
|
||||
"parent": asset_id,
|
||||
"task_name": task_name,
|
||||
"filename": filename
|
||||
})
|
||||
project_name = dbcon.active_project()
|
||||
return get_workfile_info(project_name, asset_id, task_name, filename)
|
||||
|
||||
|
||||
@with_pipeline_io
|
||||
|
|
@ -879,12 +785,13 @@ def create_workfile_doc(asset_doc, task_name, filename, workdir, dbcon=None):
|
|||
doc_data = copy.deepcopy(doc_filter)
|
||||
|
||||
# Prepare project for workdir data
|
||||
project_doc = dbcon.find_one({"type": "project"})
|
||||
project_name = dbcon.active_project()
|
||||
project_doc = get_project(project_name)
|
||||
workdir_data = get_workdir_data(
|
||||
project_doc, asset_doc, task_name, dbcon.Session["AVALON_APP"]
|
||||
)
|
||||
# Prepare anatomy
|
||||
anatomy = Anatomy(project_doc["name"])
|
||||
anatomy = Anatomy(project_name)
|
||||
# Get workdir path (result is anatomy.TemplateResult)
|
||||
template_workdir = get_workdir_with_workdir_data(
|
||||
workdir_data, anatomy
|
||||
|
|
@ -999,12 +906,11 @@ class BuildWorkfile:
|
|||
from openpype.pipeline import discover_loader_plugins
|
||||
|
||||
# Get current asset name and entity
|
||||
project_name = legacy_io.active_project()
|
||||
current_asset_name = legacy_io.Session["AVALON_ASSET"]
|
||||
current_asset_entity = legacy_io.find_one({
|
||||
"type": "asset",
|
||||
"name": current_asset_name
|
||||
})
|
||||
|
||||
current_asset_entity = get_asset_by_name(
|
||||
project_name, current_asset_name
|
||||
)
|
||||
# Skip if asset was not found
|
||||
if not current_asset_entity:
|
||||
print("Asset entity with name `{}` was not found".format(
|
||||
|
|
@ -1509,7 +1415,7 @@ class BuildWorkfile:
|
|||
return loaded_containers
|
||||
|
||||
@with_pipeline_io
|
||||
def _collect_last_version_repres(self, asset_entities):
|
||||
def _collect_last_version_repres(self, asset_docs):
|
||||
"""Collect subsets, versions and representations for asset_entities.
|
||||
|
||||
Args:
|
||||
|
|
@ -1542,64 +1448,56 @@ class BuildWorkfile:
|
|||
```
|
||||
"""
|
||||
|
||||
if not asset_entities:
|
||||
return {}
|
||||
output = {}
|
||||
if not asset_docs:
|
||||
return output
|
||||
|
||||
asset_entity_by_ids = {asset["_id"]: asset for asset in asset_entities}
|
||||
asset_docs_by_ids = {asset["_id"]: asset for asset in asset_docs}
|
||||
|
||||
subsets = list(legacy_io.find({
|
||||
"type": "subset",
|
||||
"parent": {"$in": list(asset_entity_by_ids.keys())}
|
||||
}))
|
||||
project_name = legacy_io.active_project()
|
||||
subsets = list(get_subsets(
|
||||
project_name, asset_ids=asset_docs_by_ids.keys()
|
||||
))
|
||||
subset_entity_by_ids = {subset["_id"]: subset for subset in subsets}
|
||||
|
||||
sorted_versions = list(legacy_io.find({
|
||||
"type": "version",
|
||||
"parent": {"$in": list(subset_entity_by_ids.keys())}
|
||||
}).sort("name", -1))
|
||||
last_version_by_subset_id = get_last_versions(
|
||||
project_name, subset_entity_by_ids.keys()
|
||||
)
|
||||
last_version_docs_by_id = {
|
||||
version["_id"]: version
|
||||
for version in last_version_by_subset_id.values()
|
||||
}
|
||||
repre_docs = get_representations(
|
||||
project_name, version_ids=last_version_docs_by_id.keys()
|
||||
)
|
||||
|
||||
subset_id_with_latest_version = []
|
||||
last_versions_by_id = {}
|
||||
for version in sorted_versions:
|
||||
subset_id = version["parent"]
|
||||
if subset_id in subset_id_with_latest_version:
|
||||
continue
|
||||
subset_id_with_latest_version.append(subset_id)
|
||||
last_versions_by_id[version["_id"]] = version
|
||||
for repre_doc in repre_docs:
|
||||
version_id = repre_doc["parent"]
|
||||
version_doc = last_version_docs_by_id[version_id]
|
||||
|
||||
repres = legacy_io.find({
|
||||
"type": "representation",
|
||||
"parent": {"$in": list(last_versions_by_id.keys())}
|
||||
})
|
||||
subset_id = version_doc["parent"]
|
||||
subset_doc = subset_entity_by_ids[subset_id]
|
||||
|
||||
output = {}
|
||||
for repre in repres:
|
||||
version_id = repre["parent"]
|
||||
version = last_versions_by_id[version_id]
|
||||
|
||||
subset_id = version["parent"]
|
||||
subset = subset_entity_by_ids[subset_id]
|
||||
|
||||
asset_id = subset["parent"]
|
||||
asset = asset_entity_by_ids[asset_id]
|
||||
asset_id = subset_doc["parent"]
|
||||
asset_doc = asset_docs_by_ids[asset_id]
|
||||
|
||||
if asset_id not in output:
|
||||
output[asset_id] = {
|
||||
"asset_entity": asset,
|
||||
"asset_entity": asset_doc,
|
||||
"subsets": {}
|
||||
}
|
||||
|
||||
if subset_id not in output[asset_id]["subsets"]:
|
||||
output[asset_id]["subsets"][subset_id] = {
|
||||
"subset_entity": subset,
|
||||
"subset_entity": subset_doc,
|
||||
"version": {
|
||||
"version_entity": version,
|
||||
"version_entity": version_doc,
|
||||
"repres": []
|
||||
}
|
||||
}
|
||||
|
||||
output[asset_id]["subsets"][subset_id]["version"]["repres"].append(
|
||||
repre
|
||||
repre_doc
|
||||
)
|
||||
|
||||
return output
|
||||
|
|
@ -1807,35 +1705,19 @@ def get_custom_workfile_template_by_string_context(
|
|||
context. (Existence of formatted path is not validated.)
|
||||
"""
|
||||
|
||||
if dbcon is None:
|
||||
from openpype.pipeline import AvalonMongoDB
|
||||
project_name = None
|
||||
if anatomy is not None:
|
||||
project_name = anatomy.project_name
|
||||
|
||||
dbcon = AvalonMongoDB()
|
||||
if not project_name and dbcon is not None:
|
||||
project_name = dbcon.active_project()
|
||||
|
||||
dbcon.install()
|
||||
if not project_name:
|
||||
raise ValueError("Can't determina project")
|
||||
|
||||
if dbcon.Session["AVALON_PROJECT"] != project_name:
|
||||
dbcon.Session["AVALON_PROJECT"] = project_name
|
||||
|
||||
project_doc = dbcon.find_one(
|
||||
{"type": "project"},
|
||||
# All we need is "name" and "data.code" keys
|
||||
{
|
||||
"name": 1,
|
||||
"data.code": 1
|
||||
}
|
||||
)
|
||||
asset_doc = dbcon.find_one(
|
||||
{
|
||||
"type": "asset",
|
||||
"name": asset_name
|
||||
},
|
||||
# All we need is "name" and "data.tasks" keys
|
||||
{
|
||||
"name": 1,
|
||||
"data.tasks": 1
|
||||
}
|
||||
)
|
||||
project_doc = get_project(project_name, fields=["name", "data.code"])
|
||||
asset_doc = get_asset_by_name(
|
||||
project_name, asset_name, fields=["name", "data.tasks"])
|
||||
|
||||
return get_custom_workfile_template_by_context(
|
||||
template_profiles, project_doc, asset_doc, task_name, anatomy
|
||||
|
|
|
|||
|
|
@ -11,6 +11,10 @@ except Exception:
|
|||
from openpype.lib.python_2_comp import WeakMethod
|
||||
|
||||
|
||||
class MissingEventSystem(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class EventCallback(object):
|
||||
"""Callback registered to a topic.
|
||||
|
||||
|
|
@ -176,16 +180,20 @@ class Event(object):
|
|||
topic (str): Identifier of event.
|
||||
data (Any): Data specific for event. Dictionary is recommended.
|
||||
source (str): Identifier of source.
|
||||
event_system (EventSystem): Event system in which can be event
|
||||
triggered.
|
||||
"""
|
||||
|
||||
_data = {}
|
||||
|
||||
def __init__(self, topic, data=None, source=None):
|
||||
def __init__(self, topic, data=None, source=None, event_system=None):
|
||||
self._id = str(uuid4())
|
||||
self._topic = topic
|
||||
if data is None:
|
||||
data = {}
|
||||
self._data = data
|
||||
self._source = source
|
||||
self._event_system = event_system
|
||||
|
||||
def __getitem__(self, key):
|
||||
return self._data[key]
|
||||
|
|
@ -211,28 +219,118 @@ class Event(object):
|
|||
|
||||
def emit(self):
|
||||
"""Emit event and trigger callbacks."""
|
||||
StoredCallbacks.emit_event(self)
|
||||
if self._event_system is None:
|
||||
raise MissingEventSystem(
|
||||
"Can't emit event {}. Does not have set event system.".format(
|
||||
str(repr(self))
|
||||
)
|
||||
)
|
||||
self._event_system.emit_event(self)
|
||||
|
||||
|
||||
class StoredCallbacks:
|
||||
_registered_callbacks = []
|
||||
class EventSystem(object):
|
||||
"""Encapsulate event handling into an object.
|
||||
|
||||
System wraps registered callbacks and triggered events into single object
|
||||
so it is possible to create mutltiple independent systems that have their
|
||||
topics and callbacks.
|
||||
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
self._registered_callbacks = []
|
||||
|
||||
def add_callback(self, topic, callback):
|
||||
"""Register callback in event system.
|
||||
|
||||
Args:
|
||||
topic (str): Topic for EventCallback.
|
||||
callback (Callable): Function or method that will be called
|
||||
when topic is triggered.
|
||||
|
||||
Returns:
|
||||
EventCallback: Created callback object which can be used to
|
||||
stop listening.
|
||||
"""
|
||||
|
||||
@classmethod
|
||||
def add_callback(cls, topic, callback):
|
||||
callback = EventCallback(topic, callback)
|
||||
cls._registered_callbacks.append(callback)
|
||||
self._registered_callbacks.append(callback)
|
||||
return callback
|
||||
|
||||
@classmethod
|
||||
def emit_event(cls, event):
|
||||
def create_event(self, topic, data, source):
|
||||
"""Create new event which is bound to event system.
|
||||
|
||||
Args:
|
||||
topic (str): Event topic.
|
||||
data (dict): Data related to event.
|
||||
source (str): Source of event.
|
||||
|
||||
Returns:
|
||||
Event: Object of event.
|
||||
"""
|
||||
|
||||
return Event(topic, data, source, self)
|
||||
|
||||
def emit(self, topic, data, source):
|
||||
"""Create event based on passed data and emit it.
|
||||
|
||||
This is easiest way how to trigger event in an event system.
|
||||
|
||||
Args:
|
||||
topic (str): Event topic.
|
||||
data (dict): Data related to event.
|
||||
source (str): Source of event.
|
||||
|
||||
Returns:
|
||||
Event: Created and emitted event.
|
||||
"""
|
||||
|
||||
event = self.create_event(topic, data, source)
|
||||
event.emit()
|
||||
return event
|
||||
|
||||
def emit_event(self, event):
|
||||
"""Emit event object.
|
||||
|
||||
Args:
|
||||
event (Event): Prepared event with topic and data.
|
||||
"""
|
||||
|
||||
invalid_callbacks = []
|
||||
for callback in cls._registered_callbacks:
|
||||
for callback in self._registered_callbacks:
|
||||
callback.process_event(event)
|
||||
if not callback.is_ref_valid:
|
||||
invalid_callbacks.append(callback)
|
||||
|
||||
for callback in invalid_callbacks:
|
||||
cls._registered_callbacks.remove(callback)
|
||||
self._registered_callbacks.remove(callback)
|
||||
|
||||
|
||||
class GlobalEventSystem:
|
||||
"""Event system living in global scope of process.
|
||||
|
||||
This is primarily used in host implementation to trigger events
|
||||
related to DCC changes or changes of context in the host implementation.
|
||||
"""
|
||||
|
||||
_global_event_system = None
|
||||
|
||||
@classmethod
|
||||
def get_global_event_system(cls):
|
||||
if cls._global_event_system is None:
|
||||
cls._global_event_system = EventSystem()
|
||||
return cls._global_event_system
|
||||
|
||||
@classmethod
|
||||
def add_callback(cls, topic, callback):
|
||||
event_system = cls.get_global_event_system()
|
||||
return event_system.add_callback(topic, callback)
|
||||
|
||||
@classmethod
|
||||
def emit(cls, topic, data, source):
|
||||
event_system = cls.get_global_event_system()
|
||||
return event_system.emit(topic, data, source)
|
||||
|
||||
|
||||
def register_event_callback(topic, callback):
|
||||
|
|
@ -249,7 +347,8 @@ def register_event_callback(topic, callback):
|
|||
enable/disable listening to a topic or remove the callback from
|
||||
the topic completely.
|
||||
"""
|
||||
return StoredCallbacks.add_callback(topic, callback)
|
||||
|
||||
return GlobalEventSystem.add_callback(topic, callback)
|
||||
|
||||
|
||||
def emit_event(topic, data=None, source=None):
|
||||
|
|
@ -263,6 +362,5 @@ def emit_event(topic, data=None, source=None):
|
|||
Returns:
|
||||
Event: Object of event that was emitted.
|
||||
"""
|
||||
event = Event(topic, data, source)
|
||||
event.emit()
|
||||
return event
|
||||
|
||||
return GlobalEventSystem.emit(topic, data, source)
|
||||
|
|
|
|||
171
openpype/lib/file_transaction.py
Normal file
171
openpype/lib/file_transaction.py
Normal file
|
|
@ -0,0 +1,171 @@
|
|||
import os
|
||||
import logging
|
||||
import sys
|
||||
import errno
|
||||
import six
|
||||
|
||||
from openpype.lib import create_hard_link
|
||||
|
||||
# this is needed until speedcopy for linux is fixed
|
||||
if sys.platform == "win32":
|
||||
from speedcopy import copyfile
|
||||
else:
|
||||
from shutil import copyfile
|
||||
|
||||
|
||||
class FileTransaction(object):
|
||||
"""
|
||||
|
||||
The file transaction is a three step process.
|
||||
|
||||
1) Rename any existing files to a "temporary backup" during `process()`
|
||||
2) Copy the files to final destination during `process()`
|
||||
3) Remove any backed up files (*no rollback possible!) during `finalize()`
|
||||
|
||||
Step 3 is done during `finalize()`. If not called the .bak files will
|
||||
remain on disk.
|
||||
|
||||
These steps try to ensure that we don't overwrite half of any existing
|
||||
files e.g. if they are currently in use.
|
||||
|
||||
Note:
|
||||
A regular filesystem is *not* a transactional file system and even
|
||||
though this implementation tries to produce a 'safe copy' with a
|
||||
potential rollback do keep in mind that it's inherently unsafe due
|
||||
to how filesystem works and a myriad of things could happen during
|
||||
the transaction that break the logic. A file storage could go down,
|
||||
permissions could be changed, other machines could be moving or writing
|
||||
files. A lot can happen.
|
||||
|
||||
Warning:
|
||||
Any folders created during the transfer will not be removed.
|
||||
|
||||
"""
|
||||
|
||||
MODE_COPY = 0
|
||||
MODE_HARDLINK = 1
|
||||
|
||||
def __init__(self, log=None):
|
||||
|
||||
if log is None:
|
||||
log = logging.getLogger("FileTransaction")
|
||||
|
||||
self.log = log
|
||||
|
||||
# The transfer queue
|
||||
# todo: make this an actual FIFO queue?
|
||||
self._transfers = {}
|
||||
|
||||
# Destination file paths that a file was transferred to
|
||||
self._transferred = []
|
||||
|
||||
# Backup file location mapping to original locations
|
||||
self._backup_to_original = {}
|
||||
|
||||
def add(self, src, dst, mode=MODE_COPY):
|
||||
"""Add a new file to transfer queue"""
|
||||
opts = {"mode": mode}
|
||||
|
||||
src = os.path.abspath(src)
|
||||
dst = os.path.abspath(dst)
|
||||
|
||||
if dst in self._transfers:
|
||||
queued_src = self._transfers[dst][0]
|
||||
if src == queued_src:
|
||||
self.log.debug("File transfer was already "
|
||||
"in queue: {} -> {}".format(src, dst))
|
||||
return
|
||||
else:
|
||||
self.log.warning("File transfer in queue replaced..")
|
||||
self.log.debug("Removed from queue: "
|
||||
"{} -> {}".format(queued_src, dst))
|
||||
self.log.debug("Added to queue: {} -> {}".format(src, dst))
|
||||
|
||||
self._transfers[dst] = (src, opts)
|
||||
|
||||
def process(self):
|
||||
|
||||
# Backup any existing files
|
||||
for dst in self._transfers.keys():
|
||||
if os.path.exists(dst):
|
||||
# Backup original file
|
||||
# todo: add timestamp or uuid to ensure unique
|
||||
backup = dst + ".bak"
|
||||
self._backup_to_original[backup] = dst
|
||||
self.log.debug("Backup existing file: "
|
||||
"{} -> {}".format(dst, backup))
|
||||
os.rename(dst, backup)
|
||||
|
||||
# Copy the files to transfer
|
||||
for dst, (src, opts) in self._transfers.items():
|
||||
self._create_folder_for_file(dst)
|
||||
|
||||
if opts["mode"] == self.MODE_COPY:
|
||||
self.log.debug("Copying file ... {} -> {}".format(src, dst))
|
||||
copyfile(src, dst)
|
||||
elif opts["mode"] == self.MODE_HARDLINK:
|
||||
self.log.debug("Hardlinking file ... {} -> {}".format(src,
|
||||
dst))
|
||||
create_hard_link(src, dst)
|
||||
|
||||
self._transferred.append(dst)
|
||||
|
||||
def finalize(self):
|
||||
# Delete any backed up files
|
||||
for backup in self._backup_to_original.keys():
|
||||
try:
|
||||
os.remove(backup)
|
||||
except OSError:
|
||||
self.log.error("Failed to remove backup file: "
|
||||
"{}".format(backup),
|
||||
exc_info=True)
|
||||
|
||||
def rollback(self):
|
||||
|
||||
errors = 0
|
||||
|
||||
# Rollback any transferred files
|
||||
for path in self._transferred:
|
||||
try:
|
||||
os.remove(path)
|
||||
except OSError:
|
||||
errors += 1
|
||||
self.log.error("Failed to rollback created file: "
|
||||
"{}".format(path),
|
||||
exc_info=True)
|
||||
|
||||
# Rollback the backups
|
||||
for backup, original in self._backup_to_original.items():
|
||||
try:
|
||||
os.rename(backup, original)
|
||||
except OSError:
|
||||
errors += 1
|
||||
self.log.error("Failed to restore original file: "
|
||||
"{} -> {}".format(backup, original),
|
||||
exc_info=True)
|
||||
|
||||
if errors:
|
||||
self.log.error("{} errors occurred during "
|
||||
"rollback.".format(errors), exc_info=True)
|
||||
six.reraise(*sys.exc_info())
|
||||
|
||||
@property
|
||||
def transferred(self):
|
||||
"""Return the processed transfers destination paths"""
|
||||
return list(self._transferred)
|
||||
|
||||
@property
|
||||
def backups(self):
|
||||
"""Return the backup file paths"""
|
||||
return list(self._backup_to_original.keys())
|
||||
|
||||
def _create_folder_for_file(self, path):
|
||||
dirname = os.path.dirname(path)
|
||||
try:
|
||||
os.makedirs(dirname)
|
||||
except OSError as e:
|
||||
if e.errno == errno.EEXIST:
|
||||
pass
|
||||
else:
|
||||
self.log.critical("An unexpected error occurred.")
|
||||
six.reraise(*sys.exc_info())
|
||||
|
|
@ -409,6 +409,19 @@ class TemplateResult(str):
|
|||
self.invalid_types
|
||||
)
|
||||
|
||||
def normalized(self):
|
||||
"""Convert to normalized path."""
|
||||
|
||||
cls = self.__class__
|
||||
return cls(
|
||||
os.path.normpath(self),
|
||||
self.template,
|
||||
self.solved,
|
||||
self.used_values,
|
||||
self.missing_keys,
|
||||
self.invalid_types
|
||||
)
|
||||
|
||||
|
||||
class TemplatesResultDict(dict):
|
||||
"""Holds and wrap TemplateResults for easy bug report."""
|
||||
|
|
|
|||
|
|
@ -6,10 +6,10 @@ import logging
|
|||
import re
|
||||
import json
|
||||
|
||||
from .profiles_filtering import filter_profiles
|
||||
|
||||
from openpype.client import get_asset_by_id
|
||||
from openpype.settings import get_project_settings
|
||||
|
||||
from .profiles_filtering import filter_profiles
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
|
@ -135,24 +135,17 @@ def get_subset_name(
|
|||
This is legacy function should be replaced with
|
||||
`get_subset_name_with_asset_doc` where asset document is expected.
|
||||
"""
|
||||
if dbcon is None:
|
||||
from openpype.pipeline import AvalonMongoDB
|
||||
|
||||
dbcon = AvalonMongoDB()
|
||||
dbcon.Session["AVALON_PROJECT"] = project_name
|
||||
if project_name is None:
|
||||
project_name = dbcon.project_name
|
||||
|
||||
dbcon.install()
|
||||
|
||||
asset_doc = dbcon.find_one(
|
||||
{"_id": asset_id},
|
||||
{"data.tasks": True}
|
||||
) or {}
|
||||
asset_doc = get_asset_by_id(project_name, asset_id, fields=["data.tasks"])
|
||||
|
||||
return get_subset_name_with_asset_doc(
|
||||
family,
|
||||
variant,
|
||||
task_name,
|
||||
asset_doc,
|
||||
asset_doc or {},
|
||||
project_name,
|
||||
host_name,
|
||||
default_template,
|
||||
|
|
|
|||
|
|
@ -24,7 +24,10 @@ from bson.json_util import (
|
|||
dumps,
|
||||
CANONICAL_JSON_OPTIONS
|
||||
)
|
||||
|
||||
from openpype.client import (
|
||||
get_project,
|
||||
get_whole_project,
|
||||
)
|
||||
from openpype.pipeline import AvalonMongoDB
|
||||
|
||||
DOCUMENTS_FILE_NAME = "database"
|
||||
|
|
@ -50,14 +53,12 @@ def pack_project(project_name, destination_dir=None):
|
|||
|
||||
Args:
|
||||
project_name(str): Project that should be packaged.
|
||||
destination_dir(str): Optinal path where zip will be stored. Project's
|
||||
destination_dir(str): Optional path where zip will be stored. Project's
|
||||
root is used if not passed.
|
||||
"""
|
||||
print("Creating package of project \"{}\"".format(project_name))
|
||||
# Validate existence of project
|
||||
dbcon = AvalonMongoDB()
|
||||
dbcon.Session["AVALON_PROJECT"] = project_name
|
||||
project_doc = dbcon.find_one({"type": "project"})
|
||||
project_doc = get_project(project_name)
|
||||
if not project_doc:
|
||||
raise ValueError("Project \"{}\" was not found in database".format(
|
||||
project_name
|
||||
|
|
@ -118,7 +119,7 @@ def pack_project(project_name, destination_dir=None):
|
|||
temp_docs_json = s.name
|
||||
|
||||
# Query all project documents and store them to temp json
|
||||
docs = list(dbcon.find({}))
|
||||
docs = list(get_whole_project(project_name))
|
||||
data = dumps(
|
||||
docs, json_options=CANONICAL_JSON_OPTIONS
|
||||
)
|
||||
|
|
@ -147,7 +148,7 @@ def pack_project(project_name, destination_dir=None):
|
|||
# Cleanup
|
||||
os.remove(temp_docs_json)
|
||||
os.remove(temp_metadata_json)
|
||||
dbcon.uninstall()
|
||||
|
||||
print("*** Packing finished ***")
|
||||
|
||||
|
||||
|
|
@ -207,7 +208,7 @@ def unpack_project(path_to_zip, new_root=None):
|
|||
print("Using different root path {}".format(new_root))
|
||||
root_path = new_root
|
||||
|
||||
project_doc = collection.find_one({"type": "project"})
|
||||
project_doc = get_project(project_name)
|
||||
roots = project_doc["config"]["roots"]
|
||||
key = tuple(roots.keys())[0]
|
||||
update_key = "config.roots.{}.{}".format(key, low_platform)
|
||||
|
|
|
|||
|
|
@ -8,10 +8,8 @@ except ImportError:
|
|||
# Allow to fall back on Multiverse 6.3.0+ pxr usd library
|
||||
from mvpxr import Usd, UsdGeom, Sdf, Kind
|
||||
|
||||
from openpype.pipeline import (
|
||||
registered_root,
|
||||
legacy_io,
|
||||
)
|
||||
from openpype.client import get_project, get_asset_by_name
|
||||
from openpype.pipeline import legacy_io, Anatomy
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
|
@ -128,7 +126,8 @@ def create_model(filename, asset, variant_subsets):
|
|||
|
||||
"""
|
||||
|
||||
asset_doc = legacy_io.find_one({"name": asset, "type": "asset"})
|
||||
project_name = legacy_io.active_project()
|
||||
asset_doc = get_asset_by_name(project_name, asset)
|
||||
assert asset_doc, "Asset not found: %s" % asset
|
||||
|
||||
variants = []
|
||||
|
|
@ -178,7 +177,8 @@ def create_shade(filename, asset, variant_subsets):
|
|||
|
||||
"""
|
||||
|
||||
asset_doc = legacy_io.find_one({"name": asset, "type": "asset"})
|
||||
project_name = legacy_io.active_project()
|
||||
asset_doc = get_asset_by_name(project_name, asset)
|
||||
assert asset_doc, "Asset not found: %s" % asset
|
||||
|
||||
variants = []
|
||||
|
|
@ -213,7 +213,8 @@ def create_shade_variation(filename, asset, model_variant, shade_variants):
|
|||
|
||||
"""
|
||||
|
||||
asset_doc = legacy_io.find_one({"name": asset, "type": "asset"})
|
||||
project_name = legacy_io.active_project()
|
||||
asset_doc = get_asset_by_name(project_name, asset)
|
||||
assert asset_doc, "Asset not found: %s" % asset
|
||||
|
||||
variants = []
|
||||
|
|
@ -313,21 +314,25 @@ def get_usd_master_path(asset, subset, representation):
|
|||
|
||||
"""
|
||||
|
||||
project = legacy_io.find_one(
|
||||
{"type": "project"}, projection={"config.template.publish": True}
|
||||
project_name = legacy_io.active_project()
|
||||
anatomy = Anatomy(project_name)
|
||||
project_doc = get_project(
|
||||
project_name,
|
||||
fields=["name", "data.code"]
|
||||
)
|
||||
template = project["config"]["template"]["publish"]
|
||||
|
||||
if isinstance(asset, dict) and "name" in asset:
|
||||
# Allow explicitly passing asset document
|
||||
asset_doc = asset
|
||||
else:
|
||||
asset_doc = legacy_io.find_one({"name": asset, "type": "asset"})
|
||||
asset_doc = get_asset_by_name(project_name, asset, fields=["name"])
|
||||
|
||||
path = template.format(
|
||||
**{
|
||||
"root": registered_root(),
|
||||
"project": legacy_io.Session["AVALON_PROJECT"],
|
||||
formatted_result = anatomy.format(
|
||||
{
|
||||
"project": {
|
||||
"name": project_name,
|
||||
"code": project_doc.get("data", {}).get("code")
|
||||
},
|
||||
"asset": asset_doc["name"],
|
||||
"subset": subset,
|
||||
"representation": representation,
|
||||
|
|
@ -335,6 +340,7 @@ def get_usd_master_path(asset, subset, representation):
|
|||
}
|
||||
)
|
||||
|
||||
path = formatted_result["publish"]["path"]
|
||||
# Remove the version folder
|
||||
subset_folder = os.path.dirname(os.path.dirname(path))
|
||||
master_folder = os.path.join(subset_folder, "master")
|
||||
|
|
|
|||
|
|
@ -49,6 +49,7 @@ class _ModuleClass(object):
|
|||
Object of this class can be stored to `sys.modules` and used for storing
|
||||
dynamically imported modules.
|
||||
"""
|
||||
|
||||
def __init__(self, name):
|
||||
# Call setattr on super class
|
||||
super(_ModuleClass, self).__setattr__("name", name)
|
||||
|
|
@ -116,12 +117,13 @@ class _InterfacesClass(_ModuleClass):
|
|||
- this is because interfaces must be available even if are missing
|
||||
implementation
|
||||
"""
|
||||
|
||||
def __getattr__(self, attr_name):
|
||||
if attr_name not in self.__attributes__:
|
||||
if attr_name in ("__path__", "__file__"):
|
||||
return None
|
||||
|
||||
raise ImportError((
|
||||
raise AttributeError((
|
||||
"cannot import name '{}' from 'openpype_interfaces'"
|
||||
).format(attr_name))
|
||||
|
||||
|
|
|
|||
|
|
@ -15,7 +15,7 @@ import attr
|
|||
import requests
|
||||
|
||||
import pyblish.api
|
||||
from openpype.lib.abstract_metaplugins import AbstractMetaInstancePlugin
|
||||
from openpype.pipeline.publish import AbstractMetaInstancePlugin
|
||||
|
||||
|
||||
def requests_post(*args, **kwargs):
|
||||
|
|
|
|||
|
|
@ -55,7 +55,7 @@ class HoudiniSubmitPublishDeadline(pyblish.api.ContextPlugin):
|
|||
scenename = os.path.basename(scene)
|
||||
|
||||
# Get project code
|
||||
project = legacy_io.find_one({"type": "project"})
|
||||
project = context.data["projectEntity"]
|
||||
code = project["data"].get("code", project["name"])
|
||||
|
||||
job_name = "{scene} [PUBLISH]".format(scene=scenename)
|
||||
|
|
|
|||
|
|
@ -10,7 +10,10 @@ import clique
|
|||
|
||||
import pyblish.api
|
||||
|
||||
import openpype.api
|
||||
from openpype.client import (
|
||||
get_last_version_by_subset_name,
|
||||
get_representations,
|
||||
)
|
||||
from openpype.pipeline import (
|
||||
get_representation_path,
|
||||
legacy_io,
|
||||
|
|
@ -18,15 +21,23 @@ from openpype.pipeline import (
|
|||
from openpype.pipeline.farm.patterning import match_aov_pattern
|
||||
|
||||
|
||||
def get_resources(version, extension=None):
|
||||
def get_resources(project_name, version, extension=None):
|
||||
"""Get the files from the specific version."""
|
||||
query = {"type": "representation", "parent": version["_id"]}
|
||||
|
||||
# TODO this functions seems to be weird
|
||||
# - it's looking for representation with one extension or first (any)
|
||||
# representation from a version?
|
||||
# - not sure how this should work, maybe it does for specific use cases
|
||||
# but probably can't be used for all resources from 2D workflows
|
||||
extensions = None
|
||||
if extension:
|
||||
query["name"] = extension
|
||||
|
||||
representation = legacy_io.find_one(query)
|
||||
assert representation, "This is a bug"
|
||||
extensions = [extension]
|
||||
repre_docs = list(get_representations(
|
||||
project_name, version_ids=[version["_id"]], extensions=extensions
|
||||
))
|
||||
assert repre_docs, "This is a bug"
|
||||
|
||||
representation = repre_docs[0]
|
||||
directory = get_representation_path(representation)
|
||||
print("Source: ", directory)
|
||||
resources = sorted(
|
||||
|
|
@ -330,13 +341,21 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
self.log.info("Preparing to copy ...")
|
||||
start = instance.data.get("frameStart")
|
||||
end = instance.data.get("frameEnd")
|
||||
project_name = legacy_io.active_project()
|
||||
|
||||
# get latest version of subset
|
||||
# this will stop if subset wasn't published yet
|
||||
version = openpype.api.get_latest_version(instance.data.get("asset"),
|
||||
instance.data.get("subset"))
|
||||
project_name = legacy_io.active_project()
|
||||
version = get_last_version_by_subset_name(
|
||||
project_name,
|
||||
instance.data.get("subset"),
|
||||
asset_name=instance.data.get("asset")
|
||||
)
|
||||
|
||||
# get its files based on extension
|
||||
subset_resources = get_resources(version, representation.get("ext"))
|
||||
subset_resources = get_resources(
|
||||
project_name, version, representation.get("ext")
|
||||
)
|
||||
r_col, _ = clique.assemble(subset_resources)
|
||||
|
||||
# if override remove all frames we are expecting to be rendered
|
||||
|
|
@ -1013,9 +1032,12 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
prev_start = None
|
||||
prev_end = None
|
||||
|
||||
version = openpype.api.get_latest_version(asset_name=asset,
|
||||
subset_name=subset
|
||||
)
|
||||
project_name = legacy_io.active_project()
|
||||
version = get_last_version_by_subset_name(
|
||||
project_name,
|
||||
subset,
|
||||
asset_name=asset
|
||||
)
|
||||
|
||||
# Set prev start / end frames for comparison
|
||||
if not prev_start and not prev_end:
|
||||
|
|
@ -1060,7 +1082,12 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
based on 'publish' template
|
||||
"""
|
||||
if not version:
|
||||
version = openpype.api.get_latest_version(asset, subset)
|
||||
project_name = legacy_io.active_project()
|
||||
version = get_last_version_by_subset_name(
|
||||
project_name,
|
||||
subset,
|
||||
asset_name=asset
|
||||
)
|
||||
if version:
|
||||
version = int(version["name"]) + 1
|
||||
else:
|
||||
|
|
|
|||
|
|
@ -6,7 +6,10 @@ import collections
|
|||
import ftrack_api
|
||||
|
||||
from openpype.lib import get_datetime_data
|
||||
from openpype.api import get_project_settings
|
||||
from openpype.settings.lib import (
|
||||
get_project_settings,
|
||||
get_default_project_settings
|
||||
)
|
||||
from openpype_modules.ftrack.lib import ServerAction
|
||||
|
||||
|
||||
|
|
@ -79,6 +82,35 @@ class CreateDailyReviewSessionServerAction(ServerAction):
|
|||
)
|
||||
return True
|
||||
|
||||
def _calculate_next_cycle_delta(self):
|
||||
studio_default_settings = get_default_project_settings()
|
||||
action_settings = (
|
||||
studio_default_settings
|
||||
["ftrack"]
|
||||
[self.settings_frack_subkey]
|
||||
[self.settings_key]
|
||||
)
|
||||
cycle_hour_start = action_settings.get("cycle_hour_start")
|
||||
if not cycle_hour_start:
|
||||
h = m = s = 0
|
||||
else:
|
||||
h, m, s = cycle_hour_start
|
||||
|
||||
# Create threading timer which will trigger creation of report
|
||||
# at the 00:00:01 of next day
|
||||
# - callback will trigger another timer which will have 1 day offset
|
||||
now = datetime.datetime.now()
|
||||
# Create object of today morning
|
||||
expected_next_trigger = datetime.datetime(
|
||||
now.year, now.month, now.day, h, m, s
|
||||
)
|
||||
if expected_next_trigger > now:
|
||||
seconds = (expected_next_trigger - now).total_seconds()
|
||||
else:
|
||||
expected_next_trigger += self._day_delta
|
||||
seconds = (expected_next_trigger - now).total_seconds()
|
||||
return seconds, expected_next_trigger
|
||||
|
||||
def register(self, *args, **kwargs):
|
||||
"""Override register to be able trigger """
|
||||
# Register server action as would be normally
|
||||
|
|
@ -86,22 +118,14 @@ class CreateDailyReviewSessionServerAction(ServerAction):
|
|||
*args, **kwargs
|
||||
)
|
||||
|
||||
# Create threading timer which will trigger creation of report
|
||||
# at the 00:00:01 of next day
|
||||
# - callback will trigger another timer which will have 1 day offset
|
||||
now = datetime.datetime.now()
|
||||
# Create object of today morning
|
||||
today_morning = datetime.datetime(
|
||||
now.year, now.month, now.day, 0, 0, 1
|
||||
)
|
||||
# Add a day delta (to calculate next day date)
|
||||
next_day_morning = today_morning + self._day_delta
|
||||
# Calculate first delta in seconds for first threading timer
|
||||
first_delta = (next_day_morning - now).total_seconds()
|
||||
seconds_delta, cycle_time = self._calculate_next_cycle_delta()
|
||||
|
||||
# Store cycle time which will be used to create next timer
|
||||
self._last_cyle_time = next_day_morning
|
||||
self._last_cyle_time = cycle_time
|
||||
# Create timer thread
|
||||
self._cycle_timer = threading.Timer(first_delta, self._timer_callback)
|
||||
self._cycle_timer = threading.Timer(
|
||||
seconds_delta, self._timer_callback
|
||||
)
|
||||
self._cycle_timer.start()
|
||||
|
||||
self._check_review_session()
|
||||
|
|
@ -111,13 +135,12 @@ class CreateDailyReviewSessionServerAction(ServerAction):
|
|||
self._cycle_timer is not None
|
||||
and self._last_cyle_time is not None
|
||||
):
|
||||
now = datetime.datetime.now()
|
||||
while self._last_cyle_time < now:
|
||||
self._last_cyle_time = self._last_cyle_time + self._day_delta
|
||||
seconds_delta, cycle_time = self._calculate_next_cycle_delta()
|
||||
self._last_cyle_time = cycle_time
|
||||
|
||||
delay = (self._last_cyle_time - now).total_seconds()
|
||||
|
||||
self._cycle_timer = threading.Timer(delay, self._timer_callback)
|
||||
self._cycle_timer = threading.Timer(
|
||||
seconds_delta, self._timer_callback
|
||||
)
|
||||
self._cycle_timer.start()
|
||||
self._check_review_session()
|
||||
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
import json
|
||||
import copy
|
||||
|
||||
from openpype.client import get_project
|
||||
from openpype.api import ProjectSettings
|
||||
|
|
@ -373,6 +374,10 @@ class PrepareProjectServer(ServerAction):
|
|||
project_name, project_code
|
||||
))
|
||||
create_project(project_name, project_code)
|
||||
self.trigger_event(
|
||||
"openpype.project.created",
|
||||
{"project_name": project_name}
|
||||
)
|
||||
|
||||
project_settings = ProjectSettings(project_name)
|
||||
project_anatomy_settings = project_settings["project_anatomy"]
|
||||
|
|
@ -400,6 +405,10 @@ class PrepareProjectServer(ServerAction):
|
|||
self.log.debug("- Key \"{}\" set to \"{}\"".format(key, value))
|
||||
session.commit()
|
||||
|
||||
event_data = copy.deepcopy(in_data)
|
||||
event_data["project_name"] = project_name
|
||||
self.trigger_event("openpype.project.prepared", event_data)
|
||||
|
||||
return True
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -1,7 +1,8 @@
|
|||
import time
|
||||
import sys
|
||||
import json
|
||||
import traceback
|
||||
|
||||
import ftrack_api
|
||||
|
||||
from openpype_modules.ftrack.lib import ServerAction
|
||||
from openpype_modules.ftrack.lib.avalon_sync import SyncEntitiesFactory
|
||||
|
|
@ -180,6 +181,13 @@ class SyncToAvalonServer(ServerAction):
|
|||
"* Total time: {}".format(time_7 - time_start)
|
||||
)
|
||||
|
||||
if self.entities_factory.project_created:
|
||||
event = ftrack_api.event.base.Event(
|
||||
topic="openpype.project.created",
|
||||
data={"project_name": project_name}
|
||||
)
|
||||
self.session.event_hub.publish(event)
|
||||
|
||||
report = self.entities_factory.report()
|
||||
if report and report.get("items"):
|
||||
default_title = "Synchronization report ({}):".format(
|
||||
|
|
|
|||
|
|
@ -84,6 +84,11 @@ class CreateProjectFolders(BaseAction):
|
|||
create_project_folders(basic_paths, project_name)
|
||||
self.create_ftrack_entities(basic_paths, project_entity)
|
||||
|
||||
self.trigger_event(
|
||||
"openpype.project.structure.created",
|
||||
{"project_name": project_name}
|
||||
)
|
||||
|
||||
except Exception as exc:
|
||||
self.log.warning("Creating of structure crashed.", exc_info=True)
|
||||
session.rollback()
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
import json
|
||||
import copy
|
||||
|
||||
from openpype.client import get_project
|
||||
from openpype.api import ProjectSettings
|
||||
|
|
@ -399,6 +400,10 @@ class PrepareProjectLocal(BaseAction):
|
|||
project_name, project_code
|
||||
))
|
||||
create_project(project_name, project_code)
|
||||
self.trigger_event(
|
||||
"openpype.project.created",
|
||||
{"project_name": project_name}
|
||||
)
|
||||
|
||||
project_settings = ProjectSettings(project_name)
|
||||
project_anatomy_settings = project_settings["project_anatomy"]
|
||||
|
|
@ -433,6 +438,10 @@ class PrepareProjectLocal(BaseAction):
|
|||
self.process_identifier()
|
||||
)
|
||||
self.trigger_action(trigger_identifier, event)
|
||||
|
||||
event_data = copy.deepcopy(in_data)
|
||||
event_data["project_name"] = project_name
|
||||
self.trigger_event("openpype.project.prepared", event_data)
|
||||
return True
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -1,7 +1,8 @@
|
|||
import time
|
||||
import sys
|
||||
import json
|
||||
import traceback
|
||||
|
||||
import ftrack_api
|
||||
|
||||
from openpype_modules.ftrack.lib import BaseAction, statics_icon
|
||||
from openpype_modules.ftrack.lib.avalon_sync import SyncEntitiesFactory
|
||||
|
|
@ -184,6 +185,13 @@ class SyncToAvalonLocal(BaseAction):
|
|||
"* Total time: {}".format(time_7 - time_start)
|
||||
)
|
||||
|
||||
if self.entities_factory.project_created:
|
||||
event = ftrack_api.event.base.Event(
|
||||
topic="openpype.project.created",
|
||||
data={"project_name": project_name}
|
||||
)
|
||||
self.session.event_hub.publish(event)
|
||||
|
||||
report = self.entities_factory.report()
|
||||
if report and report.get("items"):
|
||||
default_title = "Synchronization report ({}):".format(
|
||||
|
|
|
|||
|
|
@ -443,6 +443,7 @@ class SyncEntitiesFactory:
|
|||
}
|
||||
|
||||
self.create_list = []
|
||||
self.project_created = False
|
||||
self.unarchive_list = []
|
||||
self.updates = collections.defaultdict(dict)
|
||||
|
||||
|
|
@ -2214,6 +2215,7 @@ class SyncEntitiesFactory:
|
|||
self._avalon_ents_by_name[project_item["name"]] = str(new_id)
|
||||
|
||||
self.create_list.append(project_item)
|
||||
self.project_created = True
|
||||
|
||||
# store mongo id to ftrack entity
|
||||
entity = self.entities_dict[self.ft_project_id]["entity"]
|
||||
|
|
|
|||
|
|
@ -535,7 +535,7 @@ class BaseHandler(object):
|
|||
)
|
||||
|
||||
def trigger_event(
|
||||
self, topic, event_data={}, session=None, source=None,
|
||||
self, topic, event_data=None, session=None, source=None,
|
||||
event=None, on_error="ignore"
|
||||
):
|
||||
if session is None:
|
||||
|
|
@ -543,6 +543,9 @@ class BaseHandler(object):
|
|||
|
||||
if not source and event:
|
||||
source = event.get("source")
|
||||
|
||||
if event_data is None:
|
||||
event_data = {}
|
||||
# Create and trigger event
|
||||
event = ftrack_api.event.base.Event(
|
||||
topic=topic,
|
||||
|
|
|
|||
|
|
@ -116,6 +116,7 @@ class IntegrateFtrackNote(pyblish.api.InstancePlugin):
|
|||
"app_name": app_name,
|
||||
"app_label": app_label,
|
||||
"published_paths": "<br/>".join(sorted(published_paths)),
|
||||
"source": instance.data.get("source", '')
|
||||
}
|
||||
comment = template.format(**format_data)
|
||||
if not comment:
|
||||
|
|
|
|||
|
|
@ -32,11 +32,17 @@ class CollectKitsuEntities(pyblish.api.ContextPlugin):
|
|||
context.data["kitsu_project"] = kitsu_project
|
||||
self.log.debug("Collect kitsu project: {}".format(kitsu_project))
|
||||
|
||||
kitsu_asset = gazu.asset.get_asset(zou_asset_data["id"])
|
||||
if not kitsu_asset:
|
||||
raise AssertionError("Asset not found in kitsu!")
|
||||
context.data["kitsu_asset"] = kitsu_asset
|
||||
self.log.debug("Collect kitsu asset: {}".format(kitsu_asset))
|
||||
entity_type = zou_asset_data["type"]
|
||||
if entity_type == "Shot":
|
||||
kitsu_entity = gazu.shot.get_shot(zou_asset_data["id"])
|
||||
else:
|
||||
kitsu_entity = gazu.asset.get_asset(zou_asset_data["id"])
|
||||
|
||||
if not kitsu_entity:
|
||||
raise AssertionError(f"{entity_type} not found in kitsu!")
|
||||
|
||||
context.data["kitsu_entity"] = kitsu_entity
|
||||
self.log.debug(f"Collect kitsu {entity_type}: {kitsu_entity}")
|
||||
|
||||
if zou_task_data:
|
||||
kitsu_task = gazu.task.get_task(zou_task_data["id"])
|
||||
|
|
@ -57,7 +63,7 @@ class CollectKitsuEntities(pyblish.api.ContextPlugin):
|
|||
)
|
||||
|
||||
kitsu_task = gazu.task.get_task_by_name(
|
||||
kitsu_asset, kitsu_task_type
|
||||
kitsu_entity, kitsu_task_type
|
||||
)
|
||||
if not kitsu_task:
|
||||
raise AssertionError("Task not found in kitsu!")
|
||||
|
|
|
|||
|
|
@ -2,11 +2,17 @@ import os
|
|||
|
||||
import gazu
|
||||
|
||||
from openpype.client import (
|
||||
get_project,
|
||||
get_assets,
|
||||
get_asset_by_name
|
||||
)
|
||||
from openpype.pipeline import AvalonMongoDB
|
||||
from .credentials import validate_credentials
|
||||
from .update_op_with_zou import (
|
||||
create_op_asset,
|
||||
set_op_project,
|
||||
get_kitsu_project_name,
|
||||
write_project_to_op,
|
||||
update_op_assets,
|
||||
)
|
||||
|
|
@ -119,17 +125,16 @@ class Listener:
|
|||
|
||||
# Write into DB
|
||||
if update_project:
|
||||
self.dbcon = self.dbcon.database[project_name]
|
||||
self.dbcon.Session["AVALON_PROJECT"] = project_name
|
||||
self.dbcon.bulk_write([update_project])
|
||||
|
||||
def _delete_project(self, data):
|
||||
"""Delete project."""
|
||||
project_doc = self.dbcon.find_one(
|
||||
{"type": "project", "data.zou_id": data["project_id"]}
|
||||
)
|
||||
|
||||
project_name = get_kitsu_project_name(data["project_id"])
|
||||
|
||||
# Delete project collection
|
||||
self.dbcon.database[project_doc["name"]].drop()
|
||||
self.dbcon.database[project_name].drop()
|
||||
|
||||
# == Asset ==
|
||||
|
||||
|
|
@ -150,7 +155,8 @@ class Listener:
|
|||
def _update_asset(self, data):
|
||||
"""Update asset into OP DB."""
|
||||
set_op_project(self.dbcon, data["project_id"])
|
||||
project_doc = self.dbcon.find_one({"type": "project"})
|
||||
project_name = self.dbcon.active_project()
|
||||
project_doc = get_project(project_name)
|
||||
|
||||
# Get gazu entity
|
||||
asset = gazu.asset.get_asset(data["asset_id"])
|
||||
|
|
@ -159,16 +165,18 @@ class Listener:
|
|||
# Query all assets of the local project
|
||||
zou_ids_and_asset_docs = {
|
||||
asset_doc["data"]["zou"]["id"]: asset_doc
|
||||
for asset_doc in self.dbcon.find({"type": "asset"})
|
||||
for asset_doc in get_assets(project_name)
|
||||
if asset_doc["data"].get("zou", {}).get("id")
|
||||
}
|
||||
zou_ids_and_asset_docs[asset["project_id"]] = project_doc
|
||||
|
||||
# Update
|
||||
asset_doc_id, asset_update = update_op_assets(
|
||||
update_op_result = update_op_assets(
|
||||
self.dbcon, project_doc, [asset], zou_ids_and_asset_docs
|
||||
)[0]
|
||||
self.dbcon.update_one({"_id": asset_doc_id}, asset_update)
|
||||
)
|
||||
if update_op_result:
|
||||
asset_doc_id, asset_update = update_op_result[0]
|
||||
self.dbcon.update_one({"_id": asset_doc_id}, asset_update)
|
||||
|
||||
def _delete_asset(self, data):
|
||||
"""Delete asset of OP DB."""
|
||||
|
|
@ -197,7 +205,8 @@ class Listener:
|
|||
def _update_episode(self, data):
|
||||
"""Update episode into OP DB."""
|
||||
set_op_project(self.dbcon, data["project_id"])
|
||||
project_doc = self.dbcon.find_one({"type": "project"})
|
||||
project_name = self.dbcon.active_project()
|
||||
project_doc = get_project(project_name)
|
||||
|
||||
# Get gazu entity
|
||||
episode = gazu.shot.get_episode(data["episode_id"])
|
||||
|
|
@ -206,16 +215,18 @@ class Listener:
|
|||
# Query all assets of the local project
|
||||
zou_ids_and_asset_docs = {
|
||||
asset_doc["data"]["zou"]["id"]: asset_doc
|
||||
for asset_doc in self.dbcon.find({"type": "asset"})
|
||||
for asset_doc in get_assets(project_name)
|
||||
if asset_doc["data"].get("zou", {}).get("id")
|
||||
}
|
||||
zou_ids_and_asset_docs[episode["project_id"]] = project_doc
|
||||
|
||||
# Update
|
||||
asset_doc_id, asset_update = update_op_assets(
|
||||
update_op_result = update_op_assets(
|
||||
self.dbcon, project_doc, [episode], zou_ids_and_asset_docs
|
||||
)[0]
|
||||
self.dbcon.update_one({"_id": asset_doc_id}, asset_update)
|
||||
)
|
||||
if update_op_result:
|
||||
asset_doc_id, asset_update = update_op_result[0]
|
||||
self.dbcon.update_one({"_id": asset_doc_id}, asset_update)
|
||||
|
||||
def _delete_episode(self, data):
|
||||
"""Delete shot of OP DB."""
|
||||
|
|
@ -245,7 +256,8 @@ class Listener:
|
|||
def _update_sequence(self, data):
|
||||
"""Update sequence into OP DB."""
|
||||
set_op_project(self.dbcon, data["project_id"])
|
||||
project_doc = self.dbcon.find_one({"type": "project"})
|
||||
project_name = self.dbcon.active_project()
|
||||
project_doc = get_project(project_name)
|
||||
|
||||
# Get gazu entity
|
||||
sequence = gazu.shot.get_sequence(data["sequence_id"])
|
||||
|
|
@ -254,16 +266,18 @@ class Listener:
|
|||
# Query all assets of the local project
|
||||
zou_ids_and_asset_docs = {
|
||||
asset_doc["data"]["zou"]["id"]: asset_doc
|
||||
for asset_doc in self.dbcon.find({"type": "asset"})
|
||||
for asset_doc in get_assets(project_name)
|
||||
if asset_doc["data"].get("zou", {}).get("id")
|
||||
}
|
||||
zou_ids_and_asset_docs[sequence["project_id"]] = project_doc
|
||||
|
||||
# Update
|
||||
asset_doc_id, asset_update = update_op_assets(
|
||||
update_op_result = update_op_assets(
|
||||
self.dbcon, project_doc, [sequence], zou_ids_and_asset_docs
|
||||
)[0]
|
||||
self.dbcon.update_one({"_id": asset_doc_id}, asset_update)
|
||||
)
|
||||
if update_op_result:
|
||||
asset_doc_id, asset_update = update_op_result[0]
|
||||
self.dbcon.update_one({"_id": asset_doc_id}, asset_update)
|
||||
|
||||
def _delete_sequence(self, data):
|
||||
"""Delete sequence of OP DB."""
|
||||
|
|
@ -293,7 +307,8 @@ class Listener:
|
|||
def _update_shot(self, data):
|
||||
"""Update shot into OP DB."""
|
||||
set_op_project(self.dbcon, data["project_id"])
|
||||
project_doc = self.dbcon.find_one({"type": "project"})
|
||||
project_name = self.dbcon.active_project()
|
||||
project_doc = get_project(project_name)
|
||||
|
||||
# Get gazu entity
|
||||
shot = gazu.shot.get_shot(data["shot_id"])
|
||||
|
|
@ -302,16 +317,18 @@ class Listener:
|
|||
# Query all assets of the local project
|
||||
zou_ids_and_asset_docs = {
|
||||
asset_doc["data"]["zou"]["id"]: asset_doc
|
||||
for asset_doc in self.dbcon.find({"type": "asset"})
|
||||
for asset_doc in get_assets(project_name)
|
||||
if asset_doc["data"].get("zou", {}).get("id")
|
||||
}
|
||||
zou_ids_and_asset_docs[shot["project_id"]] = project_doc
|
||||
|
||||
# Update
|
||||
asset_doc_id, asset_update = update_op_assets(
|
||||
update_op_result = update_op_assets(
|
||||
self.dbcon, project_doc, [shot], zou_ids_and_asset_docs
|
||||
)[0]
|
||||
self.dbcon.update_one({"_id": asset_doc_id}, asset_update)
|
||||
)
|
||||
if update_op_result:
|
||||
asset_doc_id, asset_update = update_op_result[0]
|
||||
self.dbcon.update_one({"_id": asset_doc_id}, asset_update)
|
||||
|
||||
def _delete_shot(self, data):
|
||||
"""Delete shot of OP DB."""
|
||||
|
|
@ -327,14 +344,15 @@ class Listener:
|
|||
"""Create new task into OP DB."""
|
||||
# Get project entity
|
||||
set_op_project(self.dbcon, data["project_id"])
|
||||
project_name = self.dbcon.active_project()
|
||||
|
||||
# Get gazu entity
|
||||
task = gazu.task.get_task(data["task_id"])
|
||||
|
||||
# Find asset doc
|
||||
asset_doc = self.dbcon.find_one(
|
||||
{"type": "asset", "data.zou.id": task["entity"]["id"]}
|
||||
)
|
||||
parent_name = task["entity"]["name"]
|
||||
|
||||
asset_doc = get_asset_by_name(project_name, parent_name)
|
||||
|
||||
# Update asset tasks with new one
|
||||
asset_tasks = asset_doc["data"].get("tasks")
|
||||
|
|
@ -351,10 +369,11 @@ class Listener:
|
|||
|
||||
def _delete_task(self, data):
|
||||
"""Delete task of OP DB."""
|
||||
set_op_project(self.dbcon, data["project_id"])
|
||||
|
||||
set_op_project(self.dbcon, data["project_id"])
|
||||
project_name = self.dbcon.active_project()
|
||||
# Find asset doc
|
||||
asset_docs = [doc for doc in self.dbcon.find({"type": "asset"})]
|
||||
asset_docs = list(get_assets(project_name))
|
||||
for doc in asset_docs:
|
||||
# Match task
|
||||
for name, task in doc["data"]["tasks"].items():
|
||||
|
|
|
|||
|
|
@ -10,6 +10,12 @@ from gazu.task import (
|
|||
all_tasks_for_shot,
|
||||
)
|
||||
|
||||
from openpype.client import (
|
||||
get_project,
|
||||
get_assets,
|
||||
get_asset_by_id,
|
||||
get_asset_by_name,
|
||||
)
|
||||
from openpype.pipeline import AvalonMongoDB
|
||||
from openpype.api import get_project_settings
|
||||
from openpype.lib import create_project
|
||||
|
|
@ -33,6 +39,20 @@ def create_op_asset(gazu_entity: dict) -> dict:
|
|||
}
|
||||
|
||||
|
||||
def get_kitsu_project_name(project_id: str) -> str:
|
||||
"""Get project name based on project id in kitsu.
|
||||
|
||||
Args:
|
||||
project_id (str): UUID of project in Kitsu.
|
||||
|
||||
Returns:
|
||||
str: Name of Kitsu project.
|
||||
"""
|
||||
|
||||
project = gazu.project.get_project(project_id)
|
||||
return project["name"]
|
||||
|
||||
|
||||
def set_op_project(dbcon: AvalonMongoDB, project_id: str):
|
||||
"""Set project context.
|
||||
|
||||
|
|
@ -40,9 +60,8 @@ def set_op_project(dbcon: AvalonMongoDB, project_id: str):
|
|||
dbcon (AvalonMongoDB): Connection to DB
|
||||
project_id (str): Project zou ID
|
||||
"""
|
||||
project = gazu.project.get_project(project_id)
|
||||
project_name = project["name"]
|
||||
dbcon.Session["AVALON_PROJECT"] = project_name
|
||||
|
||||
dbcon.Session["AVALON_PROJECT"] = get_kitsu_project_name(project_id)
|
||||
|
||||
|
||||
def update_op_assets(
|
||||
|
|
@ -72,9 +91,7 @@ def update_op_assets(
|
|||
if not item_doc: # Create asset
|
||||
op_asset = create_op_asset(item)
|
||||
insert_result = dbcon.insert_one(op_asset)
|
||||
item_doc = dbcon.find_one(
|
||||
{"type": "asset", "_id": insert_result.inserted_id}
|
||||
)
|
||||
item_doc = get_asset_by_id(project_name, insert_result.inserted_id)
|
||||
|
||||
# Update asset
|
||||
item_data = deepcopy(item_doc["data"])
|
||||
|
|
@ -82,22 +99,37 @@ def update_op_assets(
|
|||
item_data["zou"] = item
|
||||
|
||||
# == Asset settings ==
|
||||
# Frame in, fallback on 0
|
||||
frame_in = int(item_data.get("frame_in") or 0)
|
||||
# Frame in, fallback to project's value or default value (1001)
|
||||
# TODO: get default from settings/project_anatomy/attributes.json
|
||||
try:
|
||||
frame_in = int(
|
||||
item_data.pop(
|
||||
"frame_in", project_doc["data"].get("frameStart")
|
||||
)
|
||||
)
|
||||
except (TypeError, ValueError):
|
||||
frame_in = 1001
|
||||
item_data["frameStart"] = frame_in
|
||||
item_data.pop("frame_in", None)
|
||||
# Frame out, fallback on frame_in + duration
|
||||
frames_duration = int(item.get("nb_frames") or 1)
|
||||
frame_out = (
|
||||
item_data["frame_out"]
|
||||
if item_data.get("frame_out")
|
||||
else frame_in + frames_duration
|
||||
)
|
||||
item_data["frameEnd"] = int(frame_out)
|
||||
item_data.pop("frame_out", None)
|
||||
# Fps, fallback to project's value when entity fps is deleted
|
||||
if not item_data.get("fps") and item_doc["data"].get("fps"):
|
||||
item_data["fps"] = project_doc["data"]["fps"]
|
||||
# Frames duration, fallback on 0
|
||||
try:
|
||||
frames_duration = int(item_data.pop("nb_frames", 0))
|
||||
except (TypeError, ValueError):
|
||||
frames_duration = 0
|
||||
# Frame out, fallback on frame_in + duration or project's value or 1001
|
||||
frame_out = item_data.pop("frame_out", None)
|
||||
if not frame_out:
|
||||
frame_out = frame_in + frames_duration
|
||||
try:
|
||||
frame_out = int(frame_out)
|
||||
except (TypeError, ValueError):
|
||||
frame_out = 1001
|
||||
item_data["frameEnd"] = frame_out
|
||||
# Fps, fallback to project's value or default value (25.0)
|
||||
try:
|
||||
fps = float(item_data.get("fps", project_doc["data"].get("fps")))
|
||||
except (TypeError, ValueError):
|
||||
fps = 25.0
|
||||
item_data["fps"] = fps
|
||||
|
||||
# Tasks
|
||||
tasks_list = []
|
||||
|
|
@ -106,9 +138,8 @@ def update_op_assets(
|
|||
tasks_list = all_tasks_for_asset(item)
|
||||
elif item_type == "Shot":
|
||||
tasks_list = all_tasks_for_shot(item)
|
||||
# TODO frame in and out
|
||||
item_data["tasks"] = {
|
||||
t["task_type_name"]: {"type": t["task_type_name"]}
|
||||
t["task_type_name"]: {"type": t["task_type_name"], "zou": t}
|
||||
for t in tasks_list
|
||||
}
|
||||
|
||||
|
|
@ -123,17 +154,23 @@ def update_op_assets(
|
|||
parent_zou_id = substitute_parent_item["parent_id"]
|
||||
else:
|
||||
parent_zou_id = (
|
||||
item.get("parent_id")
|
||||
# For Asset, put under asset type directory
|
||||
item.get("entity_type_id")
|
||||
if item_type == "Asset"
|
||||
else None
|
||||
# Else, fallback on usual hierarchy
|
||||
or item.get("parent_id")
|
||||
or item.get("episode_id")
|
||||
or item.get("source_id")
|
||||
) # TODO check consistency
|
||||
)
|
||||
|
||||
# Substitute Episode and Sequence by Shot
|
||||
substitute_item_type = (
|
||||
"shots"
|
||||
if item_type in ["Episode", "Sequence"]
|
||||
else f"{item_type.lower()}s"
|
||||
)
|
||||
# Substitute item type for general classification (assets or shots)
|
||||
if item_type in ["Asset", "AssetType"]:
|
||||
substitute_item_type = "assets"
|
||||
elif item_type in ["Episode", "Sequence"]:
|
||||
substitute_item_type = "shots"
|
||||
else:
|
||||
substitute_item_type = f"{item_type.lower()}s"
|
||||
entity_parent_folders = [
|
||||
f
|
||||
for f in project_module_settings["entities_root"]
|
||||
|
|
@ -147,15 +184,33 @@ def update_op_assets(
|
|||
asset_doc_ids[parent_zou_id]["_id"] if parent_zou_id else None
|
||||
)
|
||||
if visual_parent_doc_id is None:
|
||||
# Find root folder doc
|
||||
root_folder_doc = dbcon.find_one(
|
||||
{
|
||||
"type": "asset",
|
||||
"name": entity_parent_folders[-1],
|
||||
"data.root_of": substitute_item_type,
|
||||
},
|
||||
["_id"],
|
||||
# Find root folder docs
|
||||
root_folder_docs = get_assets(
|
||||
project_name,
|
||||
asset_names=[entity_parent_folders[-1]],
|
||||
fields=["_id", "data.root_of"],
|
||||
)
|
||||
# NOTE: Not sure why it's checking for entity type?
|
||||
# OP3 does not support multiple assets with same names so type
|
||||
# filtering is irelevant.
|
||||
# This way mimics previous implementation:
|
||||
# ```
|
||||
# root_folder_doc = dbcon.find_one(
|
||||
# {
|
||||
# "type": "asset",
|
||||
# "name": entity_parent_folders[-1],
|
||||
# "data.root_of": substitute_item_type,
|
||||
# },
|
||||
# ["_id"],
|
||||
# )
|
||||
# ```
|
||||
root_folder_doc = None
|
||||
for folder_doc in root_folder_docs:
|
||||
root_of = folder_doc.get("data", {}).get("root_of")
|
||||
if root_of == substitute_item_type:
|
||||
root_folder_doc = folder_doc
|
||||
break
|
||||
|
||||
if root_folder_doc:
|
||||
visual_parent_doc_id = root_folder_doc["_id"]
|
||||
|
||||
|
|
@ -170,7 +225,14 @@ def update_op_assets(
|
|||
|
||||
# Get parent entity
|
||||
parent_entity = parent_doc["data"]["zou"]
|
||||
parent_zou_id = parent_entity["parent_id"]
|
||||
parent_zou_id = parent_entity.get("parent_id")
|
||||
|
||||
if item_type in ["Shot", "Sequence"]:
|
||||
# Name with parents hierarchy "({episode}_){sequence}_{shot}"
|
||||
# to avoid duplicate name issue
|
||||
item_name = "_".join(item_data["parents"] + [item_doc["name"]])
|
||||
else:
|
||||
item_name = item_doc["name"]
|
||||
|
||||
# Set root folders parents
|
||||
item_data["parents"] = entity_parent_folders + item_data["parents"]
|
||||
|
|
@ -185,9 +247,9 @@ def update_op_assets(
|
|||
item_doc["_id"],
|
||||
{
|
||||
"$set": {
|
||||
"name": item["name"],
|
||||
"name": item_name,
|
||||
"data": item_data,
|
||||
"parent": asset_doc_ids[item["project_id"]]["_id"],
|
||||
"parent": project_doc["_id"],
|
||||
}
|
||||
},
|
||||
)
|
||||
|
|
@ -208,7 +270,7 @@ def write_project_to_op(project: dict, dbcon: AvalonMongoDB) -> UpdateOne:
|
|||
UpdateOne: Update instance for the project
|
||||
"""
|
||||
project_name = project["name"]
|
||||
project_doc = dbcon.database[project_name].find_one({"type": "project"})
|
||||
project_doc = get_project(project_name)
|
||||
if not project_doc:
|
||||
print(f"Creating project '{project_name}'")
|
||||
project_doc = create_project(project_name, project_name, dbcon=dbcon)
|
||||
|
|
@ -229,9 +291,9 @@ def write_project_to_op(project: dict, dbcon: AvalonMongoDB) -> UpdateOne:
|
|||
project_data.update(
|
||||
{
|
||||
"code": project_code,
|
||||
"fps": project["fps"],
|
||||
"resolutionWidth": project["resolution"].split("x")[0],
|
||||
"resolutionHeight": project["resolution"].split("x")[1],
|
||||
"fps": float(project["fps"]),
|
||||
"resolutionWidth": int(project["resolution"].split("x")[0]),
|
||||
"resolutionHeight": int(project["resolution"].split("x")[1]),
|
||||
"zou_id": project["id"],
|
||||
}
|
||||
)
|
||||
|
|
@ -278,6 +340,11 @@ def sync_all_projects(login: str, password: str):
|
|||
def sync_project_from_kitsu(dbcon: AvalonMongoDB, project: dict):
|
||||
"""Update OP project in DB with Zou data.
|
||||
|
||||
`root_of` is meant to sort entities by type for a better readability in
|
||||
the data tree. It puts all shot like (Shot and Episode and Sequence) and
|
||||
asset entities under two different root folders or hierarchy, defined in
|
||||
settings.
|
||||
|
||||
Args:
|
||||
dbcon (AvalonMongoDB): MongoDB connection
|
||||
project (dict): Project dict got using gazu.
|
||||
|
|
@ -292,12 +359,17 @@ def sync_project_from_kitsu(dbcon: AvalonMongoDB, project: dict):
|
|||
|
||||
# Get all assets from zou
|
||||
all_assets = gazu.asset.all_assets_for_project(project)
|
||||
all_asset_types = gazu.asset.all_asset_types_for_project(project)
|
||||
all_episodes = gazu.shot.all_episodes_for_project(project)
|
||||
all_seqs = gazu.shot.all_sequences_for_project(project)
|
||||
all_shots = gazu.shot.all_shots_for_project(project)
|
||||
all_entities = [
|
||||
item
|
||||
for item in all_assets + all_episodes + all_seqs + all_shots
|
||||
for item in all_assets
|
||||
+ all_asset_types
|
||||
+ all_episodes
|
||||
+ all_seqs
|
||||
+ all_shots
|
||||
if naming_pattern.match(item["name"])
|
||||
]
|
||||
|
||||
|
|
@ -305,26 +377,44 @@ def sync_project_from_kitsu(dbcon: AvalonMongoDB, project: dict):
|
|||
bulk_writes.append(write_project_to_op(project, dbcon))
|
||||
|
||||
# Try to find project document
|
||||
dbcon.Session["AVALON_PROJECT"] = project["name"]
|
||||
project_doc = dbcon.find_one({"type": "project"})
|
||||
project_name = project["name"]
|
||||
dbcon.Session["AVALON_PROJECT"] = project_name
|
||||
project_doc = get_project(project_name)
|
||||
|
||||
# Query all assets of the local project
|
||||
zou_ids_and_asset_docs = {
|
||||
asset_doc["data"]["zou"]["id"]: asset_doc
|
||||
for asset_doc in dbcon.find({"type": "asset"})
|
||||
for asset_doc in get_assets(project_name)
|
||||
if asset_doc["data"].get("zou", {}).get("id")
|
||||
}
|
||||
zou_ids_and_asset_docs[project["id"]] = project_doc
|
||||
|
||||
# Create entities root folders
|
||||
project_module_settings = get_project_settings(project["name"])["kitsu"]
|
||||
project_module_settings = get_project_settings(project_name)["kitsu"]
|
||||
for entity_type, root in project_module_settings["entities_root"].items():
|
||||
parent_folders = root.split("/")
|
||||
direct_parent_doc = None
|
||||
for i, folder in enumerate(parent_folders, 1):
|
||||
parent_doc = dbcon.find_one(
|
||||
{"type": "asset", "name": folder, "data.root_of": entity_type}
|
||||
parent_doc = get_asset_by_name(
|
||||
project_name, folder, fields=["_id", "data.root_of"]
|
||||
)
|
||||
# NOTE: Not sure why it's checking for entity type?
|
||||
# OP3 does not support multiple assets with same names so type
|
||||
# filtering is irelevant.
|
||||
# Also all of the entities could find be queried at once using
|
||||
# 'get_assets'.
|
||||
# This way mimics previous implementation:
|
||||
# ```
|
||||
# parent_doc = dbcon.find_one(
|
||||
# {"type": "asset", "name": folder, "data.root_of": entity_type}
|
||||
# )
|
||||
# ```
|
||||
if (
|
||||
parent_doc
|
||||
and parent_doc.get("data", {}).get("root_of") != entity_type
|
||||
):
|
||||
parent_doc = None
|
||||
|
||||
if not parent_doc:
|
||||
direct_parent_doc = dbcon.insert_one(
|
||||
{
|
||||
|
|
@ -334,21 +424,20 @@ def sync_project_from_kitsu(dbcon: AvalonMongoDB, project: dict):
|
|||
"data": {
|
||||
"root_of": entity_type,
|
||||
"parents": parent_folders[:i],
|
||||
"visualParent": direct_parent_doc,
|
||||
"visualParent": direct_parent_doc.inserted_id
|
||||
if direct_parent_doc
|
||||
else None,
|
||||
"tasks": {},
|
||||
},
|
||||
}
|
||||
)
|
||||
|
||||
# Create
|
||||
to_insert = []
|
||||
to_insert.extend(
|
||||
[
|
||||
create_op_asset(item)
|
||||
for item in all_entities
|
||||
if item["id"] not in zou_ids_and_asset_docs.keys()
|
||||
]
|
||||
)
|
||||
to_insert = [
|
||||
create_op_asset(item)
|
||||
for item in all_entities
|
||||
if item["id"] not in zou_ids_and_asset_docs.keys()
|
||||
]
|
||||
if to_insert:
|
||||
# Insert doc in DB
|
||||
dbcon.insert_many(to_insert)
|
||||
|
|
@ -357,7 +446,7 @@ def sync_project_from_kitsu(dbcon: AvalonMongoDB, project: dict):
|
|||
zou_ids_and_asset_docs.update(
|
||||
{
|
||||
asset_doc["data"]["zou"]["id"]: asset_doc
|
||||
for asset_doc in dbcon.find({"type": "asset"})
|
||||
for asset_doc in get_assets(project_name)
|
||||
if asset_doc["data"].get("zou")
|
||||
}
|
||||
)
|
||||
|
|
|
|||
|
|
@ -6,6 +6,7 @@ from typing import List
|
|||
import gazu
|
||||
from pymongo import UpdateOne
|
||||
|
||||
from openpype.client import get_project, get_assets
|
||||
from openpype.pipeline import AvalonMongoDB
|
||||
from openpype.api import get_project_settings
|
||||
from openpype.modules.kitsu.utils.credentials import validate_credentials
|
||||
|
|
@ -53,9 +54,7 @@ def sync_zou_from_op_project(
|
|||
"""
|
||||
# Get project doc if not provided
|
||||
if not project_doc:
|
||||
project_doc = dbcon.database[project_name].find_one(
|
||||
{"type": "project"}
|
||||
)
|
||||
project_doc = get_project(project_name)
|
||||
|
||||
# Get all entities from zou
|
||||
print(f"Synchronizing {project_name}...")
|
||||
|
|
@ -96,7 +95,7 @@ def sync_zou_from_op_project(
|
|||
dbcon.Session["AVALON_PROJECT"] = project_name
|
||||
asset_docs = {
|
||||
asset_doc["_id"]: asset_doc
|
||||
for asset_doc in dbcon.find({"type": "asset"})
|
||||
for asset_doc in get_assets(project_name)
|
||||
}
|
||||
|
||||
# Create new assets
|
||||
|
|
|
|||
|
|
@ -2,13 +2,13 @@ import os
|
|||
import platform
|
||||
|
||||
|
||||
from openpype.client import get_asset_by_name
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype_interfaces import (
|
||||
ITrayService,
|
||||
ILaunchHookPaths
|
||||
)
|
||||
from openpype.lib.events import register_event_callback
|
||||
from openpype.pipeline import AvalonMongoDB
|
||||
|
||||
from .exceptions import InvalidContextError
|
||||
|
||||
|
|
@ -197,22 +197,13 @@ class TimersManager(OpenPypeModule, ITrayService, ILaunchHookPaths):
|
|||
" Project: \"{}\" Asset: \"{}\" Task: \"{}\""
|
||||
).format(str(project_name), str(asset_name), str(task_name)))
|
||||
|
||||
dbconn = AvalonMongoDB()
|
||||
dbconn.install()
|
||||
dbconn.Session["AVALON_PROJECT"] = project_name
|
||||
|
||||
asset_doc = dbconn.find_one(
|
||||
{
|
||||
"type": "asset",
|
||||
"name": asset_name
|
||||
},
|
||||
{
|
||||
"data.tasks": True,
|
||||
"data.parents": True
|
||||
}
|
||||
asset_doc = get_asset_by_name(
|
||||
project_name,
|
||||
asset_name,
|
||||
fields=["_id", "name", "data.tasks", "data.parents"]
|
||||
)
|
||||
|
||||
if not asset_doc:
|
||||
dbconn.uninstall()
|
||||
raise InvalidContextError((
|
||||
"Asset \"{}\" not found in project \"{}\""
|
||||
).format(asset_name, project_name))
|
||||
|
|
@ -220,7 +211,6 @@ class TimersManager(OpenPypeModule, ITrayService, ILaunchHookPaths):
|
|||
asset_data = asset_doc.get("data") or {}
|
||||
asset_tasks = asset_data.get("tasks") or {}
|
||||
if task_name not in asset_tasks:
|
||||
dbconn.uninstall()
|
||||
raise InvalidContextError((
|
||||
"Task \"{}\" not found on asset \"{}\" in project \"{}\""
|
||||
).format(task_name, asset_name, project_name))
|
||||
|
|
@ -238,9 +228,10 @@ class TimersManager(OpenPypeModule, ITrayService, ILaunchHookPaths):
|
|||
hierarchy_items = asset_data.get("parents") or []
|
||||
hierarchy_items.append(asset_name)
|
||||
|
||||
dbconn.uninstall()
|
||||
return {
|
||||
"project_name": project_name,
|
||||
"asset_id": str(asset_doc["_id"]),
|
||||
"asset_name": asset_doc["name"],
|
||||
"task_name": task_name,
|
||||
"task_type": task_type,
|
||||
"hierarchy": hierarchy_items
|
||||
|
|
|
|||
|
|
@ -380,6 +380,19 @@ class AnatomyTemplateResult(TemplateResult):
|
|||
)
|
||||
return self.__class__(tmp, self.rootless)
|
||||
|
||||
def normalized(self):
|
||||
"""Convert to normalized path."""
|
||||
|
||||
tmp = TemplateResult(
|
||||
os.path.normpath(self),
|
||||
self.template,
|
||||
self.solved,
|
||||
self.used_values,
|
||||
self.missing_keys,
|
||||
self.invalid_types
|
||||
)
|
||||
return self.__class__(tmp, self.rootless)
|
||||
|
||||
|
||||
class AnatomyTemplates(TemplatesDict):
|
||||
inner_key_pattern = re.compile(r"(\{@.*?[^{}0]*\})")
|
||||
|
|
|
|||
|
|
@ -10,6 +10,12 @@ import pyblish.api
|
|||
from pyblish.lib import MessageHandler
|
||||
|
||||
import openpype
|
||||
from openpype.client import (
|
||||
get_project,
|
||||
get_asset_by_id,
|
||||
get_asset_by_name,
|
||||
version_is_latest,
|
||||
)
|
||||
from openpype.modules import load_modules, ModulesManager
|
||||
from openpype.settings import get_project_settings
|
||||
from openpype.lib import filter_pyblish_plugins
|
||||
|
|
@ -240,29 +246,7 @@ def registered_host():
|
|||
|
||||
|
||||
def deregister_host():
|
||||
_registered_host["_"] = default_host()
|
||||
|
||||
|
||||
def default_host():
|
||||
"""A default host, in place of anything better
|
||||
|
||||
This may be considered as reference for the
|
||||
interface a host must implement. It also ensures
|
||||
that the system runs, even when nothing is there
|
||||
to support it.
|
||||
|
||||
"""
|
||||
|
||||
host = types.ModuleType("defaultHost")
|
||||
|
||||
def ls():
|
||||
return list()
|
||||
|
||||
host.__dict__.update({
|
||||
"ls": ls
|
||||
})
|
||||
|
||||
return host
|
||||
_registered_host["_"] = None
|
||||
|
||||
|
||||
def debug_host():
|
||||
|
|
@ -304,3 +288,63 @@ def debug_host():
|
|||
})
|
||||
|
||||
return host
|
||||
|
||||
|
||||
def get_current_project(fields=None):
|
||||
"""Helper function to get project document based on global Session.
|
||||
|
||||
This function should be called only in process where host is installed.
|
||||
|
||||
Returns:
|
||||
dict: Project document.
|
||||
None: Project is not set.
|
||||
"""
|
||||
|
||||
project_name = legacy_io.active_project()
|
||||
return get_project(project_name, fields=fields)
|
||||
|
||||
|
||||
def get_current_project_asset(asset_name=None, asset_id=None, fields=None):
|
||||
"""Helper function to get asset document based on global Session.
|
||||
|
||||
This function should be called only in process where host is installed.
|
||||
|
||||
Asset is found out based on passed asset name or id (not both). Asset name
|
||||
is not used for filtering if asset id is passed. When both asset name and
|
||||
id are missing then asset name from current process is used.
|
||||
|
||||
Args:
|
||||
asset_name (str): Name of asset used for filter.
|
||||
asset_id (Union[str, ObjectId]): Asset document id. If entered then
|
||||
is used as only filter.
|
||||
fields (Union[List[str], None]): Limit returned data of asset documents
|
||||
to specific keys.
|
||||
|
||||
Returns:
|
||||
dict: Asset document.
|
||||
None: Asset is not set or not exist.
|
||||
"""
|
||||
|
||||
project_name = legacy_io.active_project()
|
||||
if asset_id:
|
||||
return get_asset_by_id(project_name, asset_id, fields=fields)
|
||||
|
||||
if not asset_name:
|
||||
asset_name = legacy_io.Session.get("AVALON_ASSET")
|
||||
# Skip if is not set even on context
|
||||
if not asset_name:
|
||||
return None
|
||||
return get_asset_by_name(project_name, asset_name, fields=fields)
|
||||
|
||||
def is_representation_from_latest(representation):
|
||||
"""Return whether the representation is from latest version
|
||||
|
||||
Args:
|
||||
representation (dict): The representation document from the database.
|
||||
|
||||
Returns:
|
||||
bool: Whether the representation is of latest version.
|
||||
"""
|
||||
|
||||
project_name = legacy_io.active_project()
|
||||
return version_is_latest(project_name, representation["parent"])
|
||||
|
|
|
|||
|
|
@ -29,6 +29,7 @@ UpdateData = collections.namedtuple("UpdateData", ["instance", "changes"])
|
|||
|
||||
class ImmutableKeyError(TypeError):
|
||||
"""Accessed key is immutable so does not allow changes or removements."""
|
||||
|
||||
def __init__(self, key, msg=None):
|
||||
self.immutable_key = key
|
||||
if not msg:
|
||||
|
|
@ -40,6 +41,7 @@ class ImmutableKeyError(TypeError):
|
|||
|
||||
class HostMissRequiredMethod(Exception):
|
||||
"""Host does not have implemented required functions for creation."""
|
||||
|
||||
def __init__(self, host, missing_methods):
|
||||
self.missing_methods = missing_methods
|
||||
self.host = host
|
||||
|
|
@ -66,6 +68,7 @@ class InstanceMember:
|
|||
TODO:
|
||||
Implement and use!
|
||||
"""
|
||||
|
||||
def __init__(self, instance, name):
|
||||
self.instance = instance
|
||||
|
||||
|
|
@ -94,6 +97,7 @@ class AttributeValues:
|
|||
values(dict): Values after possible conversion.
|
||||
origin_data(dict): Values loaded from host before conversion.
|
||||
"""
|
||||
|
||||
def __init__(self, attr_defs, values, origin_data=None):
|
||||
from openpype.lib.attribute_definitions import UnknownDef
|
||||
|
||||
|
|
@ -174,6 +178,10 @@ class AttributeValues:
|
|||
output = {}
|
||||
for key in self._data:
|
||||
output[key] = self[key]
|
||||
|
||||
for key, attr_def in self._attr_defs_by_key.items():
|
||||
if key not in output:
|
||||
output[key] = attr_def.default
|
||||
return output
|
||||
|
||||
@staticmethod
|
||||
|
|
@ -196,6 +204,7 @@ class CreatorAttributeValues(AttributeValues):
|
|||
Args:
|
||||
instance (CreatedInstance): Instance for which are values hold.
|
||||
"""
|
||||
|
||||
def __init__(self, instance, *args, **kwargs):
|
||||
self.instance = instance
|
||||
super(CreatorAttributeValues, self).__init__(*args, **kwargs)
|
||||
|
|
@ -211,6 +220,7 @@ class PublishAttributeValues(AttributeValues):
|
|||
publish_attributes(PublishAttributes): Wrapper for multiple publish
|
||||
attributes is used as parent object.
|
||||
"""
|
||||
|
||||
def __init__(self, publish_attributes, *args, **kwargs):
|
||||
self.publish_attributes = publish_attributes
|
||||
super(PublishAttributeValues, self).__init__(*args, **kwargs)
|
||||
|
|
@ -232,6 +242,7 @@ class PublishAttributes:
|
|||
attr_plugins(list): List of publish plugins that may have defined
|
||||
attribute definitions.
|
||||
"""
|
||||
|
||||
def __init__(self, parent, origin_data, attr_plugins=None):
|
||||
self.parent = parent
|
||||
self._origin_data = copy.deepcopy(origin_data)
|
||||
|
|
@ -270,6 +281,7 @@ class PublishAttributes:
|
|||
key(str): Plugin name.
|
||||
default: Default value if plugin was not found.
|
||||
"""
|
||||
|
||||
if key not in self._data:
|
||||
return default
|
||||
|
||||
|
|
@ -287,11 +299,13 @@ class PublishAttributes:
|
|||
|
||||
def plugin_names_order(self):
|
||||
"""Plugin names order by their 'order' attribute."""
|
||||
|
||||
for name in self._plugin_names_order:
|
||||
yield name
|
||||
|
||||
def data_to_store(self):
|
||||
"""Convert attribute values to "data to store"."""
|
||||
|
||||
output = {}
|
||||
for key, attr_value in self._data.items():
|
||||
output[key] = attr_value.data_to_store()
|
||||
|
|
@ -299,6 +313,7 @@ class PublishAttributes:
|
|||
|
||||
def changes(self):
|
||||
"""Return changes per each key."""
|
||||
|
||||
changes = {}
|
||||
for key, attr_val in self._data.items():
|
||||
attr_changes = attr_val.changes()
|
||||
|
|
@ -314,6 +329,7 @@ class PublishAttributes:
|
|||
|
||||
def set_publish_plugins(self, attr_plugins):
|
||||
"""Set publish plugins attribute definitions."""
|
||||
|
||||
self._plugin_names_order = []
|
||||
self._missing_plugins = []
|
||||
self.attr_plugins = attr_plugins or []
|
||||
|
|
@ -365,6 +381,7 @@ class CreatedInstance:
|
|||
`openpype.pipeline.registered_host`.
|
||||
new(bool): Is instance new.
|
||||
"""
|
||||
|
||||
# Keys that can't be changed or removed from data after loading using
|
||||
# creator.
|
||||
# - 'creator_attributes' and 'publish_attributes' can change values of
|
||||
|
|
@ -496,6 +513,20 @@ class CreatedInstance:
|
|||
def subset_name(self):
|
||||
return self._data["subset"]
|
||||
|
||||
@property
|
||||
def label(self):
|
||||
label = self._data.get("label")
|
||||
if not label:
|
||||
label = self.subset_name
|
||||
return label
|
||||
|
||||
@property
|
||||
def group_label(self):
|
||||
label = self._data.get("group")
|
||||
if label:
|
||||
return label
|
||||
return self.creator.get_group_label()
|
||||
|
||||
@property
|
||||
def creator_identifier(self):
|
||||
return self.creator.identifier
|
||||
|
|
@ -552,6 +583,7 @@ class CreatedInstance:
|
|||
@property
|
||||
def id(self):
|
||||
"""Instance identifier."""
|
||||
|
||||
return self._data["instance_id"]
|
||||
|
||||
@property
|
||||
|
|
@ -560,10 +592,12 @@ class CreatedInstance:
|
|||
|
||||
Access to data is needed to modify values.
|
||||
"""
|
||||
|
||||
return self
|
||||
|
||||
def changes(self):
|
||||
"""Calculate and return changes."""
|
||||
|
||||
changes = {}
|
||||
new_keys = set()
|
||||
for key, new_value in self._data.items():
|
||||
|
|
@ -702,6 +736,7 @@ class CreateContext:
|
|||
self.manual_creators = {}
|
||||
|
||||
self.publish_discover_result = None
|
||||
self.publish_plugins_mismatch_targets = []
|
||||
self.publish_plugins = []
|
||||
self.plugins_with_defs = []
|
||||
self._attr_plugins_by_family = {}
|
||||
|
|
@ -748,6 +783,10 @@ class CreateContext:
|
|||
def host_name(self):
|
||||
return os.environ["AVALON_APP"]
|
||||
|
||||
@property
|
||||
def project_name(self):
|
||||
return self.dbcon.active_project()
|
||||
|
||||
@property
|
||||
def log(self):
|
||||
"""Dynamic access to logger."""
|
||||
|
|
@ -820,6 +859,7 @@ class CreateContext:
|
|||
discover_result = DiscoverResult()
|
||||
plugins_with_defs = []
|
||||
plugins_by_targets = []
|
||||
plugins_mismatch_targets = []
|
||||
if discover_publish_plugins:
|
||||
discover_result = publish_plugins_discover()
|
||||
publish_plugins = discover_result.plugins
|
||||
|
|
@ -829,19 +869,26 @@ class CreateContext:
|
|||
plugins_by_targets = pyblish.logic.plugins_by_targets(
|
||||
publish_plugins, list(targets)
|
||||
)
|
||||
|
||||
# Collect plugins that can have attribute definitions
|
||||
for plugin in publish_plugins:
|
||||
if OpenPypePyblishPluginMixin in inspect.getmro(plugin):
|
||||
plugins_with_defs.append(plugin)
|
||||
|
||||
plugins_mismatch_targets = [
|
||||
plugin
|
||||
for plugin in publish_plugins
|
||||
if plugin not in plugins_by_targets
|
||||
]
|
||||
|
||||
self.publish_plugins_mismatch_targets = plugins_mismatch_targets
|
||||
self.publish_discover_result = discover_result
|
||||
self.publish_plugins = plugins_by_targets
|
||||
self.plugins_with_defs = plugins_with_defs
|
||||
|
||||
# Prepare settings
|
||||
project_name = self.dbcon.Session["AVALON_PROJECT"]
|
||||
system_settings = get_system_settings()
|
||||
project_settings = get_project_settings(project_name)
|
||||
project_settings = get_project_settings(self.project_name)
|
||||
|
||||
# Discover and prepare creators
|
||||
creators = {}
|
||||
|
|
@ -873,9 +920,9 @@ class CreateContext:
|
|||
continue
|
||||
|
||||
creator = creator_class(
|
||||
self,
|
||||
system_settings,
|
||||
project_settings,
|
||||
system_settings,
|
||||
self,
|
||||
self.headless
|
||||
)
|
||||
creators[creator_identifier] = creator
|
||||
|
|
|
|||
|
|
@ -1,5 +1,4 @@
|
|||
import copy
|
||||
import logging
|
||||
|
||||
from abc import (
|
||||
ABCMeta,
|
||||
|
|
@ -47,6 +46,9 @@ class BaseCreator:
|
|||
|
||||
# Label shown in UI
|
||||
label = None
|
||||
group_label = None
|
||||
# Cached group label after first call 'get_group_label'
|
||||
_cached_group_label = None
|
||||
|
||||
# Variable to store logger
|
||||
_log = None
|
||||
|
|
@ -70,7 +72,7 @@ class BaseCreator:
|
|||
host_name = None
|
||||
|
||||
def __init__(
|
||||
self, create_context, system_settings, project_settings, headless=False
|
||||
self, project_settings, system_settings, create_context, headless=False
|
||||
):
|
||||
# Reference to CreateContext
|
||||
self.create_context = create_context
|
||||
|
|
@ -85,15 +87,54 @@ class BaseCreator:
|
|||
|
||||
Default implementation returns plugin's family.
|
||||
"""
|
||||
|
||||
return self.family
|
||||
|
||||
@abstractproperty
|
||||
def family(self):
|
||||
"""Family that plugin represents."""
|
||||
|
||||
pass
|
||||
|
||||
@property
|
||||
def project_name(self):
|
||||
"""Family that plugin represents."""
|
||||
|
||||
return self.create_context.project_name
|
||||
|
||||
@property
|
||||
def host(self):
|
||||
return self.create_context.host
|
||||
|
||||
def get_group_label(self):
|
||||
"""Group label under which are instances grouped in UI.
|
||||
|
||||
Default implementation use attributes in this order:
|
||||
- 'group_label' -> 'label' -> 'identifier'
|
||||
Keep in mind that 'identifier' use 'family' by default.
|
||||
|
||||
Returns:
|
||||
str: Group label that can be used for grouping of instances in UI.
|
||||
Group label can be overriden by instance itself.
|
||||
"""
|
||||
|
||||
if self._cached_group_label is None:
|
||||
label = self.identifier
|
||||
if self.group_label:
|
||||
label = self.group_label
|
||||
elif self.label:
|
||||
label = self.label
|
||||
self._cached_group_label = label
|
||||
return self._cached_group_label
|
||||
|
||||
@property
|
||||
def log(self):
|
||||
"""Logger of the plugin.
|
||||
|
||||
Returns:
|
||||
logging.Logger: Logger with name of the plugin.
|
||||
"""
|
||||
|
||||
if self._log is None:
|
||||
from openpype.api import Logger
|
||||
|
||||
|
|
@ -101,10 +142,30 @@ class BaseCreator:
|
|||
return self._log
|
||||
|
||||
def _add_instance_to_context(self, instance):
|
||||
"""Helper method to ad d"""
|
||||
"""Helper method to add instance to create context.
|
||||
|
||||
Instances should be stored to DCC workfile metadata to be able reload
|
||||
them and also stored to CreateContext in which is creator plugin
|
||||
existing at the moment to be able use it without refresh of
|
||||
CreateContext.
|
||||
|
||||
Args:
|
||||
instance (CreatedInstance): New created instance.
|
||||
"""
|
||||
|
||||
self.create_context.creator_adds_instance(instance)
|
||||
|
||||
def _remove_instance_from_context(self, instance):
|
||||
"""Helper method to remove instance from create context.
|
||||
|
||||
Instances must be removed from DCC workfile metadat aand from create
|
||||
context in which plugin is existing at the moment of removement to
|
||||
propagate the change without restarting create context.
|
||||
|
||||
Args:
|
||||
instance (CreatedInstance): Instance which should be removed.
|
||||
"""
|
||||
|
||||
self.create_context.creator_removed_instance(instance)
|
||||
|
||||
@abstractmethod
|
||||
|
|
@ -115,6 +176,7 @@ class BaseCreator:
|
|||
- must expect all data that were passed to init in previous
|
||||
implementation
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
|
|
@ -141,6 +203,7 @@ class BaseCreator:
|
|||
self._add_instance_to_context(instance)
|
||||
```
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
|
|
@ -148,9 +211,10 @@ class BaseCreator:
|
|||
"""Store changes of existing instances so they can be recollected.
|
||||
|
||||
Args:
|
||||
update_list(list<UpdateData>): Gets list of tuples. Each item
|
||||
update_list(List[UpdateData]): Gets list of tuples. Each item
|
||||
contain changed instance and it's changes.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
|
|
@ -161,9 +225,10 @@ class BaseCreator:
|
|||
'True' if did so.
|
||||
|
||||
Args:
|
||||
instance(list<CreatedInstance>): Instance objects which should be
|
||||
instance(List[CreatedInstance]): Instance objects which should be
|
||||
removed.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
def get_icon(self):
|
||||
|
|
@ -171,6 +236,7 @@ class BaseCreator:
|
|||
|
||||
Can return path to image file or awesome icon name.
|
||||
"""
|
||||
|
||||
return self.icon
|
||||
|
||||
def get_dynamic_data(
|
||||
|
|
@ -181,6 +247,7 @@ class BaseCreator:
|
|||
These may be get dynamically created based on current context of
|
||||
workfile.
|
||||
"""
|
||||
|
||||
return {}
|
||||
|
||||
def get_subset_name(
|
||||
|
|
@ -205,6 +272,7 @@ class BaseCreator:
|
|||
project_name(str): Project name.
|
||||
host_name(str): Which host creates subset.
|
||||
"""
|
||||
|
||||
dynamic_data = self.get_dynamic_data(
|
||||
variant, task_name, asset_doc, project_name, host_name
|
||||
)
|
||||
|
|
@ -231,9 +299,10 @@ class BaseCreator:
|
|||
keys/values when plugin attributes change.
|
||||
|
||||
Returns:
|
||||
list<AbtractAttrDef>: Attribute definitions that can be tweaked for
|
||||
List[AbtractAttrDef]: Attribute definitions that can be tweaked for
|
||||
created instance.
|
||||
"""
|
||||
|
||||
return self.instance_attr_defs
|
||||
|
||||
|
||||
|
|
@ -291,6 +360,7 @@ class Creator(BaseCreator):
|
|||
Returns:
|
||||
str: Short description of family.
|
||||
"""
|
||||
|
||||
return self.description
|
||||
|
||||
def get_detail_description(self):
|
||||
|
|
@ -301,6 +371,7 @@ class Creator(BaseCreator):
|
|||
Returns:
|
||||
str: Detailed description of family for artist.
|
||||
"""
|
||||
|
||||
return self.detailed_description
|
||||
|
||||
def get_default_variants(self):
|
||||
|
|
@ -312,8 +383,9 @@ class Creator(BaseCreator):
|
|||
By default returns `default_variants` value.
|
||||
|
||||
Returns:
|
||||
list<str>: Whisper variants for user input.
|
||||
List[str]: Whisper variants for user input.
|
||||
"""
|
||||
|
||||
return copy.deepcopy(self.default_variants)
|
||||
|
||||
def get_default_variant(self):
|
||||
|
|
@ -332,11 +404,13 @@ class Creator(BaseCreator):
|
|||
"""Plugin attribute definitions needed for creation.
|
||||
Attribute definitions of plugin that define how creation will work.
|
||||
Values of these definitions are passed to `create` method.
|
||||
NOTE:
|
||||
Convert method should be implemented which should care about updating
|
||||
keys/values when plugin attributes change.
|
||||
|
||||
Note:
|
||||
Convert method should be implemented which should care about
|
||||
updating keys/values when plugin attributes change.
|
||||
|
||||
Returns:
|
||||
list<AbtractAttrDef>: Attribute definitions that can be tweaked for
|
||||
List[AbtractAttrDef]: Attribute definitions that can be tweaked for
|
||||
created instance.
|
||||
"""
|
||||
return self.pre_create_attr_defs
|
||||
|
|
|
|||
|
|
@ -24,6 +24,10 @@ from .utils import (
|
|||
|
||||
loaders_from_repre_context,
|
||||
loaders_from_representation,
|
||||
|
||||
any_outdated_containers,
|
||||
get_outdated_containers,
|
||||
filter_containers,
|
||||
)
|
||||
|
||||
from .plugins import (
|
||||
|
|
@ -66,6 +70,10 @@ __all__ = (
|
|||
"loaders_from_repre_context",
|
||||
"loaders_from_representation",
|
||||
|
||||
"any_outdated_containers",
|
||||
"get_outdated_containers",
|
||||
"filter_containers",
|
||||
|
||||
# plugins.py
|
||||
"LoaderPlugin",
|
||||
"SubsetLoaderPlugin",
|
||||
|
|
|
|||
|
|
@ -4,8 +4,10 @@ import copy
|
|||
import getpass
|
||||
import logging
|
||||
import inspect
|
||||
import collections
|
||||
import numbers
|
||||
|
||||
from openpype.host import ILoadHost
|
||||
from openpype.client import (
|
||||
get_project,
|
||||
get_assets,
|
||||
|
|
@ -15,6 +17,7 @@ from openpype.client import (
|
|||
get_last_version_by_subset_id,
|
||||
get_hero_version_by_subset_id,
|
||||
get_version_by_name,
|
||||
get_last_versions,
|
||||
get_representations,
|
||||
get_representation_by_id,
|
||||
get_representation_by_name,
|
||||
|
|
@ -28,6 +31,11 @@ from openpype.pipeline import (
|
|||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
ContainersFilterResult = collections.namedtuple(
|
||||
"ContainersFilterResult",
|
||||
["latest", "outdated", "not_foud", "invalid"]
|
||||
)
|
||||
|
||||
|
||||
class HeroVersionType(object):
|
||||
def __init__(self, version):
|
||||
|
|
@ -208,10 +216,12 @@ def get_representation_context(representation):
|
|||
|
||||
assert representation is not None, "This is a bug"
|
||||
|
||||
if not isinstance(representation, dict):
|
||||
representation = get_representation_by_id(representation)
|
||||
|
||||
project_name = legacy_io.active_project()
|
||||
if not isinstance(representation, dict):
|
||||
representation = get_representation_by_id(
|
||||
project_name, representation
|
||||
)
|
||||
|
||||
version, subset, asset, project = get_representation_parents(
|
||||
project_name, representation
|
||||
)
|
||||
|
|
@ -394,7 +404,7 @@ def update_container(container, version=-1):
|
|||
assert current_representation is not None, "This is a bug"
|
||||
|
||||
current_version = get_version_by_id(
|
||||
project_name, current_representation["_id"], fields=["parent"]
|
||||
project_name, current_representation["parent"], fields=["parent"]
|
||||
)
|
||||
if version == -1:
|
||||
new_version = get_last_version_by_subset_id(
|
||||
|
|
@ -683,3 +693,164 @@ def loaders_from_representation(loaders, representation):
|
|||
|
||||
context = get_representation_context(representation)
|
||||
return loaders_from_repre_context(loaders, context)
|
||||
|
||||
|
||||
def any_outdated_containers(host=None, project_name=None):
|
||||
"""Check if there are any outdated containers in scene."""
|
||||
|
||||
if get_outdated_containers(host, project_name):
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
def get_outdated_containers(host=None, project_name=None):
|
||||
"""Collect outdated containers from host scene.
|
||||
|
||||
Currently registered host and project in global session are used if
|
||||
arguments are not passed.
|
||||
|
||||
Args:
|
||||
host (ModuleType): Host implementation with 'ls' function available.
|
||||
project_name (str): Name of project in which context we are.
|
||||
"""
|
||||
|
||||
if host is None:
|
||||
from openpype.pipeline import registered_host
|
||||
host = registered_host()
|
||||
|
||||
if project_name is None:
|
||||
project_name = legacy_io.active_project()
|
||||
|
||||
if isinstance(host, ILoadHost):
|
||||
containers = host.get_containers()
|
||||
else:
|
||||
containers = host.ls()
|
||||
return filter_containers(containers, project_name).outdated
|
||||
|
||||
|
||||
def filter_containers(containers, project_name):
|
||||
"""Filter containers and split them into 4 categories.
|
||||
|
||||
Categories are 'latest', 'outdated', 'invalid' and 'not_found'.
|
||||
The 'lastest' containers are from last version, 'outdated' are not,
|
||||
'invalid' are invalid containers (invalid content) and 'not_foud' has
|
||||
some missing entity in database.
|
||||
|
||||
Args:
|
||||
containers (Iterable[dict]): List of containers referenced into scene.
|
||||
project_name (str): Name of project in which context shoud look for
|
||||
versions.
|
||||
|
||||
Returns:
|
||||
ContainersFilterResult: Named tuple with 'latest', 'outdated',
|
||||
'invalid' and 'not_found' containers.
|
||||
"""
|
||||
|
||||
# Make sure containers is list that won't change
|
||||
containers = list(containers)
|
||||
|
||||
outdated_containers = []
|
||||
uptodate_containers = []
|
||||
not_found_containers = []
|
||||
invalid_containers = []
|
||||
output = ContainersFilterResult(
|
||||
uptodate_containers,
|
||||
outdated_containers,
|
||||
not_found_containers,
|
||||
invalid_containers
|
||||
)
|
||||
# Query representation docs to get it's version ids
|
||||
repre_ids = {
|
||||
container["representation"]
|
||||
for container in containers
|
||||
if container["representation"]
|
||||
}
|
||||
if not repre_ids:
|
||||
if containers:
|
||||
invalid_containers.extend(containers)
|
||||
return output
|
||||
|
||||
repre_docs = get_representations(
|
||||
project_name,
|
||||
representation_ids=repre_ids,
|
||||
fields=["_id", "parent"]
|
||||
)
|
||||
# Store representations by stringified representation id
|
||||
repre_docs_by_str_id = {}
|
||||
repre_docs_by_version_id = collections.defaultdict(list)
|
||||
for repre_doc in repre_docs:
|
||||
repre_id = str(repre_doc["_id"])
|
||||
version_id = repre_doc["parent"]
|
||||
repre_docs_by_str_id[repre_id] = repre_doc
|
||||
repre_docs_by_version_id[version_id].append(repre_doc)
|
||||
|
||||
# Query version docs to get it's subset ids
|
||||
# - also query hero version to be able identify if representation
|
||||
# belongs to existing version
|
||||
version_docs = get_versions(
|
||||
project_name,
|
||||
version_ids=repre_docs_by_version_id.keys(),
|
||||
hero=True,
|
||||
fields=["_id", "parent", "type"]
|
||||
)
|
||||
verisons_by_id = {}
|
||||
versions_by_subset_id = collections.defaultdict(list)
|
||||
hero_version_ids = set()
|
||||
for version_doc in version_docs:
|
||||
version_id = version_doc["_id"]
|
||||
# Store versions by their ids
|
||||
verisons_by_id[version_id] = version_doc
|
||||
# There's no need to query subsets for hero versions
|
||||
# - they are considered as latest?
|
||||
if version_doc["type"] == "hero_version":
|
||||
hero_version_ids.add(version_id)
|
||||
continue
|
||||
subset_id = version_doc["parent"]
|
||||
versions_by_subset_id[subset_id].append(version_doc)
|
||||
|
||||
last_versions = get_last_versions(
|
||||
project_name,
|
||||
subset_ids=versions_by_subset_id.keys(),
|
||||
fields=["_id"]
|
||||
)
|
||||
# Figure out which versions are outdated
|
||||
outdated_version_ids = set()
|
||||
for subset_id, last_version_doc in last_versions.items():
|
||||
for version_doc in versions_by_subset_id[subset_id]:
|
||||
version_id = version_doc["_id"]
|
||||
if version_id != last_version_doc["_id"]:
|
||||
outdated_version_ids.add(version_id)
|
||||
|
||||
# Based on all collected data figure out which containers are outdated
|
||||
# - log out if there are missing representation or version documents
|
||||
for container in containers:
|
||||
container_name = container["objectName"]
|
||||
repre_id = container["representation"]
|
||||
if not repre_id:
|
||||
invalid_containers.append(container)
|
||||
continue
|
||||
|
||||
repre_doc = repre_docs_by_str_id.get(repre_id)
|
||||
if not repre_doc:
|
||||
log.debug((
|
||||
"Container '{}' has an invalid representation."
|
||||
" It is missing in the database."
|
||||
).format(container_name))
|
||||
not_found_containers.append(container)
|
||||
continue
|
||||
|
||||
version_id = repre_doc["parent"]
|
||||
if version_id in outdated_version_ids:
|
||||
outdated_containers.append(container)
|
||||
|
||||
elif version_id not in verisons_by_id:
|
||||
log.debug((
|
||||
"Representation on container '{}' has an invalid version."
|
||||
" It is missing in the database."
|
||||
).format(container_name))
|
||||
not_found_containers.append(container)
|
||||
|
||||
else:
|
||||
uptodate_containers.append(container)
|
||||
|
||||
return output
|
||||
|
|
|
|||
|
|
@ -1,4 +1,7 @@
|
|||
from .publish_plugins import (
|
||||
AbstractMetaInstancePlugin,
|
||||
AbstractMetaContextPlugin,
|
||||
|
||||
PublishValidationError,
|
||||
PublishXmlValidationError,
|
||||
KnownPublishError,
|
||||
|
|
@ -13,8 +16,17 @@ from .lib import (
|
|||
load_help_content_from_filepath,
|
||||
)
|
||||
|
||||
from .abstract_expected_files import ExpectedFiles
|
||||
from .abstract_collect_render import (
|
||||
RenderInstance,
|
||||
AbstractCollectRender,
|
||||
)
|
||||
|
||||
|
||||
__all__ = (
|
||||
"AbstractMetaInstancePlugin",
|
||||
"AbstractMetaContextPlugin",
|
||||
|
||||
"PublishValidationError",
|
||||
"PublishXmlValidationError",
|
||||
"KnownPublishError",
|
||||
|
|
@ -25,4 +37,9 @@ __all__ = (
|
|||
"publish_plugins_discover",
|
||||
"load_help_content_from_plugin",
|
||||
"load_help_content_from_filepath",
|
||||
|
||||
"ExpectedFiles",
|
||||
|
||||
"RenderInstance",
|
||||
"AbstractCollectRender",
|
||||
)
|
||||
|
|
|
|||
268
openpype/pipeline/publish/abstract_collect_render.py
Normal file
268
openpype/pipeline/publish/abstract_collect_render.py
Normal file
|
|
@ -0,0 +1,268 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Collect render template.
|
||||
|
||||
TODO: use @dataclass when times come.
|
||||
|
||||
"""
|
||||
from abc import abstractmethod
|
||||
|
||||
import attr
|
||||
import six
|
||||
|
||||
import pyblish.api
|
||||
|
||||
from openpype.pipeline import legacy_io
|
||||
from .publish_plugins import AbstractMetaContextPlugin
|
||||
|
||||
|
||||
@attr.s
|
||||
class RenderInstance(object):
|
||||
"""Data collected by collectors.
|
||||
|
||||
This data class later on passed to collected instances.
|
||||
Those attributes are required later on.
|
||||
|
||||
"""
|
||||
|
||||
# metadata
|
||||
version = attr.ib() # instance version
|
||||
time = attr.ib() # time of instance creation (get_formatted_current_time)
|
||||
source = attr.ib() # path to source scene file
|
||||
label = attr.ib() # label to show in GUI
|
||||
subset = attr.ib() # subset name
|
||||
task = attr.ib() # task name
|
||||
asset = attr.ib() # asset name (AVALON_ASSET)
|
||||
attachTo = attr.ib() # subset name to attach render to
|
||||
setMembers = attr.ib() # list of nodes/members producing render output
|
||||
publish = attr.ib() # bool, True to publish instance
|
||||
name = attr.ib() # instance name
|
||||
|
||||
# format settings
|
||||
resolutionWidth = attr.ib() # resolution width (1920)
|
||||
resolutionHeight = attr.ib() # resolution height (1080)
|
||||
pixelAspect = attr.ib() # pixel aspect (1.0)
|
||||
|
||||
# time settings
|
||||
frameStart = attr.ib() # start frame
|
||||
frameEnd = attr.ib() # start end
|
||||
frameStep = attr.ib() # frame step
|
||||
|
||||
handleStart = attr.ib(default=None) # start frame
|
||||
handleEnd = attr.ib(default=None) # start frame
|
||||
|
||||
# for software (like Harmony) where frame range cannot be set by DB
|
||||
# handles need to be propagated if exist
|
||||
ignoreFrameHandleCheck = attr.ib(default=False)
|
||||
|
||||
# --------------------
|
||||
# With default values
|
||||
# metadata
|
||||
renderer = attr.ib(default="") # renderer - can be used in Deadline
|
||||
review = attr.ib(default=False) # generate review from instance (bool)
|
||||
priority = attr.ib(default=50) # job priority on farm
|
||||
|
||||
family = attr.ib(default="renderlayer")
|
||||
families = attr.ib(default=["renderlayer"]) # list of families
|
||||
|
||||
# format settings
|
||||
multipartExr = attr.ib(default=False) # flag for multipart exrs
|
||||
convertToScanline = attr.ib(default=False) # flag for exr conversion
|
||||
|
||||
tileRendering = attr.ib(default=False) # bool: treat render as tiles
|
||||
tilesX = attr.ib(default=0) # number of tiles in X
|
||||
tilesY = attr.ib(default=0) # number of tiles in Y
|
||||
|
||||
# submit_publish_job
|
||||
toBeRenderedOn = attr.ib(default=None)
|
||||
deadlineSubmissionJob = attr.ib(default=None)
|
||||
anatomyData = attr.ib(default=None)
|
||||
outputDir = attr.ib(default=None)
|
||||
context = attr.ib(default=None)
|
||||
|
||||
@frameStart.validator
|
||||
def check_frame_start(self, _, value):
|
||||
"""Validate if frame start is not larger then end."""
|
||||
if value > self.frameEnd:
|
||||
raise ValueError("frameStart must be smaller "
|
||||
"or equal then frameEnd")
|
||||
|
||||
@frameEnd.validator
|
||||
def check_frame_end(self, _, value):
|
||||
"""Validate if frame end is not less then start."""
|
||||
if value < self.frameStart:
|
||||
raise ValueError("frameEnd must be smaller "
|
||||
"or equal then frameStart")
|
||||
|
||||
@tilesX.validator
|
||||
def check_tiles_x(self, _, value):
|
||||
"""Validate if tile x isn't less then 1."""
|
||||
if not self.tileRendering:
|
||||
return
|
||||
if value < 1:
|
||||
raise ValueError("tile X size cannot be less then 1")
|
||||
|
||||
if value == 1 and self.tilesY == 1:
|
||||
raise ValueError("both tiles X a Y sizes are set to 1")
|
||||
|
||||
@tilesY.validator
|
||||
def check_tiles_y(self, _, value):
|
||||
"""Validate if tile y isn't less then 1."""
|
||||
if not self.tileRendering:
|
||||
return
|
||||
if value < 1:
|
||||
raise ValueError("tile Y size cannot be less then 1")
|
||||
|
||||
if value == 1 and self.tilesX == 1:
|
||||
raise ValueError("both tiles X a Y sizes are set to 1")
|
||||
|
||||
|
||||
@six.add_metaclass(AbstractMetaContextPlugin)
|
||||
class AbstractCollectRender(pyblish.api.ContextPlugin):
|
||||
"""Gather all publishable render layers from renderSetup."""
|
||||
|
||||
order = pyblish.api.CollectorOrder + 0.01
|
||||
label = "Collect Render"
|
||||
sync_workfile_version = False
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
"""Constructor."""
|
||||
super(AbstractCollectRender, self).__init__(*args, **kwargs)
|
||||
self._file_path = None
|
||||
self._asset = legacy_io.Session["AVALON_ASSET"]
|
||||
self._context = None
|
||||
|
||||
def process(self, context):
|
||||
"""Entry point to collector."""
|
||||
self._context = context
|
||||
for instance in context:
|
||||
# make sure workfile instance publishing is enabled
|
||||
try:
|
||||
if "workfile" in instance.data["families"]:
|
||||
instance.data["publish"] = True
|
||||
# TODO merge renderFarm and render.farm
|
||||
if ("renderFarm" in instance.data["families"] or
|
||||
"render.farm" in instance.data["families"]):
|
||||
instance.data["remove"] = True
|
||||
except KeyError:
|
||||
# be tolerant if 'families' is missing.
|
||||
pass
|
||||
|
||||
self._file_path = context.data["currentFile"].replace("\\", "/")
|
||||
|
||||
render_instances = self.get_instances(context)
|
||||
for render_instance in render_instances:
|
||||
exp_files = self.get_expected_files(render_instance)
|
||||
assert exp_files, "no file names were generated, this is bug"
|
||||
|
||||
# if we want to attach render to subset, check if we have AOV's
|
||||
# in expectedFiles. If so, raise error as we cannot attach AOV
|
||||
# (considered to be subset on its own) to another subset
|
||||
if render_instance.attachTo:
|
||||
assert isinstance(exp_files, list), (
|
||||
"attaching multiple AOVs or renderable cameras to "
|
||||
"subset is not supported"
|
||||
)
|
||||
|
||||
frame_start_render = int(render_instance.frameStart)
|
||||
frame_end_render = int(render_instance.frameEnd)
|
||||
if (render_instance.ignoreFrameHandleCheck or
|
||||
int(context.data['frameStartHandle']) == frame_start_render
|
||||
and int(context.data['frameEndHandle']) == frame_end_render): # noqa: W503, E501
|
||||
|
||||
handle_start = context.data['handleStart']
|
||||
handle_end = context.data['handleEnd']
|
||||
frame_start = context.data['frameStart']
|
||||
frame_end = context.data['frameEnd']
|
||||
frame_start_handle = context.data['frameStartHandle']
|
||||
frame_end_handle = context.data['frameEndHandle']
|
||||
else:
|
||||
handle_start = 0
|
||||
handle_end = 0
|
||||
frame_start = frame_start_render
|
||||
frame_end = frame_end_render
|
||||
frame_start_handle = frame_start_render
|
||||
frame_end_handle = frame_end_render
|
||||
|
||||
data = {
|
||||
"handleStart": handle_start,
|
||||
"handleEnd": handle_end,
|
||||
"frameStart": frame_start,
|
||||
"frameEnd": frame_end,
|
||||
"frameStartHandle": frame_start_handle,
|
||||
"frameEndHandle": frame_end_handle,
|
||||
"byFrameStep": int(render_instance.frameStep),
|
||||
|
||||
"author": context.data["user"],
|
||||
# Add source to allow tracing back to the scene from
|
||||
# which was submitted originally
|
||||
"expectedFiles": exp_files,
|
||||
}
|
||||
if self.sync_workfile_version:
|
||||
data["version"] = context.data["version"]
|
||||
|
||||
# add additional data
|
||||
data = self.add_additional_data(data)
|
||||
render_instance_dict = attr.asdict(render_instance)
|
||||
|
||||
instance = context.create_instance(render_instance.name)
|
||||
instance.data["label"] = render_instance.label
|
||||
instance.data.update(render_instance_dict)
|
||||
instance.data.update(data)
|
||||
|
||||
self.post_collecting_action()
|
||||
|
||||
@abstractmethod
|
||||
def get_instances(self, context):
|
||||
"""Get all renderable instances and their data.
|
||||
|
||||
Args:
|
||||
context (pyblish.api.Context): Context object.
|
||||
|
||||
Returns:
|
||||
list of :class:`RenderInstance`: All collected renderable instances
|
||||
(like render layers, write nodes, etc.)
|
||||
|
||||
"""
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def get_expected_files(self, render_instance):
|
||||
"""Get list of expected files.
|
||||
|
||||
Returns:
|
||||
list: expected files. This can be either simple list of files with
|
||||
their paths, or list of dictionaries, where key is name of AOV
|
||||
for example and value is list of files for that AOV.
|
||||
|
||||
Example::
|
||||
|
||||
['/path/to/file.001.exr', '/path/to/file.002.exr']
|
||||
|
||||
or as dictionary:
|
||||
|
||||
[
|
||||
{
|
||||
"beauty": ['/path/to/beauty.001.exr', ...],
|
||||
"mask": ['/path/to/mask.001.exr']
|
||||
}
|
||||
]
|
||||
|
||||
"""
|
||||
pass
|
||||
|
||||
def add_additional_data(self, data):
|
||||
"""Add additional data to collected instance.
|
||||
|
||||
This can be overridden by host implementation to add custom
|
||||
additional data.
|
||||
|
||||
"""
|
||||
return data
|
||||
|
||||
def post_collecting_action(self):
|
||||
"""Execute some code after collection is done.
|
||||
|
||||
This is useful for example for restoring current render layer.
|
||||
|
||||
"""
|
||||
pass
|
||||
53
openpype/pipeline/publish/abstract_expected_files.py
Normal file
53
openpype/pipeline/publish/abstract_expected_files.py
Normal file
|
|
@ -0,0 +1,53 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Abstract ExpectedFile class definition."""
|
||||
from abc import ABCMeta, abstractmethod
|
||||
import six
|
||||
|
||||
|
||||
@six.add_metaclass(ABCMeta)
|
||||
class ExpectedFiles:
|
||||
"""Class grouping functionality for all supported renderers.
|
||||
|
||||
Attributes:
|
||||
multipart (bool): Flag if multipart exrs are used.
|
||||
|
||||
"""
|
||||
|
||||
multipart = False
|
||||
|
||||
@abstractmethod
|
||||
def get(self, render_instance):
|
||||
"""Get expected files for given renderer and render layer.
|
||||
|
||||
This method should return dictionary of all files we are expecting
|
||||
to be rendered from the host. Usually `render_instance` corresponds
|
||||
to *render layer*. Result can be either flat list with the file
|
||||
paths or it can be list of dictionaries. Each key corresponds to
|
||||
for example AOV name or channel, etc.
|
||||
|
||||
Example::
|
||||
|
||||
['/path/to/file.001.exr', '/path/to/file.002.exr']
|
||||
|
||||
or as dictionary:
|
||||
|
||||
[
|
||||
{
|
||||
"beauty": ['/path/to/beauty.001.exr', ...],
|
||||
"mask": ['/path/to/mask.001.exr']
|
||||
}
|
||||
]
|
||||
|
||||
|
||||
Args:
|
||||
render_instance (:class:`RenderInstance`): Data passed from
|
||||
collector to determine files. This should be instance of
|
||||
:class:`abstract_collect_render.RenderInstance`
|
||||
|
||||
Returns:
|
||||
list: Full paths to expected rendered files.
|
||||
list of dict: Path to expected rendered files categorized by
|
||||
AOVs, etc.
|
||||
|
||||
"""
|
||||
raise NotImplementedError()
|
||||
|
|
@ -1,7 +1,17 @@
|
|||
from abc import ABCMeta
|
||||
from pyblish.plugin import MetaPlugin, ExplicitMetaPlugin
|
||||
from openpype.lib import BoolDef
|
||||
from .lib import load_help_content_from_plugin
|
||||
|
||||
|
||||
class AbstractMetaInstancePlugin(ABCMeta, MetaPlugin):
|
||||
pass
|
||||
|
||||
|
||||
class AbstractMetaContextPlugin(ABCMeta, ExplicitMetaPlugin):
|
||||
pass
|
||||
|
||||
|
||||
class PublishValidationError(Exception):
|
||||
"""Validation error happened during publishing.
|
||||
|
||||
|
|
@ -16,6 +26,7 @@ class PublishValidationError(Exception):
|
|||
description(str): Detailed description of an error. It is possible
|
||||
to use Markdown syntax.
|
||||
"""
|
||||
|
||||
def __init__(self, message, title=None, description=None, detail=None):
|
||||
self.message = message
|
||||
self.title = title or "< Missing title >"
|
||||
|
|
@ -49,6 +60,7 @@ class KnownPublishError(Exception):
|
|||
|
||||
Message will be shown in UI for artist.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
|
||||
|
|
@ -92,6 +104,7 @@ class OpenPypePyblishPluginMixin:
|
|||
Returns:
|
||||
list<AbtractAttrDef>: Attribute definitions for plugin.
|
||||
"""
|
||||
|
||||
return []
|
||||
|
||||
@classmethod
|
||||
|
|
@ -116,6 +129,7 @@ class OpenPypePyblishPluginMixin:
|
|||
Args:
|
||||
data(dict): Data from instance or context.
|
||||
"""
|
||||
|
||||
return (
|
||||
data
|
||||
.get("publish_attributes", {})
|
||||
|
|
|
|||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue