mirror of
https://github.com/ynput/ayon-core.git
synced 2025-12-24 21:04:40 +01:00
Merge branch 'develop' into feature/OP-2267_Standalone-publisher-using-new-publisher
This commit is contained in:
commit
68cf09d61c
490 changed files with 44835 additions and 5644 deletions
7
.github/workflows/documentation.yml
vendored
7
.github/workflows/documentation.yml
vendored
|
|
@ -20,7 +20,8 @@ jobs:
|
|||
- uses: actions/checkout@v1
|
||||
- uses: actions/setup-node@v1
|
||||
with:
|
||||
node-version: '12.x'
|
||||
node-version: 14.x
|
||||
cache: yarn
|
||||
- name: Test Build
|
||||
run: |
|
||||
cd website
|
||||
|
|
@ -41,8 +42,8 @@ jobs:
|
|||
|
||||
- uses: actions/setup-node@v1
|
||||
with:
|
||||
node-version: '12.x'
|
||||
|
||||
node-version: 14.x
|
||||
cache: yarn
|
||||
- name: 🔨 Build
|
||||
run: |
|
||||
cd website
|
||||
|
|
|
|||
8
.gitmodules
vendored
8
.gitmodules
vendored
|
|
@ -3,10 +3,4 @@
|
|||
url = https://github.com/pypeclub/avalon-core.git
|
||||
[submodule "repos/avalon-unreal-integration"]
|
||||
path = repos/avalon-unreal-integration
|
||||
url = https://github.com/pypeclub/avalon-unreal-integration.git
|
||||
[submodule "openpype/modules/default_modules/ftrack/python2_vendor/arrow"]
|
||||
path = openpype/modules/default_modules/ftrack/python2_vendor/arrow
|
||||
url = https://github.com/arrow-py/arrow.git
|
||||
[submodule "openpype/modules/default_modules/ftrack/python2_vendor/ftrack-python-api"]
|
||||
path = openpype/modules/default_modules/ftrack/python2_vendor/ftrack-python-api
|
||||
url = https://bitbucket.org/ftrack/ftrack-python-api.git
|
||||
url = https://github.com/pypeclub/avalon-unreal-integration.git
|
||||
57
CHANGELOG.md
57
CHANGELOG.md
|
|
@ -1,6 +1,6 @@
|
|||
# Changelog
|
||||
|
||||
## [3.9.0-nightly.2](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
## [3.9.0-nightly.3](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.8.2...HEAD)
|
||||
|
||||
|
|
@ -10,30 +10,42 @@
|
|||
|
||||
### 📖 Documentation
|
||||
|
||||
- Update docusaurus to latest version [\#2760](https://github.com/pypeclub/OpenPype/pull/2760)
|
||||
- documentation: add example to `repack-version` command [\#2669](https://github.com/pypeclub/OpenPype/pull/2669)
|
||||
- Update docusaurus [\#2639](https://github.com/pypeclub/OpenPype/pull/2639)
|
||||
- Documentation: Fixed relative links [\#2621](https://github.com/pypeclub/OpenPype/pull/2621)
|
||||
|
||||
**🆕 New features**
|
||||
|
||||
- General: Store settings by OpenPype version [\#2570](https://github.com/pypeclub/OpenPype/pull/2570)
|
||||
- Flame: loading clips to reels [\#2622](https://github.com/pypeclub/OpenPype/pull/2622)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Pyblish Pype: Remove redundant new line in installed fonts printing [\#2758](https://github.com/pypeclub/OpenPype/pull/2758)
|
||||
- Flame: adding validator source clip [\#2746](https://github.com/pypeclub/OpenPype/pull/2746)
|
||||
- Work Files: Preserve subversion comment of current filename by default [\#2734](https://github.com/pypeclub/OpenPype/pull/2734)
|
||||
- Ftrack: Disable ftrack module by default [\#2732](https://github.com/pypeclub/OpenPype/pull/2732)
|
||||
- Project Manager: Disable add task, add asset and save button when not in a project [\#2727](https://github.com/pypeclub/OpenPype/pull/2727)
|
||||
- dropbox handle big file [\#2718](https://github.com/pypeclub/OpenPype/pull/2718)
|
||||
- Fusion Move PR: Minor tweaks to Fusion integration [\#2716](https://github.com/pypeclub/OpenPype/pull/2716)
|
||||
- Nuke: prerender with review knob [\#2691](https://github.com/pypeclub/OpenPype/pull/2691)
|
||||
- Maya configurable unit validator [\#2680](https://github.com/pypeclub/OpenPype/pull/2680)
|
||||
- General: Add settings for CleanUpFarm and disable the plugin by default [\#2679](https://github.com/pypeclub/OpenPype/pull/2679)
|
||||
- Project Manager: Only allow scroll wheel edits when spinbox is active [\#2678](https://github.com/pypeclub/OpenPype/pull/2678)
|
||||
- Ftrack: Sync description to assets [\#2670](https://github.com/pypeclub/OpenPype/pull/2670)
|
||||
- Houdini: Moved to OpenPype [\#2658](https://github.com/pypeclub/OpenPype/pull/2658)
|
||||
- Maya: Move implementation to OpenPype [\#2649](https://github.com/pypeclub/OpenPype/pull/2649)
|
||||
- General: FFmpeg conversion also check attribute string length [\#2635](https://github.com/pypeclub/OpenPype/pull/2635)
|
||||
- Global: adding studio name/code to anatomy template formatting data [\#2630](https://github.com/pypeclub/OpenPype/pull/2630)
|
||||
- Houdini: Load Arnold .ass procedurals into Houdini [\#2606](https://github.com/pypeclub/OpenPype/pull/2606)
|
||||
- Deadline: Simplify GlobalJobPreLoad logic [\#2605](https://github.com/pypeclub/OpenPype/pull/2605)
|
||||
- Houdini: Implement Arnold .ass standin extraction from Houdini \(also support .ass.gz\) [\#2603](https://github.com/pypeclub/OpenPype/pull/2603)
|
||||
- New Publisher: New features and preparations for new standalone publisher [\#2556](https://github.com/pypeclub/OpenPype/pull/2556)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Loader UI: Fix right click in representation widget [\#2757](https://github.com/pypeclub/OpenPype/pull/2757)
|
||||
- Aftereffects 2022 and Deadline [\#2748](https://github.com/pypeclub/OpenPype/pull/2748)
|
||||
- Flame: bunch of bugs [\#2745](https://github.com/pypeclub/OpenPype/pull/2745)
|
||||
- Maya: Save current scene on workfile publish [\#2744](https://github.com/pypeclub/OpenPype/pull/2744)
|
||||
- Version Up: Preserve parts of filename after version number \(like subversion\) on version\_up [\#2741](https://github.com/pypeclub/OpenPype/pull/2741)
|
||||
- Loader UI: Multiple asset selection and underline colors fixed [\#2731](https://github.com/pypeclub/OpenPype/pull/2731)
|
||||
- General: Fix loading of unused chars in xml format [\#2729](https://github.com/pypeclub/OpenPype/pull/2729)
|
||||
- TVPaint: Set objectName with members [\#2725](https://github.com/pypeclub/OpenPype/pull/2725)
|
||||
- General: Don't use 'objectName' from loaded references [\#2715](https://github.com/pypeclub/OpenPype/pull/2715)
|
||||
- Settings: Studio Project anatomy is queried using right keys [\#2711](https://github.com/pypeclub/OpenPype/pull/2711)
|
||||
|
|
@ -46,21 +58,21 @@
|
|||
- Houdini Explicitly collect correct frame name even in case of single frame render when `frameStart` is provided [\#2676](https://github.com/pypeclub/OpenPype/pull/2676)
|
||||
- hiero: fix effect collector name and order [\#2673](https://github.com/pypeclub/OpenPype/pull/2673)
|
||||
- Maya: Fix menu callbacks [\#2671](https://github.com/pypeclub/OpenPype/pull/2671)
|
||||
- hiero: removing obsolete unsupported plugin [\#2667](https://github.com/pypeclub/OpenPype/pull/2667)
|
||||
- Launcher: Fix access to 'data' attribute on actions [\#2659](https://github.com/pypeclub/OpenPype/pull/2659)
|
||||
- Houdini: fix usd family in loader and integrators [\#2631](https://github.com/pypeclub/OpenPype/pull/2631)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- General: Fix loading of unused chars in xml format [\#2729](https://github.com/pypeclub/OpenPype/pull/2729)
|
||||
- Harmony: Rendering in Deadline didn't work in other machines than submitter [\#2754](https://github.com/pypeclub/OpenPype/pull/2754)
|
||||
- Maya: set Deadline job/batch name to original source workfile name instead of published workfile [\#2733](https://github.com/pypeclub/OpenPype/pull/2733)
|
||||
- Fusion: Moved implementation into OpenPype [\#2713](https://github.com/pypeclub/OpenPype/pull/2713)
|
||||
- TVPaint: Plugin build without dependencies [\#2705](https://github.com/pypeclub/OpenPype/pull/2705)
|
||||
- Webpublisher: Photoshop create a beauty png [\#2689](https://github.com/pypeclub/OpenPype/pull/2689)
|
||||
- Ftrack: Hierarchical attributes are queried properly [\#2682](https://github.com/pypeclub/OpenPype/pull/2682)
|
||||
- Fix python install in docker for centos7 [\#2664](https://github.com/pypeclub/OpenPype/pull/2664)
|
||||
- Maya: Add Validate Frame Range settings [\#2661](https://github.com/pypeclub/OpenPype/pull/2661)
|
||||
- Harmony: move to Openpype [\#2657](https://github.com/pypeclub/OpenPype/pull/2657)
|
||||
- General: Show applications without integration in project [\#2656](https://github.com/pypeclub/OpenPype/pull/2656)
|
||||
- Maya: cleanup duplicate rendersetup code [\#2642](https://github.com/pypeclub/OpenPype/pull/2642)
|
||||
- Deadline: Be able to pass Mongo url to job [\#2616](https://github.com/pypeclub/OpenPype/pull/2616)
|
||||
|
||||
## [3.8.2](https://github.com/pypeclub/OpenPype/tree/3.8.2) (2022-02-07)
|
||||
|
||||
|
|
@ -72,16 +84,13 @@
|
|||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- TVPaint: Image loaders also work on review family [\#2638](https://github.com/pypeclub/OpenPype/pull/2638)
|
||||
- General: Project backup tools [\#2629](https://github.com/pypeclub/OpenPype/pull/2629)
|
||||
- nuke: adding clear button to write nodes [\#2627](https://github.com/pypeclub/OpenPype/pull/2627)
|
||||
- Ftrack: Family to Asset type mapping is in settings [\#2602](https://github.com/pypeclub/OpenPype/pull/2602)
|
||||
- Nuke: load color space from representation data [\#2576](https://github.com/pypeclub/OpenPype/pull/2576)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Fix pulling of cx\_freeze 6.10 [\#2628](https://github.com/pypeclub/OpenPype/pull/2628)
|
||||
- Global: fix broken otio review extractor [\#2590](https://github.com/pypeclub/OpenPype/pull/2590)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
|
|
@ -102,35 +111,15 @@
|
|||
- Release/3.8.0 [\#2619](https://github.com/pypeclub/OpenPype/pull/2619)
|
||||
- hotfix: OIIO tool path - add extension on windows [\#2618](https://github.com/pypeclub/OpenPype/pull/2618)
|
||||
- Settings: Enum does not store empty string if has single item to select [\#2615](https://github.com/pypeclub/OpenPype/pull/2615)
|
||||
- switch distutils to sysconfig for `get\_platform\(\)` [\#2594](https://github.com/pypeclub/OpenPype/pull/2594)
|
||||
- Fix poetry index and speedcopy update [\#2589](https://github.com/pypeclub/OpenPype/pull/2589)
|
||||
- Webpublisher: Fix - subset names from processed .psd used wrong value for task [\#2586](https://github.com/pypeclub/OpenPype/pull/2586)
|
||||
- `vrscene` creator Deadline webservice URL handling [\#2580](https://github.com/pypeclub/OpenPype/pull/2580)
|
||||
- global: track name was failing if duplicated root word in name [\#2568](https://github.com/pypeclub/OpenPype/pull/2568)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Bump pillow from 8.4.0 to 9.0.0 [\#2595](https://github.com/pypeclub/OpenPype/pull/2595)
|
||||
- Webpublisher: Skip version collect [\#2591](https://github.com/pypeclub/OpenPype/pull/2591)
|
||||
|
||||
## [3.8.0](https://github.com/pypeclub/OpenPype/tree/3.8.0) (2022-01-24)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.8.0-nightly.7...3.8.0)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Webpublisher: Moved error at the beginning of the log [\#2559](https://github.com/pypeclub/OpenPype/pull/2559)
|
||||
- Ftrack: Use ApplicationManager to get DJV path [\#2558](https://github.com/pypeclub/OpenPype/pull/2558)
|
||||
- Webpublisher: Added endpoint to reprocess batch through UI [\#2555](https://github.com/pypeclub/OpenPype/pull/2555)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- AfterEffects: Fix - removed obsolete import [\#2577](https://github.com/pypeclub/OpenPype/pull/2577)
|
||||
- General: OpenPype version updates [\#2575](https://github.com/pypeclub/OpenPype/pull/2575)
|
||||
- Ftrack: Delete action revision [\#2563](https://github.com/pypeclub/OpenPype/pull/2563)
|
||||
- Webpublisher: ftrack shows incorrect user names [\#2560](https://github.com/pypeclub/OpenPype/pull/2560)
|
||||
- General: Do not validate version if build does not support it [\#2557](https://github.com/pypeclub/OpenPype/pull/2557)
|
||||
|
||||
## [3.7.0](https://github.com/pypeclub/OpenPype/tree/3.7.0) (2022-01-04)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.7.0-nightly.14...3.7.0)
|
||||
|
|
|
|||
|
|
@ -17,11 +17,12 @@ class AddLastWorkfileToLaunchArgs(PreLaunchHook):
|
|||
"nuke",
|
||||
"nukex",
|
||||
"hiero",
|
||||
"houdini",
|
||||
"nukestudio",
|
||||
"blender",
|
||||
"photoshop",
|
||||
"tvpaint",
|
||||
"afftereffects"
|
||||
"aftereffects"
|
||||
]
|
||||
|
||||
def execute(self):
|
||||
|
|
|
|||
|
|
@ -19,7 +19,7 @@ class CopyTemplateWorkfile(PreLaunchHook):
|
|||
|
||||
# Before `AddLastWorkfileToLaunchArgs`
|
||||
order = 0
|
||||
app_groups = ["blender", "photoshop", "tvpaint", "afftereffects"]
|
||||
app_groups = ["blender", "photoshop", "tvpaint", "aftereffects"]
|
||||
|
||||
def execute(self):
|
||||
"""Check if can copy template for context and do it if possible.
|
||||
|
|
@ -44,7 +44,7 @@ class CopyTemplateWorkfile(PreLaunchHook):
|
|||
return
|
||||
|
||||
if os.path.exists(last_workfile):
|
||||
self.log.debug("Last workfile exits. Skipping {} process.".format(
|
||||
self.log.debug("Last workfile exists. Skipping {} process.".format(
|
||||
self.__class__.__name__
|
||||
))
|
||||
return
|
||||
|
|
@ -120,7 +120,7 @@ class CopyTemplateWorkfile(PreLaunchHook):
|
|||
f"Creating workfile from template: \"{template_path}\""
|
||||
)
|
||||
|
||||
# Copy template workfile to new destinantion
|
||||
# Copy template workfile to new destination
|
||||
shutil.copy2(
|
||||
os.path.normpath(template_path),
|
||||
os.path.normpath(last_workfile)
|
||||
|
|
|
|||
Binary file not shown.
|
|
@ -1,5 +1,5 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<ExtensionManifest Version="8.0" ExtensionBundleId="com.openpype.AE.panel" ExtensionBundleVersion="1.0.21"
|
||||
<ExtensionManifest Version="8.0" ExtensionBundleId="com.openpype.AE.panel" ExtensionBundleVersion="1.0.22"
|
||||
ExtensionBundleName="openpype" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
|
||||
<ExtensionList>
|
||||
<Extension Id="com.openpype.AE.panel" Version="1.0" />
|
||||
|
|
|
|||
|
|
@ -301,6 +301,15 @@ function main(websocket_url){
|
|||
return get_extension_version();
|
||||
});
|
||||
|
||||
RPC.addRoute('AfterEffects.get_app_version', function (data) {
|
||||
log.warn('Server called client route "get_app_version":', data);
|
||||
return runEvalScript("getAppVersion()")
|
||||
.then(function(result){
|
||||
log.warn("get_app_version: " + result);
|
||||
return result;
|
||||
});
|
||||
});
|
||||
|
||||
RPC.addRoute('AfterEffects.close', function (data) {
|
||||
log.warn('Server called client route "close":', data);
|
||||
return runEvalScript("close()");
|
||||
|
|
|
|||
|
|
@ -438,7 +438,10 @@ function getAudioUrlForComp(comp_id){
|
|||
for (i = 1; i <= item.numLayers; ++i){
|
||||
var layer = item.layers[i];
|
||||
if (layer instanceof AVLayer){
|
||||
return layer.source.file.fsName.toString();
|
||||
if (layer.hasAudio){
|
||||
source_url = layer.source.file.fsName.toString()
|
||||
return _prepareSingleValue(source_url);
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
|
@ -715,6 +718,10 @@ function close(){
|
|||
app.quit();
|
||||
}
|
||||
|
||||
function getAppVersion(){
|
||||
return _prepareSingleValue(app.version);
|
||||
}
|
||||
|
||||
function _prepareSingleValue(value){
|
||||
return JSON.stringify({"result": value})
|
||||
}
|
||||
|
|
|
|||
|
|
@ -537,6 +537,13 @@ class AfterEffectsServerStub():
|
|||
|
||||
return self._handle_return(res)
|
||||
|
||||
def get_app_version(self):
|
||||
"""Returns version number of installed application (17.5...)."""
|
||||
res = self.websocketserver.call(self.client.call(
|
||||
'AfterEffects.get_app_version'))
|
||||
|
||||
return self._handle_return(res)
|
||||
|
||||
def close(self):
|
||||
res = self.websocketserver.call(self.client.call('AfterEffects.close'))
|
||||
|
||||
|
|
|
|||
|
|
@ -20,6 +20,7 @@ class AERenderInstance(RenderInstance):
|
|||
fps = attr.ib(default=None)
|
||||
projectEntity = attr.ib(default=None)
|
||||
stagingDir = attr.ib(default=None)
|
||||
app_version = attr.ib(default=None)
|
||||
|
||||
|
||||
class CollectAERender(abstract_collect_render.AbstractCollectRender):
|
||||
|
|
@ -41,6 +42,9 @@ class CollectAERender(abstract_collect_render.AbstractCollectRender):
|
|||
def get_instances(self, context):
|
||||
instances = []
|
||||
|
||||
app_version = self.stub.get_app_version()
|
||||
app_version = app_version[0:4]
|
||||
|
||||
current_file = context.data["currentFile"]
|
||||
version = context.data["version"]
|
||||
asset_entity = context.data["assetEntity"]
|
||||
|
|
@ -105,7 +109,8 @@ class CollectAERender(abstract_collect_render.AbstractCollectRender):
|
|||
frameEnd=frameEnd,
|
||||
frameStep=1,
|
||||
toBeRenderedOn='deadline',
|
||||
fps=fps
|
||||
fps=fps,
|
||||
app_version=app_version
|
||||
)
|
||||
|
||||
comp = compositions_by_id.get(int(item_id))
|
||||
|
|
|
|||
|
|
@ -3,3 +3,20 @@ import os
|
|||
HOST_DIR = os.path.dirname(
|
||||
os.path.abspath(__file__)
|
||||
)
|
||||
|
||||
|
||||
def add_implementation_envs(env, _app):
|
||||
# Add requirements to DL_PYTHON_HOOK_PATH
|
||||
pype_root = os.environ["OPENPYPE_REPOS_ROOT"]
|
||||
|
||||
env["DL_PYTHON_HOOK_PATH"] = os.path.join(
|
||||
pype_root, "openpype", "hosts", "flame", "startup")
|
||||
env.pop("QT_AUTO_SCREEN_SCALE_FACTOR", None)
|
||||
|
||||
# Set default values if are not already set via settings
|
||||
defaults = {
|
||||
"LOGLEVEL": "DEBUG"
|
||||
}
|
||||
for key, value in defaults.items():
|
||||
if not env.get(key):
|
||||
env[key] = value
|
||||
|
|
|
|||
|
|
@ -527,6 +527,7 @@ def get_segment_attributes(segment):
|
|||
|
||||
# Add timeline segment to tree
|
||||
clip_data = {
|
||||
"shot_name": segment.shot_name.get_value(),
|
||||
"segment_name": segment.name.get_value(),
|
||||
"segment_comment": segment.comment.get_value(),
|
||||
"tape_name": segment.tape_name,
|
||||
|
|
|
|||
|
|
@ -361,6 +361,7 @@ class PublishableClip:
|
|||
vertical_sync_default = False
|
||||
driving_layer_default = ""
|
||||
index_from_segment_default = False
|
||||
use_shot_name_default = False
|
||||
|
||||
def __init__(self, segment, **kwargs):
|
||||
self.rename_index = kwargs["rename_index"]
|
||||
|
|
@ -376,6 +377,7 @@ class PublishableClip:
|
|||
# segment (clip) main attributes
|
||||
self.cs_name = self.clip_data["segment_name"]
|
||||
self.cs_index = int(self.clip_data["segment"])
|
||||
self.shot_name = self.clip_data["shot_name"]
|
||||
|
||||
# get track name and index
|
||||
self.track_index = int(self.clip_data["track"])
|
||||
|
|
@ -419,18 +421,21 @@ class PublishableClip:
|
|||
# deal with clip name
|
||||
new_name = self.marker_data.pop("newClipName")
|
||||
|
||||
if self.rename:
|
||||
if self.rename and not self.use_shot_name:
|
||||
# rename segment
|
||||
self.current_segment.name = str(new_name)
|
||||
self.marker_data["asset"] = str(new_name)
|
||||
elif self.use_shot_name:
|
||||
self.marker_data["asset"] = self.shot_name
|
||||
self.marker_data["hierarchyData"]["shot"] = self.shot_name
|
||||
else:
|
||||
self.marker_data["asset"] = self.cs_name
|
||||
self.marker_data["hierarchyData"]["shot"] = self.cs_name
|
||||
|
||||
if self.marker_data["heroTrack"] and self.review_layer:
|
||||
self.marker_data.update({"reviewTrack": self.review_layer})
|
||||
self.marker_data["reviewTrack"] = self.review_layer
|
||||
else:
|
||||
self.marker_data.update({"reviewTrack": None})
|
||||
self.marker_data["reviewTrack"] = None
|
||||
|
||||
# create pype tag on track_item and add data
|
||||
fpipeline.imprint(self.current_segment, self.marker_data)
|
||||
|
|
@ -463,6 +468,8 @@ class PublishableClip:
|
|||
# ui_inputs data or default values if gui was not used
|
||||
self.rename = self.ui_inputs.get(
|
||||
"clipRename", {}).get("value") or self.rename_default
|
||||
self.use_shot_name = self.ui_inputs.get(
|
||||
"useShotName", {}).get("value") or self.use_shot_name_default
|
||||
self.clip_name = self.ui_inputs.get(
|
||||
"clipName", {}).get("value") or self.clip_name_default
|
||||
self.hierarchy = self.ui_inputs.get(
|
||||
|
|
|
|||
|
|
@ -4,9 +4,13 @@ import tempfile
|
|||
import contextlib
|
||||
import socket
|
||||
from openpype.lib import (
|
||||
PreLaunchHook, get_openpype_username)
|
||||
PreLaunchHook,
|
||||
get_openpype_username
|
||||
)
|
||||
from openpype.lib.applications import (
|
||||
ApplicationLaunchFailed
|
||||
)
|
||||
from openpype.hosts import flame as opflame
|
||||
import openpype.hosts.flame.api as opfapi
|
||||
import openpype
|
||||
from pprint import pformat
|
||||
|
||||
|
|
@ -33,7 +37,25 @@ class FlamePrelaunch(PreLaunchHook):
|
|||
|
||||
"""Hook entry method."""
|
||||
project_doc = self.data["project_doc"]
|
||||
project_name = project_doc["name"]
|
||||
|
||||
# get image io
|
||||
project_anatomy = self.data["anatomy"]
|
||||
|
||||
# make sure anatomy settings are having flame key
|
||||
if not project_anatomy["imageio"].get("flame"):
|
||||
raise ApplicationLaunchFailed((
|
||||
"Anatomy project settings are missing `flame` key. "
|
||||
"Please make sure you remove project overides on "
|
||||
"Anatomy Image io")
|
||||
)
|
||||
|
||||
imageio_flame = project_anatomy["imageio"]["flame"]
|
||||
|
||||
# get user name and host name
|
||||
user_name = get_openpype_username()
|
||||
user_name = user_name.replace(".", "_")
|
||||
|
||||
hostname = socket.gethostname() # not returning wiretap host name
|
||||
|
||||
self.log.debug("Collected user \"{}\"".format(user_name))
|
||||
|
|
@ -41,7 +63,7 @@ class FlamePrelaunch(PreLaunchHook):
|
|||
_db_p_data = project_doc["data"]
|
||||
width = _db_p_data["resolutionWidth"]
|
||||
height = _db_p_data["resolutionHeight"]
|
||||
fps = int(_db_p_data["fps"])
|
||||
fps = float(_db_p_data["fps"])
|
||||
|
||||
project_data = {
|
||||
"Name": project_doc["name"],
|
||||
|
|
@ -52,8 +74,8 @@ class FlamePrelaunch(PreLaunchHook):
|
|||
"FrameHeight": int(height),
|
||||
"AspectRatio": float((width / height) * _db_p_data["pixelAspect"]),
|
||||
"FrameRate": "{} fps".format(fps),
|
||||
"FrameDepth": "16-bit fp",
|
||||
"FieldDominance": "PROGRESSIVE"
|
||||
"FrameDepth": str(imageio_flame["project"]["frameDepth"]),
|
||||
"FieldDominance": str(imageio_flame["project"]["fieldDominance"])
|
||||
}
|
||||
|
||||
data_to_script = {
|
||||
|
|
@ -61,10 +83,10 @@ class FlamePrelaunch(PreLaunchHook):
|
|||
"host_name": _env.get("FLAME_WIRETAP_HOSTNAME") or hostname,
|
||||
"volume_name": _env.get("FLAME_WIRETAP_VOLUME"),
|
||||
"group_name": _env.get("FLAME_WIRETAP_GROUP"),
|
||||
"color_policy": "ACES 1.1",
|
||||
"color_policy": str(imageio_flame["project"]["colourPolicy"]),
|
||||
|
||||
# from project
|
||||
"project_name": project_doc["name"],
|
||||
"project_name": project_name,
|
||||
"user_name": user_name,
|
||||
"project_data": project_data
|
||||
}
|
||||
|
|
@ -77,8 +99,6 @@ class FlamePrelaunch(PreLaunchHook):
|
|||
|
||||
app_arguments = self._get_launch_arguments(data_to_script)
|
||||
|
||||
opfapi.setup(self.launch_context.env)
|
||||
|
||||
self.launch_context.launch_args.extend(app_arguments)
|
||||
|
||||
def _add_pythonpath(self):
|
||||
|
|
|
|||
|
|
@ -87,41 +87,48 @@ class CreateShotClip(opfapi.Creator):
|
|||
"target": "tag",
|
||||
"toolTip": "Parents folder for shot root folder, Template filled with `Hierarchy Data` section", # noqa
|
||||
"order": 0},
|
||||
"useShotName": {
|
||||
"value": True,
|
||||
"type": "QCheckBox",
|
||||
"label": "Use Shot Name",
|
||||
"target": "ui",
|
||||
"toolTip": "Use name form Shot name clip attribute", # noqa
|
||||
"order": 1},
|
||||
"clipRename": {
|
||||
"value": False,
|
||||
"type": "QCheckBox",
|
||||
"label": "Rename clips",
|
||||
"target": "ui",
|
||||
"toolTip": "Renaming selected clips on fly", # noqa
|
||||
"order": 1},
|
||||
"order": 2},
|
||||
"clipName": {
|
||||
"value": "{sequence}{shot}",
|
||||
"type": "QLineEdit",
|
||||
"label": "Clip Name Template",
|
||||
"target": "ui",
|
||||
"toolTip": "template for creating shot namespaused for renaming (use rename: on)", # noqa
|
||||
"order": 2},
|
||||
"order": 3},
|
||||
"segmentIndex": {
|
||||
"value": True,
|
||||
"type": "QCheckBox",
|
||||
"label": "Segment index",
|
||||
"target": "ui",
|
||||
"toolTip": "Take number from segment index", # noqa
|
||||
"order": 3},
|
||||
"order": 4},
|
||||
"countFrom": {
|
||||
"value": 10,
|
||||
"type": "QSpinBox",
|
||||
"label": "Count sequence from",
|
||||
"target": "ui",
|
||||
"toolTip": "Set when the sequence number stafrom", # noqa
|
||||
"order": 4},
|
||||
"order": 5},
|
||||
"countSteps": {
|
||||
"value": 10,
|
||||
"type": "QSpinBox",
|
||||
"label": "Stepping number",
|
||||
"target": "ui",
|
||||
"toolTip": "What number is adding every new step", # noqa
|
||||
"order": 5},
|
||||
"order": 6},
|
||||
}
|
||||
},
|
||||
"hierarchyData": {
|
||||
|
|
|
|||
24
openpype/hosts/flame/plugins/publish/validate_source_clip.py
Normal file
24
openpype/hosts/flame/plugins/publish/validate_source_clip.py
Normal file
|
|
@ -0,0 +1,24 @@
|
|||
import pyblish
|
||||
|
||||
|
||||
@pyblish.api.log
|
||||
class ValidateSourceClip(pyblish.api.InstancePlugin):
|
||||
"""Validate instance is not having empty `flameSourceClip`"""
|
||||
|
||||
order = pyblish.api.ValidatorOrder
|
||||
label = "Validate Source Clip"
|
||||
hosts = ["flame"]
|
||||
families = ["clip"]
|
||||
|
||||
def process(self, instance):
|
||||
flame_source_clip = instance.data["flameSourceClip"]
|
||||
|
||||
self.log.debug("_ flame_source_clip: {}".format(flame_source_clip))
|
||||
|
||||
if flame_source_clip is None:
|
||||
raise AttributeError((
|
||||
"Timeline segment `{}` is not having "
|
||||
"relative clip in reels. Please make sure "
|
||||
"you push `Save Sources` button in Conform Tab").format(
|
||||
instance.data["asset"]
|
||||
))
|
||||
|
|
@ -406,31 +406,46 @@ def imprint(node, data):
|
|||
node.setParmTemplateGroup(parm_group)
|
||||
|
||||
|
||||
def lsattr(attr, value=None):
|
||||
def lsattr(attr, value=None, root="/"):
|
||||
"""Return nodes that have `attr`
|
||||
When `value` is not None it will only return nodes matching that value
|
||||
for the given attribute.
|
||||
Args:
|
||||
attr (str): Name of the attribute (hou.Parm)
|
||||
value (object, Optional): The value to compare the attribute too.
|
||||
When the default None is provided the value check is skipped.
|
||||
root (str): The root path in Houdini to search in.
|
||||
Returns:
|
||||
list: Matching nodes that have attribute with value.
|
||||
"""
|
||||
if value is None:
|
||||
nodes = list(hou.node("/obj").allNodes())
|
||||
# Use allSubChildren() as allNodes() errors on nodes without
|
||||
# permission to enter without a means to continue of querying
|
||||
# the rest
|
||||
nodes = hou.node(root).allSubChildren()
|
||||
return [n for n in nodes if n.parm(attr)]
|
||||
return lsattrs({attr: value})
|
||||
|
||||
|
||||
def lsattrs(attrs):
|
||||
def lsattrs(attrs, root="/"):
|
||||
"""Return nodes matching `key` and `value`
|
||||
|
||||
Arguments:
|
||||
attrs (dict): collection of attribute: value
|
||||
|
||||
root (str): The root path in Houdini to search in.
|
||||
Example:
|
||||
>> lsattrs({"id": "myId"})
|
||||
["myNode"]
|
||||
>> lsattr("id")
|
||||
["myNode", "myOtherNode"]
|
||||
|
||||
Returns:
|
||||
list
|
||||
list: Matching nodes that have attribute with value.
|
||||
"""
|
||||
|
||||
matches = set()
|
||||
nodes = list(hou.node("/obj").allNodes()) # returns generator object
|
||||
# Use allSubChildren() as allNodes() errors on nodes without
|
||||
# permission to enter without a means to continue of querying
|
||||
# the rest
|
||||
nodes = hou.node(root).allSubChildren()
|
||||
for node in nodes:
|
||||
for attr in attrs:
|
||||
if not node.parm(attr):
|
||||
|
|
@ -527,3 +542,37 @@ def maintained_selection():
|
|||
if previous_selection:
|
||||
for node in previous_selection:
|
||||
node.setSelected(on=True)
|
||||
|
||||
|
||||
def reset_framerange():
|
||||
"""Set frame range to current asset"""
|
||||
|
||||
asset_name = api.Session["AVALON_ASSET"]
|
||||
asset = io.find_one({"name": asset_name, "type": "asset"})
|
||||
|
||||
frame_start = asset["data"].get("frameStart")
|
||||
frame_end = asset["data"].get("frameEnd")
|
||||
# Backwards compatibility
|
||||
if frame_start is None or frame_end is None:
|
||||
frame_start = asset["data"].get("edit_in")
|
||||
frame_end = asset["data"].get("edit_out")
|
||||
|
||||
if frame_start is None or frame_end is None:
|
||||
log.warning("No edit information found for %s" % asset_name)
|
||||
return
|
||||
|
||||
handles = asset["data"].get("handles") or 0
|
||||
handle_start = asset["data"].get("handleStart")
|
||||
if handle_start is None:
|
||||
handle_start = handles
|
||||
|
||||
handle_end = asset["data"].get("handleEnd")
|
||||
if handle_end is None:
|
||||
handle_end = handles
|
||||
|
||||
frame_start -= int(handle_start)
|
||||
frame_end += int(handle_end)
|
||||
|
||||
hou.playbar.setFrameRange(frame_start, frame_end)
|
||||
hou.playbar.setPlaybackRange(frame_start, frame_end)
|
||||
hou.setFrame(frame_start)
|
||||
|
|
|
|||
|
|
@ -4,6 +4,7 @@ import logging
|
|||
import contextlib
|
||||
|
||||
import hou
|
||||
import hdefereval
|
||||
|
||||
import pyblish.api
|
||||
import avalon.api
|
||||
|
|
@ -66,9 +67,10 @@ def install():
|
|||
|
||||
sys.path.append(hou_pythonpath)
|
||||
|
||||
# Set asset FPS for the empty scene directly after launch of Houdini
|
||||
# so it initializes into the correct scene FPS
|
||||
_set_asset_fps()
|
||||
# Set asset settings for the empty scene directly after launch of Houdini
|
||||
# so it initializes into the correct scene FPS, Frame Range, etc.
|
||||
# todo: make sure this doesn't trigger when opening with last workfile
|
||||
_set_context_settings()
|
||||
|
||||
|
||||
def uninstall():
|
||||
|
|
@ -279,18 +281,49 @@ def on_open(*args):
|
|||
|
||||
def on_new(_):
|
||||
"""Set project resolution and fps when create a new file"""
|
||||
|
||||
if hou.hipFile.isLoadingHipFile():
|
||||
# This event also triggers when Houdini opens a file due to the
|
||||
# new event being registered to 'afterClear'. As such we can skip
|
||||
# 'new' logic if the user is opening a file anyway
|
||||
log.debug("Skipping on new callback due to scene being opened.")
|
||||
return
|
||||
|
||||
log.info("Running callback on new..")
|
||||
_set_asset_fps()
|
||||
_set_context_settings()
|
||||
|
||||
# It seems that the current frame always gets reset to frame 1 on
|
||||
# new scene. So we enforce current frame to be at the start of the playbar
|
||||
# with execute deferred
|
||||
def _enforce_start_frame():
|
||||
start = hou.playbar.playbackRange()[0]
|
||||
hou.setFrame(start)
|
||||
|
||||
hdefereval.executeDeferred(_enforce_start_frame)
|
||||
|
||||
|
||||
def _set_asset_fps():
|
||||
"""Set Houdini scene FPS to the default required for current asset"""
|
||||
def _set_context_settings():
|
||||
"""Apply the project settings from the project definition
|
||||
|
||||
Settings can be overwritten by an asset if the asset.data contains
|
||||
any information regarding those settings.
|
||||
|
||||
Examples of settings:
|
||||
fps
|
||||
resolution
|
||||
renderer
|
||||
|
||||
Returns:
|
||||
None
|
||||
"""
|
||||
|
||||
# Set new scene fps
|
||||
fps = get_asset_fps()
|
||||
print("Setting scene FPS to %i" % fps)
|
||||
lib.set_scene_fps(fps)
|
||||
|
||||
lib.reset_framerange()
|
||||
|
||||
|
||||
def on_pyblish_instance_toggled(instance, new_value, old_value):
|
||||
"""Toggle saver tool passthrough states on instance toggles."""
|
||||
|
|
|
|||
|
|
@ -6,10 +6,7 @@ class USDSublayerLoader(api.Loader):
|
|||
"""Sublayer USD file in Solaris"""
|
||||
|
||||
families = [
|
||||
"colorbleed.usd",
|
||||
"colorbleed.pointcache",
|
||||
"colorbleed.animation",
|
||||
"colorbleed.camera",
|
||||
"usd",
|
||||
"usdCamera",
|
||||
]
|
||||
label = "Sublayer USD"
|
||||
|
|
|
|||
|
|
@ -6,10 +6,7 @@ class USDReferenceLoader(api.Loader):
|
|||
"""Reference USD file in Solaris"""
|
||||
|
||||
families = [
|
||||
"colorbleed.usd",
|
||||
"colorbleed.pointcache",
|
||||
"colorbleed.animation",
|
||||
"colorbleed.camera",
|
||||
"usd",
|
||||
"usdCamera",
|
||||
]
|
||||
label = "Reference USD"
|
||||
|
|
|
|||
|
|
@ -48,7 +48,7 @@ class CollectUsdLayers(pyblish.api.InstancePlugin):
|
|||
label = "{0} -> {1}".format(instance.data["name"], name)
|
||||
layer_inst = context.create_instance(name)
|
||||
|
||||
family = "colorbleed.usdlayer"
|
||||
family = "usdlayer"
|
||||
layer_inst.data["family"] = family
|
||||
layer_inst.data["families"] = [family]
|
||||
layer_inst.data["subset"] = "__stub__"
|
||||
|
|
|
|||
|
|
@ -2,26 +2,14 @@ import pyblish.api
|
|||
import avalon.api
|
||||
|
||||
|
||||
class SaveCurrentScene(pyblish.api.InstancePlugin):
|
||||
class SaveCurrentScene(pyblish.api.ContextPlugin):
|
||||
"""Save current scene"""
|
||||
|
||||
label = "Save current file"
|
||||
order = pyblish.api.IntegratorOrder - 0.49
|
||||
order = pyblish.api.ExtractorOrder - 0.49
|
||||
hosts = ["houdini"]
|
||||
families = ["usdrender",
|
||||
"redshift_rop"]
|
||||
targets = ["local"]
|
||||
|
||||
def process(self, instance):
|
||||
|
||||
# This should be a ContextPlugin, but this is a workaround
|
||||
# for a bug in pyblish to run once for a family: issue #250
|
||||
context = instance.context
|
||||
key = "__hasRun{}".format(self.__class__.__name__)
|
||||
if context.data.get(key, False):
|
||||
return
|
||||
else:
|
||||
context.data[key] = True
|
||||
def process(self, context):
|
||||
|
||||
# Filename must not have changed since collecting
|
||||
host = avalon.api.registered_host()
|
||||
|
|
|
|||
|
|
@ -1,23 +0,0 @@
|
|||
import pyblish.api
|
||||
|
||||
|
||||
class SaveCurrentSceneDeadline(pyblish.api.ContextPlugin):
|
||||
"""Save current scene"""
|
||||
|
||||
label = "Save current file"
|
||||
order = pyblish.api.IntegratorOrder - 0.49
|
||||
hosts = ["houdini"]
|
||||
targets = ["deadline"]
|
||||
|
||||
def process(self, context):
|
||||
import hou
|
||||
|
||||
assert (
|
||||
context.data["currentFile"] == hou.hipFile.path()
|
||||
), "Collected filename from current scene name."
|
||||
|
||||
if hou.hipFile.hasUnsavedChanges():
|
||||
self.log.info("Saving current file..")
|
||||
hou.hipFile.save(save_to_recent_files=True)
|
||||
else:
|
||||
self.log.debug("No unsaved changes, skipping file save..")
|
||||
|
|
@ -1,77 +0,0 @@
|
|||
import pyblish.api
|
||||
|
||||
|
||||
class ValidateOutputNode(pyblish.api.InstancePlugin):
|
||||
"""Validate the instance SOP Output Node.
|
||||
|
||||
This will ensure:
|
||||
- The SOP Path is set.
|
||||
- The SOP Path refers to an existing object.
|
||||
- The SOP Path node is a SOP node.
|
||||
- The SOP Path node has at least one input connection (has an input)
|
||||
- The SOP Path has geometry data.
|
||||
|
||||
"""
|
||||
|
||||
order = pyblish.api.ValidatorOrder
|
||||
families = ["pointcache", "vdbcache"]
|
||||
hosts = ["houdini"]
|
||||
label = "Validate Output Node"
|
||||
|
||||
def process(self, instance):
|
||||
|
||||
invalid = self.get_invalid(instance)
|
||||
if invalid:
|
||||
raise RuntimeError(
|
||||
"Output node(s) `%s` are incorrect. "
|
||||
"See plug-in log for details." % invalid
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def get_invalid(cls, instance):
|
||||
|
||||
import hou
|
||||
|
||||
output_node = instance.data["output_node"]
|
||||
|
||||
if output_node is None:
|
||||
node = instance[0]
|
||||
cls.log.error(
|
||||
"SOP Output node in '%s' does not exist. "
|
||||
"Ensure a valid SOP output path is set." % node.path()
|
||||
)
|
||||
|
||||
return [node.path()]
|
||||
|
||||
# Output node must be a Sop node.
|
||||
if not isinstance(output_node, hou.SopNode):
|
||||
cls.log.error(
|
||||
"Output node %s is not a SOP node. "
|
||||
"SOP Path must point to a SOP node, "
|
||||
"instead found category type: %s"
|
||||
% (output_node.path(), output_node.type().category().name())
|
||||
)
|
||||
return [output_node.path()]
|
||||
|
||||
# For the sake of completeness also assert the category type
|
||||
# is Sop to avoid potential edge case scenarios even though
|
||||
# the isinstance check above should be stricter than this category
|
||||
assert output_node.type().category().name() == "Sop", (
|
||||
"Output node %s is not of category Sop. This is a bug.."
|
||||
% output_node.path()
|
||||
)
|
||||
|
||||
# Check if output node has incoming connections
|
||||
if not output_node.inputConnections():
|
||||
cls.log.error(
|
||||
"Output node `%s` has no incoming connections"
|
||||
% output_node.path()
|
||||
)
|
||||
return [output_node.path()]
|
||||
|
||||
# Ensure the output node has at least Geometry data
|
||||
if not output_node.geometry():
|
||||
cls.log.error(
|
||||
"Output node `%s` has no geometry data." % output_node.path()
|
||||
)
|
||||
return [output_node.path()]
|
||||
|
|
@ -66,6 +66,14 @@ host_tools.show_workfiles(parent)
|
|||
]]></scriptCode>
|
||||
</scriptItem>
|
||||
|
||||
<scriptItem id="reset_frame_range">
|
||||
<label>Reset Frame Range</label>
|
||||
<scriptCode><![CDATA[
|
||||
import openpype.hosts.houdini.api.lib
|
||||
openpype.hosts.houdini.api.lib.reset_framerange()
|
||||
]]></scriptCode>
|
||||
</scriptItem>
|
||||
|
||||
<separatorItem/>
|
||||
<scriptItem id="experimental_tools">
|
||||
<label>Experimental tools...</label>
|
||||
|
|
|
|||
|
|
@ -10,12 +10,6 @@ from .pipeline import (
|
|||
|
||||
ls,
|
||||
containerise,
|
||||
|
||||
lock,
|
||||
unlock,
|
||||
is_locked,
|
||||
lock_ignored,
|
||||
|
||||
)
|
||||
from .plugin import (
|
||||
Creator,
|
||||
|
|
@ -38,11 +32,9 @@ from .lib import (
|
|||
read,
|
||||
|
||||
apply_shaders,
|
||||
without_extension,
|
||||
maintained_selection,
|
||||
suspended_refresh,
|
||||
|
||||
unique_name,
|
||||
unique_namespace,
|
||||
)
|
||||
|
||||
|
|
@ -54,11 +46,6 @@ __all__ = [
|
|||
"ls",
|
||||
"containerise",
|
||||
|
||||
"lock",
|
||||
"unlock",
|
||||
"is_locked",
|
||||
"lock_ignored",
|
||||
|
||||
"Creator",
|
||||
"Loader",
|
||||
|
||||
|
|
@ -76,11 +63,9 @@ __all__ = [
|
|||
"lsattrs",
|
||||
"read",
|
||||
|
||||
"unique_name",
|
||||
"unique_namespace",
|
||||
|
||||
"apply_shaders",
|
||||
"without_extension",
|
||||
"maintained_selection",
|
||||
"suspended_refresh",
|
||||
|
||||
|
|
|
|||
|
|
@ -2,7 +2,6 @@
|
|||
|
||||
import os
|
||||
import sys
|
||||
import re
|
||||
import platform
|
||||
import uuid
|
||||
import math
|
||||
|
|
@ -154,73 +153,59 @@ def maintained_selection():
|
|||
cmds.select(clear=True)
|
||||
|
||||
|
||||
def unique_name(name, format="%02d", namespace="", prefix="", suffix=""):
|
||||
"""Return unique `name`
|
||||
|
||||
The function takes into consideration an optional `namespace`
|
||||
and `suffix`. The suffix is included in evaluating whether a
|
||||
name exists - such as `name` + "_GRP" - but isn't included
|
||||
in the returned value.
|
||||
|
||||
If a namespace is provided, only names within that namespace
|
||||
are considered when evaluating whether the name is unique.
|
||||
|
||||
Arguments:
|
||||
format (str, optional): The `name` is given a number, this determines
|
||||
how this number is formatted. Defaults to a padding of 2.
|
||||
E.g. my_name01, my_name02.
|
||||
namespace (str, optional): Only consider names within this namespace.
|
||||
suffix (str, optional): Only consider names with this suffix.
|
||||
|
||||
Example:
|
||||
>>> name = cmds.createNode("transform", name="MyName")
|
||||
>>> cmds.objExists(name)
|
||||
True
|
||||
>>> unique = unique_name(name)
|
||||
>>> cmds.objExists(unique)
|
||||
False
|
||||
|
||||
"""
|
||||
|
||||
iteration = 1
|
||||
unique = prefix + (name + format % iteration) + suffix
|
||||
|
||||
while cmds.objExists(namespace + ":" + unique):
|
||||
iteration += 1
|
||||
unique = prefix + (name + format % iteration) + suffix
|
||||
|
||||
if suffix:
|
||||
return unique[:-len(suffix)]
|
||||
|
||||
return unique
|
||||
|
||||
|
||||
def unique_namespace(namespace, format="%02d", prefix="", suffix=""):
|
||||
"""Return unique namespace
|
||||
|
||||
Similar to :func:`unique_name` but evaluating namespaces
|
||||
as opposed to object names.
|
||||
|
||||
Arguments:
|
||||
namespace (str): Name of namespace to consider
|
||||
format (str, optional): Formatting of the given iteration number
|
||||
suffix (str, optional): Only consider namespaces with this suffix.
|
||||
|
||||
>>> unique_namespace("bar")
|
||||
# bar01
|
||||
>>> unique_namespace(":hello")
|
||||
# :hello01
|
||||
>>> unique_namespace("bar:", suffix="_NS")
|
||||
# bar01_NS:
|
||||
|
||||
"""
|
||||
|
||||
def current_namespace():
|
||||
current = cmds.namespaceInfo(currentNamespace=True,
|
||||
absoluteName=True)
|
||||
# When inside a namespace Maya adds no trailing :
|
||||
if not current.endswith(":"):
|
||||
current += ":"
|
||||
return current
|
||||
|
||||
# Always check against the absolute namespace root
|
||||
# There's no clash with :x if we're defining namespace :a:x
|
||||
ROOT = ":" if namespace.startswith(":") else current_namespace()
|
||||
|
||||
# Strip trailing `:` tokens since we might want to add a suffix
|
||||
start = ":" if namespace.startswith(":") else ""
|
||||
end = ":" if namespace.endswith(":") else ""
|
||||
namespace = namespace.strip(":")
|
||||
if ":" in namespace:
|
||||
# Split off any nesting that we don't uniqify anyway.
|
||||
parents, namespace = namespace.rsplit(":", 1)
|
||||
start += parents + ":"
|
||||
ROOT += start
|
||||
|
||||
def exists(n):
|
||||
# Check for clash with nodes and namespaces
|
||||
fullpath = ROOT + n
|
||||
return cmds.objExists(fullpath) or cmds.namespace(exists=fullpath)
|
||||
|
||||
iteration = 1
|
||||
unique = prefix + (namespace + format % iteration) + suffix
|
||||
while True:
|
||||
nr_namespace = namespace + format % iteration
|
||||
unique = prefix + nr_namespace + suffix
|
||||
|
||||
if not exists(unique):
|
||||
return start + unique + end
|
||||
|
||||
# The `existing` set does not just contain the namespaces but *all* nodes
|
||||
# within "current namespace". We need all because the namespace could
|
||||
# also clash with a node name. To be truly unique and valid one needs to
|
||||
# check against all.
|
||||
existing = set(cmds.namespaceInfo(listNamespace=True))
|
||||
while unique in existing:
|
||||
iteration += 1
|
||||
unique = prefix + (namespace + format % iteration) + suffix
|
||||
|
||||
return unique
|
||||
|
||||
|
||||
def read(node):
|
||||
|
|
@ -286,153 +271,6 @@ def pairwise(iterable):
|
|||
return itertools.izip(a, a)
|
||||
|
||||
|
||||
def unique(name):
|
||||
assert isinstance(name, string_types), "`name` must be string"
|
||||
|
||||
while cmds.objExists(name):
|
||||
matches = re.findall(r"\d+$", name)
|
||||
|
||||
if matches:
|
||||
match = matches[-1]
|
||||
name = name.rstrip(match)
|
||||
number = int(match) + 1
|
||||
else:
|
||||
number = 1
|
||||
|
||||
name = name + str(number)
|
||||
|
||||
return name
|
||||
|
||||
|
||||
def uv_from_element(element):
|
||||
"""Return the UV coordinate of given 'element'
|
||||
|
||||
Supports components, meshes, nurbs.
|
||||
|
||||
"""
|
||||
|
||||
supported = ["mesh", "nurbsSurface"]
|
||||
|
||||
uv = [0.5, 0.5]
|
||||
|
||||
if "." not in element:
|
||||
type = cmds.nodeType(element)
|
||||
if type == "transform":
|
||||
geometry_shape = cmds.listRelatives(element, shapes=True)
|
||||
|
||||
if len(geometry_shape) >= 1:
|
||||
geometry_shape = geometry_shape[0]
|
||||
else:
|
||||
return
|
||||
|
||||
elif type in supported:
|
||||
geometry_shape = element
|
||||
|
||||
else:
|
||||
cmds.error("Could not do what you wanted..")
|
||||
return
|
||||
else:
|
||||
# If it is indeed a component - get the current Mesh
|
||||
try:
|
||||
parent = element.split(".", 1)[0]
|
||||
|
||||
# Maya is funny in that when the transform of the shape
|
||||
# of the component element has children, the name returned
|
||||
# by that elementection is the shape. Otherwise, it is
|
||||
# the transform. So lets see what type we're dealing with here.
|
||||
if cmds.nodeType(parent) in supported:
|
||||
geometry_shape = parent
|
||||
else:
|
||||
geometry_shape = cmds.listRelatives(parent, shapes=1)[0]
|
||||
|
||||
if not geometry_shape:
|
||||
cmds.error("Skipping %s: Could not find shape." % element)
|
||||
return
|
||||
|
||||
if len(cmds.ls(geometry_shape)) > 1:
|
||||
cmds.warning("Multiple shapes with identical "
|
||||
"names found. This might not work")
|
||||
|
||||
except TypeError as e:
|
||||
cmds.warning("Skipping %s: Didn't find a shape "
|
||||
"for component elementection. %s" % (element, e))
|
||||
return
|
||||
|
||||
try:
|
||||
type = cmds.nodeType(geometry_shape)
|
||||
|
||||
if type == "nurbsSurface":
|
||||
# If a surfacePoint is elementected on a nurbs surface
|
||||
root, u, v = element.rsplit("[", 2)
|
||||
uv = [float(u[:-1]), float(v[:-1])]
|
||||
|
||||
if type == "mesh":
|
||||
# -----------
|
||||
# Average the U and V values
|
||||
# ===========
|
||||
uvs = cmds.polyListComponentConversion(element, toUV=1)
|
||||
if not uvs:
|
||||
cmds.warning("Couldn't derive any UV's from "
|
||||
"component, reverting to default U and V")
|
||||
raise TypeError
|
||||
|
||||
# Flatten list of Uv's as sometimes it returns
|
||||
# neighbors like this [2:3] instead of [2], [3]
|
||||
flattened = []
|
||||
|
||||
for uv in uvs:
|
||||
flattened.extend(cmds.ls(uv, flatten=True))
|
||||
|
||||
uvs = flattened
|
||||
|
||||
sumU = 0
|
||||
sumV = 0
|
||||
for uv in uvs:
|
||||
try:
|
||||
u, v = cmds.polyEditUV(uv, query=True)
|
||||
except Exception:
|
||||
cmds.warning("Couldn't find any UV coordinated, "
|
||||
"reverting to default U and V")
|
||||
raise TypeError
|
||||
|
||||
sumU += u
|
||||
sumV += v
|
||||
|
||||
averagedU = sumU / len(uvs)
|
||||
averagedV = sumV / len(uvs)
|
||||
|
||||
uv = [averagedU, averagedV]
|
||||
except TypeError:
|
||||
pass
|
||||
|
||||
return uv
|
||||
|
||||
|
||||
def shape_from_element(element):
|
||||
"""Return shape of given 'element'
|
||||
|
||||
Supports components, meshes, and surfaces
|
||||
|
||||
"""
|
||||
|
||||
try:
|
||||
# Get either shape or transform, based on element-type
|
||||
node = cmds.ls(element, objectsOnly=True)[0]
|
||||
except Exception:
|
||||
cmds.warning("Could not find node in %s" % element)
|
||||
return None
|
||||
|
||||
if cmds.nodeType(node) == 'transform':
|
||||
try:
|
||||
return cmds.listRelatives(node, shapes=True)[0]
|
||||
except Exception:
|
||||
cmds.warning("Could not find shape in %s" % element)
|
||||
return None
|
||||
|
||||
else:
|
||||
return node
|
||||
|
||||
|
||||
def export_alembic(nodes,
|
||||
file,
|
||||
frame_range=None,
|
||||
|
|
@ -577,115 +415,6 @@ def imprint(node, data):
|
|||
cmds.setAttr(node + "." + key, value, **set_type)
|
||||
|
||||
|
||||
def serialise_shaders(nodes):
|
||||
"""Generate a shader set dictionary
|
||||
|
||||
Arguments:
|
||||
nodes (list): Absolute paths to nodes
|
||||
|
||||
Returns:
|
||||
dictionary of (shader: id) pairs
|
||||
|
||||
Schema:
|
||||
{
|
||||
"shader1": ["id1", "id2"],
|
||||
"shader2": ["id3", "id1"]
|
||||
}
|
||||
|
||||
Example:
|
||||
{
|
||||
"Bazooka_Brothers01_:blinn4SG": [
|
||||
"f9520572-ac1d-11e6-b39e-3085a99791c9.f[4922:5001]",
|
||||
"f9520572-ac1d-11e6-b39e-3085a99791c9.f[4587:4634]",
|
||||
"f9520572-ac1d-11e6-b39e-3085a99791c9.f[1120:1567]",
|
||||
"f9520572-ac1d-11e6-b39e-3085a99791c9.f[4251:4362]"
|
||||
],
|
||||
"lambert2SG": [
|
||||
"f9520571-ac1d-11e6-9dbb-3085a99791c9"
|
||||
]
|
||||
}
|
||||
|
||||
"""
|
||||
|
||||
valid_nodes = cmds.ls(
|
||||
nodes,
|
||||
long=True,
|
||||
recursive=True,
|
||||
showType=True,
|
||||
objectsOnly=True,
|
||||
type="transform"
|
||||
)
|
||||
|
||||
meshes_by_id = {}
|
||||
for mesh in valid_nodes:
|
||||
shapes = cmds.listRelatives(valid_nodes[0],
|
||||
shapes=True,
|
||||
fullPath=True) or list()
|
||||
|
||||
if shapes:
|
||||
shape = shapes[0]
|
||||
if not cmds.nodeType(shape):
|
||||
continue
|
||||
|
||||
try:
|
||||
id_ = cmds.getAttr(mesh + ".mbID")
|
||||
|
||||
if id_ not in meshes_by_id:
|
||||
meshes_by_id[id_] = list()
|
||||
|
||||
meshes_by_id[id_].append(mesh)
|
||||
|
||||
except ValueError:
|
||||
continue
|
||||
|
||||
meshes_by_shader = dict()
|
||||
for mesh in meshes_by_id.values():
|
||||
shape = cmds.listRelatives(mesh,
|
||||
shapes=True,
|
||||
fullPath=True) or list()
|
||||
|
||||
for shader in cmds.listConnections(shape,
|
||||
type="shadingEngine") or list():
|
||||
|
||||
# Objects in this group are those that haven't got
|
||||
# any shaders. These are expected to be managed
|
||||
# elsewhere, such as by the default model loader.
|
||||
if shader == "initialShadingGroup":
|
||||
continue
|
||||
|
||||
if shader not in meshes_by_shader:
|
||||
meshes_by_shader[shader] = list()
|
||||
|
||||
shaded = cmds.sets(shader, query=True) or list()
|
||||
meshes_by_shader[shader].extend(shaded)
|
||||
|
||||
shader_by_id = {}
|
||||
for shader, shaded in meshes_by_shader.items():
|
||||
|
||||
if shader not in shader_by_id:
|
||||
shader_by_id[shader] = list()
|
||||
|
||||
for mesh in shaded:
|
||||
|
||||
# Enable shader assignment to faces.
|
||||
name = mesh.split(".f[")[0]
|
||||
|
||||
transform = name
|
||||
if cmds.objectType(transform) == "mesh":
|
||||
transform = cmds.listRelatives(name, parent=True)[0]
|
||||
|
||||
try:
|
||||
id_ = cmds.getAttr(transform + ".mbID")
|
||||
shader_by_id[shader].append(mesh.replace(name, id_))
|
||||
except KeyError:
|
||||
continue
|
||||
|
||||
# Remove duplicates
|
||||
shader_by_id[shader] = list(set(shader_by_id[shader]))
|
||||
|
||||
return shader_by_id
|
||||
|
||||
|
||||
def lsattr(attr, value=None):
|
||||
"""Return nodes matching `key` and `value`
|
||||
|
||||
|
|
@ -764,17 +493,6 @@ def lsattrs(attrs):
|
|||
return list(matches)
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def without_extension():
|
||||
"""Use cmds.file with defaultExtensions=False"""
|
||||
previous_setting = cmds.file(defaultExtensions=True, query=True)
|
||||
try:
|
||||
cmds.file(defaultExtensions=False)
|
||||
yield
|
||||
finally:
|
||||
cmds.file(defaultExtensions=previous_setting)
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def attribute_values(attr_values):
|
||||
"""Remaps node attributes to values during context.
|
||||
|
|
@ -853,26 +571,6 @@ def evaluation(mode="off"):
|
|||
cmds.evaluationManager(mode=original)
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def no_refresh():
|
||||
"""Temporarily disables Maya's UI updates
|
||||
|
||||
Note:
|
||||
This only disabled the main pane and will sometimes still
|
||||
trigger updates in torn off panels.
|
||||
|
||||
"""
|
||||
|
||||
pane = _get_mel_global('gMainPane')
|
||||
state = cmds.paneLayout(pane, query=True, manage=True)
|
||||
cmds.paneLayout(pane, edit=True, manage=False)
|
||||
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
cmds.paneLayout(pane, edit=True, manage=state)
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def empty_sets(sets, force=False):
|
||||
"""Remove all members of the sets during the context"""
|
||||
|
|
@ -1539,15 +1237,6 @@ def extract_alembic(file,
|
|||
return file
|
||||
|
||||
|
||||
def maya_temp_folder():
|
||||
scene_dir = os.path.dirname(cmds.file(query=True, sceneName=True))
|
||||
tmp_dir = os.path.abspath(os.path.join(scene_dir, "..", "tmp"))
|
||||
if not os.path.isdir(tmp_dir):
|
||||
os.makedirs(tmp_dir)
|
||||
|
||||
return tmp_dir
|
||||
|
||||
|
||||
# region ID
|
||||
def get_id_required_nodes(referenced_nodes=False, nodes=None):
|
||||
"""Filter out any node which are locked (reference) or readOnly
|
||||
|
|
@ -1732,22 +1421,6 @@ def set_id(node, unique_id, overwrite=False):
|
|||
cmds.setAttr(attr, unique_id, type="string")
|
||||
|
||||
|
||||
def remove_id(node):
|
||||
"""Remove the id attribute from the input node.
|
||||
|
||||
Args:
|
||||
node (str): The node name
|
||||
|
||||
Returns:
|
||||
bool: Whether an id attribute was deleted
|
||||
|
||||
"""
|
||||
if cmds.attributeQuery("cbId", node=node, exists=True):
|
||||
cmds.deleteAttr("{}.cbId".format(node))
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
# endregion ID
|
||||
def get_reference_node(path):
|
||||
"""
|
||||
|
|
@ -1820,6 +1493,40 @@ def apply_attributes(attributes, nodes_by_id):
|
|||
set_attribute(attr, value, node)
|
||||
|
||||
|
||||
def get_container_members(container):
|
||||
"""Returns the members of a container.
|
||||
This includes the nodes from any loaded references in the container.
|
||||
"""
|
||||
if isinstance(container, dict):
|
||||
# Assume it's a container dictionary
|
||||
container = container["objectName"]
|
||||
|
||||
members = cmds.sets(container, query=True) or []
|
||||
members = cmds.ls(members, long=True, objectsOnly=True) or []
|
||||
members = set(members)
|
||||
|
||||
# Include any referenced nodes from any reference in the container
|
||||
# This is required since we've removed adding ALL nodes of a reference
|
||||
# into the container set and only add the reference node now.
|
||||
for ref in cmds.ls(members, exactType="reference", objectsOnly=True):
|
||||
|
||||
# Ignore any `:sharedReferenceNode`
|
||||
if ref.rsplit(":", 1)[-1].startswith("sharedReferenceNode"):
|
||||
continue
|
||||
|
||||
# Ignore _UNKNOWN_REF_NODE_ (PLN-160)
|
||||
if ref.rsplit(":", 1)[-1].startswith("_UNKNOWN_REF_NODE_"):
|
||||
continue
|
||||
|
||||
reference_members = cmds.referenceQuery(ref, nodes=True)
|
||||
reference_members = cmds.ls(reference_members,
|
||||
long=True,
|
||||
objectsOnly=True)
|
||||
members.update(reference_members)
|
||||
|
||||
return members
|
||||
|
||||
|
||||
# region LOOKDEV
|
||||
def list_looks(asset_id):
|
||||
"""Return all look subsets for the given asset
|
||||
|
|
@ -1882,7 +1589,7 @@ def assign_look_by_version(nodes, version_id):
|
|||
container_node = pipeline.load(Loader, look_representation)
|
||||
|
||||
# Get container members
|
||||
shader_nodes = cmds.sets(container_node, query=True)
|
||||
shader_nodes = get_container_members(container_node)
|
||||
|
||||
# Load relationships
|
||||
shader_relation = api.get_representation_path(json_representation)
|
||||
|
|
@ -2108,7 +1815,7 @@ def get_container_transforms(container, members=None, root=False):
|
|||
"""
|
||||
|
||||
if not members:
|
||||
members = cmds.sets(container["objectName"], query=True)
|
||||
members = get_container_members(container)
|
||||
|
||||
results = cmds.ls(members, type="transform", long=True)
|
||||
if root:
|
||||
|
|
@ -2389,6 +2096,7 @@ def reset_scene_resolution():
|
|||
|
||||
set_scene_resolution(width, height, pixelAspect)
|
||||
|
||||
|
||||
def set_context_settings():
|
||||
"""Apply the project settings from the project definition
|
||||
|
||||
|
|
@ -2847,7 +2555,7 @@ def get_attr_in_layer(attr, layer):
|
|||
|
||||
|
||||
def fix_incompatible_containers():
|
||||
"""Return whether the current scene has any outdated content"""
|
||||
"""Backwards compatibility: old containers to use new ReferenceLoader"""
|
||||
|
||||
host = api.registered_host()
|
||||
for container in host.ls():
|
||||
|
|
@ -3086,7 +2794,7 @@ class RenderSetupListObserver:
|
|||
cmds.delete(render_layer_set_name)
|
||||
|
||||
|
||||
class RenderSetupItemObserver():
|
||||
class RenderSetupItemObserver:
|
||||
"""Handle changes in render setup items."""
|
||||
|
||||
def __init__(self, item):
|
||||
|
|
@ -3322,7 +3030,7 @@ def set_colorspace():
|
|||
@contextlib.contextmanager
|
||||
def root_parent(nodes):
|
||||
# type: (list) -> list
|
||||
"""Context manager to un-parent provided nodes and return then back."""
|
||||
"""Context manager to un-parent provided nodes and return them back."""
|
||||
import pymel.core as pm # noqa
|
||||
|
||||
node_parents = []
|
||||
|
|
|
|||
|
|
@ -1,924 +0,0 @@
|
|||
[
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\others\\save_scene_incremental.py",
|
||||
"sourcetype": "file",
|
||||
"title": "# Version Up",
|
||||
"tooltip": "Incremental save with a specific format"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\others\\open_current_folder.py",
|
||||
"sourcetype": "file",
|
||||
"title": "Open working folder..",
|
||||
"tooltip": "Show current scene in Explorer"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\avalon\\launch_manager.py",
|
||||
"sourcetype": "file",
|
||||
"title": "# Project Manager",
|
||||
"tooltip": "Add assets to the project"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "from openpype.tools.assetcreator import app as assetcreator; assetcreator.show(context='maya')",
|
||||
"sourcetype": "python",
|
||||
"title": "Asset Creator",
|
||||
"tooltip": "Open the Asset Creator"
|
||||
},
|
||||
{
|
||||
"type": "separator"
|
||||
},
|
||||
{
|
||||
"type": "menu",
|
||||
"title": "Modeling",
|
||||
"items": [
|
||||
{
|
||||
"type": "action",
|
||||
"command": "import easyTreezSource; reload(easyTreezSource); easyTreezSource.easyTreez()",
|
||||
"sourcetype": "python",
|
||||
"tags": ["modeling", "trees", "generate", "create", "plants"],
|
||||
"title": "EasyTreez",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\modeling\\separateMeshPerShader.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["modeling", "separateMeshPerShader"],
|
||||
"title": "# Separate Mesh Per Shader",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\modeling\\polyDetachSeparate.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["modeling", "poly", "detach", "separate"],
|
||||
"title": "# Polygon Detach and Separate",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\modeling\\polySelectEveryNthEdgeUI.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["modeling", "select", "nth", "edge", "ui"],
|
||||
"title": "# Select Every Nth Edge"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\modeling\\djPFXUVs.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["modeling", "djPFX", "UVs"],
|
||||
"title": "# dj PFX UVs",
|
||||
"tooltip": ""
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "menu",
|
||||
"title": "Rigging",
|
||||
"items": [
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\rigging\\advancedSkeleton.py",
|
||||
"sourcetype": "file",
|
||||
"tags": [
|
||||
"rigging",
|
||||
"autorigger",
|
||||
"advanced",
|
||||
"skeleton",
|
||||
"advancedskeleton",
|
||||
"file"
|
||||
],
|
||||
"title": "Advanced Skeleton"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "menu",
|
||||
"title": "Shading",
|
||||
"items": [
|
||||
{
|
||||
"type": "menu",
|
||||
"title": "# VRay",
|
||||
"items": [
|
||||
{
|
||||
"type": "action",
|
||||
"title": "# Import Proxies",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\vray\\vrayImportProxies.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["shading", "vray", "import", "proxies"],
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "separator"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"title": "# Select All GES",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\vray\\selectAllGES.py",
|
||||
"sourcetype": "file",
|
||||
"tooltip": "",
|
||||
"tags": ["shading", "vray", "select All GES"]
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"title": "# Select All GES Under Selection",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\vray\\selectAllGESUnderSelection.py",
|
||||
"sourcetype": "file",
|
||||
"tooltip": "",
|
||||
"tags": ["shading", "vray", "select", "all", "GES"]
|
||||
},
|
||||
{
|
||||
"type": "separator"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"title": "# Selection To VRay Mesh",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\vray\\selectionToVrayMesh.py",
|
||||
"sourcetype": "file",
|
||||
"tooltip": "",
|
||||
"tags": ["shading", "vray", "selection", "vraymesh"]
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"title": "# Add VRay Round Edges Attribute",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\vray\\addVrayRoundEdgesAttribute.py",
|
||||
"sourcetype": "file",
|
||||
"tooltip": "",
|
||||
"tags": ["shading", "vray", "round edges", "attribute"]
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"title": "# Add Gamma",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\vray\\vrayAddGamma.py",
|
||||
"sourcetype": "file",
|
||||
"tooltip": "",
|
||||
"tags": ["shading", "vray", "add gamma"]
|
||||
},
|
||||
{
|
||||
"type": "separator"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\vray\\select_vraymesh_materials_with_unconnected_shader_slots.py",
|
||||
"sourcetype": "file",
|
||||
"title": "# Select Unconnected Shader Materials",
|
||||
"tags": [
|
||||
"shading",
|
||||
"vray",
|
||||
"select",
|
||||
"vraymesh",
|
||||
"materials",
|
||||
"unconnected shader slots"
|
||||
],
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\vray\\vrayMergeSimilarVRayMeshMaterials.py",
|
||||
"sourcetype": "file",
|
||||
"title": "# Merge Similar VRay Mesh Materials",
|
||||
"tags": [
|
||||
"shading",
|
||||
"vray",
|
||||
"Merge",
|
||||
"VRayMesh",
|
||||
"Materials"
|
||||
],
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"title": "# Create Two Sided Material",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\vray\\vrayCreate2SidedMtlForSelectedMtlRenamed.py",
|
||||
"sourcetype": "file",
|
||||
"tooltip": "Creates two sided material for selected material and renames it",
|
||||
"tags": ["shading", "vray", "two sided", "material"]
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"title": "# Create Two Sided Material For Selected",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\vray\\vrayCreate2SidedMtlForSelectedMtl.py",
|
||||
"sourcetype": "file",
|
||||
"tooltip": "Select material to create a two sided version from it",
|
||||
"tags": [
|
||||
"shading",
|
||||
"vray",
|
||||
"Create2SidedMtlForSelectedMtl.py"
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "separator"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"title": "# Add OpenSubdiv Attribute",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\vray\\addVrayOpenSubdivAttribute.py",
|
||||
"sourcetype": "file",
|
||||
"tooltip": "",
|
||||
"tags": [
|
||||
"shading",
|
||||
"vray",
|
||||
"add",
|
||||
"open subdiv",
|
||||
"attribute"
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"title": "# Remove OpenSubdiv Attribute",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\vray\\removeVrayOpenSubdivAttribute.py",
|
||||
"sourcetype": "file",
|
||||
"tooltip": "",
|
||||
"tags": [
|
||||
"shading",
|
||||
"vray",
|
||||
"remove",
|
||||
"opensubdiv",
|
||||
"attributee"
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "separator"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"title": "# Add Subdivision Attribute",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\vray\\addVraySubdivisionAttribute.py",
|
||||
"sourcetype": "file",
|
||||
"tooltip": "",
|
||||
"tags": [
|
||||
"shading",
|
||||
"vray",
|
||||
"addVraySubdivisionAttribute"
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"title": "# Remove Subdivision Attribute.py",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\vray\\removeVraySubdivisionAttribute.py",
|
||||
"sourcetype": "file",
|
||||
"tooltip": "",
|
||||
"tags": [
|
||||
"shading",
|
||||
"vray",
|
||||
"remove",
|
||||
"subdivision",
|
||||
"attribute"
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "separator"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"title": "# Add Vray Object Ids",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\vray\\addVrayObjectIds.py",
|
||||
"sourcetype": "file",
|
||||
"tooltip": "",
|
||||
"tags": ["shading", "vray", "add", "object id"]
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"title": "# Add Vray Material Ids",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\vray\\addVrayMaterialIds.py",
|
||||
"sourcetype": "file",
|
||||
"tooltip": "",
|
||||
"tags": ["shading", "vray", "addVrayMaterialIds.py"]
|
||||
},
|
||||
{
|
||||
"type": "separator"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"title": "# Set Physical DOF Depth",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\vray\\vrayPhysicalDOFSetDepth.py",
|
||||
"sourcetype": "file",
|
||||
"tooltip": "",
|
||||
"tags": ["shading", "vray", "physical", "DOF ", "Depth"]
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"title": "# Magic Vray Proxy UI",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\vray\\magicVrayProxyUI.py",
|
||||
"sourcetype": "file",
|
||||
"tooltip": "",
|
||||
"tags": ["shading", "vray", "magicVrayProxyUI"]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\pyblish\\lighting\\set_filename_prefix.py",
|
||||
"sourcetype": "file",
|
||||
"tags": [
|
||||
"shading",
|
||||
"lookdev",
|
||||
"assign",
|
||||
"shaders",
|
||||
"prefix",
|
||||
"filename",
|
||||
"render"
|
||||
],
|
||||
"title": "# Set filename prefix",
|
||||
"tooltip": "Set the render file name prefix."
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "import mayalookassigner; mayalookassigner.show()",
|
||||
"sourcetype": "python",
|
||||
"tags": ["shading", "look", "assign", "shaders", "auto"],
|
||||
"title": "Look Manager",
|
||||
"tooltip": "Open the Look Manager UI for look assignment"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\LightLinkUi.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["shading", "light", "link", "ui"],
|
||||
"title": "# Light Link UI",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\vdviewer_ui.py",
|
||||
"sourcetype": "file",
|
||||
"tags": [
|
||||
"shading",
|
||||
"look",
|
||||
"vray",
|
||||
"displacement",
|
||||
"shaders",
|
||||
"auto"
|
||||
],
|
||||
"title": "# VRay Displ Viewer",
|
||||
"tooltip": "Open the VRay Displacement Viewer, select and control the content of the set"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\setTexturePreviewToCLRImage.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["shading", "CLRImage", "textures", "preview"],
|
||||
"title": "# Set Texture Preview To CLRImage",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\fixDefaultShaderSetBehavior.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["shading", "fix", "DefaultShaderSet", "Behavior"],
|
||||
"title": "# Fix Default Shader Set Behavior",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\fixSelectedShapesReferenceAssignments.py",
|
||||
"sourcetype": "file",
|
||||
"tags": [
|
||||
"shading",
|
||||
"fix",
|
||||
"Selected",
|
||||
"Shapes",
|
||||
"Reference",
|
||||
"Assignments"
|
||||
],
|
||||
"title": "# Fix Shapes Reference Assignments",
|
||||
"tooltip": "Select shapes to fix the reference assignments"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\selectLambert1Members.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["shading", "selectLambert1Members"],
|
||||
"title": "# Select Lambert1 Members",
|
||||
"tooltip": "Selects all objects which have the Lambert1 shader assigned"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\selectShapesWithoutShader.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["shading", "selectShapesWithoutShader"],
|
||||
"title": "# Select Shapes Without Shader",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\fixRenderLayerOutAdjustmentErrors.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["shading", "fixRenderLayerOutAdjustmentErrors"],
|
||||
"title": "# Fix RenderLayer Out Adjustment Errors",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\fix_renderlayer_missing_node_override.py",
|
||||
"sourcetype": "file",
|
||||
"tags": [
|
||||
"shading",
|
||||
"renderlayer",
|
||||
"missing",
|
||||
"reference",
|
||||
"switch",
|
||||
"layer"
|
||||
],
|
||||
"title": "# Fix RenderLayer Missing Referenced Nodes Overrides",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"title": "# Image 2 Tiled EXR",
|
||||
"command": "$OPENPYPE_SCRIPTS\\shading\\open_img2exr.py",
|
||||
"sourcetype": "file",
|
||||
"tooltip": "",
|
||||
"tags": ["shading", "vray", "exr"]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "menu",
|
||||
"title": "# Rendering",
|
||||
"items": [
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\pyblish\\open_deadline_submission_settings.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["settings", "deadline", "globals", "render"],
|
||||
"title": "# DL Submission Settings UI",
|
||||
"tooltip": "Open the Deadline Submission Settings UI"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "menu",
|
||||
"title": "Animation",
|
||||
"items": [
|
||||
{
|
||||
"type": "menu",
|
||||
"title": "# Attributes",
|
||||
"tooltip": "",
|
||||
"items": [
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\animation\\attributes\\copyValues.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["animation", "copy", "attributes"],
|
||||
"title": "# Copy Values",
|
||||
"tooltip": "Copy attribute values"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\animation\\attributes\\copyInConnections.py",
|
||||
"sourcetype": "file",
|
||||
"tags": [
|
||||
"animation",
|
||||
"copy",
|
||||
"attributes",
|
||||
"connections",
|
||||
"incoming"
|
||||
],
|
||||
"title": "# Copy In Connections",
|
||||
"tooltip": "Copy incoming connections"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\animation\\attributes\\copyOutConnections.py",
|
||||
"sourcetype": "file",
|
||||
"tags": [
|
||||
"animation",
|
||||
"copy",
|
||||
"attributes",
|
||||
"connections",
|
||||
"out"
|
||||
],
|
||||
"title": "# Copy Out Connections",
|
||||
"tooltip": "Copy outcoming connections"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\animation\\attributes\\copyTransformLocal.py",
|
||||
"sourcetype": "file",
|
||||
"tags": [
|
||||
"animation",
|
||||
"copy",
|
||||
"attributes",
|
||||
"transforms",
|
||||
"local"
|
||||
],
|
||||
"title": "# Copy Local Transforms",
|
||||
"tooltip": "Copy local transforms"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\animation\\attributes\\copyTransformMatrix.py",
|
||||
"sourcetype": "file",
|
||||
"tags": [
|
||||
"animation",
|
||||
"copy",
|
||||
"attributes",
|
||||
"transforms",
|
||||
"matrix"
|
||||
],
|
||||
"title": "# Copy Matrix Transforms",
|
||||
"tooltip": "Copy Matrix transforms"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\animation\\attributes\\copyTransformUI.py",
|
||||
"sourcetype": "file",
|
||||
"tags": [
|
||||
"animation",
|
||||
"copy",
|
||||
"attributes",
|
||||
"transforms",
|
||||
"UI"
|
||||
],
|
||||
"title": "# Copy Transforms UI",
|
||||
"tooltip": "Open the Copy Transforms UI"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\animation\\attributes\\simpleCopyUI.py",
|
||||
"sourcetype": "file",
|
||||
"tags": [
|
||||
"animation",
|
||||
"copy",
|
||||
"attributes",
|
||||
"transforms",
|
||||
"UI",
|
||||
"simple"
|
||||
],
|
||||
"title": "# Simple Copy UI",
|
||||
"tooltip": "Open the simple Copy Transforms UI"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "menu",
|
||||
"title": "# Optimize",
|
||||
"tooltip": "Optimization scripts",
|
||||
"items": [
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\animation\\optimize\\toggleFreezeHierarchy.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["animation", "hierarchy", "toggle", "freeze"],
|
||||
"title": "# Toggle Freeze Hierarchy",
|
||||
"tooltip": "Freeze and unfreeze hierarchy"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\animation\\optimize\\toggleParallelNucleus.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["animation", "nucleus", "toggle", "parallel"],
|
||||
"title": "# Toggle Parallel Nucleus",
|
||||
"tooltip": "Toggle parallel nucleus"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"sourcetype": "file",
|
||||
"command": "$OPENPYPE_SCRIPTS\\animation\\bakeSelectedToWorldSpace.py",
|
||||
"tags": ["animation", "bake", "selection", "worldspace.py"],
|
||||
"title": "# Bake Selected To Worldspace",
|
||||
"type": "action"
|
||||
},
|
||||
{
|
||||
"sourcetype": "file",
|
||||
"command": "$OPENPYPE_SCRIPTS\\animation\\timeStepper.py",
|
||||
"tags": ["animation", "time", "stepper"],
|
||||
"title": "# Time Stepper",
|
||||
"type": "action"
|
||||
},
|
||||
{
|
||||
"sourcetype": "file",
|
||||
"command": "$OPENPYPE_SCRIPTS\\animation\\capture_ui.py",
|
||||
"tags": [
|
||||
"animation",
|
||||
"capture",
|
||||
"ui",
|
||||
"screen",
|
||||
"movie",
|
||||
"image"
|
||||
],
|
||||
"title": "# Capture UI",
|
||||
"type": "action"
|
||||
},
|
||||
{
|
||||
"sourcetype": "file",
|
||||
"command": "$OPENPYPE_SCRIPTS\\animation\\simplePlayblastUI.py",
|
||||
"tags": ["animation", "simple", "playblast", "ui"],
|
||||
"title": "# Simple Playblast UI",
|
||||
"type": "action"
|
||||
},
|
||||
{
|
||||
"sourcetype": "file",
|
||||
"command": "$OPENPYPE_SCRIPTS\\animation\\tweenMachineUI.py",
|
||||
"tags": ["animation", "tween", "machine"],
|
||||
"title": "# Tween Machine UI",
|
||||
"type": "action"
|
||||
},
|
||||
{
|
||||
"sourcetype": "file",
|
||||
"command": "$OPENPYPE_SCRIPTS\\animation\\selectAllAnimationCurves.py",
|
||||
"tags": ["animation", "select", "curves"],
|
||||
"title": "# Select All Animation Curves",
|
||||
"type": "action"
|
||||
},
|
||||
{
|
||||
"sourcetype": "file",
|
||||
"command": "$OPENPYPE_SCRIPTS\\animation\\pathAnimation.py",
|
||||
"tags": ["animation", "path", "along"],
|
||||
"title": "# Path Animation",
|
||||
"type": "action"
|
||||
},
|
||||
{
|
||||
"sourcetype": "file",
|
||||
"command": "$OPENPYPE_SCRIPTS\\animation\\offsetSelectedObjectsUI.py",
|
||||
"tags": ["animation", "offsetSelectedObjectsUI.py"],
|
||||
"title": "# Offset Selected Objects UI",
|
||||
"type": "action"
|
||||
},
|
||||
{
|
||||
"sourcetype": "file",
|
||||
"command": "$OPENPYPE_SCRIPTS\\animation\\key_amplifier_ui.py",
|
||||
"tags": ["animation", "key", "amplifier"],
|
||||
"title": "# Key Amplifier UI",
|
||||
"type": "action"
|
||||
},
|
||||
{
|
||||
"sourcetype": "file",
|
||||
"command": "$OPENPYPE_SCRIPTS\\animation\\anim_scene_optimizer.py",
|
||||
"tags": ["animation", "anim_scene_optimizer.py"],
|
||||
"title": "# Anim_Scene_Optimizer",
|
||||
"type": "action"
|
||||
},
|
||||
{
|
||||
"sourcetype": "file",
|
||||
"command": "$OPENPYPE_SCRIPTS\\animation\\zvParentMaster.py",
|
||||
"tags": ["animation", "zvParentMaster.py"],
|
||||
"title": "# ZV Parent Master",
|
||||
"type": "action"
|
||||
},
|
||||
{
|
||||
"sourcetype": "file",
|
||||
"command": "$OPENPYPE_SCRIPTS\\animation\\animLibrary.py",
|
||||
"tags": ["animation", "studiolibrary.py"],
|
||||
"title": "Anim Library",
|
||||
"type": "action"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "menu",
|
||||
"title": "# Layout",
|
||||
"items": [
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\layout\\alignDistributeUI.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["layout", "align", "Distribute", "UI"],
|
||||
"title": "# Align Distribute UI",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\layout\\alignSimpleUI.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["layout", "align", "UI", "Simple"],
|
||||
"title": "# Align Simple UI",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\layout\\center_locator.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["layout", "center", "locator"],
|
||||
"title": "# Center Locator",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\layout\\average_locator.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["layout", "average", "locator"],
|
||||
"title": "# Average Locator",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\layout\\selectWithinProximityUI.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["layout", "select", "proximity", "ui"],
|
||||
"title": "# Select Within Proximity UI",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\layout\\dupCurveUI.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["layout", "Duplicate", "Curve", "UI"],
|
||||
"title": "# Duplicate Curve UI",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\layout\\randomDeselectUI.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["layout", "random", "Deselect", "UI"],
|
||||
"title": "# Random Deselect UI",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\layout\\multiReferencerUI.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["layout", "multi", "reference"],
|
||||
"title": "# Multi Referencer UI",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\layout\\duplicateOffsetUI.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["layout", "duplicate", "offset", "UI"],
|
||||
"title": "# Duplicate Offset UI",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\layout\\spPaint3d.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["layout", "spPaint3d", "paint", "tool"],
|
||||
"title": "# SP Paint 3d",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\layout\\randomizeUI.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["layout", "randomize", "UI"],
|
||||
"title": "# Randomize UI",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\layout\\distributeWithinObjectUI.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["layout", "distribute", "ObjectUI", "within"],
|
||||
"title": "# Distribute Within Object UI",
|
||||
"tooltip": ""
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "menu",
|
||||
"title": "# Particles",
|
||||
"items": [
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\particles\\instancerToObjects.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["particles", "instancerToObjects"],
|
||||
"title": "# Instancer To Objects",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\particles\\instancerToObjectsInstances.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["particles", "instancerToObjectsInstances"],
|
||||
"title": "# Instancer To Objects Instances",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\particles\\instancerToObjectsInstancesWithAnimation.py",
|
||||
"sourcetype": "file",
|
||||
"tags": [
|
||||
"particles",
|
||||
"instancerToObjectsInstancesWithAnimation"
|
||||
],
|
||||
"title": "# Instancer To Objects Instances With Animation",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\particles\\instancerToObjectsWithAnimation.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["particles", "instancerToObjectsWithAnimation"],
|
||||
"title": "# Instancer To Objects With Animation",
|
||||
"tooltip": ""
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "menu",
|
||||
"title": "Cleanup",
|
||||
"items": [
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\cleanup\\repair_faulty_containers.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["cleanup", "repair", "containers"],
|
||||
"title": "# Find and Repair Containers",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "separator"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\cleanup\\removeNamespaces.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["cleanup", "remove", "namespaces"],
|
||||
"title": "# Remove Namespaces",
|
||||
"tooltip": "Remove all namespaces"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\cleanup\\remove_user_defined_attributes.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["cleanup", "remove_user_defined_attributes"],
|
||||
"title": "# Remove User Defined Attributes",
|
||||
"tooltip": "Remove all user-defined attributes from all nodes"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\cleanup\\removeUnknownNodes.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["cleanup", "removeUnknownNodes"],
|
||||
"title": "# Remove Unknown Nodes",
|
||||
"tooltip": "Remove all unknown nodes"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\cleanup\\removeUnloadedReferences.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["cleanup", "removeUnloadedReferences"],
|
||||
"title": "# Remove Unloaded References",
|
||||
"tooltip": "Remove all unloaded references"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\cleanup\\removeReferencesFailedEdits.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["cleanup", "removeReferencesFailedEdits"],
|
||||
"title": "# Remove References Failed Edits",
|
||||
"tooltip": "Remove failed edits for all references"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\cleanup\\remove_unused_looks.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["cleanup", "removeUnusedLooks"],
|
||||
"title": "# Remove Unused Looks",
|
||||
"tooltip": "Remove all loaded yet unused Avalon look containers"
|
||||
},
|
||||
{
|
||||
"type": "separator"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\cleanup\\uniqifyNodeNames.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["cleanup", "uniqifyNodeNames"],
|
||||
"title": "# Uniqify Node Names",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\cleanup\\autoRenameFileNodes.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["cleanup", "auto", "rename", "filenodes"],
|
||||
"title": "# Auto Rename File Nodes",
|
||||
"tooltip": ""
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\cleanup\\update_asset_id.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["cleanup", "update", "database", "asset", "id"],
|
||||
"title": "# Update Asset ID",
|
||||
"tooltip": "Will replace the Colorbleed ID with a new one (asset ID : Unique number)"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\cleanup\\ccRenameReplace.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["cleanup", "rename", "ui"],
|
||||
"title": "Renamer",
|
||||
"tooltip": "Rename UI"
|
||||
},
|
||||
{
|
||||
"type": "action",
|
||||
"command": "$OPENPYPE_SCRIPTS\\cleanup\\renameShapesToTransform.py",
|
||||
"sourcetype": "file",
|
||||
"tags": ["cleanup", "renameShapesToTransform"],
|
||||
"title": "# Rename Shapes To Transform",
|
||||
"tooltip": ""
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
File diff suppressed because it is too large
Load diff
|
|
@ -184,76 +184,6 @@ def uninstall():
|
|||
menu.uninstall()
|
||||
|
||||
|
||||
def lock():
|
||||
"""Lock scene
|
||||
|
||||
Add an invisible node to your Maya scene with the name of the
|
||||
current file, indicating that this file is "locked" and cannot
|
||||
be modified any further.
|
||||
|
||||
"""
|
||||
|
||||
if not cmds.objExists("lock"):
|
||||
with lib.maintained_selection():
|
||||
cmds.createNode("objectSet", name="lock")
|
||||
cmds.addAttr("lock", ln="basename", dataType="string")
|
||||
|
||||
# Permanently hide from outliner
|
||||
cmds.setAttr("lock.verticesOnlySet", True)
|
||||
|
||||
fname = cmds.file(query=True, sceneName=True)
|
||||
basename = os.path.basename(fname)
|
||||
cmds.setAttr("lock.basename", basename, type="string")
|
||||
|
||||
|
||||
def unlock():
|
||||
"""Permanently unlock a locked scene
|
||||
|
||||
Doesn't throw an error if scene is already unlocked.
|
||||
|
||||
"""
|
||||
|
||||
try:
|
||||
cmds.delete("lock")
|
||||
except ValueError:
|
||||
pass
|
||||
|
||||
|
||||
def is_locked():
|
||||
"""Query whether current scene is locked"""
|
||||
fname = cmds.file(query=True, sceneName=True)
|
||||
basename = os.path.basename(fname)
|
||||
|
||||
if self._ignore_lock:
|
||||
return False
|
||||
|
||||
try:
|
||||
return cmds.getAttr("lock.basename") == basename
|
||||
except ValueError:
|
||||
return False
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def lock_ignored():
|
||||
"""Context manager for temporarily ignoring the lock of a scene
|
||||
|
||||
The purpose of this function is to enable locking a scene and
|
||||
saving it with the lock still in place.
|
||||
|
||||
Example:
|
||||
>>> with lock_ignored():
|
||||
... pass # Do things without lock
|
||||
|
||||
"""
|
||||
|
||||
self._ignore_lock = True
|
||||
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
self._ignore_lock = False
|
||||
|
||||
|
||||
def parse_container(container):
|
||||
"""Return the container node's full container data.
|
||||
|
||||
|
|
|
|||
|
|
@ -166,24 +166,15 @@ class ReferenceLoader(Loader):
|
|||
nodes = self[:]
|
||||
if not nodes:
|
||||
return
|
||||
# FIXME: there is probably better way to do this for looks.
|
||||
if "look" in self.families:
|
||||
loaded_containers.append(containerise(
|
||||
name=name,
|
||||
namespace=namespace,
|
||||
nodes=nodes,
|
||||
context=context,
|
||||
loader=self.__class__.__name__
|
||||
))
|
||||
else:
|
||||
ref_node = get_reference_node(nodes, self.log)
|
||||
loaded_containers.append(containerise(
|
||||
name=name,
|
||||
namespace=namespace,
|
||||
nodes=[ref_node],
|
||||
context=context,
|
||||
loader=self.__class__.__name__
|
||||
))
|
||||
|
||||
ref_node = get_reference_node(nodes, self.log)
|
||||
loaded_containers.append(containerise(
|
||||
name=name,
|
||||
namespace=namespace,
|
||||
nodes=[ref_node],
|
||||
context=context,
|
||||
loader=self.__class__.__name__
|
||||
))
|
||||
|
||||
c += 1
|
||||
namespace = None
|
||||
|
|
@ -193,17 +184,18 @@ class ReferenceLoader(Loader):
|
|||
"""To be implemented by subclass"""
|
||||
raise NotImplementedError("Must be implemented by subclass")
|
||||
|
||||
|
||||
def update(self, container, representation):
|
||||
from maya import cmds
|
||||
from openpype.hosts.maya.api.lib import get_container_members
|
||||
|
||||
node = container["objectName"]
|
||||
|
||||
path = api.get_representation_path(representation)
|
||||
|
||||
# Get reference node from container members
|
||||
members = cmds.sets(node, query=True, nodesOnly=True)
|
||||
members = get_container_members(node)
|
||||
reference_node = get_reference_node(members, self.log)
|
||||
namespace = cmds.referenceQuery(reference_node, namespace=True)
|
||||
|
||||
file_type = {
|
||||
"ma": "mayaAscii",
|
||||
|
|
@ -221,18 +213,14 @@ class ReferenceLoader(Loader):
|
|||
alembic_data = {}
|
||||
if representation["name"] == "abc":
|
||||
alembic_nodes = cmds.ls(
|
||||
"{}:*".format(members[0].split(":")[0]), type="AlembicNode"
|
||||
"{}:*".format(namespace), type="AlembicNode"
|
||||
)
|
||||
if alembic_nodes:
|
||||
for attr in alembic_attrs:
|
||||
node_attr = "{}.{}".format(alembic_nodes[0], attr)
|
||||
alembic_data[attr] = cmds.getAttr(node_attr)
|
||||
else:
|
||||
cmds.warning(
|
||||
"No alembic nodes found in {}".format(
|
||||
cmds.ls("{}:*".format(members[0].split(":")[0]))
|
||||
)
|
||||
)
|
||||
self.log.debug("No alembic nodes found in {}".format(members))
|
||||
|
||||
try:
|
||||
content = cmds.file(path,
|
||||
|
|
@ -256,9 +244,9 @@ class ReferenceLoader(Loader):
|
|||
self.log.warning("Ignoring file read error:\n%s", exc)
|
||||
|
||||
# Reapply alembic settings.
|
||||
if representation["name"] == "abc":
|
||||
if representation["name"] == "abc" and alembic_data:
|
||||
alembic_nodes = cmds.ls(
|
||||
"{}:*".format(members[0].split(":")[0]), type="AlembicNode"
|
||||
"{}:*".format(namespace), type="AlembicNode"
|
||||
)
|
||||
if alembic_nodes:
|
||||
for attr, value in alembic_data.items():
|
||||
|
|
|
|||
|
|
@ -25,18 +25,6 @@ class LookLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
|||
color = "orange"
|
||||
|
||||
def process_reference(self, context, name, namespace, options):
|
||||
"""
|
||||
Load and try to assign Lookdev to nodes based on relationship data.
|
||||
|
||||
Args:
|
||||
name:
|
||||
namespace:
|
||||
context:
|
||||
options:
|
||||
|
||||
Returns:
|
||||
|
||||
"""
|
||||
import maya.cmds as cmds
|
||||
|
||||
with lib.maintained_selection():
|
||||
|
|
@ -66,36 +54,17 @@ class LookLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
|||
Returns:
|
||||
None
|
||||
"""
|
||||
import os
|
||||
from maya import cmds
|
||||
node = container["objectName"]
|
||||
path = api.get_representation_path(representation)
|
||||
|
||||
# Get reference node from container members
|
||||
members = cmds.sets(node, query=True, nodesOnly=True)
|
||||
members = lib.get_container_members(container)
|
||||
reference_node = get_reference_node(members, log=self.log)
|
||||
|
||||
shader_nodes = cmds.ls(members, type='shadingEngine')
|
||||
orig_nodes = set(self._get_nodes_with_shader(shader_nodes))
|
||||
|
||||
file_type = {
|
||||
"ma": "mayaAscii",
|
||||
"mb": "mayaBinary",
|
||||
"abc": "Alembic"
|
||||
}.get(representation["name"])
|
||||
|
||||
assert file_type, "Unsupported representation: %s" % representation
|
||||
|
||||
assert os.path.exists(path), "%s does not exist." % path
|
||||
|
||||
self._load_reference(file_type, node, path, reference_node)
|
||||
|
||||
# Remove any placeHolderList attribute entries from the set that
|
||||
# are remaining from nodes being removed from the referenced file.
|
||||
members = cmds.sets(node, query=True)
|
||||
invalid = [x for x in members if ".placeHolderList" in x]
|
||||
if invalid:
|
||||
cmds.sets(invalid, remove=node)
|
||||
# Trigger the regular reference update on the ReferenceLoader
|
||||
super(LookLoader, self).update(container, representation)
|
||||
|
||||
# get new applied shaders and nodes from new version
|
||||
shader_nodes = cmds.ls(members, type='shadingEngine')
|
||||
|
|
@ -112,30 +81,12 @@ class LookLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
|||
with open(shader_relation, "r") as f:
|
||||
json_data = json.load(f)
|
||||
|
||||
for rel, data in json_data["relationships"].items():
|
||||
# process only non-shading nodes
|
||||
current_node = "{}:{}".format(container["namespace"], rel)
|
||||
if current_node in shader_nodes:
|
||||
continue
|
||||
print("processing {}".format(rel))
|
||||
current_members = set(cmds.ls(
|
||||
cmds.sets(current_node, query=True) or [], long=True))
|
||||
new_members = {"{}".format(
|
||||
m["name"]) for m in data["members"] or []}
|
||||
dif = new_members.difference(current_members)
|
||||
|
||||
# add to set
|
||||
cmds.sets(
|
||||
dif, forceElement="{}:{}".format(container["namespace"], rel))
|
||||
|
||||
# update of reference could result in failed edits - material is not
|
||||
# present because of renaming etc.
|
||||
# present because of renaming etc. If so highlight failed edits to user
|
||||
failed_edits = cmds.referenceQuery(reference_node,
|
||||
editStrings=True,
|
||||
failedEdits=True,
|
||||
successfulEdits=False)
|
||||
|
||||
# highlight failed edits to user
|
||||
if failed_edits:
|
||||
# clean references - removes failed reference edits
|
||||
cmds.file(cr=reference_node) # cleanReference
|
||||
|
|
@ -161,11 +112,6 @@ class LookLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
|||
nodes_by_id[lib.get_id(n)].append(n)
|
||||
lib.apply_attributes(attributes, nodes_by_id)
|
||||
|
||||
# Update metadata
|
||||
cmds.setAttr("{}.representation".format(node),
|
||||
str(representation["_id"]),
|
||||
type="string")
|
||||
|
||||
def _get_nodes_with_shader(self, shader_nodes):
|
||||
"""
|
||||
Returns list of nodes belonging to specific shaders
|
||||
|
|
@ -175,7 +121,6 @@ class LookLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
|||
<list> node names
|
||||
"""
|
||||
import maya.cmds as cmds
|
||||
# Get container members
|
||||
|
||||
nodes_list = []
|
||||
for shader in shader_nodes:
|
||||
|
|
@ -186,45 +131,3 @@ class LookLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
|||
nodes_list.extend(cmds.listRelatives(connection,
|
||||
shapes=True))
|
||||
return nodes_list
|
||||
|
||||
def _load_reference(self, file_type, node, path, reference_node):
|
||||
"""
|
||||
Load reference from 'path' on 'reference_node'. Used when change
|
||||
of look (version/update) is triggered.
|
||||
Args:
|
||||
file_type: extension of referenced file
|
||||
node:
|
||||
path: (string) location of referenced file
|
||||
reference_node: (string) - name of node that should be applied
|
||||
on
|
||||
Returns:
|
||||
None
|
||||
"""
|
||||
import maya.cmds as cmds
|
||||
try:
|
||||
content = cmds.file(path,
|
||||
loadReference=reference_node,
|
||||
type=file_type,
|
||||
returnNewNodes=True)
|
||||
except RuntimeError as exc:
|
||||
# When changing a reference to a file that has load errors the
|
||||
# command will raise an error even if the file is still loaded
|
||||
# correctly (e.g. when raising errors on Arnold attributes)
|
||||
# When the file is loaded and has content, we consider it's fine.
|
||||
if not cmds.referenceQuery(reference_node, isLoaded=True):
|
||||
raise
|
||||
|
||||
content = cmds.referenceQuery(reference_node,
|
||||
nodes=True,
|
||||
dagPath=True)
|
||||
if not content:
|
||||
raise
|
||||
|
||||
self.log.warning("Ignoring file read error:\n%s", exc)
|
||||
# Fix PLN-40 for older containers created with Avalon that had the
|
||||
# `.verticesOnlySet` set to True.
|
||||
if cmds.getAttr("{}.verticesOnlySet".format(node)):
|
||||
self.log.info("Setting %s.verticesOnlySet to False", node)
|
||||
cmds.setAttr("{}.verticesOnlySet".format(node), False)
|
||||
# Add new nodes of the reference to the container
|
||||
cmds.sets(content, forceElement=node)
|
||||
|
|
|
|||
|
|
@ -7,9 +7,9 @@ class SaveCurrentScene(pyblish.api.ContextPlugin):
|
|||
"""
|
||||
|
||||
label = "Save current file"
|
||||
order = pyblish.api.IntegratorOrder - 0.49
|
||||
order = pyblish.api.ExtractorOrder - 0.49
|
||||
hosts = ["maya"]
|
||||
families = ["renderlayer"]
|
||||
families = ["renderlayer", "workfile"]
|
||||
|
||||
def process(self, context):
|
||||
import maya.cmds as cmds
|
||||
|
|
@ -17,5 +17,11 @@ class SaveCurrentScene(pyblish.api.ContextPlugin):
|
|||
current = cmds.file(query=True, sceneName=True)
|
||||
assert context.data['currentFile'] == current
|
||||
|
||||
# If file has no modifications, skip forcing a file save
|
||||
if not cmds.file(query=True, modified=True):
|
||||
self.log.debug("Skipping file save as there "
|
||||
"are no modifications..")
|
||||
return
|
||||
|
||||
self.log.info("Saving current file..")
|
||||
cmds.file(save=True, force=True)
|
||||
|
|
|
|||
|
|
@ -70,9 +70,9 @@ def get_resolve_module():
|
|||
sys.exit()
|
||||
# assign global var and return
|
||||
bmdvr = bmd.scriptapp("Resolve")
|
||||
# bmdvf = bmd.scriptapp("Fusion")
|
||||
bmdvf = bmd.scriptapp("Fusion")
|
||||
resolve.api.bmdvr = bmdvr
|
||||
resolve.api.bmdvf = bmdvr.Fusion()
|
||||
resolve.api.bmdvf = bmdvf
|
||||
log.info(("Assigning resolve module to "
|
||||
f"`pype.hosts.resolve.api.bmdvr`: {resolve.api.bmdvr}"))
|
||||
log.info(("Assigning resolve module to "
|
||||
|
|
|
|||
|
|
@ -51,12 +51,6 @@ def version_up(filepath):
|
|||
padding=padding)
|
||||
new_label = label.replace(version, new_version, 1)
|
||||
new_basename = _rreplace(basename, label, new_label)
|
||||
|
||||
if not new_basename.endswith(new_label):
|
||||
index = (new_basename.find(new_label))
|
||||
index += len(new_label)
|
||||
new_basename = new_basename[:index]
|
||||
|
||||
new_filename = "{}{}".format(new_basename, ext)
|
||||
new_filename = os.path.join(dirname, new_filename)
|
||||
new_filename = os.path.normpath(new_filename)
|
||||
|
|
@ -65,8 +59,19 @@ def version_up(filepath):
|
|||
raise RuntimeError("Created path is the same as current file,"
|
||||
"this is a bug")
|
||||
|
||||
# We check for version clashes against the current file for any file
|
||||
# that matches completely in name up to the {version} label found. Thus
|
||||
# if source file was test_v001_test.txt we want to also check clashes
|
||||
# against test_v002.txt but do want to preserve the part after the version
|
||||
# label for our new filename
|
||||
clash_basename = new_basename
|
||||
if not clash_basename.endswith(new_label):
|
||||
index = (clash_basename.find(new_label))
|
||||
index += len(new_label)
|
||||
clash_basename = clash_basename[:index]
|
||||
|
||||
for file in os.listdir(dirname):
|
||||
if file.endswith(ext) and file.startswith(new_basename):
|
||||
if file.endswith(ext) and file.startswith(clash_basename):
|
||||
log.info("Skipping existing version %s" % new_label)
|
||||
return version_up(new_filename)
|
||||
|
||||
|
|
|
|||
|
|
@ -49,11 +49,13 @@ class Terminal:
|
|||
"""
|
||||
|
||||
from openpype.lib import env_value_to_bool
|
||||
use_colors = env_value_to_bool(
|
||||
"OPENPYPE_LOG_NO_COLORS", default=Terminal.use_colors
|
||||
log_no_colors = env_value_to_bool(
|
||||
"OPENPYPE_LOG_NO_COLORS", default=None
|
||||
)
|
||||
if not use_colors:
|
||||
Terminal.use_colors = use_colors
|
||||
if log_no_colors is not None:
|
||||
Terminal.use_colors = not log_no_colors
|
||||
|
||||
if not Terminal.use_colors:
|
||||
Terminal._initialized = True
|
||||
return
|
||||
|
||||
|
|
|
|||
|
|
@ -33,8 +33,11 @@ DEFAULT_OPENPYPE_MODULES = (
|
|||
"avalon_apps",
|
||||
"clockify",
|
||||
"log_viewer",
|
||||
"deadline",
|
||||
"muster",
|
||||
"royalrender",
|
||||
"python_console_interpreter",
|
||||
"ftrack",
|
||||
"slack",
|
||||
"webserver",
|
||||
"launcher_action",
|
||||
|
|
@ -44,6 +47,7 @@ DEFAULT_OPENPYPE_MODULES = (
|
|||
"traypublish_action",
|
||||
"job_queue",
|
||||
"timers_manager",
|
||||
"sync_server",
|
||||
)
|
||||
|
||||
|
||||
|
|
@ -219,8 +223,6 @@ def load_interfaces(force=False):
|
|||
|
||||
def _load_interfaces():
|
||||
# Key under which will be modules imported in `sys.modules`
|
||||
from openpype.lib import import_filepath
|
||||
|
||||
modules_key = "openpype_interfaces"
|
||||
|
||||
sys.modules[modules_key] = openpype_interfaces = (
|
||||
|
|
|
|||
|
|
@ -15,7 +15,7 @@ import attr
|
|||
import requests
|
||||
|
||||
import pyblish.api
|
||||
from .abstract_metaplugins import AbstractMetaInstancePlugin
|
||||
from openpype.lib.abstract_metaplugins import AbstractMetaInstancePlugin
|
||||
|
||||
|
||||
def requests_post(*args, **kwargs):
|
||||
|
|
@ -1,11 +1,14 @@
|
|||
from openpype.lib import abstract_submit_deadline
|
||||
from openpype.lib.abstract_submit_deadline import DeadlineJobInfo
|
||||
import pyblish.api
|
||||
import os
|
||||
import attr
|
||||
import getpass
|
||||
import pyblish.api
|
||||
|
||||
from avalon import api
|
||||
|
||||
from openpype.lib import env_value_to_bool
|
||||
from openpype_modules.deadline import abstract_submit_deadline
|
||||
from openpype_modules.deadline.abstract_submit_deadline import DeadlineJobInfo
|
||||
|
||||
|
||||
@attr.s
|
||||
class DeadlinePluginInfo():
|
||||
|
|
@ -21,7 +24,9 @@ class DeadlinePluginInfo():
|
|||
MultiProcess = attr.ib(default=None)
|
||||
|
||||
|
||||
class AfterEffectsSubmitDeadline(abstract_submit_deadline.AbstractSubmitDeadline):
|
||||
class AfterEffectsSubmitDeadline(
|
||||
abstract_submit_deadline.AbstractSubmitDeadline
|
||||
):
|
||||
|
||||
label = "Submit AE to Deadline"
|
||||
order = pyblish.api.IntegratorOrder + 0.1
|
||||
|
|
@ -29,7 +34,13 @@ class AfterEffectsSubmitDeadline(abstract_submit_deadline.AbstractSubmitDeadline
|
|||
families = ["render.farm"] # cannot be "render' as that is integrated
|
||||
use_published = True
|
||||
|
||||
priority = 50
|
||||
chunk_size = 1000000
|
||||
primary_pool = None
|
||||
secondary_pool = None
|
||||
group = None
|
||||
department = None
|
||||
multiprocess = True
|
||||
|
||||
def get_job_info(self):
|
||||
dln_job_info = DeadlineJobInfo(Plugin="AfterEffects")
|
||||
|
|
@ -49,6 +60,11 @@ class AfterEffectsSubmitDeadline(abstract_submit_deadline.AbstractSubmitDeadline
|
|||
int(round(self._instance.data["frameEnd"])))
|
||||
dln_job_info.Frames = frame_range
|
||||
|
||||
dln_job_info.Priority = self.priority
|
||||
dln_job_info.Pool = self.primary_pool
|
||||
dln_job_info.SecondaryPool = self.secondary_pool
|
||||
dln_job_info.Group = self.group
|
||||
dln_job_info.Department = self.department
|
||||
dln_job_info.ChunkSize = self.chunk_size
|
||||
dln_job_info.OutputFilename = \
|
||||
os.path.basename(self._instance.data["expectedFiles"][0])
|
||||
|
|
@ -105,9 +121,13 @@ class AfterEffectsSubmitDeadline(abstract_submit_deadline.AbstractSubmitDeadline
|
|||
'{}.{}.{}'.format(arr[0], hashed,
|
||||
arr[2]))
|
||||
|
||||
deadline_plugin_info.MultiProcess = True
|
||||
deadline_plugin_info.Comp = self._instance.data["comp_name"]
|
||||
deadline_plugin_info.Version = "17.5"
|
||||
deadline_plugin_info.Version = self._instance.data["app_version"]
|
||||
# must be here because of DL AE plugin
|
||||
# added override of multiprocess by env var, if shouldn't be used for
|
||||
# some app variant use MULTIPROCESS:false in Settings, default is True
|
||||
env_multi = env_value_to_bool("MULTIPROCESS", default=True)
|
||||
deadline_plugin_info.MultiProcess = env_multi and self.multiprocess
|
||||
deadline_plugin_info.SceneFile = self.scene_path
|
||||
deadline_plugin_info.Output = render_path.replace("\\", "/")
|
||||
|
||||
|
|
@ -8,11 +8,11 @@ import re
|
|||
|
||||
import attr
|
||||
import pyblish.api
|
||||
|
||||
import openpype.lib.abstract_submit_deadline
|
||||
from openpype.lib.abstract_submit_deadline import DeadlineJobInfo
|
||||
from avalon import api
|
||||
|
||||
from openpype_modules.deadline import abstract_submit_deadline
|
||||
from openpype_modules.deadline.abstract_submit_deadline import DeadlineJobInfo
|
||||
|
||||
|
||||
class _ZipFile(ZipFile):
|
||||
"""Extended check for windows invalid characters."""
|
||||
|
|
@ -217,7 +217,8 @@ class PluginInfo(object):
|
|||
|
||||
|
||||
class HarmonySubmitDeadline(
|
||||
openpype.lib.abstract_submit_deadline.AbstractSubmitDeadline):
|
||||
abstract_submit_deadline.AbstractSubmitDeadline
|
||||
):
|
||||
"""Submit render write of Harmony scene to Deadline.
|
||||
|
||||
Renders are submitted to a Deadline Web Service as
|
||||
|
|
@ -320,7 +321,6 @@ class HarmonySubmitDeadline(
|
|||
/ published_scene.stem
|
||||
/ f"{published_scene.stem}.xstage"
|
||||
)
|
||||
|
||||
unzip_dir = (published_scene.parent / published_scene.stem)
|
||||
with _ZipFile(published_scene, "r") as zip_ref:
|
||||
zip_ref.extractall(unzip_dir.as_posix())
|
||||
|
|
@ -351,12 +351,9 @@ class HarmonySubmitDeadline(
|
|||
# use that one.
|
||||
if not ideal_scene:
|
||||
xstage_path = xstage_files[0]
|
||||
|
||||
return xstage_path
|
||||
|
||||
def get_plugin_info(self):
|
||||
work_scene = Path(self._instance.data["source"])
|
||||
|
||||
# this is path to published scene workfile _ZIP_. Before
|
||||
# rendering, we need to unzip it.
|
||||
published_scene = Path(
|
||||
|
|
@ -368,14 +365,13 @@ class HarmonySubmitDeadline(
|
|||
# for submit_publish job to create .json file in
|
||||
self._instance.data["outputDir"] = render_path
|
||||
new_expected_files = []
|
||||
work_path_str = str(work_scene.parent.as_posix())
|
||||
render_path_str = str(render_path.as_posix())
|
||||
for file in self._instance.data["expectedFiles"]:
|
||||
_file = str(Path(file).as_posix())
|
||||
expected_dir_str = os.path.dirname(_file)
|
||||
new_expected_files.append(
|
||||
_file.replace(work_path_str, render_path_str)
|
||||
_file.replace(expected_dir_str, render_path_str)
|
||||
)
|
||||
|
||||
audio_file = self._instance.data.get("audioFile")
|
||||
if audio_file:
|
||||
abs_path = xstage_path.parent / audio_file
|
||||
|
|
@ -404,7 +404,13 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
dirname = os.path.join(workspace, default_render_file)
|
||||
renderlayer = instance.data['setMembers'] # rs_beauty
|
||||
deadline_user = context.data.get("user", getpass.getuser())
|
||||
jobname = "%s - %s" % (filename, instance.name)
|
||||
|
||||
# Always use the original work file name for the Job name even when
|
||||
# rendering is done from the published Work File. The original work
|
||||
# file name is clearer because it can also have subversion strings,
|
||||
# etc. which are stripped for the published file.
|
||||
src_filename = os.path.basename(context.data["currentFile"])
|
||||
jobname = "%s - %s" % (src_filename, instance.name)
|
||||
|
||||
# Get the variables depending on the renderer
|
||||
render_variables = get_renderer_variables(renderlayer, dirname)
|
||||
|
|
@ -452,7 +458,7 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
self.payload_skeleton["JobInfo"]["Plugin"] = self._instance.data.get(
|
||||
"mayaRenderPlugin", "MayaBatch")
|
||||
|
||||
self.payload_skeleton["JobInfo"]["BatchName"] = filename
|
||||
self.payload_skeleton["JobInfo"]["BatchName"] = src_filename
|
||||
# Job name, as seen in Monitor
|
||||
self.payload_skeleton["JobInfo"]["Name"] = jobname
|
||||
# Arbitrary username, for visualisation in Monitor
|
||||
|
|
@ -4,8 +4,8 @@ import requests
|
|||
|
||||
import pyblish.api
|
||||
|
||||
from openpype.lib.abstract_submit_deadline import requests_get
|
||||
from openpype.lib.delivery import collect_frames
|
||||
from openpype_modules.deadline.abstract_submit_deadline import requests_get
|
||||
|
||||
|
||||
class ValidateExpectedFiles(pyblish.api.InstancePlugin):
|
||||
|
|
@ -8,18 +8,23 @@ import platform
|
|||
from Deadline.Scripting import RepositoryUtils, FileUtils
|
||||
|
||||
|
||||
def get_openpype_executable():
|
||||
"""Return OpenPype Executable from Event Plug-in Settings"""
|
||||
config = RepositoryUtils.GetPluginConfig("OpenPype")
|
||||
return config.GetConfigEntryWithDefault("OpenPypeExecutable", "")
|
||||
|
||||
|
||||
def inject_openpype_environment(deadlinePlugin):
|
||||
""" Pull env vars from OpenPype and push them to rendering process.
|
||||
|
||||
Used for correct paths, configuration from OpenPype etc.
|
||||
"""
|
||||
job = deadlinePlugin.GetJob()
|
||||
job = RepositoryUtils.GetJob(job.JobId, True) # invalidates cache
|
||||
|
||||
print(">>> Injecting OpenPype environments ...")
|
||||
try:
|
||||
print(">>> Getting OpenPype executable ...")
|
||||
exe_list = job.GetJobExtraInfoKeyValue("openpype_executables")
|
||||
exe_list = get_openpype_executable()
|
||||
openpype_app = FileUtils.SearchFileList(exe_list)
|
||||
if openpype_app == "":
|
||||
raise RuntimeError(
|
||||
|
|
@ -96,7 +101,6 @@ def inject_render_job_id(deadlinePlugin):
|
|||
"""Inject dependency ids to publish process as env var for validation."""
|
||||
print(">>> Injecting render job id ...")
|
||||
job = deadlinePlugin.GetJob()
|
||||
job = RepositoryUtils.GetJob(job.JobId, True) # invalidates cache
|
||||
|
||||
dependency_ids = job.JobDependencyIDs
|
||||
print(">>> Dependency IDs: {}".format(dependency_ids))
|
||||
|
|
@ -183,7 +187,6 @@ def __main__(deadlinePlugin):
|
|||
print("*** GlobalJobPreload start ...")
|
||||
print(">>> Getting job ...")
|
||||
job = deadlinePlugin.GetJob()
|
||||
job = RepositoryUtils.GetJob(job.JobId, True) # invalidates cache
|
||||
|
||||
openpype_render_job = \
|
||||
job.GetJobEnvironmentKeyValue('OPENPYPE_RENDER_JOB') or '0'
|
||||
|
Before Width: | Height: | Size: 1.1 KiB After Width: | Height: | Size: 1.1 KiB |
|
Before Width: | Height: | Size: 124 KiB After Width: | Height: | Size: 124 KiB |
|
Before Width: | Height: | Size: 124 KiB After Width: | Height: | Size: 124 KiB |
29
openpype/modules/deadline/repository/readme.md
Normal file
29
openpype/modules/deadline/repository/readme.md
Normal file
|
|
@ -0,0 +1,29 @@
|
|||
## OpenPype Deadline repository overlay
|
||||
|
||||
This directory is an overlay for Deadline repository.
|
||||
It means that you can copy the whole hierarchy to Deadline repository and it
|
||||
should work.
|
||||
|
||||
Logic:
|
||||
-----
|
||||
GlobalJobPreLoad
|
||||
-----
|
||||
|
||||
The `GlobalJobPreLoad` will retrieve the OpenPype executable path from the
|
||||
`OpenPype` Deadline Plug-in's settings. Then it will call the executable to
|
||||
retrieve the environment variables needed for the Deadline Job.
|
||||
These environment variables are injected into rendering process.
|
||||
|
||||
Deadline triggers the `GlobalJobPreLoad.py` for each Worker as it starts the
|
||||
Job.
|
||||
|
||||
*Note*: It also contains backward compatible logic to preserve functionality
|
||||
for old Pype2 and non-OpenPype triggered jobs.
|
||||
|
||||
Plugin
|
||||
------
|
||||
For each render and publishing job the `OpenPype` Deadline Plug-in is checked
|
||||
for the configured location of the OpenPype executable (needs to be configured
|
||||
in `Deadline's Configure Plugins > OpenPype`) through `GlobalJobPreLoad`.
|
||||
|
||||
|
||||
|
|
@ -1 +0,0 @@
|
|||
Subproject commit b746fedf7286c3755a46f07ab72f4c414cd41fc0
|
||||
|
|
@ -1 +0,0 @@
|
|||
Subproject commit d277f474ab016e7b53479c36af87cb861d0cc53e
|
||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue