mirror of
https://github.com/ynput/ayon-core.git
synced 2025-12-25 05:14:40 +01:00
Merge branch 'develop' into feature/OP-2590_Applications-without-integration
This commit is contained in:
commit
d282e4d1d5
178 changed files with 8028 additions and 5391 deletions
48
CHANGELOG.md
48
CHANGELOG.md
|
|
@ -1,8 +1,34 @@
|
|||
# Changelog
|
||||
|
||||
## [3.8.2-nightly.2](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
## [3.8.3-nightly.1](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.8.1...HEAD)
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.8.2...HEAD)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
- documentation: add example to `repack-version` command [\#2669](https://github.com/pypeclub/OpenPype/pull/2669)
|
||||
- Update docusaurus [\#2639](https://github.com/pypeclub/OpenPype/pull/2639)
|
||||
- Documentation: Fixed relative links [\#2621](https://github.com/pypeclub/OpenPype/pull/2621)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Ftrack: Sync description to assets [\#2670](https://github.com/pypeclub/OpenPype/pull/2670)
|
||||
- Houdini: Moved to OpenPype [\#2658](https://github.com/pypeclub/OpenPype/pull/2658)
|
||||
- Maya: Move implementation to OpenPype [\#2649](https://github.com/pypeclub/OpenPype/pull/2649)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Maya: Fix menu callbacks [\#2671](https://github.com/pypeclub/OpenPype/pull/2671)
|
||||
- hiero: removing obsolete unsupported plugin [\#2667](https://github.com/pypeclub/OpenPype/pull/2667)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Fix python install in docker for centos7 [\#2664](https://github.com/pypeclub/OpenPype/pull/2664)
|
||||
- Deadline: Be able to pass Mongo url to job [\#2616](https://github.com/pypeclub/OpenPype/pull/2616)
|
||||
|
||||
## [3.8.2](https://github.com/pypeclub/OpenPype/tree/3.8.2) (2022-02-07)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.8.2-nightly.3...3.8.2)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
|
|
@ -10,8 +36,12 @@
|
|||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- TVPaint: Image loaders also work on review family [\#2638](https://github.com/pypeclub/OpenPype/pull/2638)
|
||||
- General: Project backup tools [\#2629](https://github.com/pypeclub/OpenPype/pull/2629)
|
||||
- nuke: adding clear button to write nodes [\#2627](https://github.com/pypeclub/OpenPype/pull/2627)
|
||||
- Ftrack: Family to Asset type mapping is in settings [\#2602](https://github.com/pypeclub/OpenPype/pull/2602)
|
||||
- Nuke: load color space from representation data [\#2576](https://github.com/pypeclub/OpenPype/pull/2576)
|
||||
- New Publisher: New features and preparations for new standalone publisher [\#2556](https://github.com/pypeclub/OpenPype/pull/2556)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
|
|
@ -20,6 +50,8 @@
|
|||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Docker: enhance dockerfiles with metadata, fix pyenv initialization [\#2647](https://github.com/pypeclub/OpenPype/pull/2647)
|
||||
- WebPublisher: fix instance duplicates [\#2641](https://github.com/pypeclub/OpenPype/pull/2641)
|
||||
- Fix - safer pulling of task name for webpublishing from PS [\#2613](https://github.com/pypeclub/OpenPype/pull/2613)
|
||||
|
||||
## [3.8.1](https://github.com/pypeclub/OpenPype/tree/3.8.1) (2022-02-01)
|
||||
|
|
@ -30,6 +62,7 @@
|
|||
|
||||
- Webpublisher: Thumbnail extractor [\#2600](https://github.com/pypeclub/OpenPype/pull/2600)
|
||||
- Loader: Allow to toggle default family filters between "include" or "exclude" filtering [\#2541](https://github.com/pypeclub/OpenPype/pull/2541)
|
||||
- Launcher: Added context menu to to skip opening last workfile [\#2536](https://github.com/pypeclub/OpenPype/pull/2536)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
|
|
@ -41,7 +74,6 @@
|
|||
- Webpublisher: Fix - subset names from processed .psd used wrong value for task [\#2586](https://github.com/pypeclub/OpenPype/pull/2586)
|
||||
- `vrscene` creator Deadline webservice URL handling [\#2580](https://github.com/pypeclub/OpenPype/pull/2580)
|
||||
- global: track name was failing if duplicated root word in name [\#2568](https://github.com/pypeclub/OpenPype/pull/2568)
|
||||
- Validate Maya Rig produces no cycle errors [\#2484](https://github.com/pypeclub/OpenPype/pull/2484)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
|
|
@ -63,7 +95,6 @@
|
|||
- Maya : V-Ray Proxy - load all ABC files via proxy [\#2544](https://github.com/pypeclub/OpenPype/pull/2544)
|
||||
- Maya to Unreal: Extended static mesh workflow [\#2537](https://github.com/pypeclub/OpenPype/pull/2537)
|
||||
- Flame: collecting publishable instances [\#2519](https://github.com/pypeclub/OpenPype/pull/2519)
|
||||
- Flame: create publishable clips [\#2495](https://github.com/pypeclub/OpenPype/pull/2495)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
|
|
@ -73,15 +104,10 @@
|
|||
- Settings: PathInput strip passed string [\#2550](https://github.com/pypeclub/OpenPype/pull/2550)
|
||||
- Global: Exctract Review anatomy fill data with output name [\#2548](https://github.com/pypeclub/OpenPype/pull/2548)
|
||||
- Cosmetics: Clean up some cosmetics / typos [\#2542](https://github.com/pypeclub/OpenPype/pull/2542)
|
||||
- Launcher: Added context menu to to skip opening last workfile [\#2536](https://github.com/pypeclub/OpenPype/pull/2536)
|
||||
- General: Validate if current process OpenPype version is requested version [\#2529](https://github.com/pypeclub/OpenPype/pull/2529)
|
||||
- General: Be able to use anatomy data in ffmpeg output arguments [\#2525](https://github.com/pypeclub/OpenPype/pull/2525)
|
||||
- Expose toggle publish plug-in settings for Maya Look Shading Engine Naming [\#2521](https://github.com/pypeclub/OpenPype/pull/2521)
|
||||
- Photoshop: Move implementation to OpenPype [\#2510](https://github.com/pypeclub/OpenPype/pull/2510)
|
||||
- TimersManager: Move module one hierarchy higher [\#2501](https://github.com/pypeclub/OpenPype/pull/2501)
|
||||
- Slack: notifications are sent with Openpype logo and bot name [\#2499](https://github.com/pypeclub/OpenPype/pull/2499)
|
||||
- Slack: Add review to notification message [\#2498](https://github.com/pypeclub/OpenPype/pull/2498)
|
||||
- Maya: Collect 'fps' animation data only for "review" instances [\#2486](https://github.com/pypeclub/OpenPype/pull/2486)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
|
|
@ -115,10 +141,6 @@
|
|||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.7.0-nightly.14...3.7.0)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- TVPaint: Create render layer dialog is in front [\#2471](https://github.com/pypeclub/OpenPype/pull/2471)
|
||||
|
||||
## [3.6.4](https://github.com/pypeclub/OpenPype/tree/3.6.4) (2021-11-23)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.7.0-nightly.1...3.6.4)
|
||||
|
|
|
|||
|
|
@ -42,7 +42,8 @@ RUN yum -y install https://dl.fedoraproject.org/pub/epel/epel-release-latest-7.n
|
|||
patchelf \
|
||||
automake \
|
||||
autoconf \
|
||||
ncurses \
|
||||
patch \
|
||||
ncurses \
|
||||
ncurses-devel \
|
||||
qt5-qtbase-devel \
|
||||
xcb-util-wm \
|
||||
|
|
|
|||
|
|
@ -118,6 +118,7 @@ class CollectAERender(abstract_collect_render.AbstractCollectRender):
|
|||
instance.anatomyData = context.data["anatomyData"]
|
||||
|
||||
instance.outputDir = self._get_output_dir(instance)
|
||||
instance.context = context
|
||||
|
||||
settings = get_project_settings(os.getenv("AVALON_PROJECT"))
|
||||
reviewable_subset_filter = \
|
||||
|
|
@ -142,7 +143,6 @@ class CollectAERender(abstract_collect_render.AbstractCollectRender):
|
|||
break
|
||||
|
||||
self.log.info("New instance:: {}".format(instance))
|
||||
|
||||
instances.append(instance)
|
||||
|
||||
return instances
|
||||
|
|
|
|||
|
|
@ -176,6 +176,7 @@ class CollectFarmRender(openpype.lib.abstract_collect_render.
|
|||
ignoreFrameHandleCheck=True
|
||||
|
||||
)
|
||||
render_instance.context = context
|
||||
self.log.debug(render_instance)
|
||||
instances.append(render_instance)
|
||||
|
||||
|
|
|
|||
|
|
@ -18,6 +18,7 @@ def add_implementation_envs(env, _app):
|
|||
new_hiero_paths.append(norm_path)
|
||||
|
||||
env["HIERO_PLUGIN_PATH"] = os.pathsep.join(new_hiero_paths)
|
||||
env.pop("QT_AUTO_SCREEN_SCALE_FACTOR", None)
|
||||
|
||||
# Try to add QuickTime to PATH
|
||||
quick_time_path = "C:/Program Files (x86)/QuickTime/QTSystem"
|
||||
|
|
|
|||
|
|
@ -1,352 +0,0 @@
|
|||
# version_up_everywhere.py
|
||||
# Adds action to enable a Clip/Shot to be Min/Max/Next/Prev versioned in all shots used in a Project.
|
||||
#
|
||||
# Usage:
|
||||
# 1) Copy file to <HIERO_PLUGIN_PATH>/Python/Startup
|
||||
# 2) Right-click on Clip(s) or Bins containing Clips in in the Bin View, or on Shots in the Timeline/Spreadsheet
|
||||
# 3) Set Version for all Shots > OPTION to update the version in all shots where the Clip is used in the Project.
|
||||
|
||||
import hiero.core
|
||||
try:
|
||||
from PySide.QtGui import *
|
||||
from PySide.QtCore import *
|
||||
except:
|
||||
from PySide2.QtGui import *
|
||||
from PySide2.QtWidgets import *
|
||||
from PySide2.QtCore import *
|
||||
|
||||
|
||||
def whereAmI(self, searchType="TrackItem"):
|
||||
"""returns a list of TrackItem or Sequnece objects in the Project which contain this Clip.
|
||||
By default this will return a list of TrackItems where the Clip is used in its project.
|
||||
You can also return a list of Sequences by specifying the searchType to be "Sequence".
|
||||
Should consider putting this into hiero.core.Clip by default?
|
||||
|
||||
Example usage:
|
||||
|
||||
shotsForClip = clip.whereAmI("TrackItem")
|
||||
sequencesForClip = clip.whereAmI("Sequence")
|
||||
"""
|
||||
proj = self.project()
|
||||
|
||||
if ("TrackItem" not in searchType) and ("Sequence" not in searchType):
|
||||
print("searchType argument must be \"TrackItem\" or \"Sequence\"")
|
||||
return None
|
||||
|
||||
# If user specifies a TrackItem, then it will return
|
||||
searches = hiero.core.findItemsInProject(proj, searchType)
|
||||
|
||||
if len(searches) == 0:
|
||||
print("Unable to find {} in any items of type: {}".format(
|
||||
str(self), searchType))
|
||||
return None
|
||||
|
||||
# Case 1: Looking for Shots (trackItems)
|
||||
clipUsedIn = []
|
||||
if isinstance(searches[0], hiero.core.TrackItem):
|
||||
for shot in searches:
|
||||
# We have to wrap this in a try/except because it's possible through the Python API for a Shot to exist without a Clip in the Bin
|
||||
try:
|
||||
|
||||
# For versioning to work, we must look to the BinItem that a Clip is wrapped in.
|
||||
if shot.source().binItem() == self.binItem():
|
||||
clipUsedIn.append(shot)
|
||||
|
||||
# If we throw an exception here its because the Shot did not have a Source Clip in the Bin.
|
||||
except RuntimeError:
|
||||
hiero.core.log.info(
|
||||
'Unable to find Parent Clip BinItem for Shot: %s, Source:%s'
|
||||
% (shot, shot.source()))
|
||||
pass
|
||||
|
||||
# Case 1: Looking for Shots (trackItems)
|
||||
elif isinstance(searches[0], hiero.core.Sequence):
|
||||
for seq in searches:
|
||||
# Iterate tracks > shots...
|
||||
tracks = seq.items()
|
||||
for track in tracks:
|
||||
shots = track.items()
|
||||
for shot in shots:
|
||||
if shot.source().binItem() == self.binItem():
|
||||
clipUsedIn.append(seq)
|
||||
|
||||
return clipUsedIn
|
||||
|
||||
|
||||
# Add whereAmI method to Clip object
|
||||
hiero.core.Clip.whereAmI = whereAmI
|
||||
|
||||
|
||||
#### MAIN VERSION EVERYWHERE GUBBINS #####
|
||||
class VersionAllMenu(object):
|
||||
|
||||
# These are a set of action names we can use for operating on multiple Clip/TrackItems
|
||||
eMaxVersion = "Max Version"
|
||||
eMinVersion = "Min Version"
|
||||
eNextVersion = "Next Version"
|
||||
ePreviousVersion = "Previous Version"
|
||||
|
||||
# This is the title used for the Version Menu title. It's long isn't it?
|
||||
actionTitle = "Set Version for all Shots"
|
||||
|
||||
def __init__(self):
|
||||
self._versionEverywhereMenu = None
|
||||
self._versionActions = []
|
||||
|
||||
hiero.core.events.registerInterest("kShowContextMenu/kBin",
|
||||
self.binViewEventHandler)
|
||||
hiero.core.events.registerInterest("kShowContextMenu/kTimeline",
|
||||
self.binViewEventHandler)
|
||||
hiero.core.events.registerInterest("kShowContextMenu/kSpreadsheet",
|
||||
self.binViewEventHandler)
|
||||
|
||||
def showVersionUpdateReportFromShotManifest(self, sequenceShotManifest):
|
||||
"""This just displays an info Message box, based on a Sequence[Shot] manifest dictionary"""
|
||||
|
||||
# Now present an info dialog, explaining where shots were updated
|
||||
updateReportString = "The following Versions were updated:\n"
|
||||
for seq in sequenceShotManifest.keys():
|
||||
updateReportString += "%s:\n Shots:\n" % (seq.name())
|
||||
for shot in sequenceShotManifest[seq]:
|
||||
updateReportString += ' %s\n (New Version: %s)\n' % (
|
||||
shot.name(), shot.currentVersion().name())
|
||||
updateReportString += "\n"
|
||||
|
||||
infoBox = QMessageBox(hiero.ui.mainWindow())
|
||||
infoBox.setIcon(QMessageBox.Information)
|
||||
|
||||
if len(sequenceShotManifest) <= 0:
|
||||
infoBox.setText("No Shot Versions were updated")
|
||||
infoBox.setInformativeText(
|
||||
"Clip could not be found in any Shots in this Project")
|
||||
else:
|
||||
infoBox.setText(
|
||||
"Versions were updated in %i Sequences of this Project." %
|
||||
(len(sequenceShotManifest)))
|
||||
infoBox.setInformativeText("Show Details for more info.")
|
||||
infoBox.setDetailedText(updateReportString)
|
||||
|
||||
infoBox.exec_()
|
||||
|
||||
def makeVersionActionForSingleClip(self, version):
|
||||
"""This is used to populate the QAction list of Versions when a single Clip is selected in the BinView.
|
||||
It also triggers the Version Update action based on the version passed to it.
|
||||
(Not sure if this is good design practice, but it's compact!)"""
|
||||
action = QAction(version.name(), None)
|
||||
action.setData(lambda: version)
|
||||
|
||||
def updateAllTrackItems():
|
||||
currentClip = version.item()
|
||||
trackItems = currentClip.whereAmI()
|
||||
if not trackItems:
|
||||
return
|
||||
|
||||
proj = currentClip.project()
|
||||
|
||||
# A Sequence-Shot manifest dictionary
|
||||
sequenceShotManifest = {}
|
||||
|
||||
# Make this all undo-able in a single Group undo
|
||||
with proj.beginUndo(
|
||||
"Update All Versions for %s" % currentClip.name()):
|
||||
for shot in trackItems:
|
||||
seq = shot.parentSequence()
|
||||
if seq not in sequenceShotManifest.keys():
|
||||
sequenceShotManifest[seq] = [shot]
|
||||
else:
|
||||
sequenceShotManifest[seq] += [shot]
|
||||
shot.setCurrentVersion(version)
|
||||
|
||||
# We also should update the current Version of the selected Clip for completeness...
|
||||
currentClip.binItem().setActiveVersion(version)
|
||||
|
||||
# Now disaplay a Dialog which informs the user of where and what was changed
|
||||
self.showVersionUpdateReportFromShotManifest(sequenceShotManifest)
|
||||
|
||||
action.triggered.connect(updateAllTrackItems)
|
||||
return action
|
||||
|
||||
# This is just a convenience method for returning QActions with a title, triggered method and icon.
|
||||
def makeAction(self, title, method, icon=None):
|
||||
action = QAction(title, None)
|
||||
action.setIcon(QIcon(icon))
|
||||
|
||||
# We do this magic, so that the title string from the action is used to trigger the version change
|
||||
def methodWrapper():
|
||||
method(title)
|
||||
|
||||
action.triggered.connect(methodWrapper)
|
||||
return action
|
||||
|
||||
def clipSelectionFromView(self, view):
|
||||
"""Helper method to return a list of Clips in the Active View"""
|
||||
selection = hiero.ui.activeView().selection()
|
||||
|
||||
if len(selection) == 0:
|
||||
return None
|
||||
|
||||
if isinstance(view, hiero.ui.BinView):
|
||||
# We could have a mixture of Bins and Clips selected, so sort of the Clips and Clips inside Bins
|
||||
clipItems = [
|
||||
item.activeItem() for item in selection
|
||||
if hasattr(item, "activeItem")
|
||||
and isinstance(item.activeItem(), hiero.core.Clip)
|
||||
]
|
||||
|
||||
# We'll also append Bins here, and see if can find Clips inside
|
||||
bins = [
|
||||
item for item in selection if isinstance(item, hiero.core.Bin)
|
||||
]
|
||||
|
||||
# We search inside of a Bin for a Clip which is not already in clipBinItems
|
||||
if len(bins) > 0:
|
||||
# Grab the Clips inside of a Bin and append them to a list
|
||||
for bin in bins:
|
||||
clips = hiero.core.findItemsInBin(bin, "Clip")
|
||||
for clip in clips:
|
||||
if clip not in clipItems:
|
||||
clipItems.append(clip)
|
||||
|
||||
elif isinstance(view,
|
||||
(hiero.ui.TimelineEditor, hiero.ui.SpreadsheetView)):
|
||||
# Here, we have shots. To get to the Clip froma TrackItem, just call source()
|
||||
clipItems = [
|
||||
item.source() for item in selection if hasattr(item, "source")
|
||||
and isinstance(item, hiero.core.TrackItem)
|
||||
]
|
||||
|
||||
return clipItems
|
||||
|
||||
# This generates the Version Up Everywhere menu
|
||||
def createVersionEveryWhereMenuForView(self, view):
|
||||
|
||||
versionEverywhereMenu = QMenu(self.actionTitle)
|
||||
self._versionActions = []
|
||||
# We look to the activeView for a selection of Clips
|
||||
clips = self.clipSelectionFromView(view)
|
||||
|
||||
# And bail if nothing is found
|
||||
if len(clips) == 0:
|
||||
return versionEverywhereMenu
|
||||
|
||||
# Now, if we have just one Clip selected, we'll form a special menu, which lists all versions
|
||||
if len(clips) == 1:
|
||||
|
||||
# Get a reversed list of Versions, so that bigger ones appear at top
|
||||
versions = list(reversed(clips[0].binItem().items()))
|
||||
for version in versions:
|
||||
self._versionActions += [
|
||||
self.makeVersionActionForSingleClip(version)
|
||||
]
|
||||
|
||||
elif len(clips) > 1:
|
||||
# We will add Max/Min/Prev/Next options, which can be called on a TrackItem, without the need for a Version object
|
||||
self._versionActions += [
|
||||
self.makeAction(
|
||||
self.eMaxVersion,
|
||||
self.setTrackItemVersionForClipSelection,
|
||||
icon=None)
|
||||
]
|
||||
self._versionActions += [
|
||||
self.makeAction(
|
||||
self.eMinVersion,
|
||||
self.setTrackItemVersionForClipSelection,
|
||||
icon=None)
|
||||
]
|
||||
self._versionActions += [
|
||||
self.makeAction(
|
||||
self.eNextVersion,
|
||||
self.setTrackItemVersionForClipSelection,
|
||||
icon=None)
|
||||
]
|
||||
self._versionActions += [
|
||||
self.makeAction(
|
||||
self.ePreviousVersion,
|
||||
self.setTrackItemVersionForClipSelection,
|
||||
icon=None)
|
||||
]
|
||||
|
||||
for act in self._versionActions:
|
||||
versionEverywhereMenu.addAction(act)
|
||||
|
||||
return versionEverywhereMenu
|
||||
|
||||
def setTrackItemVersionForClipSelection(self, versionOption):
|
||||
|
||||
view = hiero.ui.activeView()
|
||||
if not view:
|
||||
return
|
||||
|
||||
clipSelection = self.clipSelectionFromView(view)
|
||||
|
||||
if len(clipSelection) == 0:
|
||||
return
|
||||
|
||||
proj = clipSelection[0].project()
|
||||
|
||||
# Create a Sequence-Shot Manifest, to report to users where a Shot was updated
|
||||
sequenceShotManifest = {}
|
||||
|
||||
with proj.beginUndo("Update multiple Versions"):
|
||||
for clip in clipSelection:
|
||||
|
||||
# Look to see if it exists in a TrackItem somewhere...
|
||||
shotUsage = clip.whereAmI("TrackItem")
|
||||
|
||||
# Next, depending on the versionOption, make the appropriate update
|
||||
# There's probably a more neat/compact way of doing this...
|
||||
for shot in shotUsage:
|
||||
|
||||
# This step is done for reporting reasons
|
||||
seq = shot.parentSequence()
|
||||
if seq not in sequenceShotManifest.keys():
|
||||
sequenceShotManifest[seq] = [shot]
|
||||
else:
|
||||
sequenceShotManifest[seq] += [shot]
|
||||
|
||||
if versionOption == self.eMaxVersion:
|
||||
shot.maxVersion()
|
||||
elif versionOption == self.eMinVersion:
|
||||
shot.minVersion()
|
||||
elif versionOption == self.eNextVersion:
|
||||
shot.nextVersion()
|
||||
elif versionOption == self.ePreviousVersion:
|
||||
shot.prevVersion()
|
||||
|
||||
# Finally, for completeness, set the Max/Min version of the Clip too (if chosen)
|
||||
# Note: It doesn't make sense to do Next/Prev on a Clip here because next/prev means different things for different Shots
|
||||
if versionOption == self.eMaxVersion:
|
||||
clip.binItem().maxVersion()
|
||||
elif versionOption == self.eMinVersion:
|
||||
clip.binItem().minVersion()
|
||||
|
||||
# Now disaplay a Dialog which informs the user of where and what was changed
|
||||
self.showVersionUpdateReportFromShotManifest(sequenceShotManifest)
|
||||
|
||||
# This handles events from the Project Bin View
|
||||
def binViewEventHandler(self, event):
|
||||
|
||||
if not hasattr(event.sender, "selection"):
|
||||
# Something has gone wrong, we should only be here if raised
|
||||
# by the Bin view which gives a selection.
|
||||
return
|
||||
selection = event.sender.selection()
|
||||
|
||||
# Return if there's no Selection. We won't add the Localise Menu.
|
||||
if selection == None:
|
||||
return
|
||||
|
||||
view = hiero.ui.activeView()
|
||||
# Only add the Menu if Bins or Sequences are selected (this ensures menu isn't added in the Tags Pane)
|
||||
if len(selection) > 0:
|
||||
self._versionEverywhereMenu = self.createVersionEveryWhereMenuForView(
|
||||
view)
|
||||
hiero.ui.insertMenuAction(
|
||||
self._versionEverywhereMenu.menuAction(),
|
||||
event.menu,
|
||||
after="foundry.menu.version")
|
||||
return
|
||||
|
||||
|
||||
# Instantiate the Menu to get it to register itself.
|
||||
VersionAllMenu = VersionAllMenu()
|
||||
|
|
@ -2,11 +2,11 @@ import re
|
|||
import pyblish.api
|
||||
|
||||
|
||||
class PreCollectClipEffects(pyblish.api.InstancePlugin):
|
||||
class CollectClipEffects(pyblish.api.InstancePlugin):
|
||||
"""Collect soft effects instances."""
|
||||
|
||||
order = pyblish.api.CollectorOrder - 0.479
|
||||
label = "Precollect Clip Effects Instances"
|
||||
order = pyblish.api.CollectorOrder - 0.078
|
||||
label = "Collect Clip Effects Instances"
|
||||
families = ["clip"]
|
||||
|
||||
def process(self, instance):
|
||||
|
|
@ -1,174 +1,60 @@
|
|||
import os
|
||||
import sys
|
||||
import logging
|
||||
import contextlib
|
||||
from .pipeline import (
|
||||
install,
|
||||
uninstall,
|
||||
|
||||
import hou
|
||||
|
||||
from pyblish import api as pyblish
|
||||
from avalon import api as avalon
|
||||
|
||||
import openpype.hosts.houdini
|
||||
from openpype.hosts.houdini.api import lib
|
||||
|
||||
from openpype.lib import (
|
||||
any_outdated
|
||||
ls,
|
||||
containerise,
|
||||
)
|
||||
|
||||
from .lib import get_asset_fps
|
||||
from .plugin import (
|
||||
Creator,
|
||||
)
|
||||
|
||||
log = logging.getLogger("openpype.hosts.houdini")
|
||||
from .workio import (
|
||||
open_file,
|
||||
save_file,
|
||||
current_file,
|
||||
has_unsaved_changes,
|
||||
file_extensions,
|
||||
work_root
|
||||
)
|
||||
|
||||
HOST_DIR = os.path.dirname(os.path.abspath(openpype.hosts.houdini.__file__))
|
||||
PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
|
||||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
|
||||
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
|
||||
from .lib import (
|
||||
lsattr,
|
||||
lsattrs,
|
||||
read,
|
||||
|
||||
maintained_selection,
|
||||
unique_name
|
||||
)
|
||||
|
||||
|
||||
def install():
|
||||
__all__ = [
|
||||
"install",
|
||||
"uninstall",
|
||||
|
||||
pyblish.register_plugin_path(PUBLISH_PATH)
|
||||
avalon.register_plugin_path(avalon.Loader, LOAD_PATH)
|
||||
avalon.register_plugin_path(avalon.Creator, CREATE_PATH)
|
||||
"ls",
|
||||
"containerise",
|
||||
|
||||
log.info("Installing callbacks ... ")
|
||||
# avalon.on("init", on_init)
|
||||
avalon.before("save", before_save)
|
||||
avalon.on("save", on_save)
|
||||
avalon.on("open", on_open)
|
||||
avalon.on("new", on_new)
|
||||
"Creator",
|
||||
|
||||
pyblish.register_callback("instanceToggled", on_pyblish_instance_toggled)
|
||||
# Workfiles API
|
||||
"open_file",
|
||||
"save_file",
|
||||
"current_file",
|
||||
"has_unsaved_changes",
|
||||
"file_extensions",
|
||||
"work_root",
|
||||
|
||||
log.info("Setting default family states for loader..")
|
||||
avalon.data["familiesStateToggled"] = [
|
||||
"imagesequence",
|
||||
"review"
|
||||
]
|
||||
# Utility functions
|
||||
"lsattr",
|
||||
"lsattrs",
|
||||
"read",
|
||||
|
||||
# add houdini vendor packages
|
||||
hou_pythonpath = os.path.join(os.path.dirname(HOST_DIR), "vendor")
|
||||
"maintained_selection",
|
||||
"unique_name"
|
||||
]
|
||||
|
||||
sys.path.append(hou_pythonpath)
|
||||
|
||||
# Set asset FPS for the empty scene directly after launch of Houdini
|
||||
# so it initializes into the correct scene FPS
|
||||
_set_asset_fps()
|
||||
|
||||
|
||||
def before_save(*args):
|
||||
return lib.validate_fps()
|
||||
|
||||
|
||||
def on_save(*args):
|
||||
|
||||
avalon.logger.info("Running callback on save..")
|
||||
|
||||
nodes = lib.get_id_required_nodes()
|
||||
for node, new_id in lib.generate_ids(nodes):
|
||||
lib.set_id(node, new_id, overwrite=False)
|
||||
|
||||
|
||||
def on_open(*args):
|
||||
|
||||
if not hou.isUIAvailable():
|
||||
log.debug("Batch mode detected, ignoring `on_open` callbacks..")
|
||||
return
|
||||
|
||||
avalon.logger.info("Running callback on open..")
|
||||
|
||||
# Validate FPS after update_task_from_path to
|
||||
# ensure it is using correct FPS for the asset
|
||||
lib.validate_fps()
|
||||
|
||||
if any_outdated():
|
||||
from openpype.widgets import popup
|
||||
|
||||
log.warning("Scene has outdated content.")
|
||||
|
||||
# Get main window
|
||||
parent = hou.ui.mainQtWindow()
|
||||
if parent is None:
|
||||
log.info("Skipping outdated content pop-up "
|
||||
"because Houdini window can't be found.")
|
||||
else:
|
||||
|
||||
# Show outdated pop-up
|
||||
def _on_show_inventory():
|
||||
import avalon.tools.sceneinventory as tool
|
||||
tool.show(parent=parent)
|
||||
|
||||
dialog = popup.Popup(parent=parent)
|
||||
dialog.setWindowTitle("Houdini scene has outdated content")
|
||||
dialog.setMessage("There are outdated containers in "
|
||||
"your Houdini scene.")
|
||||
dialog.on_clicked.connect(_on_show_inventory)
|
||||
dialog.show()
|
||||
|
||||
|
||||
def on_new(_):
|
||||
"""Set project resolution and fps when create a new file"""
|
||||
avalon.logger.info("Running callback on new..")
|
||||
_set_asset_fps()
|
||||
|
||||
|
||||
def _set_asset_fps():
|
||||
"""Set Houdini scene FPS to the default required for current asset"""
|
||||
|
||||
# Set new scene fps
|
||||
fps = get_asset_fps()
|
||||
print("Setting scene FPS to %i" % fps)
|
||||
lib.set_scene_fps(fps)
|
||||
|
||||
|
||||
def on_pyblish_instance_toggled(instance, new_value, old_value):
|
||||
"""Toggle saver tool passthrough states on instance toggles."""
|
||||
@contextlib.contextmanager
|
||||
def main_take(no_update=True):
|
||||
"""Enter root take during context"""
|
||||
original_take = hou.takes.currentTake()
|
||||
original_update_mode = hou.updateModeSetting()
|
||||
root = hou.takes.rootTake()
|
||||
has_changed = False
|
||||
try:
|
||||
if original_take != root:
|
||||
has_changed = True
|
||||
if no_update:
|
||||
hou.setUpdateMode(hou.updateMode.Manual)
|
||||
hou.takes.setCurrentTake(root)
|
||||
yield
|
||||
finally:
|
||||
if has_changed:
|
||||
if no_update:
|
||||
hou.setUpdateMode(original_update_mode)
|
||||
hou.takes.setCurrentTake(original_take)
|
||||
|
||||
if not instance.data.get("_allowToggleBypass", True):
|
||||
return
|
||||
|
||||
nodes = instance[:]
|
||||
if not nodes:
|
||||
return
|
||||
|
||||
# Assume instance node is first node
|
||||
instance_node = nodes[0]
|
||||
|
||||
if not hasattr(instance_node, "isBypassed"):
|
||||
# Likely not a node that can actually be bypassed
|
||||
log.debug("Can't bypass node: %s", instance_node.path())
|
||||
return
|
||||
|
||||
if instance_node.isBypassed() != (not old_value):
|
||||
print("%s old bypass state didn't match old instance state, "
|
||||
"updating anyway.." % instance_node.path())
|
||||
|
||||
try:
|
||||
# Go into the main take, because when in another take changing
|
||||
# the bypass state of a note cannot be done due to it being locked
|
||||
# by default.
|
||||
with main_take(no_update=True):
|
||||
instance_node.bypass(not new_value)
|
||||
except hou.PermissionError as exc:
|
||||
log.warning("%s - %s", instance_node.path(), exc)
|
||||
# Backwards API compatibility
|
||||
open = open_file
|
||||
save = save_file
|
||||
|
|
|
|||
|
|
@ -2,9 +2,11 @@ import uuid
|
|||
import logging
|
||||
from contextlib import contextmanager
|
||||
|
||||
from openpype.api import get_asset
|
||||
import six
|
||||
|
||||
from avalon import api, io
|
||||
from avalon.houdini import lib as houdini
|
||||
from openpype.api import get_asset
|
||||
|
||||
|
||||
import hou
|
||||
|
||||
|
|
@ -15,11 +17,11 @@ def get_asset_fps():
|
|||
"""Return current asset fps."""
|
||||
return get_asset()["data"].get("fps")
|
||||
|
||||
def set_id(node, unique_id, overwrite=False):
|
||||
|
||||
def set_id(node, unique_id, overwrite=False):
|
||||
exists = node.parm("id")
|
||||
if not exists:
|
||||
houdini.imprint(node, {"id": unique_id})
|
||||
imprint(node, {"id": unique_id})
|
||||
|
||||
if not exists and overwrite:
|
||||
node.setParm("id", unique_id)
|
||||
|
|
@ -342,3 +344,183 @@ def render_rop(ropnode):
|
|||
import traceback
|
||||
traceback.print_exc()
|
||||
raise RuntimeError("Render failed: {0}".format(exc))
|
||||
|
||||
|
||||
def children_as_string(node):
|
||||
return [c.name() for c in node.children()]
|
||||
|
||||
|
||||
def imprint(node, data):
|
||||
"""Store attributes with value on a node
|
||||
|
||||
Depending on the type of attribute it creates the correct parameter
|
||||
template. Houdini uses a template per type, see the docs for more
|
||||
information.
|
||||
|
||||
http://www.sidefx.com/docs/houdini/hom/hou/ParmTemplate.html
|
||||
|
||||
Args:
|
||||
node(hou.Node): node object from Houdini
|
||||
data(dict): collection of attributes and their value
|
||||
|
||||
Returns:
|
||||
None
|
||||
|
||||
"""
|
||||
|
||||
parm_group = node.parmTemplateGroup()
|
||||
|
||||
parm_folder = hou.FolderParmTemplate("folder", "Extra")
|
||||
for key, value in data.items():
|
||||
if value is None:
|
||||
continue
|
||||
|
||||
if isinstance(value, float):
|
||||
parm = hou.FloatParmTemplate(name=key,
|
||||
label=key,
|
||||
num_components=1,
|
||||
default_value=(value,))
|
||||
elif isinstance(value, bool):
|
||||
parm = hou.ToggleParmTemplate(name=key,
|
||||
label=key,
|
||||
default_value=value)
|
||||
elif isinstance(value, int):
|
||||
parm = hou.IntParmTemplate(name=key,
|
||||
label=key,
|
||||
num_components=1,
|
||||
default_value=(value,))
|
||||
elif isinstance(value, six.string_types):
|
||||
parm = hou.StringParmTemplate(name=key,
|
||||
label=key,
|
||||
num_components=1,
|
||||
default_value=(value,))
|
||||
else:
|
||||
raise TypeError("Unsupported type: %r" % type(value))
|
||||
|
||||
parm_folder.addParmTemplate(parm)
|
||||
|
||||
parm_group.append(parm_folder)
|
||||
node.setParmTemplateGroup(parm_group)
|
||||
|
||||
|
||||
def lsattr(attr, value=None):
|
||||
if value is None:
|
||||
nodes = list(hou.node("/obj").allNodes())
|
||||
return [n for n in nodes if n.parm(attr)]
|
||||
return lsattrs({attr: value})
|
||||
|
||||
|
||||
def lsattrs(attrs):
|
||||
"""Return nodes matching `key` and `value`
|
||||
|
||||
Arguments:
|
||||
attrs (dict): collection of attribute: value
|
||||
|
||||
Example:
|
||||
>> lsattrs({"id": "myId"})
|
||||
["myNode"]
|
||||
>> lsattr("id")
|
||||
["myNode", "myOtherNode"]
|
||||
|
||||
Returns:
|
||||
list
|
||||
"""
|
||||
|
||||
matches = set()
|
||||
nodes = list(hou.node("/obj").allNodes()) # returns generator object
|
||||
for node in nodes:
|
||||
for attr in attrs:
|
||||
if not node.parm(attr):
|
||||
continue
|
||||
elif node.evalParm(attr) != attrs[attr]:
|
||||
continue
|
||||
else:
|
||||
matches.add(node)
|
||||
|
||||
return list(matches)
|
||||
|
||||
|
||||
def read(node):
|
||||
"""Read the container data in to a dict
|
||||
|
||||
Args:
|
||||
node(hou.Node): Houdini node
|
||||
|
||||
Returns:
|
||||
dict
|
||||
|
||||
"""
|
||||
# `spareParms` returns a tuple of hou.Parm objects
|
||||
return {parameter.name(): parameter.eval() for
|
||||
parameter in node.spareParms()}
|
||||
|
||||
|
||||
def unique_name(name, format="%03d", namespace="", prefix="", suffix="",
|
||||
separator="_"):
|
||||
"""Return unique `name`
|
||||
|
||||
The function takes into consideration an optional `namespace`
|
||||
and `suffix`. The suffix is included in evaluating whether a
|
||||
name exists - such as `name` + "_GRP" - but isn't included
|
||||
in the returned value.
|
||||
|
||||
If a namespace is provided, only names within that namespace
|
||||
are considered when evaluating whether the name is unique.
|
||||
|
||||
Arguments:
|
||||
format (str, optional): The `name` is given a number, this determines
|
||||
how this number is formatted. Defaults to a padding of 2.
|
||||
E.g. my_name01, my_name02.
|
||||
namespace (str, optional): Only consider names within this namespace.
|
||||
suffix (str, optional): Only consider names with this suffix.
|
||||
|
||||
Example:
|
||||
>>> name = hou.node("/obj").createNode("geo", name="MyName")
|
||||
>>> assert hou.node("/obj/MyName")
|
||||
True
|
||||
>>> unique = unique_name(name)
|
||||
>>> assert hou.node("/obj/{}".format(unique))
|
||||
False
|
||||
|
||||
"""
|
||||
|
||||
iteration = 1
|
||||
|
||||
parts = [prefix, name, format % iteration, suffix]
|
||||
if namespace:
|
||||
parts.insert(0, namespace)
|
||||
|
||||
unique = separator.join(parts)
|
||||
children = children_as_string(hou.node("/obj"))
|
||||
while unique in children:
|
||||
iteration += 1
|
||||
unique = separator.join(parts)
|
||||
|
||||
if suffix:
|
||||
return unique[:-len(suffix)]
|
||||
|
||||
return unique
|
||||
|
||||
|
||||
@contextmanager
|
||||
def maintained_selection():
|
||||
"""Maintain selection during context
|
||||
Example:
|
||||
>>> with maintained_selection():
|
||||
... # Modify selection
|
||||
... node.setSelected(on=False, clear_all_selected=True)
|
||||
>>> # Selection restored
|
||||
"""
|
||||
|
||||
previous_selection = hou.selectedNodes()
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
# Clear the selection
|
||||
# todo: does hou.clearAllSelected() do the same?
|
||||
for node in hou.selectedNodes():
|
||||
node.setSelected(on=False)
|
||||
|
||||
if previous_selection:
|
||||
for node in previous_selection:
|
||||
node.setSelected(on=True)
|
||||
|
|
|
|||
349
openpype/hosts/houdini/api/pipeline.py
Normal file
349
openpype/hosts/houdini/api/pipeline.py
Normal file
|
|
@ -0,0 +1,349 @@
|
|||
import os
|
||||
import sys
|
||||
import logging
|
||||
import contextlib
|
||||
|
||||
import hou
|
||||
|
||||
import pyblish.api
|
||||
import avalon.api
|
||||
from avalon.pipeline import AVALON_CONTAINER_ID
|
||||
from avalon.lib import find_submodule
|
||||
|
||||
import openpype.hosts.houdini
|
||||
from openpype.hosts.houdini.api import lib
|
||||
|
||||
from openpype.lib import (
|
||||
any_outdated
|
||||
)
|
||||
|
||||
from .lib import get_asset_fps
|
||||
|
||||
log = logging.getLogger("openpype.hosts.houdini")
|
||||
|
||||
AVALON_CONTAINERS = "/obj/AVALON_CONTAINERS"
|
||||
IS_HEADLESS = not hasattr(hou, "ui")
|
||||
|
||||
HOST_DIR = os.path.dirname(os.path.abspath(openpype.hosts.houdini.__file__))
|
||||
PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
|
||||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
|
||||
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
|
||||
|
||||
|
||||
self = sys.modules[__name__]
|
||||
self._has_been_setup = False
|
||||
self._parent = None
|
||||
self._events = dict()
|
||||
|
||||
|
||||
def install():
|
||||
_register_callbacks()
|
||||
|
||||
pyblish.api.register_host("houdini")
|
||||
pyblish.api.register_host("hython")
|
||||
pyblish.api.register_host("hpython")
|
||||
|
||||
pyblish.api.register_plugin_path(PUBLISH_PATH)
|
||||
avalon.api.register_plugin_path(avalon.api.Loader, LOAD_PATH)
|
||||
avalon.api.register_plugin_path(avalon.api.Creator, CREATE_PATH)
|
||||
|
||||
log.info("Installing callbacks ... ")
|
||||
# avalon.on("init", on_init)
|
||||
avalon.api.before("save", before_save)
|
||||
avalon.api.on("save", on_save)
|
||||
avalon.api.on("open", on_open)
|
||||
avalon.api.on("new", on_new)
|
||||
|
||||
pyblish.api.register_callback(
|
||||
"instanceToggled", on_pyblish_instance_toggled
|
||||
)
|
||||
|
||||
log.info("Setting default family states for loader..")
|
||||
avalon.api.data["familiesStateToggled"] = [
|
||||
"imagesequence",
|
||||
"review"
|
||||
]
|
||||
|
||||
self._has_been_setup = True
|
||||
# add houdini vendor packages
|
||||
hou_pythonpath = os.path.join(os.path.dirname(HOST_DIR), "vendor")
|
||||
|
||||
sys.path.append(hou_pythonpath)
|
||||
|
||||
# Set asset FPS for the empty scene directly after launch of Houdini
|
||||
# so it initializes into the correct scene FPS
|
||||
_set_asset_fps()
|
||||
|
||||
|
||||
def uninstall():
|
||||
"""Uninstall Houdini-specific functionality of avalon-core.
|
||||
|
||||
This function is called automatically on calling `api.uninstall()`.
|
||||
"""
|
||||
|
||||
pyblish.api.deregister_host("hython")
|
||||
pyblish.api.deregister_host("hpython")
|
||||
pyblish.api.deregister_host("houdini")
|
||||
|
||||
|
||||
def _register_callbacks():
|
||||
for event in self._events.copy().values():
|
||||
if event is None:
|
||||
continue
|
||||
|
||||
try:
|
||||
hou.hipFile.removeEventCallback(event)
|
||||
except RuntimeError as e:
|
||||
log.info(e)
|
||||
|
||||
self._events[on_file_event_callback] = hou.hipFile.addEventCallback(
|
||||
on_file_event_callback
|
||||
)
|
||||
|
||||
|
||||
def on_file_event_callback(event):
|
||||
if event == hou.hipFileEventType.AfterLoad:
|
||||
avalon.api.emit("open", [event])
|
||||
elif event == hou.hipFileEventType.AfterSave:
|
||||
avalon.api.emit("save", [event])
|
||||
elif event == hou.hipFileEventType.BeforeSave:
|
||||
avalon.api.emit("before_save", [event])
|
||||
elif event == hou.hipFileEventType.AfterClear:
|
||||
avalon.api.emit("new", [event])
|
||||
|
||||
|
||||
def get_main_window():
|
||||
"""Acquire Houdini's main window"""
|
||||
if self._parent is None:
|
||||
self._parent = hou.ui.mainQtWindow()
|
||||
return self._parent
|
||||
|
||||
|
||||
def teardown():
|
||||
"""Remove integration"""
|
||||
if not self._has_been_setup:
|
||||
return
|
||||
|
||||
self._has_been_setup = False
|
||||
print("pyblish: Integration torn down successfully")
|
||||
|
||||
|
||||
def containerise(name,
|
||||
namespace,
|
||||
nodes,
|
||||
context,
|
||||
loader=None,
|
||||
suffix=""):
|
||||
"""Bundle `nodes` into a subnet and imprint it with metadata
|
||||
|
||||
Containerisation enables a tracking of version, author and origin
|
||||
for loaded assets.
|
||||
|
||||
Arguments:
|
||||
name (str): Name of resulting assembly
|
||||
namespace (str): Namespace under which to host container
|
||||
nodes (list): Long names of nodes to containerise
|
||||
context (dict): Asset information
|
||||
loader (str, optional): Name of loader used to produce this container.
|
||||
suffix (str, optional): Suffix of container, defaults to `_CON`.
|
||||
|
||||
Returns:
|
||||
container (str): Name of container assembly
|
||||
|
||||
"""
|
||||
|
||||
# Ensure AVALON_CONTAINERS subnet exists
|
||||
subnet = hou.node(AVALON_CONTAINERS)
|
||||
if subnet is None:
|
||||
obj_network = hou.node("/obj")
|
||||
subnet = obj_network.createNode("subnet",
|
||||
node_name="AVALON_CONTAINERS")
|
||||
|
||||
# Create proper container name
|
||||
container_name = "{}_{}".format(name, suffix or "CON")
|
||||
container = hou.node("/obj/{}".format(name))
|
||||
container.setName(container_name, unique_name=True)
|
||||
|
||||
data = {
|
||||
"schema": "openpype:container-2.0",
|
||||
"id": AVALON_CONTAINER_ID,
|
||||
"name": name,
|
||||
"namespace": namespace,
|
||||
"loader": str(loader),
|
||||
"representation": str(context["representation"]["_id"]),
|
||||
}
|
||||
|
||||
lib.imprint(container, data)
|
||||
|
||||
# "Parent" the container under the container network
|
||||
hou.moveNodesTo([container], subnet)
|
||||
|
||||
subnet.node(container_name).moveToGoodPosition()
|
||||
|
||||
return container
|
||||
|
||||
|
||||
def parse_container(container):
|
||||
"""Return the container node's full container data.
|
||||
|
||||
Args:
|
||||
container (hou.Node): A container node name.
|
||||
|
||||
Returns:
|
||||
dict: The container schema data for this container node.
|
||||
|
||||
"""
|
||||
data = lib.read(container)
|
||||
|
||||
# Backwards compatibility pre-schemas for containers
|
||||
data["schema"] = data.get("schema", "openpype:container-1.0")
|
||||
|
||||
# Append transient data
|
||||
data["objectName"] = container.path()
|
||||
data["node"] = container
|
||||
|
||||
return data
|
||||
|
||||
|
||||
def ls():
|
||||
containers = []
|
||||
for identifier in (AVALON_CONTAINER_ID,
|
||||
"pyblish.mindbender.container"):
|
||||
containers += lib.lsattr("id", identifier)
|
||||
|
||||
has_metadata_collector = False
|
||||
config_host = find_submodule(avalon.api.registered_config(), "houdini")
|
||||
if hasattr(config_host, "collect_container_metadata"):
|
||||
has_metadata_collector = True
|
||||
|
||||
for container in sorted(containers,
|
||||
# Hou 19+ Python 3 hou.ObjNode are not
|
||||
# sortable due to not supporting greater
|
||||
# than comparisons
|
||||
key=lambda node: node.path()):
|
||||
data = parse_container(container)
|
||||
|
||||
# Collect custom data if attribute is present
|
||||
if has_metadata_collector:
|
||||
metadata = config_host.collect_container_metadata(container)
|
||||
data.update(metadata)
|
||||
|
||||
yield data
|
||||
|
||||
|
||||
def before_save(*args):
|
||||
return lib.validate_fps()
|
||||
|
||||
|
||||
def on_save(*args):
|
||||
|
||||
log.info("Running callback on save..")
|
||||
|
||||
nodes = lib.get_id_required_nodes()
|
||||
for node, new_id in lib.generate_ids(nodes):
|
||||
lib.set_id(node, new_id, overwrite=False)
|
||||
|
||||
|
||||
def on_open(*args):
|
||||
|
||||
if not hou.isUIAvailable():
|
||||
log.debug("Batch mode detected, ignoring `on_open` callbacks..")
|
||||
return
|
||||
|
||||
log.info("Running callback on open..")
|
||||
|
||||
# Validate FPS after update_task_from_path to
|
||||
# ensure it is using correct FPS for the asset
|
||||
lib.validate_fps()
|
||||
|
||||
if any_outdated():
|
||||
from openpype.widgets import popup
|
||||
|
||||
log.warning("Scene has outdated content.")
|
||||
|
||||
# Get main window
|
||||
parent = get_main_window()
|
||||
if parent is None:
|
||||
log.info("Skipping outdated content pop-up "
|
||||
"because Houdini window can't be found.")
|
||||
else:
|
||||
|
||||
# Show outdated pop-up
|
||||
def _on_show_inventory():
|
||||
from openpype.tools.utils import host_tools
|
||||
host_tools.show_scene_inventory(parent=parent)
|
||||
|
||||
dialog = popup.Popup(parent=parent)
|
||||
dialog.setWindowTitle("Houdini scene has outdated content")
|
||||
dialog.setMessage("There are outdated containers in "
|
||||
"your Houdini scene.")
|
||||
dialog.on_clicked.connect(_on_show_inventory)
|
||||
dialog.show()
|
||||
|
||||
|
||||
def on_new(_):
|
||||
"""Set project resolution and fps when create a new file"""
|
||||
log.info("Running callback on new..")
|
||||
_set_asset_fps()
|
||||
|
||||
|
||||
def _set_asset_fps():
|
||||
"""Set Houdini scene FPS to the default required for current asset"""
|
||||
|
||||
# Set new scene fps
|
||||
fps = get_asset_fps()
|
||||
print("Setting scene FPS to %i" % fps)
|
||||
lib.set_scene_fps(fps)
|
||||
|
||||
|
||||
def on_pyblish_instance_toggled(instance, new_value, old_value):
|
||||
"""Toggle saver tool passthrough states on instance toggles."""
|
||||
@contextlib.contextmanager
|
||||
def main_take(no_update=True):
|
||||
"""Enter root take during context"""
|
||||
original_take = hou.takes.currentTake()
|
||||
original_update_mode = hou.updateModeSetting()
|
||||
root = hou.takes.rootTake()
|
||||
has_changed = False
|
||||
try:
|
||||
if original_take != root:
|
||||
has_changed = True
|
||||
if no_update:
|
||||
hou.setUpdateMode(hou.updateMode.Manual)
|
||||
hou.takes.setCurrentTake(root)
|
||||
yield
|
||||
finally:
|
||||
if has_changed:
|
||||
if no_update:
|
||||
hou.setUpdateMode(original_update_mode)
|
||||
hou.takes.setCurrentTake(original_take)
|
||||
|
||||
if not instance.data.get("_allowToggleBypass", True):
|
||||
return
|
||||
|
||||
nodes = instance[:]
|
||||
if not nodes:
|
||||
return
|
||||
|
||||
# Assume instance node is first node
|
||||
instance_node = nodes[0]
|
||||
|
||||
if not hasattr(instance_node, "isBypassed"):
|
||||
# Likely not a node that can actually be bypassed
|
||||
log.debug("Can't bypass node: %s", instance_node.path())
|
||||
return
|
||||
|
||||
if instance_node.isBypassed() != (not old_value):
|
||||
print("%s old bypass state didn't match old instance state, "
|
||||
"updating anyway.." % instance_node.path())
|
||||
|
||||
try:
|
||||
# Go into the main take, because when in another take changing
|
||||
# the bypass state of a note cannot be done due to it being locked
|
||||
# by default.
|
||||
with main_take(no_update=True):
|
||||
instance_node.bypass(not new_value)
|
||||
except hou.PermissionError as exc:
|
||||
log.warning("%s - %s", instance_node.path(), exc)
|
||||
|
|
@ -1,25 +1,82 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Houdini specific Avalon/Pyblish plugin definitions."""
|
||||
import sys
|
||||
from avalon.api import CreatorError
|
||||
from avalon import houdini
|
||||
import six
|
||||
import avalon.api
|
||||
from avalon.api import CreatorError
|
||||
|
||||
import hou
|
||||
from openpype.api import PypeCreatorMixin
|
||||
from .lib import imprint
|
||||
|
||||
|
||||
class OpenPypeCreatorError(CreatorError):
|
||||
pass
|
||||
|
||||
|
||||
class Creator(PypeCreatorMixin, houdini.Creator):
|
||||
class Creator(PypeCreatorMixin, avalon.api.Creator):
|
||||
"""Creator plugin to create instances in Houdini
|
||||
|
||||
To support the wide range of node types for render output (Alembic, VDB,
|
||||
Mantra) the Creator needs a node type to create the correct instance
|
||||
|
||||
By default, if none is given, is `geometry`. An example of accepted node
|
||||
types: geometry, alembic, ifd (mantra)
|
||||
|
||||
Please check the Houdini documentation for more node types.
|
||||
|
||||
Tip: to find the exact node type to create press the `i` left of the node
|
||||
when hovering over a node. The information is visible under the name of
|
||||
the node.
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super(Creator, self).__init__(*args, **kwargs)
|
||||
self.nodes = list()
|
||||
|
||||
def process(self):
|
||||
"""This is the base functionality to create instances in Houdini
|
||||
|
||||
The selected nodes are stored in self to be used in an override method.
|
||||
This is currently necessary in order to support the multiple output
|
||||
types in Houdini which can only be rendered through their own node.
|
||||
|
||||
Default node type if none is given is `geometry`
|
||||
|
||||
It also makes it easier to apply custom settings per instance type
|
||||
|
||||
Example of override method for Alembic:
|
||||
|
||||
def process(self):
|
||||
instance = super(CreateEpicNode, self, process()
|
||||
# Set paramaters for Alembic node
|
||||
instance.setParms(
|
||||
{"sop_path": "$HIP/%s.abc" % self.nodes[0]}
|
||||
)
|
||||
|
||||
Returns:
|
||||
hou.Node
|
||||
|
||||
"""
|
||||
try:
|
||||
# re-raise as standard Python exception so
|
||||
# Avalon can catch it
|
||||
instance = super(Creator, self).process()
|
||||
if (self.options or {}).get("useSelection"):
|
||||
self.nodes = hou.selectedNodes()
|
||||
|
||||
# Get the node type and remove it from the data, not needed
|
||||
node_type = self.data.pop("node_type", None)
|
||||
if node_type is None:
|
||||
node_type = "geometry"
|
||||
|
||||
# Get out node
|
||||
out = hou.node("/out")
|
||||
instance = out.createNode(node_type, node_name=self.name)
|
||||
instance.moveToGoodPosition()
|
||||
|
||||
imprint(instance, self.data)
|
||||
|
||||
self._process(instance)
|
||||
|
||||
except hou.Error as er:
|
||||
six.reraise(
|
||||
OpenPypeCreatorError,
|
||||
|
|
|
|||
58
openpype/hosts/houdini/api/workio.py
Normal file
58
openpype/hosts/houdini/api/workio.py
Normal file
|
|
@ -0,0 +1,58 @@
|
|||
"""Host API required Work Files tool"""
|
||||
import os
|
||||
|
||||
import hou
|
||||
from avalon import api
|
||||
|
||||
|
||||
def file_extensions():
|
||||
return api.HOST_WORKFILE_EXTENSIONS["houdini"]
|
||||
|
||||
|
||||
def has_unsaved_changes():
|
||||
return hou.hipFile.hasUnsavedChanges()
|
||||
|
||||
|
||||
def save_file(filepath):
|
||||
|
||||
# Force forwards slashes to avoid segfault
|
||||
filepath = filepath.replace("\\", "/")
|
||||
|
||||
hou.hipFile.save(file_name=filepath,
|
||||
save_to_recent_files=True)
|
||||
|
||||
return filepath
|
||||
|
||||
|
||||
def open_file(filepath):
|
||||
|
||||
# Force forwards slashes to avoid segfault
|
||||
filepath = filepath.replace("\\", "/")
|
||||
|
||||
hou.hipFile.load(filepath,
|
||||
suppress_save_prompt=True,
|
||||
ignore_load_warnings=False)
|
||||
|
||||
return filepath
|
||||
|
||||
|
||||
def current_file():
|
||||
|
||||
current_filepath = hou.hipFile.path()
|
||||
if (os.path.basename(current_filepath) == "untitled.hip" and
|
||||
not os.path.exists(current_filepath)):
|
||||
# By default a new scene in houdini is saved in the current
|
||||
# working directory as "untitled.hip" so we need to capture
|
||||
# that and consider it 'not saved' when it's in that state.
|
||||
return None
|
||||
|
||||
return current_filepath
|
||||
|
||||
|
||||
def work_root(session):
|
||||
work_dir = session["AVALON_WORKDIR"]
|
||||
scene_dir = session.get("AVALON_SCENEDIR")
|
||||
if scene_dir:
|
||||
return os.path.join(work_dir, scene_dir)
|
||||
else:
|
||||
return work_dir
|
||||
|
|
@ -1,8 +1,8 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
from openpype.hosts.houdini.api import plugin
|
||||
from avalon.houdini import lib
|
||||
from avalon import io
|
||||
import hou
|
||||
from avalon import io
|
||||
from openpype.hosts.houdini.api import lib
|
||||
from openpype.hosts.houdini.api import plugin
|
||||
|
||||
|
||||
class CreateHDA(plugin.Creator):
|
||||
|
|
|
|||
|
|
@ -1,6 +1,7 @@
|
|||
import os
|
||||
from avalon import api
|
||||
|
||||
from avalon.houdini import pipeline
|
||||
from openpype.hosts.houdini.api import pipeline
|
||||
|
||||
|
||||
class AbcLoader(api.Loader):
|
||||
|
|
@ -14,8 +15,6 @@ class AbcLoader(api.Loader):
|
|||
color = "orange"
|
||||
|
||||
def load(self, context, name=None, namespace=None, data=None):
|
||||
|
||||
import os
|
||||
import hou
|
||||
|
||||
# Format file name, Houdini only wants forward slashes
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
from avalon import api
|
||||
from avalon.houdini import pipeline
|
||||
from openpype.hosts.houdini.api import pipeline
|
||||
|
||||
|
||||
ARCHIVE_EXPRESSION = ('__import__("_alembic_hom_extensions")'
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
from avalon import api
|
||||
|
||||
from avalon.houdini import pipeline
|
||||
from openpype.hosts.houdini.api import pipeline
|
||||
|
||||
|
||||
class HdaLoader(api.Loader):
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
import os
|
||||
|
||||
from avalon import api
|
||||
from avalon.houdini import pipeline, lib
|
||||
from openpype.hosts.houdini.api import lib, pipeline
|
||||
|
||||
import hou
|
||||
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
from avalon import api
|
||||
from avalon.houdini import pipeline, lib
|
||||
from openpype.hosts.houdini.api import lib, pipeline
|
||||
|
||||
|
||||
class USDSublayerLoader(api.Loader):
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
from avalon import api
|
||||
from avalon.houdini import pipeline, lib
|
||||
from openpype.hosts.houdini.api import lib, pipeline
|
||||
|
||||
|
||||
class USDReferenceLoader(api.Loader):
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@ import os
|
|||
import re
|
||||
from avalon import api
|
||||
|
||||
from avalon.houdini import pipeline
|
||||
from openpype.hosts.houdini.api import pipeline
|
||||
|
||||
|
||||
class VdbLoader(api.Loader):
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@ import hou
|
|||
|
||||
import pyblish.api
|
||||
|
||||
from avalon.houdini import lib
|
||||
from openpype.hosts.houdini.api import lib
|
||||
|
||||
|
||||
class CollectInstances(pyblish.api.ContextPlugin):
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
import hou
|
||||
import pyblish.api
|
||||
from avalon.houdini import lib
|
||||
from openpype.hosts.houdini.api import lib
|
||||
import openpype.hosts.houdini.api.usd as hou_usdlib
|
||||
import openpype.lib.usdlib as usdlib
|
||||
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@ import pyblish.api
|
|||
import openpype.api
|
||||
|
||||
import hou
|
||||
from avalon.houdini import lib
|
||||
from openpype.hosts.houdini.api import lib
|
||||
|
||||
|
||||
class CollectRemotePublishSettings(pyblish.api.ContextPlugin):
|
||||
|
|
|
|||
|
|
@ -56,18 +56,6 @@ host_tools.show_workfiles(parent)
|
|||
]]></scriptCode>
|
||||
</scriptItem>
|
||||
|
||||
<separatorItem/>
|
||||
|
||||
<subMenu id="avalon_reload_pipeline">
|
||||
<label>System</label>
|
||||
<scriptItem>
|
||||
<label>Reload Pipeline (unstable)</label>
|
||||
<scriptCode><![CDATA[
|
||||
from avalon.houdini import pipeline
|
||||
pipeline.reload_pipeline()]]></scriptCode>
|
||||
</scriptItem>
|
||||
</subMenu>
|
||||
|
||||
<separatorItem/>
|
||||
<scriptItem id="experimental_tools">
|
||||
<label>Experimental tools...</label>
|
||||
|
|
|
|||
|
|
@ -1,9 +1,10 @@
|
|||
from avalon import api, houdini
|
||||
import avalon.api
|
||||
from openpype.hosts.houdini import api
|
||||
|
||||
|
||||
def main():
|
||||
print("Installing OpenPype ...")
|
||||
api.install(houdini)
|
||||
avalon.api.install(api)
|
||||
|
||||
|
||||
main()
|
||||
|
|
|
|||
|
|
@ -1,9 +1,10 @@
|
|||
from avalon import api, houdini
|
||||
import avalon.api
|
||||
from openpype.hosts.houdini import api
|
||||
|
||||
|
||||
def main():
|
||||
print("Installing OpenPype ...")
|
||||
api.install(houdini)
|
||||
avalon.api.install(api)
|
||||
|
||||
|
||||
main()
|
||||
|
|
|
|||
|
|
@ -5,9 +5,7 @@ def add_implementation_envs(env, _app):
|
|||
# Add requirements to PYTHONPATH
|
||||
pype_root = os.environ["OPENPYPE_REPOS_ROOT"]
|
||||
new_python_paths = [
|
||||
os.path.join(pype_root, "openpype", "hosts", "maya", "startup"),
|
||||
os.path.join(pype_root, "repos", "avalon-core", "setup", "maya"),
|
||||
os.path.join(pype_root, "tools", "mayalookassigner")
|
||||
os.path.join(pype_root, "openpype", "hosts", "maya", "startup")
|
||||
]
|
||||
old_python_path = env.get("PYTHONPATH") or ""
|
||||
for path in old_python_path.split(os.pathsep):
|
||||
|
|
|
|||
|
|
@ -1,233 +1,91 @@
|
|||
import os
|
||||
import logging
|
||||
import weakref
|
||||
"""Public API
|
||||
|
||||
from maya import utils, cmds
|
||||
Anything that isn't defined here is INTERNAL and unreliable for external use.
|
||||
|
||||
from avalon import api as avalon
|
||||
from avalon import pipeline
|
||||
from avalon.maya import suspended_refresh
|
||||
from avalon.maya.pipeline import IS_HEADLESS
|
||||
from openpype.tools.utils import host_tools
|
||||
from pyblish import api as pyblish
|
||||
from openpype.lib import any_outdated
|
||||
import openpype.hosts.maya
|
||||
from openpype.hosts.maya.lib import copy_workspace_mel
|
||||
from openpype.lib.path_tools import HostDirmap
|
||||
from . import menu, lib
|
||||
"""
|
||||
|
||||
log = logging.getLogger("openpype.hosts.maya")
|
||||
from .pipeline import (
|
||||
install,
|
||||
uninstall,
|
||||
|
||||
HOST_DIR = os.path.dirname(os.path.abspath(openpype.hosts.maya.__file__))
|
||||
PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
|
||||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
|
||||
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
|
||||
ls,
|
||||
containerise,
|
||||
|
||||
lock,
|
||||
unlock,
|
||||
is_locked,
|
||||
lock_ignored,
|
||||
|
||||
)
|
||||
from .plugin import (
|
||||
Creator,
|
||||
Loader
|
||||
)
|
||||
|
||||
from .workio import (
|
||||
open_file,
|
||||
save_file,
|
||||
current_file,
|
||||
has_unsaved_changes,
|
||||
file_extensions,
|
||||
work_root
|
||||
)
|
||||
|
||||
from .lib import (
|
||||
export_alembic,
|
||||
lsattr,
|
||||
lsattrs,
|
||||
read,
|
||||
|
||||
apply_shaders,
|
||||
without_extension,
|
||||
maintained_selection,
|
||||
suspended_refresh,
|
||||
|
||||
unique_name,
|
||||
unique_namespace,
|
||||
)
|
||||
|
||||
|
||||
def install():
|
||||
from openpype.settings import get_project_settings
|
||||
__all__ = [
|
||||
"install",
|
||||
"uninstall",
|
||||
|
||||
project_settings = get_project_settings(os.getenv("AVALON_PROJECT"))
|
||||
# process path mapping
|
||||
dirmap_processor = MayaDirmap("maya", project_settings)
|
||||
dirmap_processor.process_dirmap()
|
||||
"ls",
|
||||
"containerise",
|
||||
|
||||
pyblish.register_plugin_path(PUBLISH_PATH)
|
||||
avalon.register_plugin_path(avalon.Loader, LOAD_PATH)
|
||||
avalon.register_plugin_path(avalon.Creator, CREATE_PATH)
|
||||
avalon.register_plugin_path(avalon.InventoryAction, INVENTORY_PATH)
|
||||
log.info(PUBLISH_PATH)
|
||||
menu.install()
|
||||
"lock",
|
||||
"unlock",
|
||||
"is_locked",
|
||||
"lock_ignored",
|
||||
|
||||
log.info("Installing callbacks ... ")
|
||||
avalon.on("init", on_init)
|
||||
"Creator",
|
||||
"Loader",
|
||||
|
||||
# Callbacks below are not required for headless mode, the `init` however
|
||||
# is important to load referenced Alembics correctly at rendertime.
|
||||
if IS_HEADLESS:
|
||||
log.info("Running in headless mode, skipping Maya "
|
||||
"save/open/new callback installation..")
|
||||
return
|
||||
# Workfiles API
|
||||
"open_file",
|
||||
"save_file",
|
||||
"current_file",
|
||||
"has_unsaved_changes",
|
||||
"file_extensions",
|
||||
"work_root",
|
||||
|
||||
avalon.on("save", on_save)
|
||||
avalon.on("open", on_open)
|
||||
avalon.on("new", on_new)
|
||||
avalon.before("save", on_before_save)
|
||||
avalon.on("taskChanged", on_task_changed)
|
||||
avalon.on("before.workfile.save", before_workfile_save)
|
||||
# Utility functions
|
||||
"export_alembic",
|
||||
"lsattr",
|
||||
"lsattrs",
|
||||
"read",
|
||||
|
||||
log.info("Setting default family states for loader..")
|
||||
avalon.data["familiesStateToggled"] = ["imagesequence"]
|
||||
"unique_name",
|
||||
"unique_namespace",
|
||||
|
||||
"apply_shaders",
|
||||
"without_extension",
|
||||
"maintained_selection",
|
||||
"suspended_refresh",
|
||||
|
||||
def uninstall():
|
||||
pyblish.deregister_plugin_path(PUBLISH_PATH)
|
||||
avalon.deregister_plugin_path(avalon.Loader, LOAD_PATH)
|
||||
avalon.deregister_plugin_path(avalon.Creator, CREATE_PATH)
|
||||
avalon.deregister_plugin_path(avalon.InventoryAction, INVENTORY_PATH)
|
||||
]
|
||||
|
||||
menu.uninstall()
|
||||
|
||||
|
||||
def on_init(_):
|
||||
avalon.logger.info("Running callback on init..")
|
||||
|
||||
def safe_deferred(fn):
|
||||
"""Execute deferred the function in a try-except"""
|
||||
|
||||
def _fn():
|
||||
"""safely call in deferred callback"""
|
||||
try:
|
||||
fn()
|
||||
except Exception as exc:
|
||||
print(exc)
|
||||
|
||||
try:
|
||||
utils.executeDeferred(_fn)
|
||||
except Exception as exc:
|
||||
print(exc)
|
||||
|
||||
# Force load Alembic so referenced alembics
|
||||
# work correctly on scene open
|
||||
cmds.loadPlugin("AbcImport", quiet=True)
|
||||
cmds.loadPlugin("AbcExport", quiet=True)
|
||||
|
||||
# Force load objExport plug-in (requested by artists)
|
||||
cmds.loadPlugin("objExport", quiet=True)
|
||||
|
||||
from .customize import (
|
||||
override_component_mask_commands,
|
||||
override_toolbox_ui
|
||||
)
|
||||
safe_deferred(override_component_mask_commands)
|
||||
|
||||
launch_workfiles = os.environ.get("WORKFILES_STARTUP")
|
||||
|
||||
if launch_workfiles:
|
||||
safe_deferred(host_tools.show_workfiles)
|
||||
|
||||
if not IS_HEADLESS:
|
||||
safe_deferred(override_toolbox_ui)
|
||||
|
||||
|
||||
def on_before_save(return_code, _):
|
||||
"""Run validation for scene's FPS prior to saving"""
|
||||
return lib.validate_fps()
|
||||
|
||||
|
||||
def on_save(_):
|
||||
"""Automatically add IDs to new nodes
|
||||
|
||||
Any transform of a mesh, without an existing ID, is given one
|
||||
automatically on file save.
|
||||
"""
|
||||
|
||||
avalon.logger.info("Running callback on save..")
|
||||
|
||||
# # Update current task for the current scene
|
||||
# update_task_from_path(cmds.file(query=True, sceneName=True))
|
||||
|
||||
# Generate ids of the current context on nodes in the scene
|
||||
nodes = lib.get_id_required_nodes(referenced_nodes=False)
|
||||
for node, new_id in lib.generate_ids(nodes):
|
||||
lib.set_id(node, new_id, overwrite=False)
|
||||
|
||||
|
||||
def on_open(_):
|
||||
"""On scene open let's assume the containers have changed."""
|
||||
|
||||
from Qt import QtWidgets
|
||||
from openpype.widgets import popup
|
||||
|
||||
cmds.evalDeferred(
|
||||
"from openpype.hosts.maya.api import lib;"
|
||||
"lib.remove_render_layer_observer()")
|
||||
cmds.evalDeferred(
|
||||
"from openpype.hosts.maya.api import lib;"
|
||||
"lib.add_render_layer_observer()")
|
||||
cmds.evalDeferred(
|
||||
"from openpype.hosts.maya.api import lib;"
|
||||
"lib.add_render_layer_change_observer()")
|
||||
# # Update current task for the current scene
|
||||
# update_task_from_path(cmds.file(query=True, sceneName=True))
|
||||
|
||||
# Validate FPS after update_task_from_path to
|
||||
# ensure it is using correct FPS for the asset
|
||||
lib.validate_fps()
|
||||
lib.fix_incompatible_containers()
|
||||
|
||||
if any_outdated():
|
||||
log.warning("Scene has outdated content.")
|
||||
|
||||
# Find maya main window
|
||||
top_level_widgets = {w.objectName(): w for w in
|
||||
QtWidgets.QApplication.topLevelWidgets()}
|
||||
parent = top_level_widgets.get("MayaWindow", None)
|
||||
|
||||
if parent is None:
|
||||
log.info("Skipping outdated content pop-up "
|
||||
"because Maya window can't be found.")
|
||||
else:
|
||||
|
||||
# Show outdated pop-up
|
||||
def _on_show_inventory():
|
||||
host_tools.show_scene_inventory(parent=parent)
|
||||
|
||||
dialog = popup.Popup(parent=parent)
|
||||
dialog.setWindowTitle("Maya scene has outdated content")
|
||||
dialog.setMessage("There are outdated containers in "
|
||||
"your Maya scene.")
|
||||
dialog.on_show.connect(_on_show_inventory)
|
||||
dialog.show()
|
||||
|
||||
|
||||
def on_new(_):
|
||||
"""Set project resolution and fps when create a new file"""
|
||||
avalon.logger.info("Running callback on new..")
|
||||
with suspended_refresh():
|
||||
cmds.evalDeferred(
|
||||
"from openpype.hosts.maya.api import lib;"
|
||||
"lib.remove_render_layer_observer()")
|
||||
cmds.evalDeferred(
|
||||
"from openpype.hosts.maya.api import lib;"
|
||||
"lib.add_render_layer_observer()")
|
||||
cmds.evalDeferred(
|
||||
"from openpype.hosts.maya.api import lib;"
|
||||
"lib.add_render_layer_change_observer()")
|
||||
lib.set_context_settings()
|
||||
|
||||
|
||||
def on_task_changed(*args):
|
||||
"""Wrapped function of app initialize and maya's on task changed"""
|
||||
# Run
|
||||
with suspended_refresh():
|
||||
lib.set_context_settings()
|
||||
lib.update_content_on_context_change()
|
||||
|
||||
msg = " project: {}\n asset: {}\n task:{}".format(
|
||||
avalon.Session["AVALON_PROJECT"],
|
||||
avalon.Session["AVALON_ASSET"],
|
||||
avalon.Session["AVALON_TASK"]
|
||||
)
|
||||
|
||||
lib.show_message(
|
||||
"Context was changed",
|
||||
("Context was changed to:\n{}".format(msg)),
|
||||
)
|
||||
|
||||
|
||||
def before_workfile_save(event):
|
||||
workdir_path = event.workdir_path
|
||||
if workdir_path:
|
||||
copy_workspace_mel(workdir_path)
|
||||
|
||||
|
||||
class MayaDirmap(HostDirmap):
|
||||
def on_enable_dirmap(self):
|
||||
cmds.dirmap(en=True)
|
||||
|
||||
def dirmap_routine(self, source_path, destination_path):
|
||||
cmds.dirmap(m=(source_path, destination_path))
|
||||
cmds.dirmap(m=(destination_path, source_path))
|
||||
# Backwards API compatibility
|
||||
open = open_file
|
||||
save = save_file
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@
|
|||
from __future__ import absolute_import
|
||||
|
||||
import pyblish.api
|
||||
|
||||
from avalon import io
|
||||
|
||||
from openpype.api import get_errored_instances_from_context
|
||||
|
||||
|
|
@ -72,8 +72,7 @@ class GenerateUUIDsOnInvalidAction(pyblish.api.Action):
|
|||
nodes (list): all nodes to regenerate ids on
|
||||
"""
|
||||
|
||||
from openpype.hosts.maya.api import lib
|
||||
import avalon.io as io
|
||||
from . import lib
|
||||
|
||||
asset = instance.data['asset']
|
||||
asset_id = io.find_one({"name": asset, "type": "asset"},
|
||||
|
|
|
|||
|
|
@ -1,5 +1,7 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""OpenPype script commands to be used directly in Maya."""
|
||||
from maya import cmds
|
||||
from avalon import api, io
|
||||
|
||||
|
||||
class ToolWindows:
|
||||
|
|
@ -51,3 +53,134 @@ def edit_shader_definitions():
|
|||
window = ShaderDefinitionsEditor(parent=main_window)
|
||||
ToolWindows.set_window("shader_definition_editor", window)
|
||||
window.show()
|
||||
|
||||
|
||||
def reset_frame_range():
|
||||
"""Set frame range to current asset"""
|
||||
# Set FPS first
|
||||
fps = {15: 'game',
|
||||
24: 'film',
|
||||
25: 'pal',
|
||||
30: 'ntsc',
|
||||
48: 'show',
|
||||
50: 'palf',
|
||||
60: 'ntscf',
|
||||
23.98: '23.976fps',
|
||||
23.976: '23.976fps',
|
||||
29.97: '29.97fps',
|
||||
47.952: '47.952fps',
|
||||
47.95: '47.952fps',
|
||||
59.94: '59.94fps',
|
||||
44100: '44100fps',
|
||||
48000: '48000fps'
|
||||
}.get(float(api.Session.get("AVALON_FPS", 25)), "pal")
|
||||
|
||||
cmds.currentUnit(time=fps)
|
||||
|
||||
# Set frame start/end
|
||||
asset_name = api.Session["AVALON_ASSET"]
|
||||
asset = io.find_one({"name": asset_name, "type": "asset"})
|
||||
|
||||
frame_start = asset["data"].get("frameStart")
|
||||
frame_end = asset["data"].get("frameEnd")
|
||||
# Backwards compatibility
|
||||
if frame_start is None or frame_end is None:
|
||||
frame_start = asset["data"].get("edit_in")
|
||||
frame_end = asset["data"].get("edit_out")
|
||||
|
||||
if frame_start is None or frame_end is None:
|
||||
cmds.warning("No edit information found for %s" % asset_name)
|
||||
return
|
||||
|
||||
handles = asset["data"].get("handles") or 0
|
||||
handle_start = asset["data"].get("handleStart")
|
||||
if handle_start is None:
|
||||
handle_start = handles
|
||||
|
||||
handle_end = asset["data"].get("handleEnd")
|
||||
if handle_end is None:
|
||||
handle_end = handles
|
||||
|
||||
frame_start -= int(handle_start)
|
||||
frame_end += int(handle_end)
|
||||
|
||||
cmds.playbackOptions(minTime=frame_start)
|
||||
cmds.playbackOptions(maxTime=frame_end)
|
||||
cmds.playbackOptions(animationStartTime=frame_start)
|
||||
cmds.playbackOptions(animationEndTime=frame_end)
|
||||
cmds.playbackOptions(minTime=frame_start)
|
||||
cmds.playbackOptions(maxTime=frame_end)
|
||||
cmds.currentTime(frame_start)
|
||||
|
||||
cmds.setAttr("defaultRenderGlobals.startFrame", frame_start)
|
||||
cmds.setAttr("defaultRenderGlobals.endFrame", frame_end)
|
||||
|
||||
|
||||
def _resolution_from_document(doc):
|
||||
if not doc or "data" not in doc:
|
||||
print("Entered document is not valid. \"{}\"".format(str(doc)))
|
||||
return None
|
||||
|
||||
resolution_width = doc["data"].get("resolutionWidth")
|
||||
resolution_height = doc["data"].get("resolutionHeight")
|
||||
# Backwards compatibility
|
||||
if resolution_width is None or resolution_height is None:
|
||||
resolution_width = doc["data"].get("resolution_width")
|
||||
resolution_height = doc["data"].get("resolution_height")
|
||||
|
||||
# Make sure both width and heigh are set
|
||||
if resolution_width is None or resolution_height is None:
|
||||
cmds.warning(
|
||||
"No resolution information found for \"{}\"".format(doc["name"])
|
||||
)
|
||||
return None
|
||||
|
||||
return int(resolution_width), int(resolution_height)
|
||||
|
||||
|
||||
def reset_resolution():
|
||||
# Default values
|
||||
resolution_width = 1920
|
||||
resolution_height = 1080
|
||||
|
||||
# Get resolution from asset
|
||||
asset_name = api.Session["AVALON_ASSET"]
|
||||
asset_doc = io.find_one({"name": asset_name, "type": "asset"})
|
||||
resolution = _resolution_from_document(asset_doc)
|
||||
# Try get resolution from project
|
||||
if resolution is None:
|
||||
# TODO go through visualParents
|
||||
print((
|
||||
"Asset \"{}\" does not have set resolution."
|
||||
" Trying to get resolution from project"
|
||||
).format(asset_name))
|
||||
project_doc = io.find_one({"type": "project"})
|
||||
resolution = _resolution_from_document(project_doc)
|
||||
|
||||
if resolution is None:
|
||||
msg = "Using default resolution {}x{}"
|
||||
else:
|
||||
resolution_width, resolution_height = resolution
|
||||
msg = "Setting resolution to {}x{}"
|
||||
|
||||
print(msg.format(resolution_width, resolution_height))
|
||||
|
||||
# set for different renderers
|
||||
# arnold, vray, redshift, renderman
|
||||
|
||||
renderer = cmds.getAttr("defaultRenderGlobals.currentRenderer").lower()
|
||||
# handle various renderman names
|
||||
if renderer.startswith("renderman"):
|
||||
renderer = "renderman"
|
||||
|
||||
# default attributes are usable for Arnold, Renderman and Redshift
|
||||
width_attr_name = "defaultResolution.width"
|
||||
height_attr_name = "defaultResolution.height"
|
||||
|
||||
# Vray has its own way
|
||||
if renderer == "vray":
|
||||
width_attr_name = "vraySettings.width"
|
||||
height_attr_name = "vraySettings.height"
|
||||
|
||||
cmds.setAttr(width_attr_name, resolution_width)
|
||||
cmds.setAttr(height_attr_name, resolution_height)
|
||||
|
|
|
|||
|
|
@ -8,10 +8,9 @@ from functools import partial
|
|||
import maya.cmds as mc
|
||||
import maya.mel as mel
|
||||
|
||||
from avalon.maya import pipeline
|
||||
from openpype.api import resources
|
||||
from openpype.tools.utils import host_tools
|
||||
|
||||
from .lib import get_main_window
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
|
@ -76,6 +75,7 @@ def override_component_mask_commands():
|
|||
def override_toolbox_ui():
|
||||
"""Add custom buttons in Toolbox as replacement for Maya web help icon."""
|
||||
icons = resources.get_resource("icons")
|
||||
parent_widget = get_main_window()
|
||||
|
||||
# Ensure the maya web icon on toolbox exists
|
||||
web_button = "ToolBox|MainToolboxLayout|mayaWebButton"
|
||||
|
|
@ -115,7 +115,7 @@ def override_toolbox_ui():
|
|||
label="Work Files",
|
||||
image=os.path.join(icons, "workfiles.png"),
|
||||
command=lambda: host_tools.show_workfiles(
|
||||
parent=pipeline._parent
|
||||
parent=parent_widget
|
||||
),
|
||||
width=icon_size,
|
||||
height=icon_size,
|
||||
|
|
@ -130,7 +130,7 @@ def override_toolbox_ui():
|
|||
label="Loader",
|
||||
image=os.path.join(icons, "loader.png"),
|
||||
command=lambda: host_tools.show_loader(
|
||||
parent=pipeline._parent, use_context=True
|
||||
parent=parent_widget, use_context=True
|
||||
),
|
||||
width=icon_size,
|
||||
height=icon_size,
|
||||
|
|
@ -145,7 +145,7 @@ def override_toolbox_ui():
|
|||
label="Inventory",
|
||||
image=os.path.join(icons, "inventory.png"),
|
||||
command=lambda: host_tools.show_scene_inventory(
|
||||
parent=pipeline._parent
|
||||
parent=parent_widget
|
||||
),
|
||||
width=icon_size,
|
||||
height=icon_size,
|
||||
|
|
|
|||
|
|
@ -1,7 +1,8 @@
|
|||
"""Standalone helper functions"""
|
||||
|
||||
import re
|
||||
import os
|
||||
import sys
|
||||
import re
|
||||
import platform
|
||||
import uuid
|
||||
import math
|
||||
|
|
@ -18,16 +19,19 @@ import bson
|
|||
from maya import cmds, mel
|
||||
import maya.api.OpenMaya as om
|
||||
|
||||
from avalon import api, maya, io, pipeline
|
||||
import avalon.maya.lib
|
||||
import avalon.maya.interactive
|
||||
from avalon import api, io, pipeline
|
||||
|
||||
from openpype import lib
|
||||
from openpype.api import get_anatomy_settings
|
||||
from .commands import reset_frame_range
|
||||
|
||||
|
||||
self = sys.modules[__name__]
|
||||
self._parent = None
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
IS_HEADLESS = not hasattr(cmds, "about") or cmds.about(batch=True)
|
||||
ATTRIBUTE_DICT = {"int": {"attributeType": "long"},
|
||||
"str": {"dataType": "string"},
|
||||
"unicode": {"dataType": "string"},
|
||||
|
|
@ -100,6 +104,155 @@ FLOAT_FPS = {23.98, 23.976, 29.97, 47.952, 59.94}
|
|||
RENDERLIKE_INSTANCE_FAMILIES = ["rendering", "vrayscene"]
|
||||
|
||||
|
||||
def get_main_window():
|
||||
"""Acquire Maya's main window"""
|
||||
from Qt import QtWidgets
|
||||
|
||||
if self._parent is None:
|
||||
self._parent = {
|
||||
widget.objectName(): widget
|
||||
for widget in QtWidgets.QApplication.topLevelWidgets()
|
||||
}["MayaWindow"]
|
||||
return self._parent
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def suspended_refresh():
|
||||
"""Suspend viewport refreshes"""
|
||||
|
||||
try:
|
||||
cmds.refresh(suspend=True)
|
||||
yield
|
||||
finally:
|
||||
cmds.refresh(suspend=False)
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def maintained_selection():
|
||||
"""Maintain selection during context
|
||||
|
||||
Example:
|
||||
>>> scene = cmds.file(new=True, force=True)
|
||||
>>> node = cmds.createNode("transform", name="Test")
|
||||
>>> cmds.select("persp")
|
||||
>>> with maintained_selection():
|
||||
... cmds.select("Test", replace=True)
|
||||
>>> "Test" in cmds.ls(selection=True)
|
||||
False
|
||||
|
||||
"""
|
||||
|
||||
previous_selection = cmds.ls(selection=True)
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
if previous_selection:
|
||||
cmds.select(previous_selection,
|
||||
replace=True,
|
||||
noExpand=True)
|
||||
else:
|
||||
cmds.select(clear=True)
|
||||
|
||||
|
||||
def unique_name(name, format="%02d", namespace="", prefix="", suffix=""):
|
||||
"""Return unique `name`
|
||||
|
||||
The function takes into consideration an optional `namespace`
|
||||
and `suffix`. The suffix is included in evaluating whether a
|
||||
name exists - such as `name` + "_GRP" - but isn't included
|
||||
in the returned value.
|
||||
|
||||
If a namespace is provided, only names within that namespace
|
||||
are considered when evaluating whether the name is unique.
|
||||
|
||||
Arguments:
|
||||
format (str, optional): The `name` is given a number, this determines
|
||||
how this number is formatted. Defaults to a padding of 2.
|
||||
E.g. my_name01, my_name02.
|
||||
namespace (str, optional): Only consider names within this namespace.
|
||||
suffix (str, optional): Only consider names with this suffix.
|
||||
|
||||
Example:
|
||||
>>> name = cmds.createNode("transform", name="MyName")
|
||||
>>> cmds.objExists(name)
|
||||
True
|
||||
>>> unique = unique_name(name)
|
||||
>>> cmds.objExists(unique)
|
||||
False
|
||||
|
||||
"""
|
||||
|
||||
iteration = 1
|
||||
unique = prefix + (name + format % iteration) + suffix
|
||||
|
||||
while cmds.objExists(namespace + ":" + unique):
|
||||
iteration += 1
|
||||
unique = prefix + (name + format % iteration) + suffix
|
||||
|
||||
if suffix:
|
||||
return unique[:-len(suffix)]
|
||||
|
||||
return unique
|
||||
|
||||
|
||||
def unique_namespace(namespace, format="%02d", prefix="", suffix=""):
|
||||
"""Return unique namespace
|
||||
|
||||
Similar to :func:`unique_name` but evaluating namespaces
|
||||
as opposed to object names.
|
||||
|
||||
Arguments:
|
||||
namespace (str): Name of namespace to consider
|
||||
format (str, optional): Formatting of the given iteration number
|
||||
suffix (str, optional): Only consider namespaces with this suffix.
|
||||
|
||||
"""
|
||||
|
||||
iteration = 1
|
||||
unique = prefix + (namespace + format % iteration) + suffix
|
||||
|
||||
# The `existing` set does not just contain the namespaces but *all* nodes
|
||||
# within "current namespace". We need all because the namespace could
|
||||
# also clash with a node name. To be truly unique and valid one needs to
|
||||
# check against all.
|
||||
existing = set(cmds.namespaceInfo(listNamespace=True))
|
||||
while unique in existing:
|
||||
iteration += 1
|
||||
unique = prefix + (namespace + format % iteration) + suffix
|
||||
|
||||
return unique
|
||||
|
||||
|
||||
def read(node):
|
||||
"""Return user-defined attributes from `node`"""
|
||||
|
||||
data = dict()
|
||||
|
||||
for attr in cmds.listAttr(node, userDefined=True) or list():
|
||||
try:
|
||||
value = cmds.getAttr(node + "." + attr, asString=True)
|
||||
|
||||
except RuntimeError:
|
||||
# For Message type attribute or others that have connections,
|
||||
# take source node name as value.
|
||||
source = cmds.listConnections(node + "." + attr,
|
||||
source=True,
|
||||
destination=False)
|
||||
source = cmds.ls(source, long=True) or [None]
|
||||
value = source[0]
|
||||
|
||||
except ValueError:
|
||||
# Some attributes cannot be read directly,
|
||||
# such as mesh and color attributes. These
|
||||
# are considered non-essential to this
|
||||
# particular publishing pipeline.
|
||||
value = None
|
||||
|
||||
data[attr] = value
|
||||
|
||||
return data
|
||||
|
||||
|
||||
def _get_mel_global(name):
|
||||
"""Return the value of a mel global variable"""
|
||||
return mel.eval("$%s = $%s;" % (name, name))
|
||||
|
|
@ -280,6 +433,73 @@ def shape_from_element(element):
|
|||
return node
|
||||
|
||||
|
||||
def export_alembic(nodes,
|
||||
file,
|
||||
frame_range=None,
|
||||
write_uv=True,
|
||||
write_visibility=True,
|
||||
attribute_prefix=None):
|
||||
"""Wrap native MEL command with limited set of arguments
|
||||
|
||||
Arguments:
|
||||
nodes (list): Long names of nodes to cache
|
||||
|
||||
file (str): Absolute path to output destination
|
||||
|
||||
frame_range (tuple, optional): Start- and end-frame of cache,
|
||||
default to current animation range.
|
||||
|
||||
write_uv (bool, optional): Whether or not to include UVs,
|
||||
default to True
|
||||
|
||||
write_visibility (bool, optional): Turn on to store the visibility
|
||||
state of objects in the Alembic file. Otherwise, all objects are
|
||||
considered visible, default to True
|
||||
|
||||
attribute_prefix (str, optional): Include all user-defined
|
||||
attributes with this prefix.
|
||||
|
||||
"""
|
||||
|
||||
if frame_range is None:
|
||||
frame_range = (
|
||||
cmds.playbackOptions(query=True, ast=True),
|
||||
cmds.playbackOptions(query=True, aet=True)
|
||||
)
|
||||
|
||||
options = [
|
||||
("file", file),
|
||||
("frameRange", "%s %s" % frame_range),
|
||||
] + [("root", mesh) for mesh in nodes]
|
||||
|
||||
if isinstance(attribute_prefix, string_types):
|
||||
# Include all attributes prefixed with "mb"
|
||||
# TODO(marcus): This would be a good candidate for
|
||||
# external registration, so that the developer
|
||||
# doesn't have to edit this function to modify
|
||||
# the behavior of Alembic export.
|
||||
options.append(("attrPrefix", str(attribute_prefix)))
|
||||
|
||||
if write_uv:
|
||||
options.append(("uvWrite", ""))
|
||||
|
||||
if write_visibility:
|
||||
options.append(("writeVisibility", ""))
|
||||
|
||||
# Generate MEL command
|
||||
mel_args = list()
|
||||
for key, value in options:
|
||||
mel_args.append("-{0} {1}".format(key, value))
|
||||
|
||||
mel_args_string = " ".join(mel_args)
|
||||
mel_cmd = "AbcExport -j \"{0}\"".format(mel_args_string)
|
||||
|
||||
# For debuggability, put the string passed to MEL in the Script editor.
|
||||
print("mel.eval('%s')" % mel_cmd)
|
||||
|
||||
return mel.eval(mel_cmd)
|
||||
|
||||
|
||||
def collect_animation_data(fps=False):
|
||||
"""Get the basic animation data
|
||||
|
||||
|
|
@ -305,6 +525,256 @@ def collect_animation_data(fps=False):
|
|||
return data
|
||||
|
||||
|
||||
def imprint(node, data):
|
||||
"""Write `data` to `node` as userDefined attributes
|
||||
|
||||
Arguments:
|
||||
node (str): Long name of node
|
||||
data (dict): Dictionary of key/value pairs
|
||||
|
||||
Example:
|
||||
>>> from maya import cmds
|
||||
>>> def compute():
|
||||
... return 6
|
||||
...
|
||||
>>> cube, generator = cmds.polyCube()
|
||||
>>> imprint(cube, {
|
||||
... "regularString": "myFamily",
|
||||
... "computedValue": lambda: compute()
|
||||
... })
|
||||
...
|
||||
>>> cmds.getAttr(cube + ".computedValue")
|
||||
6
|
||||
|
||||
"""
|
||||
|
||||
for key, value in data.items():
|
||||
|
||||
if callable(value):
|
||||
# Support values evaluated at imprint
|
||||
value = value()
|
||||
|
||||
if isinstance(value, bool):
|
||||
add_type = {"attributeType": "bool"}
|
||||
set_type = {"keyable": False, "channelBox": True}
|
||||
elif isinstance(value, string_types):
|
||||
add_type = {"dataType": "string"}
|
||||
set_type = {"type": "string"}
|
||||
elif isinstance(value, int):
|
||||
add_type = {"attributeType": "long"}
|
||||
set_type = {"keyable": False, "channelBox": True}
|
||||
elif isinstance(value, float):
|
||||
add_type = {"attributeType": "double"}
|
||||
set_type = {"keyable": False, "channelBox": True}
|
||||
elif isinstance(value, (list, tuple)):
|
||||
add_type = {"attributeType": "enum", "enumName": ":".join(value)}
|
||||
set_type = {"keyable": False, "channelBox": True}
|
||||
value = 0 # enum default
|
||||
else:
|
||||
raise TypeError("Unsupported type: %r" % type(value))
|
||||
|
||||
cmds.addAttr(node, longName=key, **add_type)
|
||||
cmds.setAttr(node + "." + key, value, **set_type)
|
||||
|
||||
|
||||
def serialise_shaders(nodes):
|
||||
"""Generate a shader set dictionary
|
||||
|
||||
Arguments:
|
||||
nodes (list): Absolute paths to nodes
|
||||
|
||||
Returns:
|
||||
dictionary of (shader: id) pairs
|
||||
|
||||
Schema:
|
||||
{
|
||||
"shader1": ["id1", "id2"],
|
||||
"shader2": ["id3", "id1"]
|
||||
}
|
||||
|
||||
Example:
|
||||
{
|
||||
"Bazooka_Brothers01_:blinn4SG": [
|
||||
"f9520572-ac1d-11e6-b39e-3085a99791c9.f[4922:5001]",
|
||||
"f9520572-ac1d-11e6-b39e-3085a99791c9.f[4587:4634]",
|
||||
"f9520572-ac1d-11e6-b39e-3085a99791c9.f[1120:1567]",
|
||||
"f9520572-ac1d-11e6-b39e-3085a99791c9.f[4251:4362]"
|
||||
],
|
||||
"lambert2SG": [
|
||||
"f9520571-ac1d-11e6-9dbb-3085a99791c9"
|
||||
]
|
||||
}
|
||||
|
||||
"""
|
||||
|
||||
valid_nodes = cmds.ls(
|
||||
nodes,
|
||||
long=True,
|
||||
recursive=True,
|
||||
showType=True,
|
||||
objectsOnly=True,
|
||||
type="transform"
|
||||
)
|
||||
|
||||
meshes_by_id = {}
|
||||
for mesh in valid_nodes:
|
||||
shapes = cmds.listRelatives(valid_nodes[0],
|
||||
shapes=True,
|
||||
fullPath=True) or list()
|
||||
|
||||
if shapes:
|
||||
shape = shapes[0]
|
||||
if not cmds.nodeType(shape):
|
||||
continue
|
||||
|
||||
try:
|
||||
id_ = cmds.getAttr(mesh + ".mbID")
|
||||
|
||||
if id_ not in meshes_by_id:
|
||||
meshes_by_id[id_] = list()
|
||||
|
||||
meshes_by_id[id_].append(mesh)
|
||||
|
||||
except ValueError:
|
||||
continue
|
||||
|
||||
meshes_by_shader = dict()
|
||||
for mesh in meshes_by_id.values():
|
||||
shape = cmds.listRelatives(mesh,
|
||||
shapes=True,
|
||||
fullPath=True) or list()
|
||||
|
||||
for shader in cmds.listConnections(shape,
|
||||
type="shadingEngine") or list():
|
||||
|
||||
# Objects in this group are those that haven't got
|
||||
# any shaders. These are expected to be managed
|
||||
# elsewhere, such as by the default model loader.
|
||||
if shader == "initialShadingGroup":
|
||||
continue
|
||||
|
||||
if shader not in meshes_by_shader:
|
||||
meshes_by_shader[shader] = list()
|
||||
|
||||
shaded = cmds.sets(shader, query=True) or list()
|
||||
meshes_by_shader[shader].extend(shaded)
|
||||
|
||||
shader_by_id = {}
|
||||
for shader, shaded in meshes_by_shader.items():
|
||||
|
||||
if shader not in shader_by_id:
|
||||
shader_by_id[shader] = list()
|
||||
|
||||
for mesh in shaded:
|
||||
|
||||
# Enable shader assignment to faces.
|
||||
name = mesh.split(".f[")[0]
|
||||
|
||||
transform = name
|
||||
if cmds.objectType(transform) == "mesh":
|
||||
transform = cmds.listRelatives(name, parent=True)[0]
|
||||
|
||||
try:
|
||||
id_ = cmds.getAttr(transform + ".mbID")
|
||||
shader_by_id[shader].append(mesh.replace(name, id_))
|
||||
except KeyError:
|
||||
continue
|
||||
|
||||
# Remove duplicates
|
||||
shader_by_id[shader] = list(set(shader_by_id[shader]))
|
||||
|
||||
return shader_by_id
|
||||
|
||||
|
||||
def lsattr(attr, value=None):
|
||||
"""Return nodes matching `key` and `value`
|
||||
|
||||
Arguments:
|
||||
attr (str): Name of Maya attribute
|
||||
value (object, optional): Value of attribute. If none
|
||||
is provided, return all nodes with this attribute.
|
||||
|
||||
Example:
|
||||
>> lsattr("id", "myId")
|
||||
["myNode"]
|
||||
>> lsattr("id")
|
||||
["myNode", "myOtherNode"]
|
||||
|
||||
"""
|
||||
|
||||
if value is None:
|
||||
return cmds.ls("*.%s" % attr,
|
||||
recursive=True,
|
||||
objectsOnly=True,
|
||||
long=True)
|
||||
return lsattrs({attr: value})
|
||||
|
||||
|
||||
def lsattrs(attrs):
|
||||
"""Return nodes with the given attribute(s).
|
||||
|
||||
Arguments:
|
||||
attrs (dict): Name and value pairs of expected matches
|
||||
|
||||
Example:
|
||||
>> # Return nodes with an `age` of five.
|
||||
>> lsattr({"age": "five"})
|
||||
>> # Return nodes with both `age` and `color` of five and blue.
|
||||
>> lsattr({"age": "five", "color": "blue"})
|
||||
|
||||
Return:
|
||||
list: matching nodes.
|
||||
|
||||
"""
|
||||
|
||||
dep_fn = om.MFnDependencyNode()
|
||||
dag_fn = om.MFnDagNode()
|
||||
selection_list = om.MSelectionList()
|
||||
|
||||
first_attr = next(iter(attrs))
|
||||
|
||||
try:
|
||||
selection_list.add("*.{0}".format(first_attr),
|
||||
searchChildNamespaces=True)
|
||||
except RuntimeError as exc:
|
||||
if str(exc).endswith("Object does not exist"):
|
||||
return []
|
||||
|
||||
matches = set()
|
||||
for i in range(selection_list.length()):
|
||||
node = selection_list.getDependNode(i)
|
||||
if node.hasFn(om.MFn.kDagNode):
|
||||
fn_node = dag_fn.setObject(node)
|
||||
full_path_names = [path.fullPathName()
|
||||
for path in fn_node.getAllPaths()]
|
||||
else:
|
||||
fn_node = dep_fn.setObject(node)
|
||||
full_path_names = [fn_node.name()]
|
||||
|
||||
for attr in attrs:
|
||||
try:
|
||||
plug = fn_node.findPlug(attr, True)
|
||||
if plug.asString() != attrs[attr]:
|
||||
break
|
||||
except RuntimeError:
|
||||
break
|
||||
else:
|
||||
matches.update(full_path_names)
|
||||
|
||||
return list(matches)
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def without_extension():
|
||||
"""Use cmds.file with defaultExtensions=False"""
|
||||
previous_setting = cmds.file(defaultExtensions=True, query=True)
|
||||
try:
|
||||
cmds.file(defaultExtensions=False)
|
||||
yield
|
||||
finally:
|
||||
cmds.file(defaultExtensions=previous_setting)
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def attribute_values(attr_values):
|
||||
"""Remaps node attributes to values during context.
|
||||
|
|
@ -736,7 +1206,7 @@ def namespaced(namespace, new=True):
|
|||
"""
|
||||
original = cmds.namespaceInfo(cur=True, absoluteName=True)
|
||||
if new:
|
||||
namespace = avalon.maya.lib.unique_namespace(namespace)
|
||||
namespace = unique_namespace(namespace)
|
||||
cmds.namespace(add=namespace)
|
||||
|
||||
try:
|
||||
|
|
@ -1408,7 +1878,7 @@ def assign_look_by_version(nodes, version_id):
|
|||
raise RuntimeError("Could not find LookLoader, this is a bug")
|
||||
|
||||
# Reference the look file
|
||||
with maya.maintained_selection():
|
||||
with maintained_selection():
|
||||
container_node = pipeline.load(Loader, look_representation)
|
||||
|
||||
# Get container members
|
||||
|
|
@ -1947,7 +2417,7 @@ def set_context_settings():
|
|||
reset_scene_resolution()
|
||||
|
||||
# Set frame range.
|
||||
avalon.maya.interactive.reset_frame_range()
|
||||
reset_frame_range()
|
||||
|
||||
# Set colorspace
|
||||
set_colorspace()
|
||||
|
|
@ -1970,34 +2440,27 @@ def validate_fps():
|
|||
# rounding, we have to round those numbers coming from Maya.
|
||||
current_fps = float_round(mel.eval('currentTimeUnitToFPS()'), 2)
|
||||
|
||||
if current_fps != fps:
|
||||
fps_match = current_fps == fps
|
||||
if not fps_match and not IS_HEADLESS:
|
||||
from openpype.widgets import popup
|
||||
|
||||
from Qt import QtWidgets
|
||||
from ...widgets import popup
|
||||
parent = get_main_window()
|
||||
|
||||
# Find maya main window
|
||||
top_level_widgets = {w.objectName(): w for w in
|
||||
QtWidgets.QApplication.topLevelWidgets()}
|
||||
dialog = popup.Popup2(parent=parent)
|
||||
dialog.setModal(True)
|
||||
dialog.setWindowTitle("Maya scene not in line with project")
|
||||
dialog.setMessage("The FPS is out of sync, please fix")
|
||||
|
||||
parent = top_level_widgets.get("MayaWindow", None)
|
||||
if parent is None:
|
||||
pass
|
||||
else:
|
||||
dialog = popup.Popup2(parent=parent)
|
||||
dialog.setModal(True)
|
||||
dialog.setWindowTitle("Maya scene not in line with project")
|
||||
dialog.setMessage("The FPS is out of sync, please fix")
|
||||
# Set new text for button (add optional argument for the popup?)
|
||||
toggle = dialog.widgets["toggle"]
|
||||
update = toggle.isChecked()
|
||||
dialog.on_show.connect(lambda: set_scene_fps(fps, update))
|
||||
|
||||
# Set new text for button (add optional argument for the popup?)
|
||||
toggle = dialog.widgets["toggle"]
|
||||
update = toggle.isChecked()
|
||||
dialog.on_show.connect(lambda: set_scene_fps(fps, update))
|
||||
dialog.show()
|
||||
|
||||
dialog.show()
|
||||
return False
|
||||
|
||||
return False
|
||||
|
||||
return True
|
||||
return fps_match
|
||||
|
||||
|
||||
def bake(nodes,
|
||||
|
|
@ -2386,7 +2849,7 @@ def get_attr_in_layer(attr, layer):
|
|||
def fix_incompatible_containers():
|
||||
"""Return whether the current scene has any outdated content"""
|
||||
|
||||
host = avalon.api.registered_host()
|
||||
host = api.registered_host()
|
||||
for container in host.ls():
|
||||
loader = container['loader']
|
||||
|
||||
|
|
|
|||
|
|
@ -1,58 +1,146 @@
|
|||
import sys
|
||||
import os
|
||||
import logging
|
||||
|
||||
from Qt import QtWidgets, QtGui
|
||||
|
||||
import maya.utils
|
||||
import maya.cmds as cmds
|
||||
|
||||
from avalon.maya import pipeline
|
||||
import avalon.api
|
||||
|
||||
from openpype.api import BuildWorkfile
|
||||
from openpype.settings import get_project_settings
|
||||
from openpype.tools.utils import host_tools
|
||||
from openpype.hosts.maya.api import lib
|
||||
from .lib import get_main_window, IS_HEADLESS
|
||||
from .commands import reset_frame_range
|
||||
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
MENU_NAME = "op_maya_menu"
|
||||
|
||||
|
||||
def _get_menu(menu_name=None):
|
||||
"""Return the menu instance if it currently exists in Maya"""
|
||||
if menu_name is None:
|
||||
menu_name = pipeline._menu
|
||||
menu_name = MENU_NAME
|
||||
|
||||
widgets = {w.objectName(): w for w in QtWidgets.QApplication.allWidgets()}
|
||||
return widgets.get(menu_name)
|
||||
|
||||
|
||||
def deferred():
|
||||
def add_build_workfiles_item():
|
||||
# Add build first workfile
|
||||
cmds.menuItem(divider=True, parent=pipeline._menu)
|
||||
def install():
|
||||
if cmds.about(batch=True):
|
||||
log.info("Skipping openpype.menu initialization in batch mode..")
|
||||
return
|
||||
|
||||
def deferred():
|
||||
from avalon.tools import publish
|
||||
parent_widget = get_main_window()
|
||||
cmds.menu(
|
||||
MENU_NAME,
|
||||
label=avalon.api.Session["AVALON_LABEL"],
|
||||
tearOff=True,
|
||||
parent="MayaWindow"
|
||||
)
|
||||
|
||||
# Create context menu
|
||||
context_label = "{}, {}".format(
|
||||
avalon.api.Session["AVALON_ASSET"],
|
||||
avalon.api.Session["AVALON_TASK"]
|
||||
)
|
||||
cmds.menuItem(
|
||||
"currentContext",
|
||||
label=context_label,
|
||||
parent=MENU_NAME,
|
||||
enable=False
|
||||
)
|
||||
|
||||
cmds.setParent("..", menu=True)
|
||||
|
||||
cmds.menuItem(divider=True)
|
||||
|
||||
# Create default items
|
||||
cmds.menuItem(
|
||||
"Create...",
|
||||
command=lambda *args: host_tools.show_creator(parent=parent_widget)
|
||||
)
|
||||
|
||||
cmds.menuItem(
|
||||
"Load...",
|
||||
command=lambda *args: host_tools.show_loader(
|
||||
parent=parent_widget,
|
||||
use_context=True
|
||||
)
|
||||
)
|
||||
|
||||
cmds.menuItem(
|
||||
"Publish...",
|
||||
command=lambda *args: host_tools.show_publish(
|
||||
parent=parent_widget
|
||||
),
|
||||
image=publish.ICON
|
||||
)
|
||||
|
||||
cmds.menuItem(
|
||||
"Manage...",
|
||||
command=lambda *args: host_tools.show_scene_inventory(
|
||||
parent=parent_widget
|
||||
)
|
||||
)
|
||||
|
||||
cmds.menuItem(
|
||||
"Library...",
|
||||
command=lambda *args: host_tools.show_library_loader(
|
||||
parent=parent_widget
|
||||
)
|
||||
)
|
||||
|
||||
cmds.menuItem(divider=True)
|
||||
|
||||
cmds.menuItem(
|
||||
"Work Files...",
|
||||
command=lambda *args: host_tools.show_workfiles(
|
||||
parent=parent_widget
|
||||
),
|
||||
)
|
||||
|
||||
cmds.menuItem(
|
||||
"Reset Frame Range",
|
||||
command=lambda *args: reset_frame_range()
|
||||
)
|
||||
|
||||
cmds.menuItem(
|
||||
"Reset Resolution",
|
||||
command=lambda *args: lib.reset_scene_resolution()
|
||||
)
|
||||
|
||||
cmds.menuItem(
|
||||
"Set Colorspace",
|
||||
command=lambda *args: lib.set_colorspace(),
|
||||
)
|
||||
cmds.menuItem(divider=True, parent=MENU_NAME)
|
||||
cmds.menuItem(
|
||||
"Build First Workfile",
|
||||
parent=pipeline._menu,
|
||||
parent=MENU_NAME,
|
||||
command=lambda *args: BuildWorkfile().process()
|
||||
)
|
||||
|
||||
def add_look_assigner_item():
|
||||
cmds.menuItem(
|
||||
"Look assigner",
|
||||
parent=pipeline._menu,
|
||||
"Look assigner...",
|
||||
command=lambda *args: host_tools.show_look_assigner(
|
||||
pipeline._parent
|
||||
parent_widget
|
||||
)
|
||||
)
|
||||
|
||||
def add_experimental_item():
|
||||
cmds.menuItem(
|
||||
"Experimental tools...",
|
||||
parent=pipeline._menu,
|
||||
command=lambda *args: host_tools.show_experimental_tools_dialog(
|
||||
pipeline._parent
|
||||
parent_widget
|
||||
)
|
||||
)
|
||||
cmds.setParent(MENU_NAME, menu=True)
|
||||
|
||||
def add_scripts_menu():
|
||||
try:
|
||||
|
|
@ -82,124 +170,13 @@ def deferred():
|
|||
# apply configuration
|
||||
studio_menu.build_from_configuration(studio_menu, config)
|
||||
|
||||
def modify_workfiles():
|
||||
# Find the pipeline menu
|
||||
top_menu = _get_menu()
|
||||
|
||||
# Try to find workfile tool action in the menu
|
||||
workfile_action = None
|
||||
for action in top_menu.actions():
|
||||
if action.text() == "Work Files":
|
||||
workfile_action = action
|
||||
break
|
||||
|
||||
# Add at the top of menu if "Work Files" action was not found
|
||||
after_action = ""
|
||||
if workfile_action:
|
||||
# Use action's object name for `insertAfter` argument
|
||||
after_action = workfile_action.objectName()
|
||||
|
||||
# Insert action to menu
|
||||
cmds.menuItem(
|
||||
"Work Files",
|
||||
parent=pipeline._menu,
|
||||
command=lambda *args: host_tools.show_workfiles(pipeline._parent),
|
||||
insertAfter=after_action
|
||||
)
|
||||
|
||||
# Remove replaced action
|
||||
if workfile_action:
|
||||
top_menu.removeAction(workfile_action)
|
||||
|
||||
def modify_resolution():
|
||||
# Find the pipeline menu
|
||||
top_menu = _get_menu()
|
||||
|
||||
# Try to find resolution tool action in the menu
|
||||
resolution_action = None
|
||||
for action in top_menu.actions():
|
||||
if action.text() == "Reset Resolution":
|
||||
resolution_action = action
|
||||
break
|
||||
|
||||
# Add at the top of menu if "Work Files" action was not found
|
||||
after_action = ""
|
||||
if resolution_action:
|
||||
# Use action's object name for `insertAfter` argument
|
||||
after_action = resolution_action.objectName()
|
||||
|
||||
# Insert action to menu
|
||||
cmds.menuItem(
|
||||
"Reset Resolution",
|
||||
parent=pipeline._menu,
|
||||
command=lambda *args: lib.reset_scene_resolution(),
|
||||
insertAfter=after_action
|
||||
)
|
||||
|
||||
# Remove replaced action
|
||||
if resolution_action:
|
||||
top_menu.removeAction(resolution_action)
|
||||
|
||||
def remove_project_manager():
|
||||
top_menu = _get_menu()
|
||||
|
||||
# Try to find "System" menu action in the menu
|
||||
system_menu = None
|
||||
for action in top_menu.actions():
|
||||
if action.text() == "System":
|
||||
system_menu = action
|
||||
break
|
||||
|
||||
if system_menu is None:
|
||||
return
|
||||
|
||||
# Try to find "Project manager" action in "System" menu
|
||||
project_manager_action = None
|
||||
for action in system_menu.menu().children():
|
||||
if hasattr(action, "text") and action.text() == "Project Manager":
|
||||
project_manager_action = action
|
||||
break
|
||||
|
||||
# Remove "Project manager" action if was found
|
||||
if project_manager_action is not None:
|
||||
system_menu.menu().removeAction(project_manager_action)
|
||||
|
||||
def add_colorspace():
|
||||
# Find the pipeline menu
|
||||
top_menu = _get_menu()
|
||||
|
||||
# Try to find workfile tool action in the menu
|
||||
workfile_action = None
|
||||
for action in top_menu.actions():
|
||||
if action.text() == "Reset Resolution":
|
||||
workfile_action = action
|
||||
break
|
||||
|
||||
# Add at the top of menu if "Work Files" action was not found
|
||||
after_action = ""
|
||||
if workfile_action:
|
||||
# Use action's object name for `insertAfter` argument
|
||||
after_action = workfile_action.objectName()
|
||||
|
||||
# Insert action to menu
|
||||
cmds.menuItem(
|
||||
"Set Colorspace",
|
||||
parent=pipeline._menu,
|
||||
command=lambda *args: lib.set_colorspace(),
|
||||
insertAfter=after_action
|
||||
)
|
||||
|
||||
log.info("Attempting to install scripts menu ...")
|
||||
|
||||
# add_scripts_menu()
|
||||
add_build_workfiles_item()
|
||||
add_look_assigner_item()
|
||||
add_experimental_item()
|
||||
modify_workfiles()
|
||||
modify_resolution()
|
||||
remove_project_manager()
|
||||
add_colorspace()
|
||||
add_scripts_menu()
|
||||
# Allow time for uninstallation to finish.
|
||||
# We use Maya's executeDeferred instead of QTimer.singleShot
|
||||
# so that it only gets called after Maya UI has initialized too.
|
||||
# This is crucial with Maya 2020+ which initializes without UI
|
||||
# first as a QCoreApplication
|
||||
maya.utils.executeDeferred(deferred)
|
||||
cmds.evalDeferred(add_scripts_menu, lowestPriority=True)
|
||||
|
||||
|
||||
def uninstall():
|
||||
|
|
@ -214,18 +191,27 @@ def uninstall():
|
|||
log.error(e)
|
||||
|
||||
|
||||
def install():
|
||||
if cmds.about(batch=True):
|
||||
log.info("Skipping openpype.menu initialization in batch mode..")
|
||||
return
|
||||
|
||||
# Allow time for uninstallation to finish.
|
||||
cmds.evalDeferred(deferred, lowestPriority=True)
|
||||
|
||||
|
||||
def popup():
|
||||
"""Pop-up the existing menu near the mouse cursor."""
|
||||
menu = _get_menu()
|
||||
cursor = QtGui.QCursor()
|
||||
point = cursor.pos()
|
||||
menu.exec_(point)
|
||||
|
||||
|
||||
def update_menu_task_label():
|
||||
"""Update the task label in Avalon menu to current session"""
|
||||
|
||||
if IS_HEADLESS:
|
||||
return
|
||||
|
||||
object_name = "{}|currentContext".format(MENU_NAME)
|
||||
if not cmds.menuItem(object_name, query=True, exists=True):
|
||||
log.warning("Can't find menuItem: {}".format(object_name))
|
||||
return
|
||||
|
||||
label = "{}, {}".format(
|
||||
avalon.api.Session["AVALON_ASSET"],
|
||||
avalon.api.Session["AVALON_TASK"]
|
||||
)
|
||||
cmds.menuItem(object_name, edit=True, label=label)
|
||||
|
|
|
|||
596
openpype/hosts/maya/api/pipeline.py
Normal file
596
openpype/hosts/maya/api/pipeline.py
Normal file
|
|
@ -0,0 +1,596 @@
|
|||
import os
|
||||
import sys
|
||||
import errno
|
||||
import logging
|
||||
import contextlib
|
||||
|
||||
from maya import utils, cmds, OpenMaya
|
||||
import maya.api.OpenMaya as om
|
||||
|
||||
import pyblish.api
|
||||
import avalon.api
|
||||
|
||||
from avalon.lib import find_submodule
|
||||
from avalon.pipeline import AVALON_CONTAINER_ID
|
||||
|
||||
import openpype.hosts.maya
|
||||
from openpype.tools.utils import host_tools
|
||||
from openpype.lib import any_outdated
|
||||
from openpype.lib.path_tools import HostDirmap
|
||||
from openpype.hosts.maya.lib import copy_workspace_mel
|
||||
from . import menu, lib
|
||||
|
||||
log = logging.getLogger("openpype.hosts.maya")
|
||||
|
||||
HOST_DIR = os.path.dirname(os.path.abspath(openpype.hosts.maya.__file__))
|
||||
PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
|
||||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
|
||||
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
|
||||
|
||||
AVALON_CONTAINERS = ":AVALON_CONTAINERS"
|
||||
|
||||
self = sys.modules[__name__]
|
||||
self._ignore_lock = False
|
||||
self._events = {}
|
||||
|
||||
|
||||
def install():
|
||||
from openpype.settings import get_project_settings
|
||||
|
||||
project_settings = get_project_settings(os.getenv("AVALON_PROJECT"))
|
||||
# process path mapping
|
||||
dirmap_processor = MayaDirmap("maya", project_settings)
|
||||
dirmap_processor.process_dirmap()
|
||||
|
||||
pyblish.api.register_plugin_path(PUBLISH_PATH)
|
||||
pyblish.api.register_host("mayabatch")
|
||||
pyblish.api.register_host("mayapy")
|
||||
pyblish.api.register_host("maya")
|
||||
|
||||
avalon.api.register_plugin_path(avalon.api.Loader, LOAD_PATH)
|
||||
avalon.api.register_plugin_path(avalon.api.Creator, CREATE_PATH)
|
||||
avalon.api.register_plugin_path(avalon.api.InventoryAction, INVENTORY_PATH)
|
||||
log.info(PUBLISH_PATH)
|
||||
|
||||
log.info("Installing callbacks ... ")
|
||||
avalon.api.on("init", on_init)
|
||||
|
||||
# Callbacks below are not required for headless mode, the `init` however
|
||||
# is important to load referenced Alembics correctly at rendertime.
|
||||
if lib.IS_HEADLESS:
|
||||
log.info(("Running in headless mode, skipping Maya "
|
||||
"save/open/new callback installation.."))
|
||||
return
|
||||
|
||||
_set_project()
|
||||
_register_callbacks()
|
||||
|
||||
menu.install()
|
||||
|
||||
avalon.api.on("save", on_save)
|
||||
avalon.api.on("open", on_open)
|
||||
avalon.api.on("new", on_new)
|
||||
avalon.api.before("save", on_before_save)
|
||||
avalon.api.on("taskChanged", on_task_changed)
|
||||
avalon.api.on("before.workfile.save", before_workfile_save)
|
||||
|
||||
log.info("Setting default family states for loader..")
|
||||
avalon.api.data["familiesStateToggled"] = ["imagesequence"]
|
||||
|
||||
|
||||
def _set_project():
|
||||
"""Sets the maya project to the current Session's work directory.
|
||||
|
||||
Returns:
|
||||
None
|
||||
|
||||
"""
|
||||
workdir = avalon.api.Session["AVALON_WORKDIR"]
|
||||
|
||||
try:
|
||||
os.makedirs(workdir)
|
||||
except OSError as e:
|
||||
# An already existing working directory is fine.
|
||||
if e.errno == errno.EEXIST:
|
||||
pass
|
||||
else:
|
||||
raise
|
||||
|
||||
cmds.workspace(workdir, openWorkspace=True)
|
||||
|
||||
|
||||
def _register_callbacks():
|
||||
for handler, event in self._events.copy().items():
|
||||
if event is None:
|
||||
continue
|
||||
|
||||
try:
|
||||
OpenMaya.MMessage.removeCallback(event)
|
||||
self._events[handler] = None
|
||||
except RuntimeError as e:
|
||||
log.info(e)
|
||||
|
||||
self._events[_on_scene_save] = OpenMaya.MSceneMessage.addCallback(
|
||||
OpenMaya.MSceneMessage.kBeforeSave, _on_scene_save
|
||||
)
|
||||
|
||||
self._events[_before_scene_save] = OpenMaya.MSceneMessage.addCheckCallback(
|
||||
OpenMaya.MSceneMessage.kBeforeSaveCheck, _before_scene_save
|
||||
)
|
||||
|
||||
self._events[_on_scene_new] = OpenMaya.MSceneMessage.addCallback(
|
||||
OpenMaya.MSceneMessage.kAfterNew, _on_scene_new
|
||||
)
|
||||
|
||||
self._events[_on_maya_initialized] = OpenMaya.MSceneMessage.addCallback(
|
||||
OpenMaya.MSceneMessage.kMayaInitialized, _on_maya_initialized
|
||||
)
|
||||
|
||||
self._events[_on_scene_open] = OpenMaya.MSceneMessage.addCallback(
|
||||
OpenMaya.MSceneMessage.kAfterOpen, _on_scene_open
|
||||
)
|
||||
|
||||
log.info("Installed event handler _on_scene_save..")
|
||||
log.info("Installed event handler _before_scene_save..")
|
||||
log.info("Installed event handler _on_scene_new..")
|
||||
log.info("Installed event handler _on_maya_initialized..")
|
||||
log.info("Installed event handler _on_scene_open..")
|
||||
|
||||
|
||||
def _on_maya_initialized(*args):
|
||||
avalon.api.emit("init", args)
|
||||
|
||||
if cmds.about(batch=True):
|
||||
log.warning("Running batch mode ...")
|
||||
return
|
||||
|
||||
# Keep reference to the main Window, once a main window exists.
|
||||
lib.get_main_window()
|
||||
|
||||
|
||||
def _on_scene_new(*args):
|
||||
avalon.api.emit("new", args)
|
||||
|
||||
|
||||
def _on_scene_save(*args):
|
||||
avalon.api.emit("save", args)
|
||||
|
||||
|
||||
def _on_scene_open(*args):
|
||||
avalon.api.emit("open", args)
|
||||
|
||||
|
||||
def _before_scene_save(return_code, client_data):
|
||||
|
||||
# Default to allowing the action. Registered
|
||||
# callbacks can optionally set this to False
|
||||
# in order to block the operation.
|
||||
OpenMaya.MScriptUtil.setBool(return_code, True)
|
||||
|
||||
avalon.api.emit("before_save", [return_code, client_data])
|
||||
|
||||
|
||||
def uninstall():
|
||||
pyblish.api.deregister_plugin_path(PUBLISH_PATH)
|
||||
pyblish.api.deregister_host("mayabatch")
|
||||
pyblish.api.deregister_host("mayapy")
|
||||
pyblish.api.deregister_host("maya")
|
||||
|
||||
avalon.api.deregister_plugin_path(avalon.api.Loader, LOAD_PATH)
|
||||
avalon.api.deregister_plugin_path(avalon.api.Creator, CREATE_PATH)
|
||||
avalon.api.deregister_plugin_path(
|
||||
avalon.api.InventoryAction, INVENTORY_PATH
|
||||
)
|
||||
|
||||
menu.uninstall()
|
||||
|
||||
|
||||
def lock():
|
||||
"""Lock scene
|
||||
|
||||
Add an invisible node to your Maya scene with the name of the
|
||||
current file, indicating that this file is "locked" and cannot
|
||||
be modified any further.
|
||||
|
||||
"""
|
||||
|
||||
if not cmds.objExists("lock"):
|
||||
with lib.maintained_selection():
|
||||
cmds.createNode("objectSet", name="lock")
|
||||
cmds.addAttr("lock", ln="basename", dataType="string")
|
||||
|
||||
# Permanently hide from outliner
|
||||
cmds.setAttr("lock.verticesOnlySet", True)
|
||||
|
||||
fname = cmds.file(query=True, sceneName=True)
|
||||
basename = os.path.basename(fname)
|
||||
cmds.setAttr("lock.basename", basename, type="string")
|
||||
|
||||
|
||||
def unlock():
|
||||
"""Permanently unlock a locked scene
|
||||
|
||||
Doesn't throw an error if scene is already unlocked.
|
||||
|
||||
"""
|
||||
|
||||
try:
|
||||
cmds.delete("lock")
|
||||
except ValueError:
|
||||
pass
|
||||
|
||||
|
||||
def is_locked():
|
||||
"""Query whether current scene is locked"""
|
||||
fname = cmds.file(query=True, sceneName=True)
|
||||
basename = os.path.basename(fname)
|
||||
|
||||
if self._ignore_lock:
|
||||
return False
|
||||
|
||||
try:
|
||||
return cmds.getAttr("lock.basename") == basename
|
||||
except ValueError:
|
||||
return False
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def lock_ignored():
|
||||
"""Context manager for temporarily ignoring the lock of a scene
|
||||
|
||||
The purpose of this function is to enable locking a scene and
|
||||
saving it with the lock still in place.
|
||||
|
||||
Example:
|
||||
>>> with lock_ignored():
|
||||
... pass # Do things without lock
|
||||
|
||||
"""
|
||||
|
||||
self._ignore_lock = True
|
||||
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
self._ignore_lock = False
|
||||
|
||||
|
||||
def parse_container(container):
|
||||
"""Return the container node's full container data.
|
||||
|
||||
Args:
|
||||
container (str): A container node name.
|
||||
|
||||
Returns:
|
||||
dict: The container schema data for this container node.
|
||||
|
||||
"""
|
||||
data = lib.read(container)
|
||||
|
||||
# Backwards compatibility pre-schemas for containers
|
||||
data["schema"] = data.get("schema", "openpype:container-1.0")
|
||||
|
||||
# Append transient data
|
||||
data["objectName"] = container
|
||||
|
||||
return data
|
||||
|
||||
|
||||
def _ls():
|
||||
"""Yields Avalon container node names.
|
||||
|
||||
Used by `ls()` to retrieve the nodes and then query the full container's
|
||||
data.
|
||||
|
||||
Yields:
|
||||
str: Avalon container node name (objectSet)
|
||||
|
||||
"""
|
||||
|
||||
def _maya_iterate(iterator):
|
||||
"""Helper to iterate a maya iterator"""
|
||||
while not iterator.isDone():
|
||||
yield iterator.thisNode()
|
||||
iterator.next()
|
||||
|
||||
ids = {AVALON_CONTAINER_ID,
|
||||
# Backwards compatibility
|
||||
"pyblish.mindbender.container"}
|
||||
|
||||
# Iterate over all 'set' nodes in the scene to detect whether
|
||||
# they have the avalon container ".id" attribute.
|
||||
fn_dep = om.MFnDependencyNode()
|
||||
iterator = om.MItDependencyNodes(om.MFn.kSet)
|
||||
for mobject in _maya_iterate(iterator):
|
||||
if mobject.apiTypeStr != "kSet":
|
||||
# Only match by exact type
|
||||
continue
|
||||
|
||||
fn_dep.setObject(mobject)
|
||||
if not fn_dep.hasAttribute("id"):
|
||||
continue
|
||||
|
||||
plug = fn_dep.findPlug("id", True)
|
||||
value = plug.asString()
|
||||
if value in ids:
|
||||
yield fn_dep.name()
|
||||
|
||||
|
||||
def ls():
|
||||
"""Yields containers from active Maya scene
|
||||
|
||||
This is the host-equivalent of api.ls(), but instead of listing
|
||||
assets on disk, it lists assets already loaded in Maya; once loaded
|
||||
they are called 'containers'
|
||||
|
||||
Yields:
|
||||
dict: container
|
||||
|
||||
"""
|
||||
container_names = _ls()
|
||||
|
||||
has_metadata_collector = False
|
||||
config_host = find_submodule(avalon.api.registered_config(), "maya")
|
||||
if hasattr(config_host, "collect_container_metadata"):
|
||||
has_metadata_collector = True
|
||||
|
||||
for container in sorted(container_names):
|
||||
data = parse_container(container)
|
||||
|
||||
# Collect custom data if attribute is present
|
||||
if has_metadata_collector:
|
||||
metadata = config_host.collect_container_metadata(container)
|
||||
data.update(metadata)
|
||||
|
||||
yield data
|
||||
|
||||
|
||||
def containerise(name,
|
||||
namespace,
|
||||
nodes,
|
||||
context,
|
||||
loader=None,
|
||||
suffix="CON"):
|
||||
"""Bundle `nodes` into an assembly and imprint it with metadata
|
||||
|
||||
Containerisation enables a tracking of version, author and origin
|
||||
for loaded assets.
|
||||
|
||||
Arguments:
|
||||
name (str): Name of resulting assembly
|
||||
namespace (str): Namespace under which to host container
|
||||
nodes (list): Long names of nodes to containerise
|
||||
context (dict): Asset information
|
||||
loader (str, optional): Name of loader used to produce this container.
|
||||
suffix (str, optional): Suffix of container, defaults to `_CON`.
|
||||
|
||||
Returns:
|
||||
container (str): Name of container assembly
|
||||
|
||||
"""
|
||||
container = cmds.sets(nodes, name="%s_%s_%s" % (namespace, name, suffix))
|
||||
|
||||
data = [
|
||||
("schema", "openpype:container-2.0"),
|
||||
("id", AVALON_CONTAINER_ID),
|
||||
("name", name),
|
||||
("namespace", namespace),
|
||||
("loader", str(loader)),
|
||||
("representation", context["representation"]["_id"]),
|
||||
]
|
||||
|
||||
for key, value in data:
|
||||
if not value:
|
||||
continue
|
||||
|
||||
if isinstance(value, (int, float)):
|
||||
cmds.addAttr(container, longName=key, attributeType="short")
|
||||
cmds.setAttr(container + "." + key, value)
|
||||
|
||||
else:
|
||||
cmds.addAttr(container, longName=key, dataType="string")
|
||||
cmds.setAttr(container + "." + key, value, type="string")
|
||||
|
||||
main_container = cmds.ls(AVALON_CONTAINERS, type="objectSet")
|
||||
if not main_container:
|
||||
main_container = cmds.sets(empty=True, name=AVALON_CONTAINERS)
|
||||
|
||||
# Implement #399: Maya 2019+ hide AVALON_CONTAINERS on creation..
|
||||
if cmds.attributeQuery("hiddenInOutliner",
|
||||
node=main_container,
|
||||
exists=True):
|
||||
cmds.setAttr(main_container + ".hiddenInOutliner", True)
|
||||
else:
|
||||
main_container = main_container[0]
|
||||
|
||||
cmds.sets(container, addElement=main_container)
|
||||
|
||||
# Implement #399: Maya 2019+ hide containers in outliner
|
||||
if cmds.attributeQuery("hiddenInOutliner",
|
||||
node=container,
|
||||
exists=True):
|
||||
cmds.setAttr(container + ".hiddenInOutliner", True)
|
||||
|
||||
return container
|
||||
|
||||
|
||||
def on_init(_):
|
||||
log.info("Running callback on init..")
|
||||
|
||||
def safe_deferred(fn):
|
||||
"""Execute deferred the function in a try-except"""
|
||||
|
||||
def _fn():
|
||||
"""safely call in deferred callback"""
|
||||
try:
|
||||
fn()
|
||||
except Exception as exc:
|
||||
print(exc)
|
||||
|
||||
try:
|
||||
utils.executeDeferred(_fn)
|
||||
except Exception as exc:
|
||||
print(exc)
|
||||
|
||||
# Force load Alembic so referenced alembics
|
||||
# work correctly on scene open
|
||||
cmds.loadPlugin("AbcImport", quiet=True)
|
||||
cmds.loadPlugin("AbcExport", quiet=True)
|
||||
|
||||
# Force load objExport plug-in (requested by artists)
|
||||
cmds.loadPlugin("objExport", quiet=True)
|
||||
|
||||
from .customize import (
|
||||
override_component_mask_commands,
|
||||
override_toolbox_ui
|
||||
)
|
||||
safe_deferred(override_component_mask_commands)
|
||||
|
||||
launch_workfiles = os.environ.get("WORKFILES_STARTUP")
|
||||
|
||||
if launch_workfiles:
|
||||
safe_deferred(host_tools.show_workfiles)
|
||||
|
||||
if not lib.IS_HEADLESS:
|
||||
safe_deferred(override_toolbox_ui)
|
||||
|
||||
|
||||
def on_before_save(return_code, _):
|
||||
"""Run validation for scene's FPS prior to saving"""
|
||||
return lib.validate_fps()
|
||||
|
||||
|
||||
def on_save(_):
|
||||
"""Automatically add IDs to new nodes
|
||||
|
||||
Any transform of a mesh, without an existing ID, is given one
|
||||
automatically on file save.
|
||||
"""
|
||||
|
||||
log.info("Running callback on save..")
|
||||
|
||||
# # Update current task for the current scene
|
||||
# update_task_from_path(cmds.file(query=True, sceneName=True))
|
||||
|
||||
# Generate ids of the current context on nodes in the scene
|
||||
nodes = lib.get_id_required_nodes(referenced_nodes=False)
|
||||
for node, new_id in lib.generate_ids(nodes):
|
||||
lib.set_id(node, new_id, overwrite=False)
|
||||
|
||||
|
||||
def on_open(_):
|
||||
"""On scene open let's assume the containers have changed."""
|
||||
|
||||
from Qt import QtWidgets
|
||||
from openpype.widgets import popup
|
||||
|
||||
cmds.evalDeferred(
|
||||
"from openpype.hosts.maya.api import lib;"
|
||||
"lib.remove_render_layer_observer()")
|
||||
cmds.evalDeferred(
|
||||
"from openpype.hosts.maya.api import lib;"
|
||||
"lib.add_render_layer_observer()")
|
||||
cmds.evalDeferred(
|
||||
"from openpype.hosts.maya.api import lib;"
|
||||
"lib.add_render_layer_change_observer()")
|
||||
# # Update current task for the current scene
|
||||
# update_task_from_path(cmds.file(query=True, sceneName=True))
|
||||
|
||||
# Validate FPS after update_task_from_path to
|
||||
# ensure it is using correct FPS for the asset
|
||||
lib.validate_fps()
|
||||
lib.fix_incompatible_containers()
|
||||
|
||||
if any_outdated():
|
||||
log.warning("Scene has outdated content.")
|
||||
|
||||
# Find maya main window
|
||||
top_level_widgets = {w.objectName(): w for w in
|
||||
QtWidgets.QApplication.topLevelWidgets()}
|
||||
parent = top_level_widgets.get("MayaWindow", None)
|
||||
|
||||
if parent is None:
|
||||
log.info("Skipping outdated content pop-up "
|
||||
"because Maya window can't be found.")
|
||||
else:
|
||||
|
||||
# Show outdated pop-up
|
||||
def _on_show_inventory():
|
||||
host_tools.show_scene_inventory(parent=parent)
|
||||
|
||||
dialog = popup.Popup(parent=parent)
|
||||
dialog.setWindowTitle("Maya scene has outdated content")
|
||||
dialog.setMessage("There are outdated containers in "
|
||||
"your Maya scene.")
|
||||
dialog.on_show.connect(_on_show_inventory)
|
||||
dialog.show()
|
||||
|
||||
|
||||
def on_new(_):
|
||||
"""Set project resolution and fps when create a new file"""
|
||||
log.info("Running callback on new..")
|
||||
with lib.suspended_refresh():
|
||||
cmds.evalDeferred(
|
||||
"from openpype.hosts.maya.api import lib;"
|
||||
"lib.remove_render_layer_observer()")
|
||||
cmds.evalDeferred(
|
||||
"from openpype.hosts.maya.api import lib;"
|
||||
"lib.add_render_layer_observer()")
|
||||
cmds.evalDeferred(
|
||||
"from openpype.hosts.maya.api import lib;"
|
||||
"lib.add_render_layer_change_observer()")
|
||||
lib.set_context_settings()
|
||||
|
||||
|
||||
def on_task_changed(*args):
|
||||
"""Wrapped function of app initialize and maya's on task changed"""
|
||||
# Run
|
||||
menu.update_menu_task_label()
|
||||
|
||||
workdir = avalon.api.Session["AVALON_WORKDIR"]
|
||||
if os.path.exists(workdir):
|
||||
log.info("Updating Maya workspace for task change to %s", workdir)
|
||||
|
||||
_set_project()
|
||||
|
||||
# Set Maya fileDialog's start-dir to /scenes
|
||||
frule_scene = cmds.workspace(fileRuleEntry="scene")
|
||||
cmds.optionVar(stringValue=("browserLocationmayaBinaryscene",
|
||||
workdir + "/" + frule_scene))
|
||||
|
||||
else:
|
||||
log.warning((
|
||||
"Can't set project for new context because path does not exist: {}"
|
||||
).format(workdir))
|
||||
|
||||
with lib.suspended_refresh():
|
||||
lib.set_context_settings()
|
||||
lib.update_content_on_context_change()
|
||||
|
||||
msg = " project: {}\n asset: {}\n task:{}".format(
|
||||
avalon.api.Session["AVALON_PROJECT"],
|
||||
avalon.api.Session["AVALON_ASSET"],
|
||||
avalon.api.Session["AVALON_TASK"]
|
||||
)
|
||||
|
||||
lib.show_message(
|
||||
"Context was changed",
|
||||
("Context was changed to:\n{}".format(msg)),
|
||||
)
|
||||
|
||||
|
||||
def before_workfile_save(event):
|
||||
workdir_path = event.workdir_path
|
||||
if workdir_path:
|
||||
copy_workspace_mel(workdir_path)
|
||||
|
||||
|
||||
class MayaDirmap(HostDirmap):
|
||||
def on_enable_dirmap(self):
|
||||
cmds.dirmap(en=True)
|
||||
|
||||
def dirmap_routine(self, source_path, destination_path):
|
||||
cmds.dirmap(m=(source_path, destination_path))
|
||||
cmds.dirmap(m=(destination_path, source_path))
|
||||
|
|
@ -1,8 +1,14 @@
|
|||
import os
|
||||
|
||||
from maya import cmds
|
||||
|
||||
from avalon import api
|
||||
from avalon.vendor import qargparse
|
||||
import avalon.maya
|
||||
from openpype.api import PypeCreatorMixin
|
||||
|
||||
from .pipeline import containerise
|
||||
from . import lib
|
||||
|
||||
|
||||
def get_reference_node(members, log=None):
|
||||
"""Get the reference node from the container members
|
||||
|
|
@ -14,8 +20,6 @@ def get_reference_node(members, log=None):
|
|||
|
||||
"""
|
||||
|
||||
from maya import cmds
|
||||
|
||||
# Collect the references without .placeHolderList[] attributes as
|
||||
# unique entries (objects only) and skipping the sharedReferenceNode.
|
||||
references = set()
|
||||
|
|
@ -61,8 +65,6 @@ def get_reference_node_parents(ref):
|
|||
list: The upstream parent reference nodes.
|
||||
|
||||
"""
|
||||
from maya import cmds
|
||||
|
||||
parent = cmds.referenceQuery(ref,
|
||||
referenceNode=True,
|
||||
parent=True)
|
||||
|
|
@ -75,11 +77,25 @@ def get_reference_node_parents(ref):
|
|||
return parents
|
||||
|
||||
|
||||
class Creator(PypeCreatorMixin, avalon.maya.Creator):
|
||||
pass
|
||||
class Creator(PypeCreatorMixin, api.Creator):
|
||||
def process(self):
|
||||
nodes = list()
|
||||
|
||||
with lib.undo_chunk():
|
||||
if (self.options or {}).get("useSelection"):
|
||||
nodes = cmds.ls(selection=True)
|
||||
|
||||
instance = cmds.sets(nodes, name=self.name)
|
||||
lib.imprint(instance, self.data)
|
||||
|
||||
return instance
|
||||
|
||||
|
||||
class ReferenceLoader(api.Loader):
|
||||
class Loader(api.Loader):
|
||||
hosts = ["maya"]
|
||||
|
||||
|
||||
class ReferenceLoader(Loader):
|
||||
"""A basic ReferenceLoader for Maya
|
||||
|
||||
This will implement the basic behavior for a loader to inherit from that
|
||||
|
|
@ -117,11 +133,6 @@ class ReferenceLoader(api.Loader):
|
|||
namespace=None,
|
||||
options=None
|
||||
):
|
||||
|
||||
import os
|
||||
from avalon.maya import lib
|
||||
from avalon.maya.pipeline import containerise
|
||||
|
||||
assert os.path.exists(self.fname), "%s does not exist." % self.fname
|
||||
|
||||
asset = context['asset']
|
||||
|
|
@ -182,8 +193,6 @@ class ReferenceLoader(api.Loader):
|
|||
|
||||
|
||||
def update(self, container, representation):
|
||||
|
||||
import os
|
||||
from maya import cmds
|
||||
|
||||
node = container["objectName"]
|
||||
|
|
|
|||
|
|
@ -9,8 +9,10 @@ import six
|
|||
from maya import cmds
|
||||
|
||||
from avalon import api, io
|
||||
from avalon.maya.lib import unique_namespace
|
||||
from openpype.hosts.maya.api.lib import matrix_equals
|
||||
from openpype.hosts.maya.api.lib import (
|
||||
matrix_equals,
|
||||
unique_namespace
|
||||
)
|
||||
|
||||
log = logging.getLogger("PackageLoader")
|
||||
|
||||
|
|
@ -239,7 +241,7 @@ def get_contained_containers(container):
|
|||
"""
|
||||
|
||||
import avalon.schema
|
||||
from avalon.maya.pipeline import parse_container
|
||||
from .pipeline import parse_container
|
||||
|
||||
# Get avalon containers in this package setdress container
|
||||
containers = []
|
||||
|
|
|
|||
67
openpype/hosts/maya/api/workio.py
Normal file
67
openpype/hosts/maya/api/workio.py
Normal file
|
|
@ -0,0 +1,67 @@
|
|||
"""Host API required Work Files tool"""
|
||||
import os
|
||||
from maya import cmds
|
||||
from avalon import api
|
||||
|
||||
|
||||
def file_extensions():
|
||||
return api.HOST_WORKFILE_EXTENSIONS["maya"]
|
||||
|
||||
|
||||
def has_unsaved_changes():
|
||||
return cmds.file(query=True, modified=True)
|
||||
|
||||
|
||||
def save_file(filepath):
|
||||
cmds.file(rename=filepath)
|
||||
ext = os.path.splitext(filepath)[1]
|
||||
if ext == ".mb":
|
||||
file_type = "mayaBinary"
|
||||
else:
|
||||
file_type = "mayaAscii"
|
||||
cmds.file(save=True, type=file_type)
|
||||
|
||||
|
||||
def open_file(filepath):
|
||||
return cmds.file(filepath, open=True, force=True)
|
||||
|
||||
|
||||
def current_file():
|
||||
|
||||
current_filepath = cmds.file(query=True, sceneName=True)
|
||||
if not current_filepath:
|
||||
return None
|
||||
|
||||
return current_filepath
|
||||
|
||||
|
||||
def work_root(session):
|
||||
work_dir = session["AVALON_WORKDIR"]
|
||||
scene_dir = None
|
||||
|
||||
# Query scene file rule from workspace.mel if it exists in WORKDIR
|
||||
# We are parsing the workspace.mel manually as opposed to temporarily
|
||||
# setting the Workspace in Maya in a context manager since Maya had a
|
||||
# tendency to crash on frequently changing the workspace when this
|
||||
# function was called many times as one scrolled through Work Files assets.
|
||||
workspace_mel = os.path.join(work_dir, "workspace.mel")
|
||||
if os.path.exists(workspace_mel):
|
||||
scene_rule = 'workspace -fr "scene" '
|
||||
# We need to use builtins as `open` is overridden by the workio API
|
||||
open_file = __builtins__["open"]
|
||||
with open_file(workspace_mel, "r") as f:
|
||||
for line in f:
|
||||
if line.strip().startswith(scene_rule):
|
||||
# remainder == "rule";
|
||||
remainder = line[len(scene_rule):]
|
||||
# scene_dir == rule
|
||||
scene_dir = remainder.split('"')[1]
|
||||
else:
|
||||
# We can't query a workspace that does not exist
|
||||
# so we return similar to what we do in other hosts.
|
||||
scene_dir = session.get("AVALON_SCENEDIR")
|
||||
|
||||
if scene_dir:
|
||||
return os.path.join(work_dir, scene_dir)
|
||||
else:
|
||||
return work_dir
|
||||
|
|
@ -1,4 +1,9 @@
|
|||
from avalon import api, io
|
||||
import json
|
||||
from avalon import api, io, pipeline
|
||||
from openpype.hosts.maya.api.lib import (
|
||||
maintained_selection,
|
||||
apply_shaders
|
||||
)
|
||||
|
||||
|
||||
class ImportModelRender(api.InventoryAction):
|
||||
|
|
@ -49,10 +54,8 @@ class ImportModelRender(api.InventoryAction):
|
|||
Returns:
|
||||
None
|
||||
"""
|
||||
import json
|
||||
|
||||
from maya import cmds
|
||||
from avalon import maya, io, pipeline
|
||||
from openpype.hosts.maya.api import lib
|
||||
|
||||
# Get representations of shader file and relationships
|
||||
look_repr = io.find_one({
|
||||
|
|
@ -77,7 +80,7 @@ class ImportModelRender(api.InventoryAction):
|
|||
json_file = pipeline.get_representation_path_from_context(context)
|
||||
|
||||
# Import the look file
|
||||
with maya.maintained_selection():
|
||||
with maintained_selection():
|
||||
shader_nodes = cmds.file(maya_file,
|
||||
i=True, # import
|
||||
returnNewNodes=True)
|
||||
|
|
@ -89,4 +92,4 @@ class ImportModelRender(api.InventoryAction):
|
|||
relationships = json.load(f)
|
||||
|
||||
# Assign relationships
|
||||
lib.apply_shaders(relationships, shader_nodes, nodes)
|
||||
apply_shaders(relationships, shader_nodes, nodes)
|
||||
|
|
|
|||
|
|
@ -17,7 +17,7 @@ class AbcLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
|||
def process_reference(self, context, name, namespace, data):
|
||||
|
||||
import maya.cmds as cmds
|
||||
from avalon import maya
|
||||
from openpype.hosts.maya.api.lib import unique_namespace
|
||||
|
||||
cmds.loadPlugin("AbcImport.mll", quiet=True)
|
||||
# Prevent identical alembic nodes from being shared
|
||||
|
|
@ -27,9 +27,11 @@ class AbcLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
|||
# Assuming name is subset name from the animation, we split the number
|
||||
# suffix from the name to ensure the namespace is unique
|
||||
name = name.split("_")[0]
|
||||
namespace = maya.unique_namespace("{}_".format(name),
|
||||
format="%03d",
|
||||
suffix="_abc")
|
||||
namespace = unique_namespace(
|
||||
"{}_".format(name),
|
||||
format="%03d",
|
||||
suffix="_abc"
|
||||
)
|
||||
|
||||
# hero_001 (abc)
|
||||
# asset_counter{optional}
|
||||
|
|
|
|||
|
|
@ -3,6 +3,10 @@
|
|||
"""
|
||||
|
||||
from avalon import api
|
||||
from openpype.hosts.maya.api.lib import (
|
||||
maintained_selection,
|
||||
unique_namespace
|
||||
)
|
||||
|
||||
|
||||
class SetFrameRangeLoader(api.Loader):
|
||||
|
|
@ -98,22 +102,19 @@ class ImportMayaLoader(api.Loader):
|
|||
def load(self, context, name=None, namespace=None, data=None):
|
||||
import maya.cmds as cmds
|
||||
|
||||
from avalon import maya
|
||||
from avalon.maya import lib
|
||||
|
||||
choice = self.display_warning()
|
||||
if choice is False:
|
||||
return
|
||||
|
||||
asset = context['asset']
|
||||
|
||||
namespace = namespace or lib.unique_namespace(
|
||||
namespace = namespace or unique_namespace(
|
||||
asset["name"] + "_",
|
||||
prefix="_" if asset["name"][0].isdigit() else "",
|
||||
suffix="_",
|
||||
)
|
||||
|
||||
with maya.maintained_selection():
|
||||
with maintained_selection():
|
||||
cmds.file(self.fname,
|
||||
i=True,
|
||||
preserveReferences=True,
|
||||
|
|
|
|||
|
|
@ -1,9 +1,15 @@
|
|||
import os
|
||||
import clique
|
||||
|
||||
from avalon import api
|
||||
from openpype.api import get_project_settings
|
||||
import openpype.hosts.maya.api.plugin
|
||||
from openpype.hosts.maya.api.plugin import get_reference_node
|
||||
import os
|
||||
from openpype.api import get_project_settings
|
||||
import clique
|
||||
from openpype.hosts.maya.api.lib import (
|
||||
maintained_selection,
|
||||
unique_namespace
|
||||
)
|
||||
from openpype.hosts.maya.api.pipeline import containerise
|
||||
|
||||
|
||||
class AssProxyLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
||||
|
|
@ -20,7 +26,6 @@ class AssProxyLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
|||
def process_reference(self, context, name, namespace, options):
|
||||
|
||||
import maya.cmds as cmds
|
||||
from avalon import maya
|
||||
import pymel.core as pm
|
||||
|
||||
version = context['version']
|
||||
|
|
@ -35,7 +40,7 @@ class AssProxyLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
|||
except ValueError:
|
||||
family = "ass"
|
||||
|
||||
with maya.maintained_selection():
|
||||
with maintained_selection():
|
||||
|
||||
groupName = "{}:{}".format(namespace, name)
|
||||
path = self.fname
|
||||
|
|
@ -95,8 +100,6 @@ class AssProxyLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
|||
self.update(container, representation)
|
||||
|
||||
def update(self, container, representation):
|
||||
|
||||
import os
|
||||
from maya import cmds
|
||||
import pymel.core as pm
|
||||
|
||||
|
|
@ -175,8 +178,6 @@ class AssStandinLoader(api.Loader):
|
|||
def load(self, context, name, namespace, options):
|
||||
|
||||
import maya.cmds as cmds
|
||||
import avalon.maya.lib as lib
|
||||
from avalon.maya.pipeline import containerise
|
||||
import mtoa.ui.arnoldmenu
|
||||
import pymel.core as pm
|
||||
|
||||
|
|
@ -188,7 +189,7 @@ class AssStandinLoader(api.Loader):
|
|||
frameStart = version_data.get("frameStart", None)
|
||||
|
||||
asset = context['asset']['name']
|
||||
namespace = namespace or lib.unique_namespace(
|
||||
namespace = namespace or unique_namespace(
|
||||
asset + "_",
|
||||
prefix="_" if asset[0].isdigit() else "",
|
||||
suffix="_",
|
||||
|
|
|
|||
|
|
@ -13,11 +13,11 @@ class AssemblyLoader(api.Loader):
|
|||
|
||||
def load(self, context, name, namespace, data):
|
||||
|
||||
from avalon.maya.pipeline import containerise
|
||||
from avalon.maya import lib
|
||||
from openpype.hosts.maya.api.pipeline import containerise
|
||||
from openpype.hosts.maya.api.lib import unique_namespace
|
||||
|
||||
asset = context['asset']['name']
|
||||
namespace = namespace or lib.unique_namespace(
|
||||
namespace = namespace or unique_namespace(
|
||||
asset + "_",
|
||||
prefix="_" if asset[0].isdigit() else "",
|
||||
suffix="_",
|
||||
|
|
@ -25,9 +25,11 @@ class AssemblyLoader(api.Loader):
|
|||
|
||||
from openpype.hosts.maya.api import setdress
|
||||
|
||||
containers = setdress.load_package(filepath=self.fname,
|
||||
name=name,
|
||||
namespace=namespace)
|
||||
containers = setdress.load_package(
|
||||
filepath=self.fname,
|
||||
name=name,
|
||||
namespace=namespace
|
||||
)
|
||||
|
||||
self[:] = containers
|
||||
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
from avalon import api, io
|
||||
from avalon.maya.pipeline import containerise
|
||||
from avalon.maya import lib
|
||||
from maya import cmds, mel
|
||||
from avalon import api, io
|
||||
from openpype.hosts.maya.api.pipeline import containerise
|
||||
from openpype.hosts.maya.api.lib import unique_namespace
|
||||
|
||||
|
||||
class AudioLoader(api.Loader):
|
||||
|
|
@ -27,7 +27,7 @@ class AudioLoader(api.Loader):
|
|||
)
|
||||
|
||||
asset = context["asset"]["name"]
|
||||
namespace = namespace or lib.unique_namespace(
|
||||
namespace = namespace or unique_namespace(
|
||||
asset + "_",
|
||||
prefix="_" if asset[0].isdigit() else "",
|
||||
suffix="_",
|
||||
|
|
|
|||
|
|
@ -17,11 +17,11 @@ class GpuCacheLoader(api.Loader):
|
|||
def load(self, context, name, namespace, data):
|
||||
|
||||
import maya.cmds as cmds
|
||||
import avalon.maya.lib as lib
|
||||
from avalon.maya.pipeline import containerise
|
||||
from openpype.hosts.maya.api.pipeline import containerise
|
||||
from openpype.hosts.maya.api.lib import unique_namespace
|
||||
|
||||
asset = context['asset']['name']
|
||||
namespace = namespace or lib.unique_namespace(
|
||||
namespace = namespace or unique_namespace(
|
||||
asset + "_",
|
||||
prefix="_" if asset[0].isdigit() else "",
|
||||
suffix="_",
|
||||
|
|
|
|||
|
|
@ -1,8 +1,9 @@
|
|||
from avalon import api, io
|
||||
from avalon.maya.pipeline import containerise
|
||||
from avalon.maya import lib
|
||||
from Qt import QtWidgets, QtCore
|
||||
|
||||
from avalon import api, io
|
||||
from openpype.hosts.maya.api.pipeline import containerise
|
||||
from openpype.hosts.maya.api.lib import unique_namespace
|
||||
|
||||
from maya import cmds
|
||||
|
||||
|
||||
|
|
@ -88,7 +89,7 @@ class ImagePlaneLoader(api.Loader):
|
|||
new_nodes = []
|
||||
image_plane_depth = 1000
|
||||
asset = context['asset']['name']
|
||||
namespace = namespace or lib.unique_namespace(
|
||||
namespace = namespace or unique_namespace(
|
||||
asset + "_",
|
||||
prefix="_" if asset[0].isdigit() else "",
|
||||
suffix="_",
|
||||
|
|
|
|||
|
|
@ -1,13 +1,15 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Look loader."""
|
||||
import openpype.hosts.maya.api.plugin
|
||||
from avalon import api, io
|
||||
import json
|
||||
import openpype.hosts.maya.api.lib
|
||||
from collections import defaultdict
|
||||
from openpype.widgets.message_window import ScrollMessageBox
|
||||
|
||||
from Qt import QtWidgets
|
||||
|
||||
from avalon import api, io
|
||||
import openpype.hosts.maya.api.plugin
|
||||
from openpype.hosts.maya.api import lib
|
||||
from openpype.widgets.message_window import ScrollMessageBox
|
||||
|
||||
from openpype.hosts.maya.api.plugin import get_reference_node
|
||||
|
||||
|
||||
|
|
@ -36,9 +38,8 @@ class LookLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
|||
|
||||
"""
|
||||
import maya.cmds as cmds
|
||||
from avalon import maya
|
||||
|
||||
with maya.maintained_selection():
|
||||
with lib.maintained_selection():
|
||||
nodes = cmds.file(self.fname,
|
||||
namespace=namespace,
|
||||
reference=True,
|
||||
|
|
@ -140,9 +141,7 @@ class LookLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
|||
cmds.file(cr=reference_node) # cleanReference
|
||||
|
||||
# reapply shading groups from json representation on orig nodes
|
||||
openpype.hosts.maya.api.lib.apply_shaders(json_data,
|
||||
shader_nodes,
|
||||
orig_nodes)
|
||||
lib.apply_shaders(json_data, shader_nodes, orig_nodes)
|
||||
|
||||
msg = ["During reference update some edits failed.",
|
||||
"All successful edits were kept intact.\n",
|
||||
|
|
@ -159,8 +158,8 @@ class LookLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
|||
# region compute lookup
|
||||
nodes_by_id = defaultdict(list)
|
||||
for n in nodes:
|
||||
nodes_by_id[openpype.hosts.maya.api.lib.get_id(n)].append(n)
|
||||
openpype.hosts.maya.api.lib.apply_attributes(attributes, nodes_by_id)
|
||||
nodes_by_id[lib.get_id(n)].append(n)
|
||||
lib.apply_attributes(attributes, nodes_by_id)
|
||||
|
||||
# Update metadata
|
||||
cmds.setAttr("{}.representation".format(node),
|
||||
|
|
|
|||
|
|
@ -1,11 +1,18 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Loader for Redshift proxy."""
|
||||
from avalon.maya import lib
|
||||
import os
|
||||
import clique
|
||||
|
||||
import maya.cmds as cmds
|
||||
|
||||
from avalon import api
|
||||
from openpype.api import get_project_settings
|
||||
import os
|
||||
import maya.cmds as cmds
|
||||
import clique
|
||||
from openpype.hosts.maya.api.lib import (
|
||||
namespaced,
|
||||
maintained_selection,
|
||||
unique_namespace
|
||||
)
|
||||
from openpype.hosts.maya.api.pipeline import containerise
|
||||
|
||||
|
||||
class RedshiftProxyLoader(api.Loader):
|
||||
|
|
@ -21,17 +28,13 @@ class RedshiftProxyLoader(api.Loader):
|
|||
|
||||
def load(self, context, name=None, namespace=None, options=None):
|
||||
"""Plugin entry point."""
|
||||
|
||||
from avalon.maya.pipeline import containerise
|
||||
from openpype.hosts.maya.api.lib import namespaced
|
||||
|
||||
try:
|
||||
family = context["representation"]["context"]["family"]
|
||||
except ValueError:
|
||||
family = "redshiftproxy"
|
||||
|
||||
asset_name = context['asset']["name"]
|
||||
namespace = namespace or lib.unique_namespace(
|
||||
namespace = namespace or unique_namespace(
|
||||
asset_name + "_",
|
||||
prefix="_" if asset_name[0].isdigit() else "",
|
||||
suffix="_",
|
||||
|
|
@ -40,7 +43,7 @@ class RedshiftProxyLoader(api.Loader):
|
|||
# Ensure Redshift for Maya is loaded.
|
||||
cmds.loadPlugin("redshift4maya", quiet=True)
|
||||
|
||||
with lib.maintained_selection():
|
||||
with maintained_selection():
|
||||
cmds.namespace(addNamespace=namespace)
|
||||
with namespaced(namespace, new=False):
|
||||
nodes, group_node = self.create_rs_proxy(
|
||||
|
|
|
|||
|
|
@ -1,9 +1,10 @@
|
|||
import openpype.hosts.maya.api.plugin
|
||||
from avalon import api, maya
|
||||
from maya import cmds
|
||||
import os
|
||||
from maya import cmds
|
||||
from avalon import api
|
||||
from openpype.api import get_project_settings
|
||||
from openpype.lib import get_creator_by_name
|
||||
import openpype.hosts.maya.api.plugin
|
||||
from openpype.hosts.maya.api.lib import maintained_selection
|
||||
|
||||
|
||||
class ReferenceLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
||||
|
|
@ -32,7 +33,6 @@ class ReferenceLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
|||
|
||||
def process_reference(self, context, name, namespace, options):
|
||||
import maya.cmds as cmds
|
||||
from avalon import maya
|
||||
import pymel.core as pm
|
||||
|
||||
try:
|
||||
|
|
@ -44,7 +44,7 @@ class ReferenceLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
|||
# True by default to keep legacy behaviours
|
||||
attach_to_root = options.get("attach_to_root", True)
|
||||
|
||||
with maya.maintained_selection():
|
||||
with maintained_selection():
|
||||
cmds.loadPlugin("AbcImport.mll", quiet=True)
|
||||
nodes = cmds.file(self.fname,
|
||||
namespace=namespace,
|
||||
|
|
@ -149,7 +149,7 @@ class ReferenceLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
|||
|
||||
# Create the animation instance
|
||||
creator_plugin = get_creator_by_name(self.animation_creator_name)
|
||||
with maya.maintained_selection():
|
||||
with maintained_selection():
|
||||
cmds.select([output, controls] + roots, noExpand=True)
|
||||
api.create(
|
||||
creator_plugin,
|
||||
|
|
|
|||
|
|
@ -11,8 +11,8 @@ import six
|
|||
import sys
|
||||
|
||||
from avalon import api
|
||||
from avalon.maya import lib
|
||||
from openpype.hosts.maya.api import lib as pypelib
|
||||
from openpype.hosts.maya.api import lib
|
||||
from openpype.hosts.maya.api.pipeline import containerise
|
||||
|
||||
from maya import cmds
|
||||
import maya.app.renderSetup.model.renderSetup as renderSetup
|
||||
|
|
@ -31,7 +31,6 @@ class RenderSetupLoader(api.Loader):
|
|||
|
||||
def load(self, context, name, namespace, data):
|
||||
"""Load RenderSetup settings."""
|
||||
from avalon.maya.pipeline import containerise
|
||||
|
||||
# from openpype.hosts.maya.api.lib import namespaced
|
||||
|
||||
|
|
@ -83,7 +82,7 @@ class RenderSetupLoader(api.Loader):
|
|||
|
||||
def update(self, container, representation):
|
||||
"""Update RenderSetup setting by overwriting existing settings."""
|
||||
pypelib.show_message(
|
||||
lib.show_message(
|
||||
"Render setup update",
|
||||
"Render setup setting will be overwritten by new version. All "
|
||||
"setting specified by user not included in loaded version "
|
||||
|
|
|
|||
|
|
@ -1,7 +1,8 @@
|
|||
from avalon import api
|
||||
import os
|
||||
from avalon import api
|
||||
from openpype.api import get_project_settings
|
||||
|
||||
|
||||
class LoadVDBtoRedShift(api.Loader):
|
||||
"""Load OpenVDB in a Redshift Volume Shape"""
|
||||
|
||||
|
|
@ -15,8 +16,8 @@ class LoadVDBtoRedShift(api.Loader):
|
|||
def load(self, context, name=None, namespace=None, data=None):
|
||||
|
||||
from maya import cmds
|
||||
import avalon.maya.lib as lib
|
||||
from avalon.maya.pipeline import containerise
|
||||
from openpype.hosts.maya.api.pipeline import containerise
|
||||
from openpype.hosts.maya.api.lib import unique_namespace
|
||||
|
||||
try:
|
||||
family = context["representation"]["context"]["family"]
|
||||
|
|
@ -45,7 +46,7 @@ class LoadVDBtoRedShift(api.Loader):
|
|||
asset = context['asset']
|
||||
|
||||
asset_name = asset["name"]
|
||||
namespace = namespace or lib.unique_namespace(
|
||||
namespace = namespace or unique_namespace(
|
||||
asset_name + "_",
|
||||
prefix="_" if asset_name[0].isdigit() else "",
|
||||
suffix="_",
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
import os
|
||||
from avalon import api
|
||||
from openpype.api import get_project_settings
|
||||
import os
|
||||
|
||||
from maya import cmds
|
||||
|
||||
|
|
@ -80,8 +80,8 @@ class LoadVDBtoVRay(api.Loader):
|
|||
|
||||
def load(self, context, name, namespace, data):
|
||||
|
||||
import avalon.maya.lib as lib
|
||||
from avalon.maya.pipeline import containerise
|
||||
from openpype.hosts.maya.api.lib import unique_namespace
|
||||
from openpype.hosts.maya.api.pipeline import containerise
|
||||
|
||||
assert os.path.exists(self.fname), (
|
||||
"Path does not exist: %s" % self.fname
|
||||
|
|
@ -111,7 +111,7 @@ class LoadVDBtoVRay(api.Loader):
|
|||
|
||||
asset = context['asset']
|
||||
asset_name = asset["name"]
|
||||
namespace = namespace or lib.unique_namespace(
|
||||
namespace = namespace or unique_namespace(
|
||||
asset_name + "_",
|
||||
prefix="_" if asset_name[0].isdigit() else "",
|
||||
suffix="_",
|
||||
|
|
|
|||
|
|
@ -9,9 +9,14 @@ import os
|
|||
|
||||
import maya.cmds as cmds
|
||||
|
||||
from avalon.maya import lib
|
||||
from avalon import api, io
|
||||
from openpype.api import get_project_settings
|
||||
from openpype.hosts.maya.api.lib import (
|
||||
maintained_selection,
|
||||
namespaced,
|
||||
unique_namespace
|
||||
)
|
||||
from openpype.hosts.maya.api.pipeline import containerise
|
||||
|
||||
|
||||
class VRayProxyLoader(api.Loader):
|
||||
|
|
@ -36,8 +41,6 @@ class VRayProxyLoader(api.Loader):
|
|||
options (dict): Optional loader options.
|
||||
|
||||
"""
|
||||
from avalon.maya.pipeline import containerise
|
||||
from openpype.hosts.maya.api.lib import namespaced
|
||||
|
||||
try:
|
||||
family = context["representation"]["context"]["family"]
|
||||
|
|
@ -48,7 +51,7 @@ class VRayProxyLoader(api.Loader):
|
|||
self.fname = self._get_abc(context["version"]["_id"]) or self.fname
|
||||
|
||||
asset_name = context['asset']["name"]
|
||||
namespace = namespace or lib.unique_namespace(
|
||||
namespace = namespace or unique_namespace(
|
||||
asset_name + "_",
|
||||
prefix="_" if asset_name[0].isdigit() else "",
|
||||
suffix="_",
|
||||
|
|
@ -57,7 +60,7 @@ class VRayProxyLoader(api.Loader):
|
|||
# Ensure V-Ray for Maya is loaded.
|
||||
cmds.loadPlugin("vrayformaya", quiet=True)
|
||||
|
||||
with lib.maintained_selection():
|
||||
with maintained_selection():
|
||||
cmds.namespace(addNamespace=namespace)
|
||||
with namespaced(namespace, new=False):
|
||||
nodes, group_node = self.create_vray_proxy(
|
||||
|
|
|
|||
|
|
@ -1,8 +1,13 @@
|
|||
from avalon.maya import lib
|
||||
from avalon import api
|
||||
from openpype.api import config
|
||||
import os
|
||||
import maya.cmds as cmds
|
||||
from avalon import api
|
||||
from openpype.api import get_project_settings
|
||||
from openpype.hosts.maya.api.lib import (
|
||||
maintained_selection,
|
||||
namespaced,
|
||||
unique_namespace
|
||||
)
|
||||
from openpype.hosts.maya.api.pipeline import containerise
|
||||
|
||||
|
||||
class VRaySceneLoader(api.Loader):
|
||||
|
|
@ -18,8 +23,6 @@ class VRaySceneLoader(api.Loader):
|
|||
|
||||
def load(self, context, name, namespace, data):
|
||||
|
||||
from avalon.maya.pipeline import containerise
|
||||
from openpype.hosts.maya.lib import namespaced
|
||||
|
||||
try:
|
||||
family = context["representation"]["context"]["family"]
|
||||
|
|
@ -27,7 +30,7 @@ class VRaySceneLoader(api.Loader):
|
|||
family = "vrayscene_layer"
|
||||
|
||||
asset_name = context['asset']["name"]
|
||||
namespace = namespace or lib.unique_namespace(
|
||||
namespace = namespace or unique_namespace(
|
||||
asset_name + "_",
|
||||
prefix="_" if asset_name[0].isdigit() else "",
|
||||
suffix="_",
|
||||
|
|
@ -36,7 +39,7 @@ class VRaySceneLoader(api.Loader):
|
|||
# Ensure V-Ray for Maya is loaded.
|
||||
cmds.loadPlugin("vrayformaya", quiet=True)
|
||||
|
||||
with lib.maintained_selection():
|
||||
with maintained_selection():
|
||||
cmds.namespace(addNamespace=namespace)
|
||||
with namespaced(namespace, new=False):
|
||||
nodes, group_node = self.create_vray_scene(name,
|
||||
|
|
@ -47,8 +50,8 @@ class VRaySceneLoader(api.Loader):
|
|||
return
|
||||
|
||||
# colour the group node
|
||||
presets = config.get_presets(project=os.environ['AVALON_PROJECT'])
|
||||
colors = presets['plugins']['maya']['load']['colors']
|
||||
presets = get_project_settings(os.environ['AVALON_PROJECT'])
|
||||
colors = presets['maya']['load']['colors']
|
||||
c = colors.get(family)
|
||||
if c is not None:
|
||||
cmds.setAttr("{0}.useOutlinerColor".format(group_node), 1)
|
||||
|
|
|
|||
|
|
@ -3,14 +3,14 @@ import json
|
|||
import re
|
||||
import glob
|
||||
from collections import defaultdict
|
||||
from pprint import pprint
|
||||
|
||||
from maya import cmds
|
||||
|
||||
from avalon import api, io
|
||||
from avalon.maya import lib as avalon_lib, pipeline
|
||||
from openpype.hosts.maya.api import lib
|
||||
from openpype.api import get_project_settings
|
||||
from pprint import pprint
|
||||
from openpype.hosts.maya.api import lib
|
||||
from openpype.hosts.maya.api.pipeline import containerise
|
||||
|
||||
|
||||
class YetiCacheLoader(api.Loader):
|
||||
|
|
@ -75,11 +75,13 @@ class YetiCacheLoader(api.Loader):
|
|||
|
||||
self[:] = nodes
|
||||
|
||||
return pipeline.containerise(name=name,
|
||||
namespace=namespace,
|
||||
nodes=nodes,
|
||||
context=context,
|
||||
loader=self.__class__.__name__)
|
||||
return containerise(
|
||||
name=name,
|
||||
namespace=namespace,
|
||||
nodes=nodes,
|
||||
context=context,
|
||||
loader=self.__class__.__name__
|
||||
)
|
||||
|
||||
def remove(self, container):
|
||||
|
||||
|
|
@ -239,9 +241,11 @@ class YetiCacheLoader(api.Loader):
|
|||
|
||||
asset_name = "{}_".format(asset)
|
||||
prefix = "_" if asset_name[0].isdigit()else ""
|
||||
namespace = avalon_lib.unique_namespace(asset_name,
|
||||
prefix=prefix,
|
||||
suffix="_")
|
||||
namespace = lib.unique_namespace(
|
||||
asset_name,
|
||||
prefix=prefix,
|
||||
suffix="_"
|
||||
)
|
||||
|
||||
return namespace
|
||||
|
||||
|
|
|
|||
|
|
@ -25,7 +25,6 @@ class YetiRigLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
|||
self, context, name=None, namespace=None, options=None):
|
||||
|
||||
import maya.cmds as cmds
|
||||
from avalon import maya
|
||||
|
||||
# get roots of selected hierarchies
|
||||
selected_roots = []
|
||||
|
|
@ -53,7 +52,7 @@ class YetiRigLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
|||
scene_lookup[cb_id] = node
|
||||
|
||||
# load rig
|
||||
with maya.maintained_selection():
|
||||
with lib.maintained_selection():
|
||||
nodes = cmds.file(self.fname,
|
||||
namespace=namespace,
|
||||
reference=True,
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@ from collections import defaultdict
|
|||
import pyblish.api
|
||||
|
||||
from maya import cmds, mel
|
||||
from avalon import maya as avalon
|
||||
from openpype.hosts.maya import api
|
||||
from openpype.hosts.maya.api import lib
|
||||
|
||||
# TODO : Publish of assembly: -unique namespace for all assets, VALIDATOR!
|
||||
|
|
@ -30,7 +30,7 @@ class CollectAssembly(pyblish.api.InstancePlugin):
|
|||
def process(self, instance):
|
||||
|
||||
# Find containers
|
||||
containers = avalon.ls()
|
||||
containers = api.ls()
|
||||
|
||||
# Get all content from the instance
|
||||
instance_lookup = set(cmds.ls(instance, type="transform", long=True))
|
||||
|
|
|
|||
|
|
@ -49,7 +49,7 @@ import maya.app.renderSetup.model.renderSetup as renderSetup
|
|||
|
||||
import pyblish.api
|
||||
|
||||
from avalon import maya, api
|
||||
from avalon import api
|
||||
from openpype.hosts.maya.api.lib_renderproducts import get as get_layer_render_products # noqa: E501
|
||||
from openpype.hosts.maya.api import lib
|
||||
|
||||
|
|
@ -409,7 +409,7 @@ class CollectMayaRender(pyblish.api.ContextPlugin):
|
|||
dict: only overrides with values
|
||||
|
||||
"""
|
||||
attributes = maya.read(render_globals)
|
||||
attributes = lib.read(render_globals)
|
||||
|
||||
options = {"renderGlobals": {}}
|
||||
options["renderGlobals"]["Priority"] = attributes["priority"]
|
||||
|
|
|
|||
|
|
@ -2,9 +2,12 @@ import os
|
|||
|
||||
from maya import cmds
|
||||
|
||||
import avalon.maya
|
||||
import openpype.api
|
||||
from openpype.hosts.maya.api.lib import extract_alembic
|
||||
from openpype.hosts.maya.api.lib import (
|
||||
extract_alembic,
|
||||
suspended_refresh,
|
||||
maintained_selection
|
||||
)
|
||||
|
||||
|
||||
class ExtractAnimation(openpype.api.Extractor):
|
||||
|
|
@ -71,8 +74,8 @@ class ExtractAnimation(openpype.api.Extractor):
|
|||
# Since Maya 2017 alembic supports multiple uv sets - write them.
|
||||
options["writeUVSets"] = True
|
||||
|
||||
with avalon.maya.suspended_refresh():
|
||||
with avalon.maya.maintained_selection():
|
||||
with suspended_refresh():
|
||||
with maintained_selection():
|
||||
cmds.select(nodes, noExpand=True)
|
||||
extract_alembic(file=path,
|
||||
startFrame=float(start),
|
||||
|
|
|
|||
|
|
@ -1,9 +1,9 @@
|
|||
import os
|
||||
|
||||
import avalon.maya
|
||||
import openpype.api
|
||||
|
||||
from maya import cmds
|
||||
from openpype.hosts.maya.api.lib import maintained_selection
|
||||
|
||||
|
||||
class ExtractAssStandin(openpype.api.Extractor):
|
||||
|
|
@ -30,7 +30,7 @@ class ExtractAssStandin(openpype.api.Extractor):
|
|||
|
||||
# Write out .ass file
|
||||
self.log.info("Writing: '%s'" % file_path)
|
||||
with avalon.maya.maintained_selection():
|
||||
with maintained_selection():
|
||||
self.log.info("Writing: {}".format(instance.data["setMembers"]))
|
||||
cmds.select(instance.data["setMembers"], noExpand=True)
|
||||
|
||||
|
|
|
|||
|
|
@ -1,10 +1,10 @@
|
|||
import os
|
||||
|
||||
from maya import cmds
|
||||
import contextlib
|
||||
|
||||
import avalon.maya
|
||||
from maya import cmds
|
||||
|
||||
import openpype.api
|
||||
from openpype.hosts.maya.api.lib import maintained_selection
|
||||
|
||||
|
||||
class ExtractAssProxy(openpype.api.Extractor):
|
||||
|
|
@ -54,7 +54,7 @@ class ExtractAssProxy(openpype.api.Extractor):
|
|||
noIntermediate=True)
|
||||
self.log.info(members)
|
||||
|
||||
with avalon.maya.maintained_selection():
|
||||
with maintained_selection():
|
||||
with unparent(members[0]):
|
||||
cmds.select(members, noExpand=True)
|
||||
cmds.file(path,
|
||||
|
|
|
|||
|
|
@ -2,9 +2,7 @@ import os
|
|||
|
||||
from maya import cmds
|
||||
|
||||
import avalon.maya
|
||||
import openpype.api
|
||||
|
||||
from openpype.hosts.maya.api import lib
|
||||
|
||||
|
||||
|
|
@ -54,7 +52,7 @@ class ExtractCameraAlembic(openpype.api.Extractor):
|
|||
path = os.path.join(dir_path, filename)
|
||||
|
||||
# Perform alembic extraction
|
||||
with avalon.maya.maintained_selection():
|
||||
with lib.maintained_selection():
|
||||
cmds.select(camera, replace=True, noExpand=True)
|
||||
|
||||
# Enforce forward slashes for AbcExport because we're
|
||||
|
|
@ -86,7 +84,7 @@ class ExtractCameraAlembic(openpype.api.Extractor):
|
|||
job_str += " -attr {0}".format(attr)
|
||||
|
||||
with lib.evaluation("off"):
|
||||
with avalon.maya.suspended_refresh():
|
||||
with lib.suspended_refresh():
|
||||
cmds.AbcExport(j=job_str, verbose=False)
|
||||
|
||||
if "representations" not in instance.data:
|
||||
|
|
|
|||
|
|
@ -5,7 +5,6 @@ import itertools
|
|||
|
||||
from maya import cmds
|
||||
|
||||
import avalon.maya
|
||||
import openpype.api
|
||||
from openpype.hosts.maya.api import lib
|
||||
|
||||
|
|
@ -157,9 +156,9 @@ class ExtractCameraMayaScene(openpype.api.Extractor):
|
|||
path = os.path.join(dir_path, filename)
|
||||
|
||||
# Perform extraction
|
||||
with avalon.maya.maintained_selection():
|
||||
with lib.maintained_selection():
|
||||
with lib.evaluation("off"):
|
||||
with avalon.maya.suspended_refresh():
|
||||
with lib.suspended_refresh():
|
||||
if bake_to_worldspace:
|
||||
self.log.info(
|
||||
"Performing camera bakes: {}".format(transform))
|
||||
|
|
|
|||
|
|
@ -3,12 +3,12 @@ import os
|
|||
|
||||
from maya import cmds # noqa
|
||||
import maya.mel as mel # noqa
|
||||
from openpype.hosts.maya.api.lib import root_parent
|
||||
|
||||
import pyblish.api
|
||||
import avalon.maya
|
||||
|
||||
import openpype.api
|
||||
from openpype.hosts.maya.api.lib import (
|
||||
root_parent,
|
||||
maintained_selection
|
||||
)
|
||||
|
||||
|
||||
class ExtractFBX(openpype.api.Extractor):
|
||||
|
|
@ -205,13 +205,13 @@ class ExtractFBX(openpype.api.Extractor):
|
|||
|
||||
# Export
|
||||
if "unrealStaticMesh" in instance.data["families"]:
|
||||
with avalon.maya.maintained_selection():
|
||||
with maintained_selection():
|
||||
with root_parent(members):
|
||||
self.log.info("Un-parenting: {}".format(members))
|
||||
cmds.select(members, r=1, noExpand=True)
|
||||
mel.eval('FBXExport -f "{}" -s'.format(path))
|
||||
else:
|
||||
with avalon.maya.maintained_selection():
|
||||
with maintained_selection():
|
||||
cmds.select(members, r=1, noExpand=True)
|
||||
mel.eval('FBXExport -f "{}" -s'.format(path))
|
||||
|
||||
|
|
|
|||
|
|
@ -11,8 +11,7 @@ from collections import OrderedDict
|
|||
from maya import cmds # noqa
|
||||
|
||||
import pyblish.api
|
||||
import avalon.maya
|
||||
from avalon import io, api
|
||||
from avalon import io
|
||||
|
||||
import openpype.api
|
||||
from openpype.hosts.maya.api import lib
|
||||
|
|
@ -239,7 +238,7 @@ class ExtractLook(openpype.api.Extractor):
|
|||
# getting incorrectly remapped. (LKD-17, PLN-101)
|
||||
with no_workspace_dir():
|
||||
with lib.attribute_values(remap):
|
||||
with avalon.maya.maintained_selection():
|
||||
with lib.maintained_selection():
|
||||
cmds.select(sets, noExpand=True)
|
||||
cmds.file(
|
||||
maya_path,
|
||||
|
|
|
|||
|
|
@ -4,8 +4,8 @@ import os
|
|||
|
||||
from maya import cmds
|
||||
|
||||
import avalon.maya
|
||||
import openpype.api
|
||||
from openpype.hosts.maya.api.lib import maintained_selection
|
||||
|
||||
|
||||
class ExtractMayaSceneRaw(openpype.api.Extractor):
|
||||
|
|
@ -59,7 +59,7 @@ class ExtractMayaSceneRaw(openpype.api.Extractor):
|
|||
|
||||
# Perform extraction
|
||||
self.log.info("Performing extraction ...")
|
||||
with avalon.maya.maintained_selection():
|
||||
with maintained_selection():
|
||||
cmds.select(members, noExpand=True)
|
||||
cmds.file(path,
|
||||
force=True,
|
||||
|
|
|
|||
|
|
@ -4,7 +4,6 @@ import os
|
|||
|
||||
from maya import cmds
|
||||
|
||||
import avalon.maya
|
||||
import openpype.api
|
||||
from openpype.hosts.maya.api import lib
|
||||
|
||||
|
|
@ -74,7 +73,7 @@ class ExtractModel(openpype.api.Extractor):
|
|||
polygonObject=1):
|
||||
with lib.shader(members,
|
||||
shadingEngine="initialShadingGroup"):
|
||||
with avalon.maya.maintained_selection():
|
||||
with lib.maintained_selection():
|
||||
cmds.select(members, noExpand=True)
|
||||
cmds.file(path,
|
||||
force=True,
|
||||
|
|
|
|||
|
|
@ -2,9 +2,12 @@ import os
|
|||
|
||||
from maya import cmds
|
||||
|
||||
import avalon.maya
|
||||
import openpype.api
|
||||
from openpype.hosts.maya.api.lib import extract_alembic
|
||||
from openpype.hosts.maya.api.lib import (
|
||||
extract_alembic,
|
||||
suspended_refresh,
|
||||
maintained_selection
|
||||
)
|
||||
|
||||
|
||||
class ExtractAlembic(openpype.api.Extractor):
|
||||
|
|
@ -70,8 +73,8 @@ class ExtractAlembic(openpype.api.Extractor):
|
|||
# Since Maya 2017 alembic supports multiple uv sets - write them.
|
||||
options["writeUVSets"] = True
|
||||
|
||||
with avalon.maya.suspended_refresh():
|
||||
with avalon.maya.maintained_selection():
|
||||
with suspended_refresh():
|
||||
with maintained_selection():
|
||||
cmds.select(nodes, noExpand=True)
|
||||
extract_alembic(file=path,
|
||||
startFrame=start,
|
||||
|
|
|
|||
|
|
@ -2,11 +2,11 @@
|
|||
"""Redshift Proxy extractor."""
|
||||
import os
|
||||
|
||||
import avalon.maya
|
||||
import openpype.api
|
||||
|
||||
from maya import cmds
|
||||
|
||||
import openpype.api
|
||||
from openpype.hosts.maya.api.lib import maintained_selection
|
||||
|
||||
|
||||
class ExtractRedshiftProxy(openpype.api.Extractor):
|
||||
"""Extract the content of the instance to a redshift proxy file."""
|
||||
|
|
@ -54,7 +54,7 @@ class ExtractRedshiftProxy(openpype.api.Extractor):
|
|||
|
||||
# Write out rs file
|
||||
self.log.info("Writing: '%s'" % file_path)
|
||||
with avalon.maya.maintained_selection():
|
||||
with maintained_selection():
|
||||
cmds.select(instance.data["setMembers"], noExpand=True)
|
||||
cmds.file(file_path,
|
||||
pr=False,
|
||||
|
|
|
|||
|
|
@ -4,8 +4,8 @@ import os
|
|||
|
||||
from maya import cmds
|
||||
|
||||
import avalon.maya
|
||||
import openpype.api
|
||||
from openpype.hosts.maya.api.lib import maintained_selection
|
||||
|
||||
|
||||
class ExtractRig(openpype.api.Extractor):
|
||||
|
|
@ -40,7 +40,7 @@ class ExtractRig(openpype.api.Extractor):
|
|||
|
||||
# Perform extraction
|
||||
self.log.info("Performing extraction ...")
|
||||
with avalon.maya.maintained_selection():
|
||||
with maintained_selection():
|
||||
cmds.select(instance, noExpand=True)
|
||||
cmds.file(path,
|
||||
force=True,
|
||||
|
|
|
|||
|
|
@ -1,10 +1,10 @@
|
|||
import os
|
||||
|
||||
import avalon.maya
|
||||
import openpype.api
|
||||
|
||||
from maya import cmds
|
||||
|
||||
import openpype.api
|
||||
from openpype.hosts.maya.api.lib import maintained_selection
|
||||
|
||||
|
||||
class ExtractVRayProxy(openpype.api.Extractor):
|
||||
"""Extract the content of the instance to a vrmesh file
|
||||
|
|
@ -41,7 +41,7 @@ class ExtractVRayProxy(openpype.api.Extractor):
|
|||
|
||||
# Write out vrmesh file
|
||||
self.log.info("Writing: '%s'" % file_path)
|
||||
with avalon.maya.maintained_selection():
|
||||
with maintained_selection():
|
||||
cmds.select(instance.data["setMembers"], noExpand=True)
|
||||
cmds.vrayCreateProxy(exportType=1,
|
||||
dir=staging_dir,
|
||||
|
|
|
|||
|
|
@ -3,9 +3,9 @@
|
|||
import os
|
||||
import re
|
||||
|
||||
import avalon.maya
|
||||
import openpype.api
|
||||
from openpype.hosts.maya.api.render_setup_tools import export_in_rs_layer
|
||||
from openpype.hosts.maya.api.lib import maintained_selection
|
||||
|
||||
from maya import cmds
|
||||
|
||||
|
|
@ -57,7 +57,7 @@ class ExtractVrayscene(openpype.api.Extractor):
|
|||
|
||||
# Write out vrscene file
|
||||
self.log.info("Writing: '%s'" % file_path)
|
||||
with avalon.maya.maintained_selection():
|
||||
with maintained_selection():
|
||||
if "*" not in instance.data["setMembers"]:
|
||||
self.log.info(
|
||||
"Exporting: {}".format(instance.data["setMembers"]))
|
||||
|
|
|
|||
|
|
@ -2,8 +2,11 @@ import os
|
|||
|
||||
from maya import cmds
|
||||
|
||||
import avalon.maya
|
||||
import openpype.api
|
||||
from openpype.hosts.maya.api.lib import (
|
||||
suspended_refresh,
|
||||
maintained_selection
|
||||
)
|
||||
|
||||
|
||||
class ExtractXgenCache(openpype.api.Extractor):
|
||||
|
|
@ -32,8 +35,8 @@ class ExtractXgenCache(openpype.api.Extractor):
|
|||
filename = "{name}.abc".format(**instance.data)
|
||||
path = os.path.join(parent_dir, filename)
|
||||
|
||||
with avalon.maya.suspended_refresh():
|
||||
with avalon.maya.maintained_selection():
|
||||
with suspended_refresh():
|
||||
with maintained_selection():
|
||||
command = (
|
||||
'-file '
|
||||
+ path
|
||||
|
|
|
|||
|
|
@ -7,9 +7,8 @@ import contextlib
|
|||
|
||||
from maya import cmds
|
||||
|
||||
import avalon.maya.lib as lib
|
||||
import openpype.api
|
||||
import openpype.hosts.maya.api.lib as maya
|
||||
from openpype.hosts.maya.api import lib
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
|
|
|
|||
|
|
@ -2,10 +2,9 @@ from maya import cmds
|
|||
|
||||
import pyblish.api
|
||||
|
||||
from avalon import maya
|
||||
|
||||
import openpype.api
|
||||
import openpype.hosts.maya.api.action
|
||||
from openpype.hosts.maya.api.lib import maintained_selection
|
||||
|
||||
|
||||
class ValidateCycleError(pyblish.api.InstancePlugin):
|
||||
|
|
@ -26,7 +25,7 @@ class ValidateCycleError(pyblish.api.InstancePlugin):
|
|||
@classmethod
|
||||
def get_invalid(cls, instance):
|
||||
|
||||
with maya.maintained_selection():
|
||||
with maintained_selection():
|
||||
cmds.select(instance[:], noExpand=True)
|
||||
plugs = cmds.cycleCheck(all=False, # check selection only
|
||||
list=True)
|
||||
|
|
|
|||
|
|
@ -3,7 +3,7 @@ from maya import cmds
|
|||
import pyblish.api
|
||||
import openpype.api
|
||||
import openpype.hosts.maya.api.action
|
||||
from avalon import maya
|
||||
from openpype.hosts.maya.api.lib import maintained_selection
|
||||
|
||||
|
||||
class ValidateMeshArnoldAttributes(pyblish.api.InstancePlugin):
|
||||
|
|
@ -67,7 +67,7 @@ class ValidateMeshArnoldAttributes(pyblish.api.InstancePlugin):
|
|||
|
||||
@classmethod
|
||||
def repair(cls, instance):
|
||||
with maya.maintained_selection():
|
||||
with maintained_selection():
|
||||
with pc.UndoChunk():
|
||||
temp_transform = pc.polyCube()[0]
|
||||
|
||||
|
|
|
|||
|
|
@ -3,7 +3,6 @@ from maya import cmds
|
|||
import pyblish.api
|
||||
import openpype.api
|
||||
import openpype.hosts.maya.api.action
|
||||
from avalon import maya
|
||||
from openpype.hosts.maya.api import lib
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -5,8 +5,6 @@ import openpype.api
|
|||
import openpype.hosts.maya.api.action
|
||||
from openpype.hosts.maya.api import lib
|
||||
|
||||
from avalon.maya import maintained_selection
|
||||
|
||||
|
||||
class ValidateShapeZero(pyblish.api.Validator):
|
||||
"""Shape components may not have any "tweak" values
|
||||
|
|
@ -51,7 +49,7 @@ class ValidateShapeZero(pyblish.api.Validator):
|
|||
if not invalid_shapes:
|
||||
return
|
||||
|
||||
with maintained_selection():
|
||||
with lib.maintained_selection():
|
||||
with lib.tool("selectSuperContext"):
|
||||
for shape in invalid_shapes:
|
||||
cmds.polyCollapseTweaks(shape)
|
||||
|
|
|
|||
|
|
@ -1,8 +1,12 @@
|
|||
import os
|
||||
import avalon.api
|
||||
from openpype.api import get_project_settings
|
||||
from openpype.hosts.maya import api
|
||||
import openpype.hosts.maya.api.lib as mlib
|
||||
from maya import cmds
|
||||
|
||||
avalon.api.install(api)
|
||||
|
||||
|
||||
print("starting OpenPype usersetup")
|
||||
|
||||
|
|
|
|||
|
|
@ -10,6 +10,7 @@ import avalon.api
|
|||
from openpype.api import Logger
|
||||
from openpype.tools.utils import host_tools
|
||||
from openpype.lib.remote_publish import headless_publish
|
||||
from openpype.lib import env_value_to_bool
|
||||
|
||||
from .launch_logic import ProcessLauncher, stub
|
||||
|
||||
|
|
@ -34,20 +35,19 @@ def main(*subprocess_args):
|
|||
launcher = ProcessLauncher(subprocess_args)
|
||||
launcher.start()
|
||||
|
||||
if os.environ.get("HEADLESS_PUBLISH"):
|
||||
if env_value_to_bool("HEADLESS_PUBLISH"):
|
||||
launcher.execute_in_main_thread(
|
||||
headless_publish,
|
||||
log,
|
||||
"ClosePS",
|
||||
os.environ.get("IS_TEST")
|
||||
)
|
||||
elif os.environ.get("AVALON_PHOTOSHOP_WORKFILES_ON_LAUNCH", True):
|
||||
save = False
|
||||
if os.getenv("WORKFILES_SAVE_AS"):
|
||||
save = True
|
||||
elif env_value_to_bool("AVALON_PHOTOSHOP_WORKFILES_ON_LAUNCH",
|
||||
default=True):
|
||||
|
||||
launcher.execute_in_main_thread(
|
||||
host_tools.show_workfiles, save=save
|
||||
host_tools.show_workfiles,
|
||||
save=env_value_to_bool("WORKFILES_SAVE_AS")
|
||||
)
|
||||
|
||||
sys.exit(app.exec_())
|
||||
|
|
|
|||
|
|
@ -11,7 +11,7 @@ class MyAutoCreator(AutoCreator):
|
|||
identifier = "workfile"
|
||||
family = "workfile"
|
||||
|
||||
def get_attribute_defs(self):
|
||||
def get_instance_attr_defs(self):
|
||||
output = [
|
||||
lib.NumberDef("number_key", label="Number")
|
||||
]
|
||||
|
|
|
|||
|
|
@ -1,3 +1,4 @@
|
|||
import json
|
||||
from openpype import resources
|
||||
from openpype.hosts.testhost.api import pipeline
|
||||
from openpype.pipeline import (
|
||||
|
|
@ -13,6 +14,8 @@ class TestCreatorOne(Creator):
|
|||
family = "test"
|
||||
description = "Testing creator of testhost"
|
||||
|
||||
create_allow_context_change = False
|
||||
|
||||
def get_icon(self):
|
||||
return resources.get_openpype_splash_filepath()
|
||||
|
||||
|
|
@ -33,7 +36,10 @@ class TestCreatorOne(Creator):
|
|||
for instance in instances:
|
||||
self._remove_instance_from_context(instance)
|
||||
|
||||
def create(self, subset_name, data, options=None):
|
||||
def create(self, subset_name, data, pre_create_data):
|
||||
print("Data that can be used in create:\n{}".format(
|
||||
json.dumps(pre_create_data, indent=4)
|
||||
))
|
||||
new_instance = CreatedInstance(self.family, subset_name, data, self)
|
||||
pipeline.HostContext.add_instance(new_instance.data_to_store())
|
||||
self.log.info(new_instance.data)
|
||||
|
|
@ -46,9 +52,21 @@ class TestCreatorOne(Creator):
|
|||
"different_variant"
|
||||
]
|
||||
|
||||
def get_attribute_defs(self):
|
||||
def get_instance_attr_defs(self):
|
||||
output = [
|
||||
lib.NumberDef("number_key", label="Number")
|
||||
lib.NumberDef("number_key", label="Number"),
|
||||
]
|
||||
return output
|
||||
|
||||
def get_pre_create_attr_defs(self):
|
||||
output = [
|
||||
lib.BoolDef("use_selection", label="Use selection"),
|
||||
lib.UISeparatorDef(),
|
||||
lib.UILabelDef("Testing label"),
|
||||
lib.FileDef("filepath", folders=True, label="Filepath"),
|
||||
lib.FileDef(
|
||||
"filepath_2", multipath=True, folders=True, label="Filepath 2"
|
||||
)
|
||||
]
|
||||
return output
|
||||
|
||||
|
|
|
|||
|
|
@ -15,7 +15,7 @@ class TestCreatorTwo(Creator):
|
|||
def get_icon(self):
|
||||
return "cube"
|
||||
|
||||
def create(self, subset_name, data, options=None):
|
||||
def create(self, subset_name, data, pre_create_data):
|
||||
new_instance = CreatedInstance(self.family, subset_name, data, self)
|
||||
pipeline.HostContext.add_instance(new_instance.data_to_store())
|
||||
self.log.info(new_instance.data)
|
||||
|
|
@ -38,7 +38,7 @@ class TestCreatorTwo(Creator):
|
|||
for instance in instances:
|
||||
self._remove_instance_from_context(instance)
|
||||
|
||||
def get_attribute_defs(self):
|
||||
def get_instance_attr_defs(self):
|
||||
output = [
|
||||
lib.NumberDef("number_key"),
|
||||
lib.TextDef("text_key")
|
||||
|
|
|
|||
|
|
@ -19,7 +19,7 @@ class CollectContextDataTestHost(
|
|||
hosts = ["testhost"]
|
||||
|
||||
@classmethod
|
||||
def get_attribute_defs(cls):
|
||||
def get_instance_attr_defs(cls):
|
||||
return [
|
||||
attribute_definitions.BoolDef(
|
||||
"test_bool",
|
||||
|
|
|
|||
|
|
@ -20,7 +20,7 @@ class CollectInstanceOneTestHost(
|
|||
hosts = ["testhost"]
|
||||
|
||||
@classmethod
|
||||
def get_attribute_defs(cls):
|
||||
def get_instance_attr_defs(cls):
|
||||
return [
|
||||
attribute_definitions.NumberDef(
|
||||
"version",
|
||||
|
|
|
|||
|
|
@ -59,6 +59,7 @@ class CollectBatchData(pyblish.api.ContextPlugin):
|
|||
context.data["asset"] = asset_name
|
||||
context.data["task"] = task_name
|
||||
context.data["taskType"] = task_type
|
||||
context.data["project_name"] = project_name
|
||||
|
||||
self._set_ctx_path(batch_data)
|
||||
|
||||
|
|
|
|||
|
|
@ -13,8 +13,10 @@ import tempfile
|
|||
from avalon import io
|
||||
import pyblish.api
|
||||
from openpype.lib import prepare_template_data
|
||||
from openpype.lib.plugin_tools import parse_json
|
||||
|
||||
from openpype.lib.plugin_tools import (
|
||||
parse_json,
|
||||
get_subset_name_with_asset_doc
|
||||
)
|
||||
|
||||
class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
||||
"""
|
||||
|
|
@ -34,7 +36,7 @@ class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
|||
targets = ["filespublish"]
|
||||
|
||||
# from Settings
|
||||
task_type_to_family = {}
|
||||
task_type_to_family = []
|
||||
|
||||
def process(self, context):
|
||||
batch_dir = context.data["batchDir"]
|
||||
|
|
@ -47,8 +49,13 @@ class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
|||
self.log.info("task_sub:: {}".format(task_subfolders))
|
||||
|
||||
asset_name = context.data["asset"]
|
||||
asset_doc = io.find_one({
|
||||
"type": "asset",
|
||||
"name": asset_name
|
||||
})
|
||||
task_name = context.data["task"]
|
||||
task_type = context.data["taskType"]
|
||||
project_name = context.data["project_name"]
|
||||
for task_dir in task_subfolders:
|
||||
task_data = parse_json(os.path.join(task_dir,
|
||||
"manifest.json"))
|
||||
|
|
@ -57,20 +64,21 @@ class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
|||
is_sequence = len(task_data["files"]) > 1
|
||||
|
||||
_, extension = os.path.splitext(task_data["files"][0])
|
||||
family, families, subset_template, tags = self._get_family(
|
||||
family, families, tags = self._get_family(
|
||||
self.task_type_to_family,
|
||||
task_type,
|
||||
is_sequence,
|
||||
extension.replace(".", ''))
|
||||
|
||||
subset = self._get_subset_name(
|
||||
family, subset_template, task_name, task_data["variant"]
|
||||
subset_name = get_subset_name_with_asset_doc(
|
||||
family, task_data["variant"], task_name, asset_doc,
|
||||
project_name=project_name, host_name="webpublisher"
|
||||
)
|
||||
version = self._get_last_version(asset_name, subset) + 1
|
||||
version = self._get_last_version(asset_name, subset_name) + 1
|
||||
|
||||
instance = context.create_instance(subset)
|
||||
instance = context.create_instance(subset_name)
|
||||
instance.data["asset"] = asset_name
|
||||
instance.data["subset"] = subset
|
||||
instance.data["subset"] = subset_name
|
||||
instance.data["family"] = family
|
||||
instance.data["families"] = families
|
||||
instance.data["version"] = version
|
||||
|
|
@ -149,7 +157,7 @@ class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
|||
extension (str): without '.'
|
||||
|
||||
Returns:
|
||||
(family, [families], subset_template_name, tags) tuple
|
||||
(family, [families], tags) tuple
|
||||
AssertionError if not matching family found
|
||||
"""
|
||||
task_type = task_type.lower()
|
||||
|
|
@ -160,12 +168,21 @@ class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
|||
assert task_obj, "No family configuration for '{}'".format(task_type)
|
||||
|
||||
found_family = None
|
||||
for family, content in task_obj.items():
|
||||
if is_sequence != content["is_sequence"]:
|
||||
families_config = []
|
||||
# backward compatibility, should be removed pretty soon
|
||||
if isinstance(task_obj, dict):
|
||||
for family, config in task_obj:
|
||||
config["result_family"] = family
|
||||
families_config.append(config)
|
||||
else:
|
||||
families_config = task_obj
|
||||
|
||||
for config in families_config:
|
||||
if is_sequence != config["is_sequence"]:
|
||||
continue
|
||||
if extension in content["extensions"] or \
|
||||
'' in content["extensions"]: # all extensions setting
|
||||
found_family = family
|
||||
if (extension in config["extensions"] or
|
||||
'' in config["extensions"]): # all extensions setting
|
||||
found_family = config["result_family"]
|
||||
break
|
||||
|
||||
msg = "No family found for combination of " +\
|
||||
|
|
@ -173,10 +190,9 @@ class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
|||
task_type, is_sequence, extension)
|
||||
assert found_family, msg
|
||||
|
||||
return found_family, \
|
||||
content["families"], \
|
||||
content["subset_template_name"], \
|
||||
content["tags"]
|
||||
return (found_family,
|
||||
config["families"],
|
||||
config["tags"])
|
||||
|
||||
def _get_last_version(self, asset_name, subset_name):
|
||||
"""Returns version number or 0 for 'asset' and 'subset'"""
|
||||
|
|
|
|||
|
|
@ -359,12 +359,19 @@ class ConfiguredExtensionsEndpoint(_RestApiEndpoint):
|
|||
"studio_exts": set(["psd", "psb", "tvpp", "tvp"])
|
||||
}
|
||||
collect_conf = sett["webpublisher"]["publish"]["CollectPublishedFiles"]
|
||||
for _, mapping in collect_conf.get("task_type_to_family", {}).items():
|
||||
for _family, config in mapping.items():
|
||||
if config["is_sequence"]:
|
||||
configured["sequence_exts"].update(config["extensions"])
|
||||
else:
|
||||
configured["file_exts"].update(config["extensions"])
|
||||
configs = collect_conf.get("task_type_to_family", [])
|
||||
mappings = []
|
||||
for _, conf_mappings in configs.items():
|
||||
if isinstance(conf_mappings, dict):
|
||||
conf_mappings = conf_mappings.values()
|
||||
for conf_mapping in conf_mappings:
|
||||
mappings.append(conf_mapping)
|
||||
|
||||
for mapping in mappings:
|
||||
if mapping["is_sequence"]:
|
||||
configured["sequence_exts"].update(mapping["extensions"])
|
||||
else:
|
||||
configured["file_exts"].update(mapping["extensions"])
|
||||
|
||||
return Response(
|
||||
status=200,
|
||||
|
|
|
|||
|
|
@ -84,6 +84,7 @@ from .avalon_context import (
|
|||
get_hierarchy,
|
||||
get_linked_assets,
|
||||
get_latest_version,
|
||||
get_system_general_anatomy_data,
|
||||
|
||||
get_workfile_template_key,
|
||||
get_workfile_template_key_from_context,
|
||||
|
|
@ -222,6 +223,7 @@ __all__ = [
|
|||
"get_hierarchy",
|
||||
"get_linked_assets",
|
||||
"get_latest_version",
|
||||
"get_system_general_anatomy_data",
|
||||
|
||||
"get_workfile_template_key",
|
||||
"get_workfile_template_key_from_context",
|
||||
|
|
|
|||
|
|
@ -76,6 +76,7 @@ class RenderInstance(object):
|
|||
deadlineSubmissionJob = attr.ib(default=None)
|
||||
anatomyData = attr.ib(default=None)
|
||||
outputDir = attr.ib(default=None)
|
||||
context = attr.ib(default=None)
|
||||
|
||||
@frameStart.validator
|
||||
def check_frame_start(self, _, value):
|
||||
|
|
|
|||
|
|
@ -1081,10 +1081,19 @@ class ApplicationLaunchContext:
|
|||
# Prepare data that will be passed to midprocess
|
||||
# - store arguments to a json and pass path to json as last argument
|
||||
# - pass environments to set
|
||||
app_env = self.kwargs.pop("env", {})
|
||||
json_data = {
|
||||
"args": self.launch_args,
|
||||
"env": self.kwargs.pop("env", {})
|
||||
"env": app_env
|
||||
}
|
||||
if app_env:
|
||||
# Filter environments of subprocess
|
||||
self.kwargs["env"] = {
|
||||
key: value
|
||||
for key, value in os.environ.items()
|
||||
if key in app_env
|
||||
}
|
||||
|
||||
# Create temp file
|
||||
json_temp = tempfile.NamedTemporaryFile(
|
||||
mode="w", prefix="op_app_args", suffix=".json", delete=False
|
||||
|
|
|
|||
|
|
@ -9,7 +9,10 @@ import collections
|
|||
import functools
|
||||
import getpass
|
||||
|
||||
from openpype.settings import get_project_settings
|
||||
from openpype.settings import (
|
||||
get_project_settings,
|
||||
get_system_settings
|
||||
)
|
||||
from .anatomy import Anatomy
|
||||
from .profiles_filtering import filter_profiles
|
||||
|
||||
|
|
@ -258,6 +261,18 @@ def get_hierarchy(asset_name=None):
|
|||
return "/".join(hierarchy_items)
|
||||
|
||||
|
||||
def get_system_general_anatomy_data():
|
||||
system_settings = get_system_settings()
|
||||
studio_name = system_settings["general"]["studio_name"]
|
||||
studio_code = system_settings["general"]["studio_code"]
|
||||
return {
|
||||
"studio": {
|
||||
"name": studio_name,
|
||||
"code": studio_code
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
def get_linked_asset_ids(asset_doc):
|
||||
"""Return linked asset ids for `asset_doc` from DB
|
||||
|
||||
|
|
@ -536,6 +551,10 @@ def get_workdir_data(project_doc, asset_doc, task_name, host_name):
|
|||
"user": getpass.getuser(),
|
||||
"hierarchy": hierarchy,
|
||||
}
|
||||
|
||||
system_general_data = get_system_general_anatomy_data()
|
||||
data.update(system_general_data)
|
||||
|
||||
return data
|
||||
|
||||
|
||||
|
|
@ -1505,7 +1524,7 @@ def _get_task_context_data_for_anatomy(
|
|||
"requested task type: `{}`".format(task_type)
|
||||
)
|
||||
|
||||
return {
|
||||
data = {
|
||||
"project": {
|
||||
"name": project_doc["name"],
|
||||
"code": project_doc["data"].get("code")
|
||||
|
|
@ -1518,6 +1537,11 @@ def _get_task_context_data_for_anatomy(
|
|||
}
|
||||
}
|
||||
|
||||
system_general_data = get_system_general_anatomy_data()
|
||||
data.update(system_general_data)
|
||||
|
||||
return data
|
||||
|
||||
|
||||
def get_custom_workfile_template_by_context(
|
||||
template_profiles, project_doc, asset_doc, task_name, anatomy=None
|
||||
|
|
|
|||
|
|
@ -4,12 +4,35 @@ import logging
|
|||
import collections
|
||||
import tempfile
|
||||
|
||||
import xml.etree.ElementTree
|
||||
|
||||
from .execute import run_subprocess
|
||||
from .vendor_bin_utils import (
|
||||
get_oiio_tools_path,
|
||||
is_oiio_supported
|
||||
)
|
||||
|
||||
# Max length of string that is supported by ffmpeg
|
||||
MAX_FFMPEG_STRING_LEN = 8196
|
||||
# OIIO known xml tags
|
||||
STRING_TAGS = {
|
||||
"format"
|
||||
}
|
||||
INT_TAGS = {
|
||||
"x", "y", "z",
|
||||
"width", "height", "depth",
|
||||
"full_x", "full_y", "full_z",
|
||||
"full_width", "full_height", "full_depth",
|
||||
"tile_width", "tile_height", "tile_depth",
|
||||
"nchannels",
|
||||
"alpha_channel",
|
||||
"z_channel",
|
||||
"deep",
|
||||
"subimages",
|
||||
}
|
||||
# Regex to parse array attributes
|
||||
ARRAY_TYPE_REGEX = re.compile(r"^(int|float|string)\[\d+\]$")
|
||||
|
||||
|
||||
def get_transcode_temp_directory():
|
||||
"""Creates temporary folder for transcoding.
|
||||
|
|
@ -24,87 +47,215 @@ def get_transcode_temp_directory():
|
|||
|
||||
|
||||
def get_oiio_info_for_input(filepath, logger=None):
|
||||
"""Call oiiotool to get information about input and return stdout."""
|
||||
args = [
|
||||
get_oiio_tools_path(), "--info", "-v", filepath
|
||||
]
|
||||
return run_subprocess(args, logger=logger)
|
||||
"""Call oiiotool to get information about input and return stdout.
|
||||
|
||||
|
||||
def parse_oiio_info(oiio_info):
|
||||
"""Create an object based on output from oiiotool.
|
||||
|
||||
Removes quotation marks from compression value. Parse channels into
|
||||
dictionary - key is channel name value is determined type of channel
|
||||
(e.g. 'uint', 'float').
|
||||
|
||||
Args:
|
||||
oiio_info (str): Output of calling "oiiotool --info -v <path>"
|
||||
|
||||
Returns:
|
||||
dict: Loaded data from output.
|
||||
Stdout should contain xml format string.
|
||||
"""
|
||||
lines = [
|
||||
line.strip()
|
||||
for line in oiio_info.split("\n")
|
||||
args = [
|
||||
get_oiio_tools_path(), "--info", "-v", "-i:infoformat=xml", filepath
|
||||
]
|
||||
# Each line should contain information about one key
|
||||
# key - value are separated with ": "
|
||||
oiio_sep = ": "
|
||||
data_map = {}
|
||||
for line in lines:
|
||||
parts = line.split(oiio_sep)
|
||||
if len(parts) < 2:
|
||||
output = run_subprocess(args, logger=logger)
|
||||
output = output.replace("\r\n", "\n")
|
||||
|
||||
xml_started = False
|
||||
lines = []
|
||||
for line in output.split("\n"):
|
||||
if not xml_started:
|
||||
if not line.startswith("<"):
|
||||
continue
|
||||
xml_started = True
|
||||
if xml_started:
|
||||
lines.append(line)
|
||||
|
||||
if not xml_started:
|
||||
raise ValueError(
|
||||
"Failed to read input file \"{}\".\nOutput:\n{}".format(
|
||||
filepath, output
|
||||
)
|
||||
)
|
||||
|
||||
xml_text = "\n".join(lines)
|
||||
return parse_oiio_xml_output(xml_text, logger=logger)
|
||||
|
||||
|
||||
class RationalToInt:
|
||||
"""Rational value stored as division of 2 integers using string."""
|
||||
def __init__(self, string_value):
|
||||
parts = string_value.split("/")
|
||||
top = float(parts[0])
|
||||
bottom = 1.0
|
||||
if len(parts) != 1:
|
||||
bottom = float(parts[1])
|
||||
|
||||
self._value = top / bottom
|
||||
self._string_value = string_value
|
||||
|
||||
@property
|
||||
def value(self):
|
||||
return self._value
|
||||
|
||||
@property
|
||||
def string_value(self):
|
||||
return self._string_value
|
||||
|
||||
def __format__(self, *args, **kwargs):
|
||||
return self._string_value.__format__(*args, **kwargs)
|
||||
|
||||
def __float__(self):
|
||||
return self._value
|
||||
|
||||
def __str__(self):
|
||||
return self._string_value
|
||||
|
||||
def __repr__(self):
|
||||
return "<{}> {}".format(self.__class__.__name__, self._string_value)
|
||||
|
||||
|
||||
def convert_value_by_type_name(value_type, value, logger=None):
|
||||
"""Convert value to proper type based on type name.
|
||||
|
||||
In some cases value types have custom python class.
|
||||
"""
|
||||
if logger is None:
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Simple types
|
||||
if value_type == "string":
|
||||
return value
|
||||
|
||||
if value_type == "int":
|
||||
return int(value)
|
||||
|
||||
if value_type == "float":
|
||||
return float(value)
|
||||
|
||||
# Vectors will probably have more types
|
||||
if value_type == "vec2f":
|
||||
return [float(item) for item in value.split(",")]
|
||||
|
||||
# Matrix should be always have square size of element 3x3, 4x4
|
||||
# - are returned as list of lists
|
||||
if value_type == "matrix":
|
||||
output = []
|
||||
current_index = -1
|
||||
parts = value.split(",")
|
||||
parts_len = len(parts)
|
||||
if parts_len == 1:
|
||||
divisor = 1
|
||||
elif parts_len == 4:
|
||||
divisor = 2
|
||||
elif parts_len == 9:
|
||||
divisor == 3
|
||||
elif parts_len == 16:
|
||||
divisor = 4
|
||||
else:
|
||||
logger.info("Unknown matrix resolution {}. Value: \"{}\"".format(
|
||||
parts_len, value
|
||||
))
|
||||
for part in parts:
|
||||
output.append(float(part))
|
||||
return output
|
||||
|
||||
for idx, item in enumerate(parts):
|
||||
list_index = idx % divisor
|
||||
if list_index > current_index:
|
||||
current_index = list_index
|
||||
output.append([])
|
||||
output[list_index].append(float(item))
|
||||
return output
|
||||
|
||||
if value_type == "rational2i":
|
||||
return RationalToInt(value)
|
||||
|
||||
# Array of other types is converted to list
|
||||
re_result = ARRAY_TYPE_REGEX.findall(value_type)
|
||||
if re_result:
|
||||
array_type = re_result[0]
|
||||
output = []
|
||||
for item in value.split(","):
|
||||
output.append(
|
||||
convert_value_by_type_name(array_type, item, logger=logger)
|
||||
)
|
||||
return output
|
||||
|
||||
logger.info((
|
||||
"MISSING IMPLEMENTATION:"
|
||||
" Unknown attrib type \"{}\". Value: {}"
|
||||
).format(value_type, value))
|
||||
return value
|
||||
|
||||
|
||||
def parse_oiio_xml_output(xml_string, logger=None):
|
||||
"""Parse xml output from OIIO info command."""
|
||||
output = {}
|
||||
if not xml_string:
|
||||
return output
|
||||
|
||||
if logger is None:
|
||||
logger = logging.getLogger("OIIO-xml-parse")
|
||||
|
||||
tree = xml.etree.ElementTree.fromstring(xml_string)
|
||||
attribs = {}
|
||||
output["attribs"] = attribs
|
||||
for child in tree:
|
||||
tag_name = child.tag
|
||||
if tag_name == "attrib":
|
||||
attrib_def = child.attrib
|
||||
value = convert_value_by_type_name(
|
||||
attrib_def["type"], child.text, logger=logger
|
||||
)
|
||||
|
||||
attribs[attrib_def["name"]] = value
|
||||
continue
|
||||
key = parts.pop(0)
|
||||
value = oiio_sep.join(parts)
|
||||
data_map[key] = value
|
||||
|
||||
if "compression" in data_map:
|
||||
value = data_map["compression"]
|
||||
data_map["compression"] = value.replace("\"", "")
|
||||
# Channels are stored as tex on each child
|
||||
if tag_name == "channelnames":
|
||||
value = []
|
||||
for channel in child:
|
||||
value.append(channel.text)
|
||||
|
||||
channels_info = {}
|
||||
channels_value = data_map.get("channel list") or ""
|
||||
if channels_value:
|
||||
channels = channels_value.split(", ")
|
||||
type_regex = re.compile(r"(?P<name>[^\(]+) \((?P<type>[^\)]+)\)")
|
||||
for channel in channels:
|
||||
match = type_regex.search(channel)
|
||||
if not match:
|
||||
channel_name = channel
|
||||
channel_type = "uint"
|
||||
else:
|
||||
channel_name = match.group("name")
|
||||
channel_type = match.group("type")
|
||||
channels_info[channel_name] = channel_type
|
||||
data_map["channels_info"] = channels_info
|
||||
return data_map
|
||||
# Convert known integer type tags to int
|
||||
elif tag_name in INT_TAGS:
|
||||
value = int(child.text)
|
||||
|
||||
# Keep value of known string tags
|
||||
elif tag_name in STRING_TAGS:
|
||||
value = child.text
|
||||
|
||||
# Keep value as text for unknown tags
|
||||
# - feel free to add more tags
|
||||
else:
|
||||
value = child.text
|
||||
logger.info((
|
||||
"MISSING IMPLEMENTATION:"
|
||||
" Unknown tag \"{}\". Value \"{}\""
|
||||
).format(tag_name, value))
|
||||
|
||||
output[child.tag] = value
|
||||
|
||||
return output
|
||||
|
||||
|
||||
def get_convert_rgb_channels(channels_info):
|
||||
def get_convert_rgb_channels(channel_names):
|
||||
"""Get first available RGB(A) group from channels info.
|
||||
|
||||
## Examples
|
||||
```
|
||||
# Ideal situation
|
||||
channels_info: {
|
||||
"R": ...,
|
||||
"G": ...,
|
||||
"B": ...,
|
||||
"A": ...
|
||||
channels_info: [
|
||||
"R", "G", "B", "A"
|
||||
}
|
||||
```
|
||||
Result will be `("R", "G", "B", "A")`
|
||||
|
||||
```
|
||||
# Not ideal situation
|
||||
channels_info: {
|
||||
"beauty.red": ...,
|
||||
"beuaty.green": ...,
|
||||
"beauty.blue": ...,
|
||||
"depth.Z": ...
|
||||
}
|
||||
channels_info: [
|
||||
"beauty.red",
|
||||
"beuaty.green",
|
||||
"beauty.blue",
|
||||
"depth.Z"
|
||||
]
|
||||
```
|
||||
Result will be `("beauty.red", "beauty.green", "beauty.blue", None)`
|
||||
|
||||
|
|
@ -116,7 +267,7 @@ def get_convert_rgb_channels(channels_info):
|
|||
"""
|
||||
rgb_by_main_name = collections.defaultdict(dict)
|
||||
main_name_order = [""]
|
||||
for channel_name in channels_info.keys():
|
||||
for channel_name in channel_names:
|
||||
name_parts = channel_name.split(".")
|
||||
rgb_part = name_parts.pop(-1).lower()
|
||||
main_name = ".".join(name_parts)
|
||||
|
|
@ -166,28 +317,35 @@ def should_convert_for_ffmpeg(src_filepath):
|
|||
return None
|
||||
|
||||
# Load info about info from oiio tool
|
||||
oiio_info = get_oiio_info_for_input(src_filepath)
|
||||
input_info = parse_oiio_info(oiio_info)
|
||||
input_info = get_oiio_info_for_input(src_filepath)
|
||||
if not input_info:
|
||||
return None
|
||||
|
||||
# Check compression
|
||||
compression = input_info["compression"]
|
||||
compression = input_info["attribs"].get("compression")
|
||||
if compression in ("dwaa", "dwab"):
|
||||
return True
|
||||
|
||||
# Check channels
|
||||
channels_info = input_info["channels_info"]
|
||||
review_channels = get_convert_rgb_channels(channels_info)
|
||||
channel_names = input_info["channelnames"]
|
||||
review_channels = get_convert_rgb_channels(channel_names)
|
||||
if review_channels is None:
|
||||
return None
|
||||
|
||||
for attr_value in input_info["attribs"].values():
|
||||
if (
|
||||
isinstance(attr_value, str)
|
||||
and len(attr_value) > MAX_FFMPEG_STRING_LEN
|
||||
):
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
def convert_for_ffmpeg(
|
||||
first_input_path,
|
||||
output_dir,
|
||||
input_frame_start,
|
||||
input_frame_end,
|
||||
input_frame_start=None,
|
||||
input_frame_end=None,
|
||||
logger=None
|
||||
):
|
||||
"""Contert source file to format supported in ffmpeg.
|
||||
|
|
@ -221,46 +379,76 @@ def convert_for_ffmpeg(
|
|||
if input_frame_start is not None and input_frame_end is not None:
|
||||
is_sequence = int(input_frame_end) != int(input_frame_start)
|
||||
|
||||
oiio_info = get_oiio_info_for_input(first_input_path)
|
||||
input_info = parse_oiio_info(oiio_info)
|
||||
input_info = get_oiio_info_for_input(first_input_path)
|
||||
|
||||
# Change compression only if source compression is "dwaa" or "dwab"
|
||||
# - they're not supported in ffmpeg
|
||||
compression = input_info["compression"]
|
||||
compression = input_info["attribs"].get("compression")
|
||||
if compression in ("dwaa", "dwab"):
|
||||
compression = "none"
|
||||
|
||||
# Prepare subprocess arguments
|
||||
oiio_cmd = [
|
||||
get_oiio_tools_path(),
|
||||
"--compression", compression,
|
||||
first_input_path
|
||||
]
|
||||
oiio_cmd = [get_oiio_tools_path()]
|
||||
# Add input compression if available
|
||||
if compression:
|
||||
oiio_cmd.extend(["--compression", compression])
|
||||
|
||||
channels_info = input_info["channels_info"]
|
||||
review_channels = get_convert_rgb_channels(channels_info)
|
||||
# Collect channels to export
|
||||
channel_names = input_info["channelnames"]
|
||||
review_channels = get_convert_rgb_channels(channel_names)
|
||||
if review_channels is None:
|
||||
raise ValueError(
|
||||
"Couldn't find channels that can be used for conversion."
|
||||
)
|
||||
|
||||
red, green, blue, alpha = review_channels
|
||||
input_channels = [red, green, blue]
|
||||
channels_arg = "R={},G={},B={}".format(red, green, blue)
|
||||
if alpha is not None:
|
||||
channels_arg += ",A={}".format(alpha)
|
||||
oiio_cmd.append("--ch")
|
||||
oiio_cmd.append(channels_arg)
|
||||
input_channels.append(alpha)
|
||||
input_channels_str = ",".join(input_channels)
|
||||
|
||||
oiio_cmd.extend([
|
||||
# Tell oiiotool which channels should be loaded
|
||||
# - other channels are not loaded to memory so helps to avoid memory
|
||||
# leak issues
|
||||
"-i:ch={}".format(input_channels_str), first_input_path,
|
||||
# Tell oiiotool which channels should be put to top stack (and output)
|
||||
"--ch", channels_arg
|
||||
])
|
||||
|
||||
# Add frame definitions to arguments
|
||||
if is_sequence:
|
||||
oiio_cmd.append("--frames")
|
||||
oiio_cmd.append("{}-{}".format(input_frame_start, input_frame_end))
|
||||
oiio_cmd.extend([
|
||||
"--frames", "{}-{}".format(input_frame_start, input_frame_end)
|
||||
])
|
||||
|
||||
ignore_attr_changes_added = False
|
||||
for attr_name, attr_value in input_info["attribs"].items():
|
||||
if not isinstance(attr_value, str):
|
||||
continue
|
||||
|
||||
# Remove attributes that have string value longer than allowed length
|
||||
# for ffmpeg
|
||||
if len(attr_value) > MAX_FFMPEG_STRING_LEN:
|
||||
if not ignore_attr_changes_added:
|
||||
# Attrite changes won't be added to attributes itself
|
||||
ignore_attr_changes_added = True
|
||||
oiio_cmd.append("--sansattrib")
|
||||
# Set attribute to empty string
|
||||
logger.info((
|
||||
"Removed attribute \"{}\" from metadata"
|
||||
" because has too long value ({} chars)."
|
||||
).format(attr_name, len(attr_value)))
|
||||
oiio_cmd.extend(["--eraseattrib", attr_name])
|
||||
|
||||
# Add last argument - path to output
|
||||
base_file_name = os.path.basename(first_input_path)
|
||||
output_path = os.path.join(output_dir, base_file_name)
|
||||
oiio_cmd.append("-o")
|
||||
oiio_cmd.append(output_path)
|
||||
oiio_cmd.extend([
|
||||
"-o", output_path
|
||||
])
|
||||
|
||||
logger.debug("Conversion command: {}".format(" ".join(oiio_cmd)))
|
||||
run_subprocess(oiio_cmd, logger=logger)
|
||||
|
|
|
|||
|
|
@ -9,6 +9,8 @@ class CollectDefaultDeadlineServer(pyblish.api.ContextPlugin):
|
|||
order = pyblish.api.CollectorOrder + 0.01
|
||||
label = "Default Deadline Webservice"
|
||||
|
||||
pass_mongo_url = False
|
||||
|
||||
def process(self, context):
|
||||
try:
|
||||
deadline_module = context.data.get("openPypeModules")["deadline"]
|
||||
|
|
@ -19,3 +21,5 @@ class CollectDefaultDeadlineServer(pyblish.api.ContextPlugin):
|
|||
# get default deadline webservice url from deadline module
|
||||
self.log.debug(deadline_module.deadline_urls)
|
||||
context.data["defaultDeadline"] = deadline_module.deadline_urls["default"] # noqa: E501
|
||||
|
||||
context.data["deadlinePassMongoUrl"] = self.pass_mongo_url
|
||||
|
|
|
|||
|
|
@ -67,6 +67,9 @@ class AfterEffectsSubmitDeadline(abstract_submit_deadline.AbstractSubmitDeadline
|
|||
"OPENPYPE_DEV",
|
||||
"OPENPYPE_LOG_NO_COLORS"
|
||||
]
|
||||
# Add mongo url if it's enabled
|
||||
if self._instance.context.data.get("deadlinePassMongoUrl"):
|
||||
keys.append("OPENPYPE_MONGO")
|
||||
|
||||
environment = dict({key: os.environ[key] for key in keys
|
||||
if key in os.environ}, **api.Session)
|
||||
|
|
|
|||
|
|
@ -276,6 +276,9 @@ class HarmonySubmitDeadline(
|
|||
"OPENPYPE_DEV",
|
||||
"OPENPYPE_LOG_NO_COLORS"
|
||||
]
|
||||
# Add mongo url if it's enabled
|
||||
if self._instance.context.data.get("deadlinePassMongoUrl"):
|
||||
keys.append("OPENPYPE_MONGO")
|
||||
|
||||
environment = dict({key: os.environ[key] for key in keys
|
||||
if key in os.environ}, **api.Session)
|
||||
|
|
|
|||
|
|
@ -105,15 +105,21 @@ class HoudiniSubmitPublishDeadline(pyblish.api.ContextPlugin):
|
|||
# Clarify job name per submission (include instance name)
|
||||
payload["JobInfo"]["Name"] = job_name + " - %s" % instance
|
||||
self.submit_job(
|
||||
payload, instances=[instance], deadline=AVALON_DEADLINE
|
||||
context,
|
||||
payload,
|
||||
instances=[instance],
|
||||
deadline=AVALON_DEADLINE
|
||||
)
|
||||
else:
|
||||
# Submit a single job
|
||||
self.submit_job(
|
||||
payload, instances=instance_names, deadline=AVALON_DEADLINE
|
||||
context,
|
||||
payload,
|
||||
instances=instance_names,
|
||||
deadline=AVALON_DEADLINE
|
||||
)
|
||||
|
||||
def submit_job(self, payload, instances, deadline):
|
||||
def submit_job(self, context, payload, instances, deadline):
|
||||
|
||||
# Ensure we operate on a copy, a shallow copy is fine.
|
||||
payload = payload.copy()
|
||||
|
|
@ -125,6 +131,9 @@ class HoudiniSubmitPublishDeadline(pyblish.api.ContextPlugin):
|
|||
# similar environment using it, e.g. "houdini17.5;pluginx2.3"
|
||||
"AVALON_TOOLS",
|
||||
]
|
||||
# Add mongo url if it's enabled
|
||||
if context.data.get("deadlinePassMongoUrl"):
|
||||
keys.append("OPENPYPE_MONGO")
|
||||
|
||||
environment = dict(
|
||||
{key: os.environ[key] for key in keys if key in os.environ},
|
||||
|
|
|
|||
|
|
@ -101,6 +101,10 @@ class HoudiniSubmitRenderDeadline(pyblish.api.InstancePlugin):
|
|||
# similar environment using it, e.g. "maya2018;vray4.x;yeti3.1.9"
|
||||
"AVALON_TOOLS",
|
||||
]
|
||||
# Add mongo url if it's enabled
|
||||
if context.data.get("deadlinePassMongoUrl"):
|
||||
keys.append("OPENPYPE_MONGO")
|
||||
|
||||
environment = dict({key: os.environ[key] for key in keys
|
||||
if key in os.environ}, **api.Session)
|
||||
|
||||
|
|
|
|||
|
|
@ -498,6 +498,9 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
"OPENPYPE_DEV",
|
||||
"OPENPYPE_LOG_NO_COLORS"
|
||||
]
|
||||
# Add mongo url if it's enabled
|
||||
if instance.context.data.get("deadlinePassMongoUrl"):
|
||||
keys.append("OPENPYPE_MONGO")
|
||||
|
||||
environment = dict({key: os.environ[key] for key in keys
|
||||
if key in os.environ}, **api.Session)
|
||||
|
|
|
|||
|
|
@ -249,6 +249,10 @@ class NukeSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
"TOOL_ENV",
|
||||
"FOUNDRY_LICENSE"
|
||||
]
|
||||
# Add mongo url if it's enabled
|
||||
if instance.context.data.get("deadlinePassMongoUrl"):
|
||||
keys.append("OPENPYPE_MONGO")
|
||||
|
||||
# add allowed keys from preset if any
|
||||
if self.env_allowed_keys:
|
||||
keys += self.env_allowed_keys
|
||||
|
|
|
|||
|
|
@ -227,12 +227,17 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
environment["OPENPYPE_USERNAME"] = instance.context.data["user"]
|
||||
environment["OPENPYPE_PUBLISH_JOB"] = "1"
|
||||
environment["OPENPYPE_RENDER_JOB"] = "0"
|
||||
# Add mongo url if it's enabled
|
||||
if instance.context.data.get("deadlinePassMongoUrl"):
|
||||
mongo_url = os.environ.get("OPENPYPE_MONGO")
|
||||
if mongo_url:
|
||||
environment["OPENPYPE_MONGO"] = mongo_url
|
||||
|
||||
args = [
|
||||
'publish',
|
||||
roothless_metadata_path,
|
||||
"--targets", "deadline",
|
||||
"--targets", "filesequence"
|
||||
"--targets", "farm"
|
||||
]
|
||||
|
||||
# Generate the payload for Deadline submission
|
||||
|
|
@ -273,18 +278,18 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
|
|||
else:
|
||||
payload["JobInfo"]["JobDependency0"] = job["_id"]
|
||||
|
||||
i = 0
|
||||
for index, key in enumerate(environment):
|
||||
index = 0
|
||||
for key in environment:
|
||||
if key.upper() in self.enviro_filter:
|
||||
payload["JobInfo"].update(
|
||||
{
|
||||
"EnvironmentKeyValue%d"
|
||||
% i: "{key}={value}".format(
|
||||
% index: "{key}={value}".format(
|
||||
key=key, value=environment[key]
|
||||
)
|
||||
}
|
||||
)
|
||||
i += 1
|
||||
index += 1
|
||||
|
||||
# remove secondary pool
|
||||
payload["JobInfo"].pop("SecondaryPool", None)
|
||||
|
|
|
|||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue