mirror of
https://github.com/ynput/ayon-core.git
synced 2025-12-24 12:54:40 +01:00
Merge branch 'develop' into bugfix/OP-7281_Maya-Review---playblast-renders-without-textures
This commit is contained in:
commit
f2eef86bd8
43 changed files with 513 additions and 357 deletions
10
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
10
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
|
|
@ -35,6 +35,11 @@ body:
|
|||
label: Version
|
||||
description: What version are you running? Look to OpenPype Tray
|
||||
options:
|
||||
- 3.18.2
|
||||
- 3.18.2-nightly.6
|
||||
- 3.18.2-nightly.5
|
||||
- 3.18.2-nightly.4
|
||||
- 3.18.2-nightly.3
|
||||
- 3.18.2-nightly.2
|
||||
- 3.18.2-nightly.1
|
||||
- 3.18.1
|
||||
|
|
@ -130,11 +135,6 @@ body:
|
|||
- 3.15.6-nightly.2
|
||||
- 3.15.6-nightly.1
|
||||
- 3.15.5
|
||||
- 3.15.5-nightly.2
|
||||
- 3.15.5-nightly.1
|
||||
- 3.15.4
|
||||
- 3.15.4-nightly.3
|
||||
- 3.15.4-nightly.2
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
|
|
|
|||
317
CHANGELOG.md
317
CHANGELOG.md
|
|
@ -1,6 +1,323 @@
|
|||
# Changelog
|
||||
|
||||
|
||||
## [3.18.2](https://github.com/ynput/OpenPype/tree/3.18.2)
|
||||
|
||||
|
||||
[Full Changelog](https://github.com/ynput/OpenPype/compare/3.18.1...3.18.2)
|
||||
|
||||
### **🚀 Enhancements**
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Testing: Release Maya/Deadline job from pending when testing. <a href="https://github.com/ynput/OpenPype/pull/5988">#5988</a></summary>
|
||||
|
||||
When testing we wont put the Deadline jobs into pending with dependencies, so the worker can start as soon as possible.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Max: Tweaks on Extractions for the exporters <a href="https://github.com/ynput/OpenPype/pull/5814">#5814</a></summary>
|
||||
|
||||
With this PR
|
||||
- Suspend Refresh would be introduced in abc & obj extractors for optimization.
|
||||
- Allow users to choose the custom attributes to be included in abc exports
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Maya: Optional preserve references. <a href="https://github.com/ynput/OpenPype/pull/5994">#5994</a></summary>
|
||||
|
||||
Optional preserve references when publishing Maya scenes.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>AYON ftrack: Expect 'ayon' group in custom attributes <a href="https://github.com/ynput/OpenPype/pull/6066">#6066</a></summary>
|
||||
|
||||
Expect `ayon` group as one of options to get custom attributes.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>AYON Chore: Remove dependencies related to separated addons <a href="https://github.com/ynput/OpenPype/pull/6074">#6074</a></summary>
|
||||
|
||||
Removed dependencies from openpype client pyproject.toml that are already defined by addons which require them.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Editorial & chore: Stop using pathlib2 <a href="https://github.com/ynput/OpenPype/pull/6075">#6075</a></summary>
|
||||
|
||||
Do not use `pathlib2` which is Python 2 backport for `pathlib` module in python 3.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Traypublisher: Correct validator label <a href="https://github.com/ynput/OpenPype/pull/6084">#6084</a></summary>
|
||||
|
||||
Use correct label for Validate filepaths.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Nuke: Extract Review Intermediate disabled when both Extract Review Mov and Extract Review Intermediate disabled in setting <a href="https://github.com/ynput/OpenPype/pull/6089">#6089</a></summary>
|
||||
|
||||
Report in Discord https://discord.com/channels/517362899170230292/563751989075378201/1187874498234556477
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
### **🐛 Bug fixes**
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Maya: Bug fix the file from texture node not being collected correctly in Yeti Rig <a href="https://github.com/ynput/OpenPype/pull/5990">#5990</a></summary>
|
||||
|
||||
Fix the bug of collect Yeti Rig not being able to get the file parameter(s) from the texture node(s), resulting to the failure of publishing the textures to the resource directory.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Bug: fix AYON settings for Maya workspace <a href="https://github.com/ynput/OpenPype/pull/6069">#6069</a></summary>
|
||||
|
||||
This is changing bug in default AYON setting for Maya workspace, where missing semicolumn caused workspace not being set. This is also syncing default workspace settings to OpenPype
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Refactor colorspace handling in CollectColorspace plugin <a href="https://github.com/ynput/OpenPype/pull/6033">#6033</a></summary>
|
||||
|
||||
Traypublisher is now capable set available colorspaces or roles to publishing images sequence or video. This is fix of new implementation where we allowed to use roles in the enumerator selector.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Bugfix: Houdini render split bugs <a href="https://github.com/ynput/OpenPype/pull/6037">#6037</a></summary>
|
||||
|
||||
This PR is a follow up PR to https://github.com/ynput/OpenPype/pull/5420This PR does:
|
||||
- refactor `get_output_parameter` to what is used to be.
|
||||
- fix a bug with split render
|
||||
- rename `exportJob` flag to `split_render`
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Fusion: fix for single frame rendering <a href="https://github.com/ynput/OpenPype/pull/6056">#6056</a></summary>
|
||||
|
||||
Fixes publishes of single frame of `render` product type.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Photoshop: fix layer publish thumbnail missing in loader <a href="https://github.com/ynput/OpenPype/pull/6061">#6061</a></summary>
|
||||
|
||||
Thumbnails from any products (either `review` nor separate layer instances) weren't stored in Ayon.This resulted in not showing them in Loader and Server UI. After this PR thumbnails should be shown in the Loader and on the Server (`http://YOUR_AYON_HOSTNAME:5000/projects/YOUR_PROJECT/browser`).
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>AYON Chore: Do not use thumbnailSource for thumbnail integration <a href="https://github.com/ynput/OpenPype/pull/6063">#6063</a></summary>
|
||||
|
||||
Do not use `thumbnailSource` for thumbnail integration.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Photoshop: fix creation of .mov <a href="https://github.com/ynput/OpenPype/pull/6064">#6064</a></summary>
|
||||
|
||||
Generation of .mov file with 1 frame per published layer was failing.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Photoshop: fix Collect Color Coded settings <a href="https://github.com/ynput/OpenPype/pull/6065">#6065</a></summary>
|
||||
|
||||
Fix for wrong default value for `Collect Color Coded Instances` Settings
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Bug: Fix Publisher parent window in Nuke <a href="https://github.com/ynput/OpenPype/pull/6067">#6067</a></summary>
|
||||
|
||||
Fixing issue where publisher parent window wasn't set because wrong use of version constant.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Python console widget: Save registry fix <a href="https://github.com/ynput/OpenPype/pull/6076">#6076</a></summary>
|
||||
|
||||
Do not save registry until there is something to save.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Ftrack: update asset names for multiple reviewable items <a href="https://github.com/ynput/OpenPype/pull/6077">#6077</a></summary>
|
||||
|
||||
Multiple reviewable assetVersion components with better grouping to asset version name.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Ftrack: DJV action fixes <a href="https://github.com/ynput/OpenPype/pull/6098">#6098</a></summary>
|
||||
|
||||
Fix bugs in DJV ftrack action.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>AYON Workfiles tool: Fix arrow to timezone typo <a href="https://github.com/ynput/OpenPype/pull/6099">#6099</a></summary>
|
||||
|
||||
Fix parenthesis typo with arrow local timezone function.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
### **🔀 Refactored code**
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Chore: Update folder-favorite icon to ayon icon <a href="https://github.com/ynput/OpenPype/pull/5718">#5718</a></summary>
|
||||
|
||||
Updates old "Pype-2.0-era" (from ancient greece times) to AYON logo equivalent.I believe it's only used in Nuke.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
### **Merged pull requests**
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Chore: Maya / Nuke remove publish gui filters from settings <a href="https://github.com/ynput/OpenPype/pull/5570">#5570</a></summary>
|
||||
|
||||
- Remove Publish GUI Filters from Nuke settings
|
||||
- Remove Publish GUI Filters from Maya settings
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Fusion: Project/User option for output format (create_saver) <a href="https://github.com/ynput/OpenPype/pull/6045">#6045</a></summary>
|
||||
|
||||
Adds "Output Image Format" option which can be set via project settings and overwritten by users in "Create" menu. This replaces the current behaviour of being hardcoded to "exr". Replacing the need for people to manually edit the saver path if they require a different extension.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Fusion: Output Image Format Updating Instances (create_saver) <a href="https://github.com/ynput/OpenPype/pull/6060">#6060</a></summary>
|
||||
|
||||
Adds the ability to update Saver image output format if changed in the Publish UI.~~Adds an optional validator that compares "Output Image Format" in the Publish menu against the one currently found on the saver. It then offers a repair action to update the output extension on the saver.~~
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Tests: Fix representation count for AE legacy test <a href="https://github.com/ynput/OpenPype/pull/6072">#6072</a></summary>
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
|
||||
|
||||
## [3.18.1](https://github.com/ynput/OpenPype/tree/3.18.1)
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -121,62 +121,6 @@ def get_id_required_nodes():
|
|||
return list(nodes)
|
||||
|
||||
|
||||
def get_export_parameter(node):
|
||||
"""Return the export output parameter of the given node
|
||||
|
||||
Example:
|
||||
root = hou.node("/obj")
|
||||
my_alembic_node = root.createNode("alembic")
|
||||
get_output_parameter(my_alembic_node)
|
||||
# Result: "output"
|
||||
|
||||
Args:
|
||||
node(hou.Node): node instance
|
||||
|
||||
Returns:
|
||||
hou.Parm
|
||||
|
||||
"""
|
||||
node_type = node.type().description()
|
||||
|
||||
# Ensures the proper Take is selected for each ROP to retrieve the correct
|
||||
# ifd
|
||||
try:
|
||||
rop_take = hou.takes.findTake(node.parm("take").eval())
|
||||
if rop_take is not None:
|
||||
hou.takes.setCurrentTake(rop_take)
|
||||
except AttributeError:
|
||||
# hou object doesn't always have the 'takes' attribute
|
||||
pass
|
||||
|
||||
if node_type == "Mantra" and node.parm("soho_outputmode").eval():
|
||||
return node.parm("soho_diskfile")
|
||||
elif node_type == "Alfred":
|
||||
return node.parm("alf_diskfile")
|
||||
elif (node_type == "RenderMan" or node_type == "RenderMan RIS"):
|
||||
pre_ris22 = node.parm("rib_outputmode") and \
|
||||
node.parm("rib_outputmode").eval()
|
||||
ris22 = node.parm("diskfile") and node.parm("diskfile").eval()
|
||||
if pre_ris22 or ris22:
|
||||
return node.parm("soho_diskfile")
|
||||
elif node_type == "Redshift" and node.parm("RS_archive_enable").eval():
|
||||
return node.parm("RS_archive_file")
|
||||
elif node_type == "Wedge" and node.parm("driver").eval():
|
||||
return get_export_parameter(node.node(node.parm("driver").eval()))
|
||||
elif node_type == "Arnold":
|
||||
return node.parm("ar_ass_file")
|
||||
elif node_type == "Alembic" and node.parm("use_sop_path").eval():
|
||||
return node.parm("sop_path")
|
||||
elif node_type == "Shotgun Mantra" and node.parm("soho_outputmode").eval():
|
||||
return node.parm("sgtk_soho_diskfile")
|
||||
elif node_type == "Shotgun Alembic" and node.parm("use_sop_path").eval():
|
||||
return node.parm("sop_path")
|
||||
elif node.type().nameWithCategory() == "Driver/vray_renderer":
|
||||
return node.parm("render_export_filepath")
|
||||
|
||||
raise TypeError("Node type '%s' not supported" % node_type)
|
||||
|
||||
|
||||
def get_output_parameter(node):
|
||||
"""Return the render output parameter of the given node
|
||||
|
||||
|
|
@ -184,41 +128,59 @@ def get_output_parameter(node):
|
|||
root = hou.node("/obj")
|
||||
my_alembic_node = root.createNode("alembic")
|
||||
get_output_parameter(my_alembic_node)
|
||||
# Result: "output"
|
||||
>>> "filename"
|
||||
|
||||
Notes:
|
||||
I'm using node.type().name() to get on par with the creators,
|
||||
Because the return value of `node.type().name()` is the
|
||||
same string value used in creators
|
||||
e.g. instance_data.update({"node_type": "alembic"})
|
||||
|
||||
Rop nodes in different network categories have
|
||||
the same output parameter.
|
||||
So, I took that into consideration as a hint for
|
||||
future development.
|
||||
|
||||
Args:
|
||||
node(hou.Node): node instance
|
||||
|
||||
Returns:
|
||||
hou.Parm
|
||||
|
||||
"""
|
||||
node_type = node.type().description()
|
||||
category = node.type().category().name()
|
||||
|
||||
node_type = node.type().name()
|
||||
|
||||
# Figure out which type of node is being rendered
|
||||
if node_type == "Geometry" or node_type == "Filmbox FBX" or \
|
||||
(node_type == "ROP Output Driver" and category == "Sop"):
|
||||
return node.parm("sopoutput")
|
||||
elif node_type == "Composite":
|
||||
return node.parm("copoutput")
|
||||
elif node_type == "opengl":
|
||||
return node.parm("picture")
|
||||
if node_type in {"alembic", "rop_alembic"}:
|
||||
return node.parm("filename")
|
||||
elif node_type == "arnold":
|
||||
if node.evalParm("ar_ass_export_enable"):
|
||||
if node_type.evalParm("ar_ass_export_enable"):
|
||||
return node.parm("ar_ass_file")
|
||||
elif node_type == "Redshift_Proxy_Output":
|
||||
return node.parm("RS_archive_file")
|
||||
elif node_type == "ifd":
|
||||
return node.parm("ar_picture")
|
||||
elif node_type in {
|
||||
"geometry",
|
||||
"rop_geometry",
|
||||
"filmboxfbx",
|
||||
"rop_fbx"
|
||||
}:
|
||||
return node.parm("sopoutput")
|
||||
elif node_type == "comp":
|
||||
return node.parm("copoutput")
|
||||
elif node_type in {"karma", "opengl"}:
|
||||
return node.parm("picture")
|
||||
elif node_type == "ifd": # Mantra
|
||||
if node.evalParm("soho_outputmode"):
|
||||
return node.parm("soho_diskfile")
|
||||
elif node_type == "Octane":
|
||||
return node.parm("HO_img_fileName")
|
||||
elif node_type == "Fetch":
|
||||
inner_node = node.node(node.parm("source").eval())
|
||||
if inner_node:
|
||||
return get_output_parameter(inner_node)
|
||||
elif node.type().nameWithCategory() == "Driver/vray_renderer":
|
||||
return node.parm("vm_picture")
|
||||
elif node_type == "Redshift_Proxy_Output":
|
||||
return node.parm("RS_archive_file")
|
||||
elif node_type == "Redshift_ROP":
|
||||
return node.parm("RS_outputFileNamePrefix")
|
||||
elif node_type in {"usd", "usd_rop", "usdexport"}:
|
||||
return node.parm("lopoutput")
|
||||
elif node_type in {"usdrender", "usdrender_rop"}:
|
||||
return node.parm("outputimage")
|
||||
elif node_type == "vray_renderer":
|
||||
return node.parm("SettingsOutput_img_file_path")
|
||||
|
||||
raise TypeError("Node type '%s' not supported" % node_type)
|
||||
|
|
|
|||
|
|
@ -41,11 +41,11 @@ class CollectArnoldROPRenderProducts(pyblish.api.InstancePlugin):
|
|||
render_products = []
|
||||
|
||||
# Store whether we are splitting the render job (export + render)
|
||||
export_job = bool(rop.parm("ar_ass_export_enable").eval())
|
||||
instance.data["exportJob"] = export_job
|
||||
split_render = bool(rop.parm("ar_ass_export_enable").eval())
|
||||
instance.data["splitRender"] = split_render
|
||||
export_prefix = None
|
||||
export_products = []
|
||||
if export_job:
|
||||
if split_render:
|
||||
export_prefix = evalParmNoFrame(
|
||||
rop, "ar_ass_file", pad_character="0"
|
||||
)
|
||||
|
|
|
|||
|
|
@ -45,11 +45,11 @@ class CollectMantraROPRenderProducts(pyblish.api.InstancePlugin):
|
|||
render_products = []
|
||||
|
||||
# Store whether we are splitting the render job (export + render)
|
||||
export_job = bool(rop.parm("soho_outputmode").eval())
|
||||
instance.data["exportJob"] = export_job
|
||||
split_render = bool(rop.parm("soho_outputmode").eval())
|
||||
instance.data["splitRender"] = split_render
|
||||
export_prefix = None
|
||||
export_products = []
|
||||
if export_job:
|
||||
if split_render:
|
||||
export_prefix = evalParmNoFrame(
|
||||
rop, "soho_diskfile", pad_character="0"
|
||||
)
|
||||
|
|
|
|||
|
|
@ -46,11 +46,11 @@ class CollectVrayROPRenderProducts(pyblish.api.InstancePlugin):
|
|||
# TODO: add render elements if render element
|
||||
|
||||
# Store whether we are splitting the render job in an export + render
|
||||
export_job = rop.parm("render_export_mode").eval() == "2"
|
||||
instance.data["exportJob"] = export_job
|
||||
split_render = rop.parm("render_export_mode").eval() == "2"
|
||||
instance.data["splitRender"] = split_render
|
||||
export_prefix = None
|
||||
export_products = []
|
||||
if export_job:
|
||||
if split_render:
|
||||
export_prefix = evalParmNoFrame(
|
||||
rop, "render_export_filepath", pad_character="0"
|
||||
)
|
||||
|
|
|
|||
|
|
@ -6,6 +6,7 @@ from maya import cmds
|
|||
import pyblish.api
|
||||
|
||||
from openpype.hosts.maya.api import lib
|
||||
from openpype.pipeline.publish import KnownPublishError
|
||||
|
||||
|
||||
SETTINGS = {"renderDensity",
|
||||
|
|
@ -116,7 +117,6 @@ class CollectYetiRig(pyblish.api.InstancePlugin):
|
|||
resources = []
|
||||
|
||||
image_search_paths = cmds.getAttr("{}.imageSearchPath".format(node))
|
||||
texture_filenames = []
|
||||
if image_search_paths:
|
||||
|
||||
# TODO: Somehow this uses OS environment path separator, `:` vs `;`
|
||||
|
|
@ -127,9 +127,16 @@ class CollectYetiRig(pyblish.api.InstancePlugin):
|
|||
# find all ${TOKEN} tokens and replace them with $TOKEN env. variable
|
||||
image_search_paths = self._replace_tokens(image_search_paths)
|
||||
|
||||
# List all related textures
|
||||
texture_filenames = cmds.pgYetiCommand(node, listTextures=True)
|
||||
self.log.debug("Found %i texture(s)" % len(texture_filenames))
|
||||
# List all related textures
|
||||
texture_nodes = cmds.pgYetiGraph(
|
||||
node, listNodes=True, type="texture")
|
||||
texture_filenames = [
|
||||
cmds.pgYetiGraph(
|
||||
node, node=texture_node,
|
||||
param="file_name", getParamValue=True)
|
||||
for texture_node in texture_nodes
|
||||
]
|
||||
self.log.debug("Found %i texture(s)" % len(texture_filenames))
|
||||
|
||||
# Get all reference nodes
|
||||
reference_nodes = cmds.pgYetiGraph(node,
|
||||
|
|
@ -137,11 +144,6 @@ class CollectYetiRig(pyblish.api.InstancePlugin):
|
|||
type="reference")
|
||||
self.log.debug("Found %i reference node(s)" % len(reference_nodes))
|
||||
|
||||
if texture_filenames and not image_search_paths:
|
||||
raise ValueError("pgYetiMaya node '%s' is missing the path to the "
|
||||
"files in the 'imageSearchPath "
|
||||
"atttribute'" % node)
|
||||
|
||||
# Collect all texture files
|
||||
# find all ${TOKEN} tokens and replace them with $TOKEN env. variable
|
||||
texture_filenames = self._replace_tokens(texture_filenames)
|
||||
|
|
@ -161,7 +163,7 @@ class CollectYetiRig(pyblish.api.InstancePlugin):
|
|||
break
|
||||
|
||||
if not files:
|
||||
self.log.warning(
|
||||
raise KnownPublishError(
|
||||
"No texture found for: %s "
|
||||
"(searched: %s)" % (texture, image_search_paths))
|
||||
|
||||
|
|
|
|||
|
|
@ -6,9 +6,11 @@ from maya import cmds
|
|||
|
||||
from openpype.hosts.maya.api.lib import maintained_selection
|
||||
from openpype.pipeline import AVALON_CONTAINER_ID, publish
|
||||
from openpype.pipeline.publish import OpenPypePyblishPluginMixin
|
||||
from openpype.lib import BoolDef
|
||||
|
||||
|
||||
class ExtractMayaSceneRaw(publish.Extractor):
|
||||
class ExtractMayaSceneRaw(publish.Extractor, OpenPypePyblishPluginMixin):
|
||||
"""Extract as Maya Scene (raw).
|
||||
|
||||
This will preserve all references, construction history, etc.
|
||||
|
|
@ -23,6 +25,22 @@ class ExtractMayaSceneRaw(publish.Extractor):
|
|||
"camerarig"]
|
||||
scene_type = "ma"
|
||||
|
||||
@classmethod
|
||||
def get_attribute_defs(cls):
|
||||
return [
|
||||
BoolDef(
|
||||
"preserve_references",
|
||||
label="Preserve References",
|
||||
tooltip=(
|
||||
"When enabled references will still be references "
|
||||
"in the published file.\nWhen disabled the references "
|
||||
"are imported into the published file generating a "
|
||||
"file without references."
|
||||
),
|
||||
default=True
|
||||
)
|
||||
]
|
||||
|
||||
def process(self, instance):
|
||||
"""Plugin entry point."""
|
||||
ext_mapping = (
|
||||
|
|
@ -64,13 +82,18 @@ class ExtractMayaSceneRaw(publish.Extractor):
|
|||
|
||||
# Perform extraction
|
||||
self.log.debug("Performing extraction ...")
|
||||
attribute_values = self.get_attr_values_from_data(
|
||||
instance.data
|
||||
)
|
||||
with maintained_selection():
|
||||
cmds.select(selection, noExpand=True)
|
||||
cmds.file(path,
|
||||
force=True,
|
||||
typ="mayaAscii" if self.scene_type == "ma" else "mayaBinary", # noqa: E501
|
||||
exportSelected=True,
|
||||
preserveReferences=True,
|
||||
preserveReferences=attribute_values[
|
||||
"preserve_references"
|
||||
],
|
||||
constructionHistory=True,
|
||||
shader=True,
|
||||
constraints=True,
|
||||
|
|
|
|||
|
|
@ -12,7 +12,7 @@ def set_context_favorites(favorites=None):
|
|||
favorites (dict): couples of {name:path}
|
||||
"""
|
||||
favorites = favorites or {}
|
||||
icon_path = resources.get_resource("icons", "folder-favorite3.png")
|
||||
icon_path = resources.get_resource("icons", "folder-favorite.png")
|
||||
for name, path in favorites.items():
|
||||
nuke.addFavoriteDir(
|
||||
name,
|
||||
|
|
|
|||
|
|
@ -34,6 +34,11 @@ class ExtractReviewIntermediates(publish.Extractor):
|
|||
nuke_publish = project_settings["nuke"]["publish"]
|
||||
deprecated_setting = nuke_publish["ExtractReviewDataMov"]
|
||||
current_setting = nuke_publish.get("ExtractReviewIntermediates")
|
||||
if not deprecated_setting["enabled"] and (
|
||||
not current_setting["enabled"]
|
||||
):
|
||||
cls.enabled = False
|
||||
|
||||
if deprecated_setting["enabled"]:
|
||||
# Use deprecated settings if they are still enabled
|
||||
cls.viewer_lut_raw = deprecated_setting["viewer_lut_raw"]
|
||||
|
|
|
|||
|
|
@ -15,7 +15,7 @@ class ValidateFilePath(pyblish.api.InstancePlugin):
|
|||
This is primarily created for Simple Creator instances.
|
||||
"""
|
||||
|
||||
label = "Validate Workfile"
|
||||
label = "Validate Filepaths"
|
||||
order = pyblish.api.ValidatorOrder - 0.49
|
||||
|
||||
hosts = ["traypublisher"]
|
||||
|
|
|
|||
|
|
@ -464,7 +464,7 @@ class AbstractSubmitDeadline(pyblish.api.InstancePlugin,
|
|||
self.log.info("Submitted job to Deadline: {}.".format(job_id))
|
||||
|
||||
# TODO: Find a way that's more generic and not render type specific
|
||||
if "exportJob" in instance.data:
|
||||
if instance.data.get("splitRender"):
|
||||
self.log.info("Splitting export and render in two jobs")
|
||||
self.log.info("Export job id: %s", job_id)
|
||||
render_job_info = self.get_job_info(dependency_job_ids=[job_id])
|
||||
|
|
|
|||
|
|
@ -124,7 +124,7 @@ class HoudiniSubmitDeadline(
|
|||
|
||||
# Whether Deadline render submission is being split in two
|
||||
# (extract + render)
|
||||
split_render_job = instance.data["exportJob"]
|
||||
split_render_job = instance.data.get("splitRender")
|
||||
|
||||
# If there's some dependency job ids we can assume this is a render job
|
||||
# and not an export job
|
||||
|
|
@ -132,18 +132,21 @@ class HoudiniSubmitDeadline(
|
|||
if dependency_job_ids:
|
||||
is_export_job = False
|
||||
|
||||
job_type = "[RENDER]"
|
||||
if split_render_job and not is_export_job:
|
||||
# Convert from family to Deadline plugin name
|
||||
# i.e., arnold_rop -> Arnold
|
||||
plugin = instance.data["family"].replace("_rop", "").capitalize()
|
||||
else:
|
||||
plugin = "Houdini"
|
||||
if split_render_job:
|
||||
job_type = "[EXPORT IFD]"
|
||||
|
||||
job_info = DeadlineJobInfo(Plugin=plugin)
|
||||
|
||||
filepath = context.data["currentFile"]
|
||||
filename = os.path.basename(filepath)
|
||||
job_info.Name = "{} - {}".format(filename, instance.name)
|
||||
job_info.Name = "{} - {} {}".format(filename, instance.name, job_type)
|
||||
job_info.BatchName = filename
|
||||
|
||||
job_info.UserName = context.data.get(
|
||||
|
|
|
|||
|
|
@ -231,7 +231,7 @@ class MayaSubmitDeadline(abstract_submit_deadline.AbstractSubmitDeadline,
|
|||
job_info.EnvironmentKeyValue["OPENPYPE_LOG_NO_COLORS"] = "1"
|
||||
|
||||
# Adding file dependencies.
|
||||
if self.asset_dependencies:
|
||||
if not bool(os.environ.get("IS_TEST")) and self.asset_dependencies:
|
||||
dependencies = instance.context.data["fileDependencies"]
|
||||
for dependency in dependencies:
|
||||
job_info.AssetDependency += dependency
|
||||
|
|
@ -570,7 +570,7 @@ class MayaSubmitDeadline(abstract_submit_deadline.AbstractSubmitDeadline,
|
|||
|
||||
job_info = copy.deepcopy(self.job_info)
|
||||
|
||||
if self.asset_dependencies:
|
||||
if not bool(os.environ.get("IS_TEST")) and self.asset_dependencies:
|
||||
# Asset dependency to wait for at least the scene file to sync.
|
||||
job_info.AssetDependency += self.scene_path
|
||||
|
||||
|
|
|
|||
|
|
@ -89,7 +89,7 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin,
|
|||
|
||||
"""
|
||||
|
||||
label = "Submit image sequence jobs to Deadline or Muster"
|
||||
label = "Submit Image Publishing job to Deadline"
|
||||
order = pyblish.api.IntegratorOrder + 0.2
|
||||
icon = "tractor"
|
||||
|
||||
|
|
@ -297,7 +297,9 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin,
|
|||
job_index)] = assembly_id # noqa: E501
|
||||
job_index += 1
|
||||
elif instance.data.get("bakingSubmissionJobs"):
|
||||
self.log.info("Adding baking submission jobs as dependencies...")
|
||||
self.log.info(
|
||||
"Adding baking submission jobs as dependencies..."
|
||||
)
|
||||
job_index = 0
|
||||
for assembly_id in instance.data["bakingSubmissionJobs"]:
|
||||
payload["JobInfo"]["JobDependency{}".format(
|
||||
|
|
@ -582,16 +584,7 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin,
|
|||
|
||||
'''
|
||||
|
||||
render_job = None
|
||||
submission_type = ""
|
||||
if instance.data.get("toBeRenderedOn") == "deadline":
|
||||
render_job = instance.data.pop("deadlineSubmissionJob", None)
|
||||
submission_type = "deadline"
|
||||
|
||||
if instance.data.get("toBeRenderedOn") == "muster":
|
||||
render_job = instance.data.pop("musterSubmissionJob", None)
|
||||
submission_type = "muster"
|
||||
|
||||
render_job = instance.data.pop("deadlineSubmissionJob", None)
|
||||
if not render_job and instance.data.get("tileRendering") is False:
|
||||
raise AssertionError(("Cannot continue without valid Deadline "
|
||||
"or Muster submission."))
|
||||
|
|
@ -624,21 +617,19 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin,
|
|||
"FTRACK_SERVER": os.environ.get("FTRACK_SERVER"),
|
||||
}
|
||||
|
||||
deadline_publish_job_id = None
|
||||
if submission_type == "deadline":
|
||||
# get default deadline webservice url from deadline module
|
||||
self.deadline_url = instance.context.data["defaultDeadline"]
|
||||
# if custom one is set in instance, use that
|
||||
if instance.data.get("deadlineUrl"):
|
||||
self.deadline_url = instance.data.get("deadlineUrl")
|
||||
assert self.deadline_url, "Requires Deadline Webservice URL"
|
||||
# get default deadline webservice url from deadline module
|
||||
self.deadline_url = instance.context.data["defaultDeadline"]
|
||||
# if custom one is set in instance, use that
|
||||
if instance.data.get("deadlineUrl"):
|
||||
self.deadline_url = instance.data.get("deadlineUrl")
|
||||
assert self.deadline_url, "Requires Deadline Webservice URL"
|
||||
|
||||
deadline_publish_job_id = \
|
||||
self._submit_deadline_post_job(instance, render_job, instances)
|
||||
deadline_publish_job_id = \
|
||||
self._submit_deadline_post_job(instance, render_job, instances)
|
||||
|
||||
# Inject deadline url to instances.
|
||||
for inst in instances:
|
||||
inst["deadlineUrl"] = self.deadline_url
|
||||
# Inject deadline url to instances.
|
||||
for inst in instances:
|
||||
inst["deadlineUrl"] = self.deadline_url
|
||||
|
||||
# publish job file
|
||||
publish_job = {
|
||||
|
|
@ -664,15 +655,6 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin,
|
|||
if audio_file and os.path.isfile(audio_file):
|
||||
publish_job.update({"audio": audio_file})
|
||||
|
||||
# pass Ftrack credentials in case of Muster
|
||||
if submission_type == "muster":
|
||||
ftrack = {
|
||||
"FTRACK_API_USER": os.environ.get("FTRACK_API_USER"),
|
||||
"FTRACK_API_KEY": os.environ.get("FTRACK_API_KEY"),
|
||||
"FTRACK_SERVER": os.environ.get("FTRACK_SERVER"),
|
||||
}
|
||||
publish_job.update({"ftrack": ftrack})
|
||||
|
||||
metadata_path, rootless_metadata_path = \
|
||||
create_metadata_path(instance, anatomy)
|
||||
|
||||
|
|
|
|||
|
|
@ -13,7 +13,7 @@ class DJVViewAction(BaseAction):
|
|||
description = "DJV View Launcher"
|
||||
icon = statics_icon("app_icons", "djvView.png")
|
||||
|
||||
type = 'Application'
|
||||
type = "Application"
|
||||
|
||||
allowed_types = [
|
||||
"cin", "dpx", "avi", "dv", "gif", "flv", "mkv", "mov", "mpg", "mpeg",
|
||||
|
|
@ -60,7 +60,7 @@ class DJVViewAction(BaseAction):
|
|||
return False
|
||||
|
||||
def interface(self, session, entities, event):
|
||||
if event['data'].get('values', {}):
|
||||
if event["data"].get("values", {}):
|
||||
return
|
||||
|
||||
entity = entities[0]
|
||||
|
|
@ -70,32 +70,32 @@ class DJVViewAction(BaseAction):
|
|||
if entity_type == "assetversion":
|
||||
if (
|
||||
entity[
|
||||
'components'
|
||||
][0]['file_type'][1:] in self.allowed_types
|
||||
"components"
|
||||
][0]["file_type"][1:] in self.allowed_types
|
||||
):
|
||||
versions.append(entity)
|
||||
else:
|
||||
master_entity = entity
|
||||
if entity_type == "task":
|
||||
master_entity = entity['parent']
|
||||
master_entity = entity["parent"]
|
||||
|
||||
for asset in master_entity['assets']:
|
||||
for version in asset['versions']:
|
||||
for asset in master_entity["assets"]:
|
||||
for version in asset["versions"]:
|
||||
# Get only AssetVersion of selected task
|
||||
if (
|
||||
entity_type == "task" and
|
||||
version['task']['id'] != entity['id']
|
||||
version["task"]["id"] != entity["id"]
|
||||
):
|
||||
continue
|
||||
# Get only components with allowed type
|
||||
filetype = version['components'][0]['file_type']
|
||||
filetype = version["components"][0]["file_type"]
|
||||
if filetype[1:] in self.allowed_types:
|
||||
versions.append(version)
|
||||
|
||||
if len(versions) < 1:
|
||||
return {
|
||||
'success': False,
|
||||
'message': 'There are no Asset Versions to open.'
|
||||
"success": False,
|
||||
"message": "There are no Asset Versions to open."
|
||||
}
|
||||
|
||||
# TODO sort them (somehow?)
|
||||
|
|
@ -134,68 +134,68 @@ class DJVViewAction(BaseAction):
|
|||
last_available = None
|
||||
select_value = None
|
||||
for version in versions:
|
||||
for component in version['components']:
|
||||
for component in version["components"]:
|
||||
label = base_label.format(
|
||||
str(version['version']).zfill(3),
|
||||
version['asset']['type']['name'],
|
||||
component['name']
|
||||
str(version["version"]).zfill(3),
|
||||
version["asset"]["type"]["name"],
|
||||
component["name"]
|
||||
)
|
||||
|
||||
try:
|
||||
location = component[
|
||||
'component_locations'
|
||||
][0]['location']
|
||||
"component_locations"
|
||||
][0]["location"]
|
||||
file_path = location.get_filesystem_path(component)
|
||||
except Exception:
|
||||
file_path = component[
|
||||
'component_locations'
|
||||
][0]['resource_identifier']
|
||||
"component_locations"
|
||||
][0]["resource_identifier"]
|
||||
|
||||
if os.path.isdir(os.path.dirname(file_path)):
|
||||
last_available = file_path
|
||||
if component['name'] == default_component:
|
||||
if component["name"] == default_component:
|
||||
select_value = file_path
|
||||
version_items.append(
|
||||
{'label': label, 'value': file_path}
|
||||
{"label": label, "value": file_path}
|
||||
)
|
||||
|
||||
if len(version_items) == 0:
|
||||
return {
|
||||
'success': False,
|
||||
'message': (
|
||||
'There are no Asset Versions with accessible path.'
|
||||
"success": False,
|
||||
"message": (
|
||||
"There are no Asset Versions with accessible path."
|
||||
)
|
||||
}
|
||||
|
||||
item = {
|
||||
'label': 'Items to view',
|
||||
'type': 'enumerator',
|
||||
'name': 'path',
|
||||
'data': sorted(
|
||||
"label": "Items to view",
|
||||
"type": "enumerator",
|
||||
"name": "path",
|
||||
"data": sorted(
|
||||
version_items,
|
||||
key=itemgetter('label'),
|
||||
key=itemgetter("label"),
|
||||
reverse=True
|
||||
)
|
||||
}
|
||||
if select_value is not None:
|
||||
item['value'] = select_value
|
||||
item["value"] = select_value
|
||||
else:
|
||||
item['value'] = last_available
|
||||
item["value"] = last_available
|
||||
|
||||
items.append(item)
|
||||
|
||||
return {'items': items}
|
||||
return {"items": items}
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
"""Callback method for DJVView action."""
|
||||
|
||||
# Launching application
|
||||
event_data = event["data"]
|
||||
if "values" not in event_data:
|
||||
event_values = event["data"].get("values")
|
||||
if not event_values:
|
||||
return
|
||||
|
||||
djv_app_name = event_data["djv_app_name"]
|
||||
app = self.applicaion_manager.applications.get(djv_app_name)
|
||||
djv_app_name = event_values["djv_app_name"]
|
||||
app = self.application_manager.applications.get(djv_app_name)
|
||||
executable = None
|
||||
if app is not None:
|
||||
executable = app.find_executable()
|
||||
|
|
@ -206,18 +206,21 @@ class DJVViewAction(BaseAction):
|
|||
"message": "Couldn't find DJV executable."
|
||||
}
|
||||
|
||||
filpath = os.path.normpath(event_data["values"]["path"])
|
||||
filpath = os.path.normpath(event_values["path"])
|
||||
|
||||
cmd = [
|
||||
# DJV path
|
||||
executable,
|
||||
str(executable),
|
||||
# PATH TO COMPONENT
|
||||
filpath
|
||||
]
|
||||
|
||||
try:
|
||||
# Run DJV with these commands
|
||||
subprocess.Popen(cmd)
|
||||
_process = subprocess.Popen(cmd)
|
||||
# Keep process in memory for some time
|
||||
time.sleep(0.1)
|
||||
|
||||
except FileNotFoundError:
|
||||
return {
|
||||
"success": False,
|
||||
|
|
|
|||
|
|
@ -352,7 +352,8 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
|
||||
# add extended name if any
|
||||
if (
|
||||
not self.keep_first_subset_name_for_review
|
||||
multiple_reviewable
|
||||
and not self.keep_first_subset_name_for_review
|
||||
and extended_asset_name
|
||||
):
|
||||
other_item["asset_data"]["name"] = extended_asset_name
|
||||
|
|
|
|||
|
|
@ -354,7 +354,7 @@ class PythonInterpreterWidget(QtWidgets.QWidget):
|
|||
default_width = 1000
|
||||
default_height = 600
|
||||
|
||||
def __init__(self, parent=None):
|
||||
def __init__(self, allow_save_registry=True, parent=None):
|
||||
super(PythonInterpreterWidget, self).__init__(parent)
|
||||
|
||||
self.setWindowTitle("{} Console".format(
|
||||
|
|
@ -414,6 +414,8 @@ class PythonInterpreterWidget(QtWidgets.QWidget):
|
|||
|
||||
self._first_show = True
|
||||
self._splitter_size_ratio = None
|
||||
self._allow_save_registry = allow_save_registry
|
||||
self._registry_saved = True
|
||||
|
||||
self._init_from_registry()
|
||||
|
||||
|
|
@ -457,6 +459,11 @@ class PythonInterpreterWidget(QtWidgets.QWidget):
|
|||
pass
|
||||
|
||||
def save_registry(self):
|
||||
# Window was not showed
|
||||
if not self._allow_save_registry or self._registry_saved:
|
||||
return
|
||||
|
||||
self._registry_saved = True
|
||||
setting_registry = PythonInterpreterRegistry()
|
||||
|
||||
setting_registry.set_item("width", self.width())
|
||||
|
|
@ -650,6 +657,7 @@ class PythonInterpreterWidget(QtWidgets.QWidget):
|
|||
|
||||
def showEvent(self, event):
|
||||
self._line_check_timer.start()
|
||||
self._registry_saved = False
|
||||
super(PythonInterpreterWidget, self).showEvent(event)
|
||||
# First show setup
|
||||
if self._first_show:
|
||||
|
|
|
|||
Binary file not shown.
|
Before Width: | Height: | Size: 6.8 KiB After Width: | Height: | Size: 9.8 KiB |
Binary file not shown.
|
Before Width: | Height: | Size: 22 KiB |
Binary file not shown.
|
Before Width: | Height: | Size: 7.8 KiB |
|
|
@ -21,7 +21,7 @@ Providing functionality:
|
|||
|
||||
import click
|
||||
import json
|
||||
from pathlib2 import Path
|
||||
from pathlib import Path
|
||||
import PyOpenColorIO as ocio
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -478,15 +478,6 @@ def _convert_maya_project_settings(ayon_settings, output):
|
|||
for item in ayon_maya["ext_mapping"]
|
||||
}
|
||||
|
||||
# Publish UI filters
|
||||
new_filters = {}
|
||||
for item in ayon_maya["filters"]:
|
||||
new_filters[item["name"]] = {
|
||||
subitem["name"]: subitem["value"]
|
||||
for subitem in item["value"]
|
||||
}
|
||||
ayon_maya["filters"] = new_filters
|
||||
|
||||
# Maya dirmap
|
||||
ayon_maya_dirmap = ayon_maya.pop("maya_dirmap")
|
||||
ayon_maya_dirmap_path = ayon_maya_dirmap["paths"]
|
||||
|
|
@ -743,16 +734,6 @@ def _convert_nuke_project_settings(ayon_settings, output):
|
|||
dirmap["paths"][dst_key] = dirmap["paths"].pop(src_key)
|
||||
ayon_nuke["nuke-dirmap"] = dirmap
|
||||
|
||||
# --- Filters ---
|
||||
new_gui_filters = {}
|
||||
for item in ayon_nuke.pop("filters"):
|
||||
subvalue = {}
|
||||
key = item["name"]
|
||||
for subitem in item["value"]:
|
||||
subvalue[subitem["name"]] = subitem["value"]
|
||||
new_gui_filters[key] = subvalue
|
||||
ayon_nuke["filters"] = new_gui_filters
|
||||
|
||||
# --- Load ---
|
||||
ayon_load = ayon_nuke["load"]
|
||||
ayon_load["LoadClip"]["_representations"] = (
|
||||
|
|
@ -896,7 +877,7 @@ def _convert_hiero_project_settings(ayon_settings, output):
|
|||
_convert_host_imageio(ayon_hiero)
|
||||
|
||||
new_gui_filters = {}
|
||||
for item in ayon_hiero.pop("filters"):
|
||||
for item in ayon_hiero.pop("filters", []):
|
||||
subvalue = {}
|
||||
key = item["name"]
|
||||
for subitem in item["value"]:
|
||||
|
|
@ -963,17 +944,6 @@ def _convert_tvpaint_project_settings(ayon_settings, output):
|
|||
|
||||
_convert_host_imageio(ayon_tvpaint)
|
||||
|
||||
filters = {}
|
||||
for item in ayon_tvpaint["filters"]:
|
||||
value = item["value"]
|
||||
try:
|
||||
value = json.loads(value)
|
||||
|
||||
except ValueError:
|
||||
value = {}
|
||||
filters[item["name"]] = value
|
||||
ayon_tvpaint["filters"] = filters
|
||||
|
||||
ayon_publish_settings = ayon_tvpaint["publish"]
|
||||
for plugin_name in (
|
||||
"ValidateProjectSettings",
|
||||
|
|
|
|||
|
|
@ -1609,14 +1609,5 @@
|
|||
},
|
||||
"templated_workfile_build": {
|
||||
"profiles": []
|
||||
},
|
||||
"filters": {
|
||||
"preset 1": {
|
||||
"ValidateNoAnimation": false,
|
||||
"ValidateShapeDefaultNames": false
|
||||
},
|
||||
"preset 2": {
|
||||
"ValidateNoAnimation": false
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -540,6 +540,5 @@
|
|||
},
|
||||
"templated_workfile_build": {
|
||||
"profiles": []
|
||||
},
|
||||
"filters": {}
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -107,6 +107,5 @@
|
|||
"workfile_builder": {
|
||||
"create_first_version": false,
|
||||
"custom_templates": []
|
||||
},
|
||||
"filters": {}
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -258,10 +258,6 @@
|
|||
{
|
||||
"type": "schema",
|
||||
"name": "schema_templated_workfile_build"
|
||||
},
|
||||
{
|
||||
"type": "schema",
|
||||
"name": "schema_publish_gui_filter"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
|
|
|||
|
|
@ -291,10 +291,6 @@
|
|||
{
|
||||
"type": "schema",
|
||||
"name": "schema_templated_workfile_build"
|
||||
},
|
||||
{
|
||||
"type": "schema",
|
||||
"name": "schema_publish_gui_filter"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
|
|
|||
|
|
@ -436,10 +436,6 @@
|
|||
"workfile_builder/builder_on_start",
|
||||
"workfile_builder/profiles"
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "schema",
|
||||
"name": "schema_publish_gui_filter"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
|
|
|||
|
|
@ -606,7 +606,7 @@ class PublishWorkfilesModel:
|
|||
print("Failed to format workfile path: {}".format(exc))
|
||||
|
||||
dirpath, filename = os.path.split(workfile_path)
|
||||
created_at = arrow.get(repre_entity["createdAt"].to("local"))
|
||||
created_at = arrow.get(repre_entity["createdAt"]).to("local")
|
||||
return FileItem(
|
||||
dirpath,
|
||||
filename,
|
||||
|
|
|
|||
|
|
@ -1,3 +1,3 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Package declaring Pype version."""
|
||||
__version__ = "3.18.2-nightly.2"
|
||||
__version__ = "3.18.2"
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
[tool.poetry]
|
||||
name = "OpenPype"
|
||||
version = "3.18.1" # OpenPype
|
||||
version = "3.18.2" # OpenPype
|
||||
description = "Open VFX and Animation pipeline with support."
|
||||
authors = ["OpenPype Team <info@openpype.io>"]
|
||||
license = "MIT License"
|
||||
|
|
|
|||
|
|
@ -23,23 +23,6 @@ class ExtMappingItemModel(BaseSettingsModel):
|
|||
value: str = Field(title="Extension")
|
||||
|
||||
|
||||
class PublishGUIFilterItemModel(BaseSettingsModel):
|
||||
_layout = "compact"
|
||||
name: str = Field(title="Name")
|
||||
value: bool = Field(True, title="Active")
|
||||
|
||||
|
||||
class PublishGUIFiltersModel(BaseSettingsModel):
|
||||
_layout = "compact"
|
||||
name: str = Field(title="Name")
|
||||
value: list[PublishGUIFilterItemModel] = Field(default_factory=list)
|
||||
|
||||
@validator("value")
|
||||
def validate_unique_outputs(cls, value):
|
||||
ensure_unique_names(value)
|
||||
return value
|
||||
|
||||
|
||||
class MayaSettings(BaseSettingsModel):
|
||||
"""Maya Project Settings."""
|
||||
|
||||
|
|
@ -76,11 +59,8 @@ class MayaSettings(BaseSettingsModel):
|
|||
templated_workfile_build: TemplatedProfilesModel = Field(
|
||||
default_factory=TemplatedProfilesModel,
|
||||
title="Templated Workfile Build Settings")
|
||||
filters: list[PublishGUIFiltersModel] = Field(
|
||||
default_factory=list,
|
||||
title="Publish GUI Filters")
|
||||
|
||||
@validator("filters", "ext_mapping")
|
||||
@validator("ext_mapping")
|
||||
def validate_unique_outputs(cls, value):
|
||||
ensure_unique_names(value)
|
||||
return value
|
||||
|
|
@ -123,20 +103,5 @@ DEFAULT_MAYA_SETTING = {
|
|||
"publish": DEFAULT_PUBLISH_SETTINGS,
|
||||
"load": DEFAULT_LOADERS_SETTING,
|
||||
"workfile_build": DEFAULT_WORKFILE_SETTING,
|
||||
"templated_workfile_build": DEFAULT_TEMPLATED_WORKFILE_SETTINGS,
|
||||
"filters": [
|
||||
{
|
||||
"name": "preset 1",
|
||||
"value": [
|
||||
{"name": "ValidateNoAnimation", "value": False},
|
||||
{"name": "ValidateShapeDefaultNames", "value": False},
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "preset 2",
|
||||
"value": [
|
||||
{"name": "ValidateNoAnimation", "value": False},
|
||||
]
|
||||
},
|
||||
]
|
||||
"templated_workfile_build": DEFAULT_TEMPLATED_WORKFILE_SETTINGS
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,19 +0,0 @@
|
|||
from pydantic import Field, validator
|
||||
from ayon_server.settings import BaseSettingsModel, ensure_unique_names
|
||||
|
||||
|
||||
class PublishGUIFilterItemModel(BaseSettingsModel):
|
||||
_layout = "compact"
|
||||
name: str = Field(title="Name")
|
||||
value: bool = Field(True, title="Active")
|
||||
|
||||
|
||||
class PublishGUIFiltersModel(BaseSettingsModel):
|
||||
_layout = "compact"
|
||||
name: str = Field(title="Name")
|
||||
value: list[PublishGUIFilterItemModel] = Field(default_factory=list)
|
||||
|
||||
@validator("value")
|
||||
def validate_unique_outputs(cls, value):
|
||||
ensure_unique_names(value)
|
||||
return value
|
||||
|
|
@ -44,7 +44,6 @@ from .workfile_builder import (
|
|||
from .templated_workfile_build import (
|
||||
TemplatedWorkfileBuildModel
|
||||
)
|
||||
from .filters import PublishGUIFilterItemModel
|
||||
|
||||
|
||||
class NukeSettings(BaseSettingsModel):
|
||||
|
|
@ -98,16 +97,6 @@ class NukeSettings(BaseSettingsModel):
|
|||
default_factory=TemplatedWorkfileBuildModel
|
||||
)
|
||||
|
||||
filters: list[PublishGUIFilterItemModel] = Field(
|
||||
default_factory=list
|
||||
)
|
||||
|
||||
@validator("filters")
|
||||
def ensure_unique_names(cls, value):
|
||||
"""Ensure name fields within the lists have unique names."""
|
||||
ensure_unique_names(value)
|
||||
return value
|
||||
|
||||
|
||||
DEFAULT_VALUES = {
|
||||
"general": DEFAULT_GENERAL_SETTINGS,
|
||||
|
|
@ -121,6 +110,5 @@ DEFAULT_VALUES = {
|
|||
"workfile_builder": DEFAULT_WORKFILE_BUILDER_SETTINGS,
|
||||
"templated_workfile_build": {
|
||||
"profiles": []
|
||||
},
|
||||
"filters": []
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1 +1 @@
|
|||
__version__ = "0.1.7"
|
||||
__version__ = "0.1.8"
|
||||
|
|
|
|||
|
|
@ -8,17 +8,11 @@ aiohttp_json_rpc = "*" # TVPaint server
|
|||
aiohttp-middlewares = "^2.0.0"
|
||||
wsrpc_aiohttp = "^3.1.1" # websocket server
|
||||
clique = "1.6.*"
|
||||
gazu = "^0.9.3"
|
||||
google-api-python-client = "^1.12.8" # sync server google support (should be separate?)
|
||||
jsonschema = "^2.6.0"
|
||||
pymongo = "^3.11.2"
|
||||
log4mongo = "^1.7"
|
||||
pathlib2= "^2.3.5" # deadline submit publish job only (single place, maybe not needed?)
|
||||
pyblish-base = "^1.8.11"
|
||||
pynput = "^1.7.2" # Timers manager - TODO replace
|
||||
pynput = "^1.7.2" # Timers manager - TODO remove
|
||||
"Qt.py" = "^1.3.3"
|
||||
qtawesome = "0.7.3"
|
||||
speedcopy = "^2.1"
|
||||
slack-sdk = "^3.6.0"
|
||||
pysftp = "^0.2.9"
|
||||
dropbox = "^11.20.0"
|
||||
|
|
|
|||
|
|
@ -1,4 +1,4 @@
|
|||
from pydantic import Field, validator
|
||||
from pydantic import Field
|
||||
from ayon_server.settings import (
|
||||
BaseSettingsModel,
|
||||
ensure_unique_names,
|
||||
|
|
@ -14,23 +14,6 @@ from .publish_plugins import (
|
|||
)
|
||||
|
||||
|
||||
class PublishGUIFilterItemModel(BaseSettingsModel):
|
||||
_layout = "compact"
|
||||
name: str = Field(title="Name")
|
||||
value: bool = Field(True, title="Active")
|
||||
|
||||
|
||||
class PublishGUIFiltersModel(BaseSettingsModel):
|
||||
_layout = "compact"
|
||||
name: str = Field(title="Name")
|
||||
value: list[PublishGUIFilterItemModel] = Field(default_factory=list)
|
||||
|
||||
@validator("value")
|
||||
def validate_unique_outputs(cls, value):
|
||||
ensure_unique_names(value)
|
||||
return value
|
||||
|
||||
|
||||
class TvpaintSettings(BaseSettingsModel):
|
||||
imageio: TVPaintImageIOModel = Field(
|
||||
default_factory=TVPaintImageIOModel,
|
||||
|
|
@ -52,14 +35,6 @@ class TvpaintSettings(BaseSettingsModel):
|
|||
default_factory=WorkfileBuilderPlugin,
|
||||
title="Workfile Builder"
|
||||
)
|
||||
filters: list[PublishGUIFiltersModel] = Field(
|
||||
default_factory=list,
|
||||
title="Publish GUI Filters")
|
||||
|
||||
@validator("filters")
|
||||
def validate_unique_outputs(cls, value):
|
||||
ensure_unique_names(value)
|
||||
return value
|
||||
|
||||
|
||||
DEFAULT_VALUES = {
|
||||
|
|
|
|||
|
|
@ -1 +1 @@
|
|||
__version__ = "0.1.0"
|
||||
__version__ = "0.1.1"
|
||||
|
|
|
|||
|
|
@ -60,7 +60,7 @@ class TestPublishInAfterEffects(AELocalPublishTestClass):
|
|||
name="renderTest_taskMain"))
|
||||
|
||||
failures.append(
|
||||
DBAssert.count_of_types(dbcon, "representation", 2))
|
||||
DBAssert.count_of_types(dbcon, "representation", 3))
|
||||
|
||||
additional_args = {"context.subset": "workfileTest_task",
|
||||
"context.ext": "aep"}
|
||||
|
|
|
|||
|
|
@ -185,7 +185,7 @@ createNode objectSet -n "modelMain";
|
|||
addAttr -ci true -sn "attrPrefix" -ln "attrPrefix" -dt "string";
|
||||
addAttr -ci true -sn "publish_attributes" -ln "publish_attributes" -dt "string";
|
||||
addAttr -ci true -sn "creator_attributes" -ln "creator_attributes" -dt "string";
|
||||
addAttr -ci true -sn "__creator_attributes_keys" -ln "__creator_attributes_keys"
|
||||
addAttr -ci true -sn "__creator_attributes_keys" -ln "__creator_attributes_keys"
|
||||
-dt "string";
|
||||
setAttr ".ihi" 0;
|
||||
setAttr ".cbId" -type "string" "60df31e2be2b48bd3695c056:7364ea6776c9";
|
||||
|
|
@ -296,7 +296,7 @@ createNode objectSet -n "workfileMain";
|
|||
addAttr -ci true -sn "task" -ln "task" -dt "string";
|
||||
addAttr -ci true -sn "publish_attributes" -ln "publish_attributes" -dt "string";
|
||||
addAttr -ci true -sn "creator_attributes" -ln "creator_attributes" -dt "string";
|
||||
addAttr -ci true -sn "__creator_attributes_keys" -ln "__creator_attributes_keys"
|
||||
addAttr -ci true -sn "__creator_attributes_keys" -ln "__creator_attributes_keys"
|
||||
-dt "string";
|
||||
setAttr ".ihi" 0;
|
||||
setAttr ".hio" yes;
|
||||
|
|
|
|||
|
|
@ -185,7 +185,7 @@ createNode objectSet -n "modelMain";
|
|||
addAttr -ci true -sn "attrPrefix" -ln "attrPrefix" -dt "string";
|
||||
addAttr -ci true -sn "publish_attributes" -ln "publish_attributes" -dt "string";
|
||||
addAttr -ci true -sn "creator_attributes" -ln "creator_attributes" -dt "string";
|
||||
addAttr -ci true -sn "__creator_attributes_keys" -ln "__creator_attributes_keys"
|
||||
addAttr -ci true -sn "__creator_attributes_keys" -ln "__creator_attributes_keys"
|
||||
-dt "string";
|
||||
setAttr ".ihi" 0;
|
||||
setAttr ".cbId" -type "string" "60df31e2be2b48bd3695c056:7364ea6776c9";
|
||||
|
|
@ -296,7 +296,7 @@ createNode objectSet -n "workfileMain";
|
|||
addAttr -ci true -sn "task" -ln "task" -dt "string";
|
||||
addAttr -ci true -sn "publish_attributes" -ln "publish_attributes" -dt "string";
|
||||
addAttr -ci true -sn "creator_attributes" -ln "creator_attributes" -dt "string";
|
||||
addAttr -ci true -sn "__creator_attributes_keys" -ln "__creator_attributes_keys"
|
||||
addAttr -ci true -sn "__creator_attributes_keys" -ln "__creator_attributes_keys"
|
||||
-dt "string";
|
||||
setAttr ".ihi" 0;
|
||||
setAttr ".hio" yes;
|
||||
|
|
|
|||
|
|
@ -185,7 +185,7 @@ createNode objectSet -n "modelMain";
|
|||
addAttr -ci true -sn "attrPrefix" -ln "attrPrefix" -dt "string";
|
||||
addAttr -ci true -sn "publish_attributes" -ln "publish_attributes" -dt "string";
|
||||
addAttr -ci true -sn "creator_attributes" -ln "creator_attributes" -dt "string";
|
||||
addAttr -ci true -sn "__creator_attributes_keys" -ln "__creator_attributes_keys"
|
||||
addAttr -ci true -sn "__creator_attributes_keys" -ln "__creator_attributes_keys"
|
||||
-dt "string";
|
||||
setAttr ".ihi" 0;
|
||||
setAttr ".cbId" -type "string" "60df31e2be2b48bd3695c056:7364ea6776c9";
|
||||
|
|
@ -296,7 +296,7 @@ createNode objectSet -n "workfileMain";
|
|||
addAttr -ci true -sn "task" -ln "task" -dt "string";
|
||||
addAttr -ci true -sn "publish_attributes" -ln "publish_attributes" -dt "string";
|
||||
addAttr -ci true -sn "creator_attributes" -ln "creator_attributes" -dt "string";
|
||||
addAttr -ci true -sn "__creator_attributes_keys" -ln "__creator_attributes_keys"
|
||||
addAttr -ci true -sn "__creator_attributes_keys" -ln "__creator_attributes_keys"
|
||||
-dt "string";
|
||||
setAttr ".ihi" 0;
|
||||
setAttr ".hio" yes;
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue