mirror of
https://github.com/ynput/ayon-core.git
synced 2026-01-02 08:54:53 +01:00
Merge branch 'develop' into feature/template_schema_as_object_type
This commit is contained in:
commit
6eb90c3562
60 changed files with 2001 additions and 240 deletions
49
CHANGELOG.md
49
CHANGELOG.md
|
|
@ -1,8 +1,35 @@
|
|||
# Changelog
|
||||
|
||||
## [3.2.0-nightly.7](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
## [3.3.0-nightly.4](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/2.18.4...HEAD)
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.2.0...HEAD)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Ftrack push attributes action adds traceback to job [\#1843](https://github.com/pypeclub/OpenPype/pull/1843)
|
||||
- Prepare project action enhance [\#1838](https://github.com/pypeclub/OpenPype/pull/1838)
|
||||
- nuke: settings create missing default subsets [\#1829](https://github.com/pypeclub/OpenPype/pull/1829)
|
||||
- Update poetry lock [\#1823](https://github.com/pypeclub/OpenPype/pull/1823)
|
||||
- Settings: settings for plugins [\#1819](https://github.com/pypeclub/OpenPype/pull/1819)
|
||||
- Maya: Deadline custom settings [\#1797](https://github.com/pypeclub/OpenPype/pull/1797)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Nuke: updating effects subset fail [\#1841](https://github.com/pypeclub/OpenPype/pull/1841)
|
||||
- nuke: write render node skipped with crop [\#1836](https://github.com/pypeclub/OpenPype/pull/1836)
|
||||
- Project folder structure overrides [\#1813](https://github.com/pypeclub/OpenPype/pull/1813)
|
||||
- Maya: fix yeti settings path in extractor [\#1809](https://github.com/pypeclub/OpenPype/pull/1809)
|
||||
- Failsafe for cross project containers. [\#1806](https://github.com/pypeclub/OpenPype/pull/1806)
|
||||
- Houdini colector formatting keys fix [\#1802](https://github.com/pypeclub/OpenPype/pull/1802)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Add support for pyenv-win on windows [\#1822](https://github.com/pypeclub/OpenPype/pull/1822)
|
||||
- PS, AE - send actual context when another webserver is running [\#1811](https://github.com/pypeclub/OpenPype/pull/1811)
|
||||
|
||||
## [3.2.0](https://github.com/pypeclub/OpenPype/tree/3.2.0) (2021-07-13)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.2.0-nightly.7...3.2.0)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
|
|
@ -11,6 +38,7 @@
|
|||
- Ftrack Multiple notes as server action [\#1795](https://github.com/pypeclub/OpenPype/pull/1795)
|
||||
- Settings conditional dict [\#1777](https://github.com/pypeclub/OpenPype/pull/1777)
|
||||
- Settings application use python 2 only where needed [\#1776](https://github.com/pypeclub/OpenPype/pull/1776)
|
||||
- Settings UI copy/paste [\#1769](https://github.com/pypeclub/OpenPype/pull/1769)
|
||||
- Workfile tool widths [\#1766](https://github.com/pypeclub/OpenPype/pull/1766)
|
||||
- Push hierarchical attributes care about task parent changes [\#1763](https://github.com/pypeclub/OpenPype/pull/1763)
|
||||
- Application executables with environment variables [\#1757](https://github.com/pypeclub/OpenPype/pull/1757)
|
||||
|
|
@ -42,8 +70,6 @@
|
|||
- StandalonePublisher: failing collector for editorial [\#1738](https://github.com/pypeclub/OpenPype/pull/1738)
|
||||
- Local settings UI crash on missing defaults [\#1737](https://github.com/pypeclub/OpenPype/pull/1737)
|
||||
- TVPaint white background on thumbnail [\#1735](https://github.com/pypeclub/OpenPype/pull/1735)
|
||||
- Application without executables [\#1679](https://github.com/pypeclub/OpenPype/pull/1679)
|
||||
- Unreal: launching on Linux [\#1672](https://github.com/pypeclub/OpenPype/pull/1672)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
|
|
@ -69,7 +95,7 @@
|
|||
|
||||
- Tools names forwards compatibility [\#1727](https://github.com/pypeclub/OpenPype/pull/1727)
|
||||
|
||||
**Merged pull requests:**
|
||||
**⚠️ Deprecations**
|
||||
|
||||
- global: removing obsolete ftrack validator plugin [\#1710](https://github.com/pypeclub/OpenPype/pull/1710)
|
||||
|
||||
|
|
@ -93,24 +119,11 @@
|
|||
|
||||
- Log Viewer with OpenPype style [\#1703](https://github.com/pypeclub/OpenPype/pull/1703)
|
||||
- Scrolling in OpenPype info widget [\#1702](https://github.com/pypeclub/OpenPype/pull/1702)
|
||||
- OpenPype style in modules [\#1694](https://github.com/pypeclub/OpenPype/pull/1694)
|
||||
- Sort applications and tools alphabetically in Settings UI [\#1689](https://github.com/pypeclub/OpenPype/pull/1689)
|
||||
- \#683 - Validate Frame Range in Standalone Publisher [\#1683](https://github.com/pypeclub/OpenPype/pull/1683)
|
||||
- Hiero: old container versions identify with red color [\#1682](https://github.com/pypeclub/OpenPype/pull/1682)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Nuke: broken publishing rendered frames [\#1707](https://github.com/pypeclub/OpenPype/pull/1707)
|
||||
- Standalone publisher Thumbnail export args [\#1705](https://github.com/pypeclub/OpenPype/pull/1705)
|
||||
- Bad zip can break OpenPype start [\#1691](https://github.com/pypeclub/OpenPype/pull/1691)
|
||||
- Hiero: published whole edit mov [\#1687](https://github.com/pypeclub/OpenPype/pull/1687)
|
||||
- Ftrack subprocess handle of stdout/stderr [\#1675](https://github.com/pypeclub/OpenPype/pull/1675)
|
||||
- Settings list race condifiton and mutable dict list conversion [\#1671](https://github.com/pypeclub/OpenPype/pull/1671)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- update dependencies [\#1697](https://github.com/pypeclub/OpenPype/pull/1697)
|
||||
- Bump normalize-url from 4.5.0 to 4.5.1 in /website [\#1686](https://github.com/pypeclub/OpenPype/pull/1686)
|
||||
|
||||
# Changelog
|
||||
|
||||
|
|
|
|||
28
openpype/hooks/pre_foundry_apps.py
Normal file
28
openpype/hooks/pre_foundry_apps.py
Normal file
|
|
@ -0,0 +1,28 @@
|
|||
import subprocess
|
||||
from openpype.lib import PreLaunchHook
|
||||
|
||||
|
||||
class LaunchFoundryAppsWindows(PreLaunchHook):
|
||||
"""Foundry applications have specific way how to launch them.
|
||||
|
||||
Nuke is executed "like" python process so it is required to pass
|
||||
`CREATE_NEW_CONSOLE` flag on windows to trigger creation of new console.
|
||||
At the same time the newly created console won't create it's own stdout
|
||||
and stderr handlers so they should not be redirected to DEVNULL.
|
||||
"""
|
||||
|
||||
# Should be as last hook because must change launch arguments to string
|
||||
order = 1000
|
||||
app_groups = ["nuke", "nukex", "hiero", "nukestudio"]
|
||||
platforms = ["windows"]
|
||||
|
||||
def execute(self):
|
||||
# Change `creationflags` to CREATE_NEW_CONSOLE
|
||||
# - on Windows will nuke create new window using it's console
|
||||
# Set `stdout` and `stderr` to None so new created console does not
|
||||
# have redirected output to DEVNULL in build
|
||||
self.launch_context.kwargs.update({
|
||||
"creationflags": subprocess.CREATE_NEW_CONSOLE,
|
||||
"stdout": None,
|
||||
"stderr": None
|
||||
})
|
||||
|
|
@ -49,5 +49,7 @@ class NonPythonHostHook(PreLaunchHook):
|
|||
if remainders:
|
||||
self.launch_context.launch_args.extend(remainders)
|
||||
|
||||
# This must be set otherwise it wouldn't be possible to catch output
|
||||
# when build OpenPype is used.
|
||||
self.launch_context.kwargs["stdout"] = subprocess.DEVNULL
|
||||
self.launch_context.kwargs["stderr"] = subprocess.STDOUT
|
||||
self.launch_context.kwargs["stderr"] = subprocess.DEVNULL
|
||||
|
|
|
|||
|
|
@ -1,44 +0,0 @@
|
|||
import os
|
||||
import subprocess
|
||||
from openpype.lib import PreLaunchHook
|
||||
|
||||
|
||||
class LaunchWithWindowsShell(PreLaunchHook):
|
||||
"""Add shell command before executable.
|
||||
|
||||
Some hosts have issues when are launched directly from python in that case
|
||||
it is possible to prepend shell executable which will trigger process
|
||||
instead.
|
||||
"""
|
||||
|
||||
# Should be as last hook because must change launch arguments to string
|
||||
order = 1000
|
||||
app_groups = ["nuke", "nukex", "hiero", "nukestudio"]
|
||||
platforms = ["windows"]
|
||||
|
||||
def execute(self):
|
||||
launch_args = self.launch_context.clear_launch_args(
|
||||
self.launch_context.launch_args)
|
||||
new_args = [
|
||||
# Get comspec which is cmd.exe in most cases.
|
||||
os.environ.get("COMSPEC", "cmd.exe"),
|
||||
# NOTE change to "/k" if want to keep console opened
|
||||
"/c",
|
||||
# Convert arguments to command line arguments (as string)
|
||||
"\"{}\"".format(
|
||||
subprocess.list2cmdline(launch_args)
|
||||
)
|
||||
]
|
||||
# Convert list to string
|
||||
# WARNING this only works if is used as string
|
||||
args_string = " ".join(new_args)
|
||||
self.log.info((
|
||||
"Modified launch arguments to be launched with shell \"{}\"."
|
||||
).format(args_string))
|
||||
|
||||
# Replace launch args with new one
|
||||
self.launch_context.launch_args = args_string
|
||||
# Change `creationflags` to CREATE_NEW_CONSOLE
|
||||
self.launch_context.kwargs["creationflags"] = (
|
||||
subprocess.CREATE_NEW_CONSOLE
|
||||
)
|
||||
|
|
@ -0,0 +1,61 @@
|
|||
from avalon import api
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
from avalon import aftereffects
|
||||
|
||||
|
||||
class ValidateInstanceAssetRepair(pyblish.api.Action):
|
||||
"""Repair the instance asset with value from Context."""
|
||||
|
||||
label = "Repair"
|
||||
icon = "wrench"
|
||||
on = "failed"
|
||||
|
||||
def process(self, context, plugin):
|
||||
|
||||
# Get the errored instances
|
||||
failed = []
|
||||
for result in context.data["results"]:
|
||||
if (result["error"] is not None and result["instance"] is not None
|
||||
and result["instance"] not in failed):
|
||||
failed.append(result["instance"])
|
||||
|
||||
# Apply pyblish.logic to get the instances for the plug-in
|
||||
instances = pyblish.api.instances_by_plugin(failed, plugin)
|
||||
stub = aftereffects.stub()
|
||||
for instance in instances:
|
||||
data = stub.read(instance[0])
|
||||
|
||||
data["asset"] = api.Session["AVALON_ASSET"]
|
||||
stub.imprint(instance[0], data)
|
||||
|
||||
|
||||
class ValidateInstanceAsset(pyblish.api.InstancePlugin):
|
||||
"""Validate the instance asset is the current selected context asset.
|
||||
|
||||
As it might happen that multiple worfiles are opened at same time,
|
||||
switching between them would mess with selected context. (From Launcher
|
||||
or Ftrack).
|
||||
|
||||
In that case outputs might be output under wrong asset!
|
||||
|
||||
Repair action will use Context asset value (from Workfiles or Launcher)
|
||||
Closing and reopening with Workfiles will refresh Context value.
|
||||
"""
|
||||
|
||||
label = "Validate Instance Asset"
|
||||
hosts = ["aftereffects"]
|
||||
actions = [ValidateInstanceAssetRepair]
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
|
||||
def process(self, instance):
|
||||
instance_asset = instance.data["asset"]
|
||||
current_asset = api.Session["AVALON_ASSET"]
|
||||
msg = (
|
||||
f"Instance asset {instance_asset} is not the same "
|
||||
f"as current context {current_asset}. PLEASE DO:\n"
|
||||
f"Repair with 'A' action to use '{current_asset}'.\n"
|
||||
f"If that's not correct value, close workfile and "
|
||||
f"reopen via Workfiles!"
|
||||
)
|
||||
assert instance_asset == current_asset, msg
|
||||
|
|
@ -133,10 +133,10 @@ class ExtractYetiRig(openpype.api.Extractor):
|
|||
image_search_path = resources_dir = instance.data["resourcesDir"]
|
||||
|
||||
settings = instance.data.get("rigsettings", None)
|
||||
if settings:
|
||||
settings["imageSearchPath"] = image_search_path
|
||||
with open(settings_path, "w") as fp:
|
||||
json.dump(settings, fp, ensure_ascii=False)
|
||||
assert settings, "Yeti rig settings were not collected."
|
||||
settings["imageSearchPath"] = image_search_path
|
||||
with open(settings_path, "w") as fp:
|
||||
json.dump(settings, fp, ensure_ascii=False)
|
||||
|
||||
# add textures to transfers
|
||||
if 'transfers' not in instance.data:
|
||||
|
|
@ -192,12 +192,12 @@ class ExtractYetiRig(openpype.api.Extractor):
|
|||
'stagingDir': dirname
|
||||
}
|
||||
)
|
||||
self.log.info("settings file: {}".format(settings))
|
||||
self.log.info("settings file: {}".format(settings_path))
|
||||
instance.data["representations"].append(
|
||||
{
|
||||
'name': 'rigsettings',
|
||||
'ext': 'rigsettings',
|
||||
'files': os.path.basename(settings),
|
||||
'files': os.path.basename(settings_path),
|
||||
'stagingDir': dirname
|
||||
}
|
||||
)
|
||||
|
|
|
|||
|
|
@ -113,6 +113,14 @@ def check_inventory_versions():
|
|||
"_id": io.ObjectId(avalon_knob_data["representation"])
|
||||
})
|
||||
|
||||
# Failsafe for not finding the representation.
|
||||
if not representation:
|
||||
log.warning(
|
||||
"Could not find the representation on "
|
||||
"node \"{}\"".format(node.name())
|
||||
)
|
||||
continue
|
||||
|
||||
# Get start frame from version data
|
||||
version = io.find_one({
|
||||
"type": "version",
|
||||
|
|
@ -391,13 +399,14 @@ def create_write_node(name, data, input=None, prenodes=None,
|
|||
if prenodes:
|
||||
for node in prenodes:
|
||||
# get attributes
|
||||
name = node["name"]
|
||||
pre_node_name = node["name"]
|
||||
klass = node["class"]
|
||||
knobs = node["knobs"]
|
||||
dependent = node["dependent"]
|
||||
|
||||
# create node
|
||||
now_node = nuke.createNode(klass, "name {}".format(name))
|
||||
now_node = nuke.createNode(
|
||||
klass, "name {}".format(pre_node_name))
|
||||
now_node.hideControlPanel()
|
||||
|
||||
# add data to knob
|
||||
|
|
@ -476,27 +485,27 @@ def create_write_node(name, data, input=None, prenodes=None,
|
|||
|
||||
linked_knob_names.append("Render")
|
||||
|
||||
for name in linked_knob_names:
|
||||
if "_grp-start_" in name:
|
||||
for _k_name in linked_knob_names:
|
||||
if "_grp-start_" in _k_name:
|
||||
knob = nuke.Tab_Knob(
|
||||
"rnd_attr", "Rendering attributes", nuke.TABBEGINCLOSEDGROUP)
|
||||
GN.addKnob(knob)
|
||||
elif "_grp-end_" in name:
|
||||
elif "_grp-end_" in _k_name:
|
||||
knob = nuke.Tab_Knob(
|
||||
"rnd_attr_end", "Rendering attributes", nuke.TABENDGROUP)
|
||||
GN.addKnob(knob)
|
||||
else:
|
||||
if "___" in name:
|
||||
if "___" in _k_name:
|
||||
# add devider
|
||||
GN.addKnob(nuke.Text_Knob(""))
|
||||
else:
|
||||
# add linked knob by name
|
||||
# add linked knob by _k_name
|
||||
link = nuke.Link_Knob("")
|
||||
link.makeLink(write_node.name(), name)
|
||||
link.setName(name)
|
||||
link.makeLink(write_node.name(), _k_name)
|
||||
link.setName(_k_name)
|
||||
|
||||
# make render
|
||||
if "Render" in name:
|
||||
if "Render" in _k_name:
|
||||
link.setLabel("Render Local")
|
||||
link.setFlag(0x1000)
|
||||
GN.addKnob(link)
|
||||
|
|
|
|||
|
|
@ -214,7 +214,7 @@ class LoadEffects(api.Loader):
|
|||
self.log.warning(e)
|
||||
continue
|
||||
|
||||
if isinstance(v, list) and len(v) > 3:
|
||||
if isinstance(v, list) and len(v) > 4:
|
||||
node[k].setAnimated()
|
||||
for i, value in enumerate(v):
|
||||
if isinstance(value, list):
|
||||
|
|
|
|||
|
|
@ -217,7 +217,7 @@ class LoadEffectsInputProcess(api.Loader):
|
|||
self.log.warning(e)
|
||||
continue
|
||||
|
||||
if isinstance(v, list) and len(v) > 3:
|
||||
if isinstance(v, list) and len(v) > 4:
|
||||
node[k].setAnimated()
|
||||
for i, value in enumerate(v):
|
||||
if isinstance(value, list):
|
||||
|
|
@ -239,10 +239,10 @@ class LoadEffectsInputProcess(api.Loader):
|
|||
output = nuke.createNode("Output")
|
||||
output.setInput(0, pre_node)
|
||||
|
||||
# try to place it under Viewer1
|
||||
if not self.connect_active_viewer(GN):
|
||||
nuke.delete(GN)
|
||||
return
|
||||
# # try to place it under Viewer1
|
||||
# if not self.connect_active_viewer(GN):
|
||||
# nuke.delete(GN)
|
||||
# return
|
||||
|
||||
# get all versions in list
|
||||
versions = io.find({
|
||||
|
|
@ -298,7 +298,11 @@ class LoadEffectsInputProcess(api.Loader):
|
|||
viewer["input_process_node"].setValue(group_node_name)
|
||||
|
||||
# put backdrop under
|
||||
lib.create_backdrop(label="Input Process", layer=2, nodes=[viewer, group_node], color="0x7c7faaff")
|
||||
lib.create_backdrop(
|
||||
label="Input Process",
|
||||
layer=2,
|
||||
nodes=[viewer, group_node],
|
||||
color="0x7c7faaff")
|
||||
|
||||
return True
|
||||
|
||||
|
|
|
|||
|
|
@ -1,5 +1,4 @@
|
|||
import os
|
||||
|
||||
from avalon import api
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
from avalon import photoshop
|
||||
|
|
@ -27,12 +26,20 @@ class ValidateInstanceAssetRepair(pyblish.api.Action):
|
|||
for instance in instances:
|
||||
data = stub.read(instance[0])
|
||||
|
||||
data["asset"] = os.environ["AVALON_ASSET"]
|
||||
data["asset"] = api.Session["AVALON_ASSET"]
|
||||
stub.imprint(instance[0], data)
|
||||
|
||||
|
||||
class ValidateInstanceAsset(pyblish.api.InstancePlugin):
|
||||
"""Validate the instance asset is the current asset."""
|
||||
"""Validate the instance asset is the current selected context asset.
|
||||
|
||||
As it might happen that multiple worfiles are opened, switching
|
||||
between them would mess with selected context.
|
||||
In that case outputs might be output under wrong asset!
|
||||
|
||||
Repair action will use Context asset value (from Workfiles or Launcher)
|
||||
Closing and reopening with Workfiles will refresh Context value.
|
||||
"""
|
||||
|
||||
label = "Validate Instance Asset"
|
||||
hosts = ["photoshop"]
|
||||
|
|
@ -41,9 +48,12 @@ class ValidateInstanceAsset(pyblish.api.InstancePlugin):
|
|||
|
||||
def process(self, instance):
|
||||
instance_asset = instance.data["asset"]
|
||||
current_asset = os.environ["AVALON_ASSET"]
|
||||
current_asset = api.Session["AVALON_ASSET"]
|
||||
msg = (
|
||||
"Instance asset is not the same as current asset:"
|
||||
f"\nInstance: {instance_asset}\nCurrent: {current_asset}"
|
||||
f"Instance asset {instance_asset} is not the same "
|
||||
f"as current context {current_asset}. PLEASE DO:\n"
|
||||
f"Repair with 'A' action to use '{current_asset}'.\n"
|
||||
f"If that's not correct value, close workfile and "
|
||||
f"reopen via Workfiles!"
|
||||
)
|
||||
assert instance_asset == current_asset, msg
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@
|
|||
Optional:
|
||||
presets -> extensions (
|
||||
example of use:
|
||||
[".mov", ".mp4"]
|
||||
["mov", "mp4"]
|
||||
)
|
||||
presets -> source_dir (
|
||||
example of use:
|
||||
|
|
@ -11,6 +11,7 @@ Optional:
|
|||
"{root[work]}/{project[name]}/inputs"
|
||||
"./input"
|
||||
"../input"
|
||||
""
|
||||
)
|
||||
"""
|
||||
|
||||
|
|
@ -48,7 +49,7 @@ class CollectEditorial(pyblish.api.InstancePlugin):
|
|||
actions = []
|
||||
|
||||
# presets
|
||||
extensions = [".mov", ".mp4"]
|
||||
extensions = ["mov", "mp4"]
|
||||
source_dir = None
|
||||
|
||||
def process(self, instance):
|
||||
|
|
@ -72,7 +73,7 @@ class CollectEditorial(pyblish.api.InstancePlugin):
|
|||
video_path = None
|
||||
basename = os.path.splitext(os.path.basename(file_path))[0]
|
||||
|
||||
if self.source_dir:
|
||||
if self.source_dir != "":
|
||||
source_dir = self.source_dir.replace("\\", "/")
|
||||
if ("./" in source_dir) or ("../" in source_dir):
|
||||
# get current working dir
|
||||
|
|
@ -98,7 +99,7 @@ class CollectEditorial(pyblish.api.InstancePlugin):
|
|||
if os.path.splitext(f)[0] not in basename:
|
||||
continue
|
||||
# filter out by respected extensions
|
||||
if os.path.splitext(f)[1] not in self.extensions:
|
||||
if os.path.splitext(f)[1][1:] not in self.extensions:
|
||||
continue
|
||||
video_path = os.path.join(
|
||||
staging_dir, f
|
||||
|
|
|
|||
|
|
@ -17,16 +17,12 @@ class CollectInstances(pyblish.api.InstancePlugin):
|
|||
"referenceMain": {
|
||||
"family": "review",
|
||||
"families": ["clip"],
|
||||
"extensions": [".mp4"]
|
||||
"extensions": ["mp4"]
|
||||
},
|
||||
"audioMain": {
|
||||
"family": "audio",
|
||||
"families": ["clip"],
|
||||
"extensions": [".wav"],
|
||||
},
|
||||
"shotMain": {
|
||||
"family": "shot",
|
||||
"families": []
|
||||
"extensions": ["wav"],
|
||||
}
|
||||
}
|
||||
timeline_frame_start = 900000 # starndard edl default (10:00:00:00)
|
||||
|
|
@ -178,7 +174,17 @@ class CollectInstances(pyblish.api.InstancePlugin):
|
|||
data_key: instance.data.get(data_key)})
|
||||
|
||||
# adding subsets to context as instances
|
||||
self.subsets.update({
|
||||
"shotMain": {
|
||||
"family": "shot",
|
||||
"families": []
|
||||
}
|
||||
})
|
||||
for subset, properities in self.subsets.items():
|
||||
version = properities.get("version")
|
||||
if version and version == 0:
|
||||
properities.pop("version")
|
||||
|
||||
# adding Review-able instance
|
||||
subset_instance_data = instance_data.copy()
|
||||
subset_instance_data.update(properities)
|
||||
|
|
|
|||
|
|
@ -177,19 +177,23 @@ class CollectInstanceResources(pyblish.api.InstancePlugin):
|
|||
collection_head_name = None
|
||||
# loop trough collections and create representations
|
||||
for _collection in collections:
|
||||
ext = _collection.tail
|
||||
ext = _collection.tail[1:]
|
||||
collection_head_name = _collection.head
|
||||
frame_start = list(_collection.indexes)[0]
|
||||
frame_end = list(_collection.indexes)[-1]
|
||||
repre_data = {
|
||||
"frameStart": frame_start,
|
||||
"frameEnd": frame_end,
|
||||
"name": ext[1:],
|
||||
"ext": ext[1:],
|
||||
"name": ext,
|
||||
"ext": ext,
|
||||
"files": [item for item in _collection],
|
||||
"stagingDir": staging_dir
|
||||
}
|
||||
|
||||
if instance_data.get("keepSequence"):
|
||||
repre_data_keep = deepcopy(repre_data)
|
||||
instance_data["representations"].append(repre_data_keep)
|
||||
|
||||
if "review" in instance_data["families"]:
|
||||
repre_data.update({
|
||||
"thumbnail": True,
|
||||
|
|
@ -208,20 +212,20 @@ class CollectInstanceResources(pyblish.api.InstancePlugin):
|
|||
|
||||
# loop trough reminders and create representations
|
||||
for _reminding_file in remainder:
|
||||
ext = os.path.splitext(_reminding_file)[-1]
|
||||
ext = os.path.splitext(_reminding_file)[-1][1:]
|
||||
if ext not in instance_data["extensions"]:
|
||||
continue
|
||||
if collection_head_name and (
|
||||
(collection_head_name + ext[1:]) not in _reminding_file
|
||||
) and (ext in [".mp4", ".mov"]):
|
||||
(collection_head_name + ext) not in _reminding_file
|
||||
) and (ext in ["mp4", "mov"]):
|
||||
self.log.info(f"Skipping file: {_reminding_file}")
|
||||
continue
|
||||
frame_start = 1
|
||||
frame_end = 1
|
||||
|
||||
repre_data = {
|
||||
"name": ext[1:],
|
||||
"ext": ext[1:],
|
||||
"name": ext,
|
||||
"ext": ext,
|
||||
"files": _reminding_file,
|
||||
"stagingDir": staging_dir
|
||||
}
|
||||
|
|
|
|||
|
|
@ -131,20 +131,21 @@ class CollectHierarchyInstance(pyblish.api.ContextPlugin):
|
|||
tasks_to_add = dict()
|
||||
project_tasks = io.find_one({"type": "project"})["config"]["tasks"]
|
||||
for task_name, task_data in self.shot_add_tasks.items():
|
||||
try:
|
||||
if task_data["type"] in project_tasks.keys():
|
||||
tasks_to_add.update({task_name: task_data})
|
||||
else:
|
||||
raise KeyError(
|
||||
"Wrong FtrackTaskType `{}` for `{}` is not"
|
||||
" existing in `{}``".format(
|
||||
task_data["type"],
|
||||
task_name,
|
||||
list(project_tasks.keys())))
|
||||
except KeyError as error:
|
||||
_task_data = deepcopy(task_data)
|
||||
|
||||
# fixing enumerator from settings
|
||||
_task_data["type"] = task_data["type"][0]
|
||||
|
||||
# check if task type in project task types
|
||||
if _task_data["type"] in project_tasks.keys():
|
||||
tasks_to_add.update({task_name: _task_data})
|
||||
else:
|
||||
raise KeyError(
|
||||
"Wrong presets: `{0}`".format(error)
|
||||
)
|
||||
"Wrong FtrackTaskType `{}` for `{}` is not"
|
||||
" existing in `{}``".format(
|
||||
_task_data["type"],
|
||||
task_name,
|
||||
list(project_tasks.keys())))
|
||||
|
||||
instance.data["tasks"] = tasks_to_add
|
||||
else:
|
||||
|
|
|
|||
|
|
@ -0,0 +1,456 @@
|
|||
import os
|
||||
import re
|
||||
import pyblish.api
|
||||
import json
|
||||
|
||||
from avalon.api import format_template_with_optional_keys
|
||||
|
||||
from openpype.lib import prepare_template_data
|
||||
|
||||
|
||||
class CollectTextures(pyblish.api.ContextPlugin):
|
||||
"""Collect workfile (and its resource_files) and textures.
|
||||
|
||||
Currently implements use case with Mari and Substance Painter, where
|
||||
one workfile is main (.mra - Mari) with possible additional workfiles
|
||||
(.spp - Substance)
|
||||
|
||||
|
||||
Provides:
|
||||
1 instance per workfile (with 'resources' filled if needed)
|
||||
(workfile family)
|
||||
1 instance per group of textures
|
||||
(textures family)
|
||||
"""
|
||||
|
||||
order = pyblish.api.CollectorOrder
|
||||
label = "Collect Textures"
|
||||
hosts = ["standalonepublisher"]
|
||||
families = ["texture_batch"]
|
||||
actions = []
|
||||
|
||||
# from presets
|
||||
main_workfile_extensions = ['mra']
|
||||
other_workfile_extensions = ['spp', 'psd']
|
||||
texture_extensions = ["exr", "dpx", "jpg", "jpeg", "png", "tiff", "tga",
|
||||
"gif", "svg"]
|
||||
|
||||
# additional families (ftrack etc.)
|
||||
workfile_families = []
|
||||
textures_families = []
|
||||
|
||||
color_space = ["linsRGB", "raw", "acesg"]
|
||||
|
||||
# currently implemented placeholders ["color_space"]
|
||||
# describing patterns in file names splitted by regex groups
|
||||
input_naming_patterns = {
|
||||
# workfile: corridorMain_v001.mra >
|
||||
# texture: corridorMain_aluminiumID_v001_baseColor_linsRGB_1001.exr
|
||||
"workfile": r'^([^.]+)(_[^_.]*)?_v([0-9]{3,}).+',
|
||||
"textures": r'^([^_.]+)_([^_.]+)_v([0-9]{3,})_([^_.]+)_({color_space})_(1[0-9]{3}).+', # noqa
|
||||
}
|
||||
# matching regex group position to 'input_naming_patterns'
|
||||
input_naming_groups = {
|
||||
"workfile": ('asset', 'filler', 'version'),
|
||||
"textures": ('asset', 'shader', 'version', 'channel', 'color_space',
|
||||
'udim')
|
||||
}
|
||||
|
||||
workfile_subset_template = "textures{Subset}Workfile"
|
||||
# implemented keys: ["color_space", "channel", "subset", "shader"]
|
||||
texture_subset_template = "textures{Subset}_{Shader}_{Channel}"
|
||||
|
||||
def process(self, context):
|
||||
self.context = context
|
||||
|
||||
resource_files = {}
|
||||
workfile_files = {}
|
||||
representations = {}
|
||||
version_data = {}
|
||||
asset_builds = set()
|
||||
asset = None
|
||||
for instance in context:
|
||||
if not self.input_naming_patterns:
|
||||
raise ValueError("Naming patterns are not configured. \n"
|
||||
"Ask admin to provide naming conventions "
|
||||
"for workfiles and textures.")
|
||||
|
||||
if not asset:
|
||||
asset = instance.data["asset"] # selected from SP
|
||||
|
||||
parsed_subset = instance.data["subset"].replace(
|
||||
instance.data["family"], '')
|
||||
|
||||
fill_pairs = {
|
||||
"subset": parsed_subset
|
||||
}
|
||||
|
||||
fill_pairs = prepare_template_data(fill_pairs)
|
||||
workfile_subset = format_template_with_optional_keys(
|
||||
fill_pairs, self.workfile_subset_template)
|
||||
|
||||
processed_instance = False
|
||||
for repre in instance.data["representations"]:
|
||||
ext = repre["ext"].replace('.', '')
|
||||
asset_build = version = None
|
||||
|
||||
if isinstance(repre["files"], list):
|
||||
repre_file = repre["files"][0]
|
||||
else:
|
||||
repre_file = repre["files"]
|
||||
|
||||
if ext in self.main_workfile_extensions or \
|
||||
ext in self.other_workfile_extensions:
|
||||
|
||||
asset_build = self._get_asset_build(
|
||||
repre_file,
|
||||
self.input_naming_patterns["workfile"],
|
||||
self.input_naming_groups["workfile"],
|
||||
self.color_space
|
||||
)
|
||||
version = self._get_version(
|
||||
repre_file,
|
||||
self.input_naming_patterns["workfile"],
|
||||
self.input_naming_groups["workfile"],
|
||||
self.color_space
|
||||
)
|
||||
asset_builds.add((asset_build, version,
|
||||
workfile_subset, 'workfile'))
|
||||
processed_instance = True
|
||||
|
||||
if not representations.get(workfile_subset):
|
||||
representations[workfile_subset] = []
|
||||
|
||||
if ext in self.main_workfile_extensions:
|
||||
# workfiles can have only single representation
|
||||
# currently OP is not supporting different extensions in
|
||||
# representation files
|
||||
representations[workfile_subset] = [repre]
|
||||
|
||||
workfile_files[asset_build] = repre_file
|
||||
|
||||
if ext in self.other_workfile_extensions:
|
||||
# add only if not added already from main
|
||||
if not representations.get(workfile_subset):
|
||||
representations[workfile_subset] = [repre]
|
||||
|
||||
# only overwrite if not present
|
||||
if not workfile_files.get(asset_build):
|
||||
workfile_files[asset_build] = repre_file
|
||||
|
||||
if not resource_files.get(workfile_subset):
|
||||
resource_files[workfile_subset] = []
|
||||
item = {
|
||||
"files": [os.path.join(repre["stagingDir"],
|
||||
repre["files"])],
|
||||
"source": "standalone publisher"
|
||||
}
|
||||
resource_files[workfile_subset].append(item)
|
||||
|
||||
if ext in self.texture_extensions:
|
||||
c_space = self._get_color_space(
|
||||
repre_file,
|
||||
self.color_space
|
||||
)
|
||||
|
||||
channel = self._get_channel_name(
|
||||
repre_file,
|
||||
self.input_naming_patterns["textures"],
|
||||
self.input_naming_groups["textures"],
|
||||
self.color_space
|
||||
)
|
||||
|
||||
shader = self._get_shader_name(
|
||||
repre_file,
|
||||
self.input_naming_patterns["textures"],
|
||||
self.input_naming_groups["textures"],
|
||||
self.color_space
|
||||
)
|
||||
|
||||
formatting_data = {
|
||||
"color_space": c_space or '', # None throws exception
|
||||
"channel": channel or '',
|
||||
"shader": shader or '',
|
||||
"subset": parsed_subset or ''
|
||||
}
|
||||
|
||||
fill_pairs = prepare_template_data(formatting_data)
|
||||
subset = format_template_with_optional_keys(
|
||||
fill_pairs, self.texture_subset_template)
|
||||
|
||||
asset_build = self._get_asset_build(
|
||||
repre_file,
|
||||
self.input_naming_patterns["textures"],
|
||||
self.input_naming_groups["textures"],
|
||||
self.color_space
|
||||
)
|
||||
version = self._get_version(
|
||||
repre_file,
|
||||
self.input_naming_patterns["textures"],
|
||||
self.input_naming_groups["textures"],
|
||||
self.color_space
|
||||
)
|
||||
if not representations.get(subset):
|
||||
representations[subset] = []
|
||||
representations[subset].append(repre)
|
||||
|
||||
ver_data = {
|
||||
"color_space": c_space or '',
|
||||
"channel_name": channel or '',
|
||||
"shader_name": shader or ''
|
||||
}
|
||||
version_data[subset] = ver_data
|
||||
|
||||
asset_builds.add(
|
||||
(asset_build, version, subset, "textures"))
|
||||
processed_instance = True
|
||||
|
||||
if processed_instance:
|
||||
self.context.remove(instance)
|
||||
|
||||
self._create_new_instances(context,
|
||||
asset,
|
||||
asset_builds,
|
||||
resource_files,
|
||||
representations,
|
||||
version_data,
|
||||
workfile_files)
|
||||
|
||||
def _create_new_instances(self, context, asset, asset_builds,
|
||||
resource_files, representations,
|
||||
version_data, workfile_files):
|
||||
"""Prepare new instances from collected data.
|
||||
|
||||
Args:
|
||||
context (ContextPlugin)
|
||||
asset (string): selected asset from SP
|
||||
asset_builds (set) of tuples
|
||||
(asset_build, version, subset, family)
|
||||
resource_files (list) of resource dicts - to store additional
|
||||
files to main workfile
|
||||
representations (list) of dicts - to store workfile info OR
|
||||
all collected texture files, key is asset_build
|
||||
version_data (dict) - prepared to store into version doc in DB
|
||||
workfile_files (dict) - to store workfile to add to textures
|
||||
key is asset_build
|
||||
"""
|
||||
# sort workfile first
|
||||
asset_builds = sorted(asset_builds,
|
||||
key=lambda tup: tup[3], reverse=True)
|
||||
|
||||
# workfile must have version, textures might
|
||||
main_version = None
|
||||
for asset_build, version, subset, family in asset_builds:
|
||||
if not main_version:
|
||||
main_version = version
|
||||
new_instance = context.create_instance(subset)
|
||||
new_instance.data.update(
|
||||
{
|
||||
"subset": subset,
|
||||
"asset": asset,
|
||||
"label": subset,
|
||||
"name": subset,
|
||||
"family": family,
|
||||
"version": int(version or main_version or 1),
|
||||
"asset_build": asset_build # remove in validator
|
||||
}
|
||||
)
|
||||
|
||||
workfile = workfile_files.get(asset_build)
|
||||
|
||||
if resource_files.get(subset):
|
||||
# add resources only when workfile is main style
|
||||
for ext in self.main_workfile_extensions:
|
||||
if ext in workfile:
|
||||
new_instance.data.update({
|
||||
"resources": resource_files.get(subset)
|
||||
})
|
||||
break
|
||||
|
||||
# store origin
|
||||
if family == 'workfile':
|
||||
families = self.workfile_families
|
||||
|
||||
new_instance.data["source"] = "standalone publisher"
|
||||
else:
|
||||
families = self.textures_families
|
||||
|
||||
repre = representations.get(subset)[0]
|
||||
new_instance.context.data["currentFile"] = os.path.join(
|
||||
repre["stagingDir"], workfile or 'dummy.txt')
|
||||
|
||||
new_instance.data["families"] = families
|
||||
|
||||
# add data for version document
|
||||
ver_data = version_data.get(subset)
|
||||
if ver_data:
|
||||
if workfile:
|
||||
ver_data['workfile'] = workfile
|
||||
|
||||
new_instance.data.update(
|
||||
{"versionData": ver_data}
|
||||
)
|
||||
|
||||
upd_representations = representations.get(subset)
|
||||
if upd_representations and family != 'workfile':
|
||||
upd_representations = self._update_representations(
|
||||
upd_representations)
|
||||
|
||||
new_instance.data["representations"] = upd_representations
|
||||
|
||||
self.log.debug("new instance - {}:: {}".format(
|
||||
family,
|
||||
json.dumps(new_instance.data, indent=4)))
|
||||
|
||||
def _get_asset_build(self, name,
|
||||
input_naming_patterns, input_naming_groups,
|
||||
color_spaces):
|
||||
"""Loops through configured workfile patterns to find asset name.
|
||||
|
||||
Asset name used to bind workfile and its textures.
|
||||
|
||||
Args:
|
||||
name (str): workfile name
|
||||
input_naming_patterns (list):
|
||||
[workfile_pattern] or [texture_pattern]
|
||||
input_naming_groups (list)
|
||||
ordinal position of regex groups matching to input_naming..
|
||||
color_spaces (list) - predefined color spaces
|
||||
"""
|
||||
asset_name = "NOT_AVAIL"
|
||||
|
||||
return self._parse(name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces, 'asset') or asset_name
|
||||
|
||||
def _get_version(self, name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces):
|
||||
found = self._parse(name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces, 'version')
|
||||
|
||||
if found:
|
||||
return found.replace('v', '')
|
||||
|
||||
self.log.info("No version found in the name {}".format(name))
|
||||
|
||||
def _get_udim(self, name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces):
|
||||
"""Parses from 'name' udim value."""
|
||||
found = self._parse(name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces, 'udim')
|
||||
if found:
|
||||
return found
|
||||
|
||||
self.log.warning("Didn't find UDIM in {}".format(name))
|
||||
|
||||
def _get_color_space(self, name, color_spaces):
|
||||
"""Looks for color_space from a list in a file name.
|
||||
|
||||
Color space seems not to be recognizable by regex pattern, set of
|
||||
known space spaces must be provided.
|
||||
"""
|
||||
color_space = None
|
||||
found = [cs for cs in color_spaces if
|
||||
re.search("_{}_".format(cs), name)]
|
||||
|
||||
if not found:
|
||||
self.log.warning("No color space found in {}".format(name))
|
||||
else:
|
||||
if len(found) > 1:
|
||||
msg = "Multiple color spaces found in {}->{}".format(name,
|
||||
found)
|
||||
self.log.warning(msg)
|
||||
|
||||
color_space = found[0]
|
||||
|
||||
return color_space
|
||||
|
||||
def _get_shader_name(self, name, input_naming_patterns,
|
||||
input_naming_groups, color_spaces):
|
||||
"""Return parsed shader name.
|
||||
|
||||
Shader name is needed for overlapping udims (eg. udims might be
|
||||
used for different materials, shader needed to not overwrite).
|
||||
|
||||
Unknown format of channel name and color spaces >> cs are known
|
||||
list - 'color_space' used as a placeholder
|
||||
"""
|
||||
found = self._parse(name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces, 'shader')
|
||||
if found:
|
||||
return found
|
||||
|
||||
self.log.warning("Didn't find shader in {}".format(name))
|
||||
|
||||
def _get_channel_name(self, name, input_naming_patterns,
|
||||
input_naming_groups, color_spaces):
|
||||
"""Return parsed channel name.
|
||||
|
||||
Unknown format of channel name and color spaces >> cs are known
|
||||
list - 'color_space' used as a placeholder
|
||||
"""
|
||||
found = self._parse(name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces, 'channel')
|
||||
if found:
|
||||
return found
|
||||
|
||||
self.log.warning("Didn't find channel in {}".format(name))
|
||||
|
||||
def _parse(self, name, input_naming_patterns, input_naming_groups,
|
||||
color_spaces, key):
|
||||
"""Universal way to parse 'name' with configurable regex groups.
|
||||
|
||||
Args:
|
||||
name (str): workfile name
|
||||
input_naming_patterns (list):
|
||||
[workfile_pattern] or [texture_pattern]
|
||||
input_naming_groups (list)
|
||||
ordinal position of regex groups matching to input_naming..
|
||||
color_spaces (list) - predefined color spaces
|
||||
|
||||
Raises:
|
||||
ValueError - if broken 'input_naming_groups'
|
||||
"""
|
||||
for input_pattern in input_naming_patterns:
|
||||
for cs in color_spaces:
|
||||
pattern = input_pattern.replace('{color_space}', cs)
|
||||
regex_result = re.findall(pattern, name)
|
||||
if regex_result:
|
||||
idx = list(input_naming_groups).index(key)
|
||||
if idx < 0:
|
||||
msg = "input_naming_groups must " +\
|
||||
"have '{}' key".format(key)
|
||||
raise ValueError(msg)
|
||||
|
||||
try:
|
||||
parsed_value = regex_result[0][idx]
|
||||
return parsed_value
|
||||
except IndexError:
|
||||
self.log.warning("Wrong index, probably "
|
||||
"wrong name {}".format(name))
|
||||
|
||||
def _update_representations(self, upd_representations):
|
||||
"""Frames dont have sense for textures, add collected udims instead."""
|
||||
udims = []
|
||||
for repre in upd_representations:
|
||||
repre.pop("frameStart", None)
|
||||
repre.pop("frameEnd", None)
|
||||
repre.pop("fps", None)
|
||||
|
||||
# ignore unique name from SP, use extension instead
|
||||
# SP enforces unique name, here different subsets >> unique repres
|
||||
repre["name"] = repre["ext"].replace('.', '')
|
||||
|
||||
files = repre.get("files", [])
|
||||
if not isinstance(files, list):
|
||||
files = [files]
|
||||
|
||||
for file_name in files:
|
||||
udim = self._get_udim(file_name,
|
||||
self.input_naming_patterns["textures"],
|
||||
self.input_naming_groups["textures"],
|
||||
self.color_space)
|
||||
udims.append(udim)
|
||||
|
||||
repre["udim"] = udims # must be this way, used for filling path
|
||||
|
||||
return upd_representations
|
||||
|
|
@ -0,0 +1,42 @@
|
|||
import os
|
||||
import pyblish.api
|
||||
|
||||
|
||||
class ExtractResources(pyblish.api.InstancePlugin):
|
||||
"""
|
||||
Extracts files from instance.data["resources"].
|
||||
|
||||
These files are additional (textures etc.), currently not stored in
|
||||
representations!
|
||||
|
||||
Expects collected 'resourcesDir'. (list of dicts with 'files' key and
|
||||
list of source urls)
|
||||
|
||||
Provides filled 'transfers' (list of tuples (source_url, target_url))
|
||||
"""
|
||||
|
||||
label = "Extract Resources SP"
|
||||
hosts = ["standalonepublisher"]
|
||||
order = pyblish.api.ExtractorOrder
|
||||
|
||||
families = ["workfile"]
|
||||
|
||||
def process(self, instance):
|
||||
if not instance.data.get("resources"):
|
||||
self.log.info("No resources")
|
||||
return
|
||||
|
||||
if not instance.data.get("transfers"):
|
||||
instance.data["transfers"] = []
|
||||
|
||||
publish_dir = instance.data["resourcesDir"]
|
||||
|
||||
transfers = []
|
||||
for resource in instance.data["resources"]:
|
||||
for file_url in resource.get("files", []):
|
||||
file_name = os.path.basename(file_url)
|
||||
dest_url = os.path.join(publish_dir, file_name)
|
||||
transfers.append((file_url, dest_url))
|
||||
|
||||
self.log.info("transfers:: {}".format(transfers))
|
||||
instance.data["transfers"].extend(transfers)
|
||||
|
|
@ -0,0 +1,43 @@
|
|||
import os
|
||||
import pyblish.api
|
||||
|
||||
|
||||
class ExtractWorkfileUrl(pyblish.api.ContextPlugin):
|
||||
"""
|
||||
Modifies 'workfile' field to contain link to published workfile.
|
||||
|
||||
Expects that batch contains only single workfile and matching
|
||||
(multiple) textures.
|
||||
"""
|
||||
|
||||
label = "Extract Workfile Url SP"
|
||||
hosts = ["standalonepublisher"]
|
||||
order = pyblish.api.ExtractorOrder
|
||||
|
||||
families = ["textures"]
|
||||
|
||||
def process(self, context):
|
||||
filepath = None
|
||||
|
||||
# first loop for workfile
|
||||
for instance in context:
|
||||
if instance.data["family"] == 'workfile':
|
||||
anatomy = context.data['anatomy']
|
||||
template_data = instance.data.get("anatomyData")
|
||||
rep_name = instance.data.get("representations")[0].get("name")
|
||||
template_data["representation"] = rep_name
|
||||
template_data["ext"] = rep_name
|
||||
anatomy_filled = anatomy.format(template_data)
|
||||
template_filled = anatomy_filled["publish"]["path"]
|
||||
filepath = os.path.normpath(template_filled)
|
||||
self.log.info("Using published scene for render {}".format(
|
||||
filepath))
|
||||
|
||||
if not filepath:
|
||||
self.log.info("Texture batch doesn't contain workfile.")
|
||||
return
|
||||
|
||||
# then apply to all textures
|
||||
for instance in context:
|
||||
if instance.data["family"] == 'textures':
|
||||
instance.data["versionData"]["workfile"] = filepath
|
||||
|
|
@ -0,0 +1,22 @@
|
|||
import pyblish.api
|
||||
import openpype.api
|
||||
|
||||
|
||||
class ValidateTextureBatch(pyblish.api.InstancePlugin):
|
||||
"""Validates that some texture files are present."""
|
||||
|
||||
label = "Validate Texture Presence"
|
||||
hosts = ["standalonepublisher"]
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
families = ["workfile"]
|
||||
optional = False
|
||||
|
||||
def process(self, instance):
|
||||
present = False
|
||||
for instance in instance.context:
|
||||
if instance.data["family"] == "textures":
|
||||
self.log.info("Some textures present.")
|
||||
|
||||
return
|
||||
|
||||
assert present, "No textures found in published batch!"
|
||||
|
|
@ -0,0 +1,20 @@
|
|||
import pyblish.api
|
||||
import openpype.api
|
||||
|
||||
|
||||
class ValidateTextureHasWorkfile(pyblish.api.InstancePlugin):
|
||||
"""Validates that textures have appropriate workfile attached.
|
||||
|
||||
Workfile is optional, disable this Validator after Refresh if you are
|
||||
sure it is not needed.
|
||||
"""
|
||||
label = "Validate Texture Has Workfile"
|
||||
hosts = ["standalonepublisher"]
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
families = ["textures"]
|
||||
optional = True
|
||||
|
||||
def process(self, instance):
|
||||
wfile = instance.data["versionData"].get("workfile")
|
||||
|
||||
assert wfile, "Textures are missing attached workfile"
|
||||
|
|
@ -0,0 +1,50 @@
|
|||
import pyblish.api
|
||||
import openpype.api
|
||||
|
||||
|
||||
class ValidateTextureBatchNaming(pyblish.api.InstancePlugin):
|
||||
"""Validates that all instances had properly formatted name."""
|
||||
|
||||
label = "Validate Texture Batch Naming"
|
||||
hosts = ["standalonepublisher"]
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
families = ["workfile", "textures"]
|
||||
optional = False
|
||||
|
||||
def process(self, instance):
|
||||
file_name = instance.data["representations"][0]["files"]
|
||||
if isinstance(file_name, list):
|
||||
file_name = file_name[0]
|
||||
|
||||
msg = "Couldnt find asset name in '{}'\n".format(file_name) + \
|
||||
"File name doesn't follow configured pattern.\n" + \
|
||||
"Please rename the file."
|
||||
assert "NOT_AVAIL" not in instance.data["asset_build"], msg
|
||||
|
||||
instance.data.pop("asset_build")
|
||||
|
||||
if instance.data["family"] == "textures":
|
||||
file_name = instance.data["representations"][0]["files"][0]
|
||||
self._check_proper_collected(instance.data["versionData"],
|
||||
file_name)
|
||||
|
||||
def _check_proper_collected(self, versionData, file_name):
|
||||
"""
|
||||
Loop through collected versionData to check if name parsing was OK.
|
||||
Args:
|
||||
versionData: (dict)
|
||||
|
||||
Returns:
|
||||
raises AssertionException
|
||||
"""
|
||||
missing_key_values = []
|
||||
for key, value in versionData.items():
|
||||
if not value:
|
||||
missing_key_values.append(key)
|
||||
|
||||
msg = "Collected data {} doesn't contain values for {}".format(
|
||||
versionData, missing_key_values) + "\n" + \
|
||||
"Name of the texture file doesn't match expected pattern.\n" + \
|
||||
"Please rename file(s) {}".format(file_name)
|
||||
|
||||
assert not missing_key_values, msg
|
||||
|
|
@ -0,0 +1,38 @@
|
|||
import pyblish.api
|
||||
import openpype.api
|
||||
|
||||
|
||||
class ValidateTextureBatchVersions(pyblish.api.InstancePlugin):
|
||||
"""Validates that versions match in workfile and textures.
|
||||
|
||||
Workfile is optional, so if you are sure, you can disable this
|
||||
validator after Refresh.
|
||||
|
||||
Validates that only single version is published at a time.
|
||||
"""
|
||||
label = "Validate Texture Batch Versions"
|
||||
hosts = ["standalonepublisher"]
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
families = ["textures"]
|
||||
optional = False
|
||||
|
||||
def process(self, instance):
|
||||
wfile = instance.data["versionData"].get("workfile")
|
||||
|
||||
version_str = "v{:03d}".format(instance.data["version"])
|
||||
|
||||
if not wfile: # no matching workfile, do not check versions
|
||||
self.log.info("No workfile present for textures")
|
||||
return
|
||||
|
||||
msg = "Not matching version: texture v{:03d} - workfile {}"
|
||||
assert version_str in wfile, \
|
||||
msg.format(
|
||||
instance.data["version"], wfile
|
||||
)
|
||||
|
||||
present_versions = set()
|
||||
for instance in instance.context:
|
||||
present_versions.add(instance.data["version"])
|
||||
|
||||
assert len(present_versions) == 1, "Too many versions in a batch!"
|
||||
|
|
@ -0,0 +1,29 @@
|
|||
import pyblish.api
|
||||
import openpype.api
|
||||
|
||||
|
||||
class ValidateTextureBatchWorkfiles(pyblish.api.InstancePlugin):
|
||||
"""Validates that textures workfile has collected resources (optional).
|
||||
|
||||
Collected recourses means secondary workfiles (in most cases).
|
||||
"""
|
||||
|
||||
label = "Validate Texture Workfile Has Resources"
|
||||
hosts = ["standalonepublisher"]
|
||||
order = openpype.api.ValidateContentsOrder
|
||||
families = ["workfile"]
|
||||
optional = True
|
||||
|
||||
# from presets
|
||||
main_workfile_extensions = ['mra']
|
||||
|
||||
def process(self, instance):
|
||||
if instance.data["family"] == "workfile":
|
||||
ext = instance.data["representations"][0]["ext"]
|
||||
if ext not in self.main_workfile_extensions:
|
||||
self.log.warning("Only secondary workfile present!")
|
||||
return
|
||||
|
||||
msg = "No secondary workfiles present for workfile {}".\
|
||||
format(instance.data["name"])
|
||||
assert instance.data.get("resources"), msg
|
||||
|
|
@ -155,6 +155,7 @@ class CollectWorkfileData(pyblish.api.ContextPlugin):
|
|||
"sceneMarkInState": mark_in_state == "set",
|
||||
"sceneMarkOut": int(mark_out_frame),
|
||||
"sceneMarkOutState": mark_out_state == "set",
|
||||
"sceneStartFrame": int(lib.execute_george("tv_startframe")),
|
||||
"sceneBgColor": self._get_bg_color()
|
||||
}
|
||||
self.log.debug(
|
||||
|
|
|
|||
|
|
@ -49,6 +49,14 @@ class ExtractSequence(pyblish.api.Extractor):
|
|||
family_lowered = instance.data["family"].lower()
|
||||
mark_in = instance.context.data["sceneMarkIn"]
|
||||
mark_out = instance.context.data["sceneMarkOut"]
|
||||
|
||||
# Scene start frame offsets the output files, so we need to offset the
|
||||
# marks.
|
||||
scene_start_frame = instance.context.data["sceneStartFrame"]
|
||||
difference = scene_start_frame - mark_in
|
||||
mark_in += difference
|
||||
mark_out += difference
|
||||
|
||||
# Frame start/end may be stored as float
|
||||
frame_start = int(instance.data["frameStart"])
|
||||
frame_end = int(instance.data["frameEnd"])
|
||||
|
|
@ -98,7 +106,7 @@ class ExtractSequence(pyblish.api.Extractor):
|
|||
self.log.warning((
|
||||
"Lowering representation range to {} frames."
|
||||
" Changed frame end {} -> {}"
|
||||
).format(output_range + 1, mark_out, new_mark_out))
|
||||
).format(output_range + 1, mark_out, new_output_frame_end))
|
||||
output_frame_end = new_output_frame_end
|
||||
|
||||
# -------------------------------------------------------------------
|
||||
|
|
|
|||
|
|
@ -0,0 +1,27 @@
|
|||
import pyblish.api
|
||||
from avalon.tvpaint import lib
|
||||
|
||||
|
||||
class RepairStartFrame(pyblish.api.Action):
|
||||
"""Repair start frame."""
|
||||
|
||||
label = "Repair"
|
||||
icon = "wrench"
|
||||
on = "failed"
|
||||
|
||||
def process(self, context, plugin):
|
||||
lib.execute_george("tv_startframe 0")
|
||||
|
||||
|
||||
class ValidateStartFrame(pyblish.api.ContextPlugin):
|
||||
"""Validate start frame being at frame 0."""
|
||||
|
||||
label = "Validate Start Frame"
|
||||
order = pyblish.api.ValidatorOrder
|
||||
hosts = ["tvpaint"]
|
||||
actions = [RepairStartFrame]
|
||||
optional = True
|
||||
|
||||
def process(self, context):
|
||||
start_frame = lib.execute_george("tv_startframe")
|
||||
assert int(start_frame) == 0, "Start frame has to be frame 0."
|
||||
|
|
@ -1,4 +1,5 @@
|
|||
import os
|
||||
import sys
|
||||
import re
|
||||
import copy
|
||||
import json
|
||||
|
|
@ -708,6 +709,10 @@ class ApplicationLaunchContext:
|
|||
)
|
||||
self.kwargs["creationflags"] = flags
|
||||
|
||||
if not sys.stdout:
|
||||
self.kwargs["stdout"] = subprocess.DEVNULL
|
||||
self.kwargs["stderr"] = subprocess.DEVNULL
|
||||
|
||||
self.prelaunch_hooks = None
|
||||
self.postlaunch_hooks = None
|
||||
|
||||
|
|
|
|||
|
|
@ -1,6 +1,8 @@
|
|||
import json
|
||||
|
||||
from avalon.api import AvalonMongoDB
|
||||
from openpype.api import ProjectSettings
|
||||
from openpype.lib import create_project
|
||||
|
||||
from openpype.modules.ftrack.lib import (
|
||||
ServerAction,
|
||||
|
|
@ -21,8 +23,24 @@ class PrepareProjectServer(ServerAction):
|
|||
|
||||
role_list = ["Pypeclub", "Administrator", "Project Manager"]
|
||||
|
||||
# Key to store info about trigerring create folder structure
|
||||
settings_key = "prepare_project"
|
||||
|
||||
item_splitter = {"type": "label", "value": "---"}
|
||||
_keys_order = (
|
||||
"fps",
|
||||
"frameStart",
|
||||
"frameEnd",
|
||||
"handleStart",
|
||||
"handleEnd",
|
||||
"clipIn",
|
||||
"clipOut",
|
||||
"resolutionHeight",
|
||||
"resolutionWidth",
|
||||
"pixelAspect",
|
||||
"applications",
|
||||
"tools_env",
|
||||
"library_project",
|
||||
)
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
"""Show only on project."""
|
||||
|
|
@ -47,13 +65,7 @@ class PrepareProjectServer(ServerAction):
|
|||
project_entity = entities[0]
|
||||
project_name = project_entity["full_name"]
|
||||
|
||||
try:
|
||||
project_settings = ProjectSettings(project_name)
|
||||
except ValueError:
|
||||
return {
|
||||
"message": "Project is not synchronized yet",
|
||||
"success": False
|
||||
}
|
||||
project_settings = ProjectSettings(project_name)
|
||||
|
||||
project_anatom_settings = project_settings["project_anatomy"]
|
||||
root_items = self.prepare_root_items(project_anatom_settings)
|
||||
|
|
@ -78,14 +90,13 @@ class PrepareProjectServer(ServerAction):
|
|||
|
||||
items.extend(ca_items)
|
||||
|
||||
# This item will be last (before enumerators)
|
||||
# - sets value of auto synchronization
|
||||
auto_sync_name = "avalon_auto_sync"
|
||||
# This item will be last before enumerators
|
||||
# Set value of auto synchronization
|
||||
auto_sync_value = project_entity["custom_attributes"].get(
|
||||
CUST_ATTR_AUTO_SYNC, False
|
||||
)
|
||||
auto_sync_item = {
|
||||
"name": auto_sync_name,
|
||||
"name": CUST_ATTR_AUTO_SYNC,
|
||||
"type": "boolean",
|
||||
"value": auto_sync_value,
|
||||
"label": "AutoSync to Avalon"
|
||||
|
|
@ -199,7 +210,18 @@ class PrepareProjectServer(ServerAction):
|
|||
str([key for key in attributes_to_set])
|
||||
))
|
||||
|
||||
for key, in_data in attributes_to_set.items():
|
||||
attribute_keys = set(attributes_to_set.keys())
|
||||
keys_order = []
|
||||
for key in self._keys_order:
|
||||
if key in attribute_keys:
|
||||
keys_order.append(key)
|
||||
|
||||
attribute_keys = attribute_keys - set(keys_order)
|
||||
for key in sorted(attribute_keys):
|
||||
keys_order.append(key)
|
||||
|
||||
for key in keys_order:
|
||||
in_data = attributes_to_set[key]
|
||||
attr = in_data["object"]
|
||||
|
||||
# initial item definition
|
||||
|
|
@ -225,7 +247,7 @@ class PrepareProjectServer(ServerAction):
|
|||
multiselect_enumerators.append(self.item_splitter)
|
||||
multiselect_enumerators.append({
|
||||
"type": "label",
|
||||
"value": in_data["label"]
|
||||
"value": "<h3>{}</h3>".format(in_data["label"])
|
||||
})
|
||||
|
||||
default = in_data["default"]
|
||||
|
|
@ -286,10 +308,10 @@ class PrepareProjectServer(ServerAction):
|
|||
return items, multiselect_enumerators
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
if not event['data'].get('values', {}):
|
||||
in_data = event["data"].get("values")
|
||||
if not in_data:
|
||||
return
|
||||
|
||||
in_data = event['data']['values']
|
||||
|
||||
root_values = {}
|
||||
root_key = "__root__"
|
||||
|
|
@ -337,7 +359,27 @@ class PrepareProjectServer(ServerAction):
|
|||
|
||||
self.log.debug("Setting Custom Attribute values")
|
||||
|
||||
project_name = entities[0]["full_name"]
|
||||
project_entity = entities[0]
|
||||
project_name = project_entity["full_name"]
|
||||
|
||||
# Try to find project document
|
||||
dbcon = AvalonMongoDB()
|
||||
dbcon.install()
|
||||
dbcon.Session["AVALON_PROJECT"] = project_name
|
||||
project_doc = dbcon.find_one({
|
||||
"type": "project"
|
||||
})
|
||||
# Create project if is not available
|
||||
# - creation is required to be able set project anatomy and attributes
|
||||
if not project_doc:
|
||||
project_code = project_entity["name"]
|
||||
self.log.info("Creating project \"{} [{}]\"".format(
|
||||
project_name, project_code
|
||||
))
|
||||
create_project(project_name, project_code, dbcon=dbcon)
|
||||
|
||||
dbcon.uninstall()
|
||||
|
||||
project_settings = ProjectSettings(project_name)
|
||||
project_anatomy_settings = project_settings["project_anatomy"]
|
||||
project_anatomy_settings["roots"] = root_data
|
||||
|
|
@ -352,10 +394,12 @@ class PrepareProjectServer(ServerAction):
|
|||
|
||||
project_settings.save()
|
||||
|
||||
entity = entities[0]
|
||||
for key, value in custom_attribute_values.items():
|
||||
entity["custom_attributes"][key] = value
|
||||
self.log.debug("- Key \"{}\" set to \"{}\"".format(key, value))
|
||||
# Change custom attributes on project
|
||||
if custom_attribute_values:
|
||||
for key, value in custom_attribute_values.items():
|
||||
project_entity["custom_attributes"][key] = value
|
||||
self.log.debug("- Key \"{}\" set to \"{}\"".format(key, value))
|
||||
session.commit()
|
||||
|
||||
return True
|
||||
|
||||
|
|
|
|||
|
|
@ -1,3 +1,4 @@
|
|||
import sys
|
||||
import json
|
||||
import collections
|
||||
import ftrack_api
|
||||
|
|
@ -90,27 +91,28 @@ class PushHierValuesToNonHier(ServerAction):
|
|||
|
||||
try:
|
||||
result = self.propagate_values(session, event, entities)
|
||||
job["status"] = "done"
|
||||
session.commit()
|
||||
|
||||
return result
|
||||
|
||||
except Exception:
|
||||
session.rollback()
|
||||
job["status"] = "failed"
|
||||
session.commit()
|
||||
|
||||
except Exception as exc:
|
||||
msg = "Pushing Custom attribute values to task Failed"
|
||||
|
||||
self.log.warning(msg, exc_info=True)
|
||||
|
||||
session.rollback()
|
||||
|
||||
description = "{} (Download traceback)".format(msg)
|
||||
self.add_traceback_to_job(
|
||||
job, session, sys.exc_info(), description
|
||||
)
|
||||
|
||||
return {
|
||||
"success": False,
|
||||
"message": msg
|
||||
"message": "Error: {}".format(str(exc))
|
||||
}
|
||||
|
||||
finally:
|
||||
if job["status"] == "running":
|
||||
job["status"] = "failed"
|
||||
session.commit()
|
||||
job["status"] = "done"
|
||||
session.commit()
|
||||
|
||||
return result
|
||||
|
||||
def attrs_configurations(self, session, object_ids, interest_attributes):
|
||||
attrs = session.query(self.cust_attrs_query.format(
|
||||
|
|
|
|||
|
|
@ -1259,7 +1259,7 @@ class SyncToAvalonEvent(BaseEvent):
|
|||
self.process_session,
|
||||
entity,
|
||||
hier_attrs,
|
||||
self.cust_attr_types_by_id
|
||||
self.cust_attr_types_by_id.values()
|
||||
)
|
||||
for key, val in hier_values.items():
|
||||
output[key] = val
|
||||
|
|
|
|||
|
|
@ -1,5 +1,6 @@
|
|||
import os
|
||||
import re
|
||||
import json
|
||||
|
||||
from openpype.modules.ftrack.lib import BaseAction, statics_icon
|
||||
from openpype.api import Anatomy, get_project_settings
|
||||
|
|
@ -84,6 +85,9 @@ class CreateProjectFolders(BaseAction):
|
|||
}
|
||||
|
||||
try:
|
||||
if isinstance(project_folder_structure, str):
|
||||
project_folder_structure = json.loads(project_folder_structure)
|
||||
|
||||
# Get paths based on presets
|
||||
basic_paths = self.get_path_items(project_folder_structure)
|
||||
self.create_folders(basic_paths, project_entity)
|
||||
|
|
|
|||
|
|
@ -1,6 +1,8 @@
|
|||
import json
|
||||
|
||||
from avalon.api import AvalonMongoDB
|
||||
from openpype.api import ProjectSettings
|
||||
from openpype.lib import create_project
|
||||
|
||||
from openpype.modules.ftrack.lib import (
|
||||
BaseAction,
|
||||
|
|
@ -23,7 +25,24 @@ class PrepareProjectLocal(BaseAction):
|
|||
settings_key = "prepare_project"
|
||||
|
||||
# Key to store info about trigerring create folder structure
|
||||
create_project_structure_key = "create_folder_structure"
|
||||
create_project_structure_identifier = "create.project.structure"
|
||||
item_splitter = {"type": "label", "value": "---"}
|
||||
_keys_order = (
|
||||
"fps",
|
||||
"frameStart",
|
||||
"frameEnd",
|
||||
"handleStart",
|
||||
"handleEnd",
|
||||
"clipIn",
|
||||
"clipOut",
|
||||
"resolutionHeight",
|
||||
"resolutionWidth",
|
||||
"pixelAspect",
|
||||
"applications",
|
||||
"tools_env",
|
||||
"library_project",
|
||||
)
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
"""Show only on project."""
|
||||
|
|
@ -48,13 +67,7 @@ class PrepareProjectLocal(BaseAction):
|
|||
project_entity = entities[0]
|
||||
project_name = project_entity["full_name"]
|
||||
|
||||
try:
|
||||
project_settings = ProjectSettings(project_name)
|
||||
except ValueError:
|
||||
return {
|
||||
"message": "Project is not synchronized yet",
|
||||
"success": False
|
||||
}
|
||||
project_settings = ProjectSettings(project_name)
|
||||
|
||||
project_anatom_settings = project_settings["project_anatomy"]
|
||||
root_items = self.prepare_root_items(project_anatom_settings)
|
||||
|
|
@ -79,14 +92,12 @@ class PrepareProjectLocal(BaseAction):
|
|||
|
||||
items.extend(ca_items)
|
||||
|
||||
# This item will be last (before enumerators)
|
||||
# - sets value of auto synchronization
|
||||
auto_sync_name = "avalon_auto_sync"
|
||||
# Set value of auto synchronization
|
||||
auto_sync_value = project_entity["custom_attributes"].get(
|
||||
CUST_ATTR_AUTO_SYNC, False
|
||||
)
|
||||
auto_sync_item = {
|
||||
"name": auto_sync_name,
|
||||
"name": CUST_ATTR_AUTO_SYNC,
|
||||
"type": "boolean",
|
||||
"value": auto_sync_value,
|
||||
"label": "AutoSync to Avalon"
|
||||
|
|
@ -94,6 +105,27 @@ class PrepareProjectLocal(BaseAction):
|
|||
# Add autosync attribute
|
||||
items.append(auto_sync_item)
|
||||
|
||||
# This item will be last before enumerators
|
||||
# Ask if want to trigger Action Create Folder Structure
|
||||
create_project_structure_checked = (
|
||||
project_settings
|
||||
["project_settings"]
|
||||
["ftrack"]
|
||||
["user_handlers"]
|
||||
["prepare_project"]
|
||||
["create_project_structure_checked"]
|
||||
).value
|
||||
items.append({
|
||||
"type": "label",
|
||||
"value": "<h3>Want to create basic Folder Structure?</h3>"
|
||||
})
|
||||
items.append({
|
||||
"name": self.create_project_structure_key,
|
||||
"type": "boolean",
|
||||
"value": create_project_structure_checked,
|
||||
"label": "Check if Yes"
|
||||
})
|
||||
|
||||
# Add enumerator items at the end
|
||||
for item in multiselect_enumerators:
|
||||
items.append(item)
|
||||
|
|
@ -200,7 +232,18 @@ class PrepareProjectLocal(BaseAction):
|
|||
str([key for key in attributes_to_set])
|
||||
))
|
||||
|
||||
for key, in_data in attributes_to_set.items():
|
||||
attribute_keys = set(attributes_to_set.keys())
|
||||
keys_order = []
|
||||
for key in self._keys_order:
|
||||
if key in attribute_keys:
|
||||
keys_order.append(key)
|
||||
|
||||
attribute_keys = attribute_keys - set(keys_order)
|
||||
for key in sorted(attribute_keys):
|
||||
keys_order.append(key)
|
||||
|
||||
for key in keys_order:
|
||||
in_data = attributes_to_set[key]
|
||||
attr = in_data["object"]
|
||||
|
||||
# initial item definition
|
||||
|
|
@ -226,7 +269,7 @@ class PrepareProjectLocal(BaseAction):
|
|||
multiselect_enumerators.append(self.item_splitter)
|
||||
multiselect_enumerators.append({
|
||||
"type": "label",
|
||||
"value": in_data["label"]
|
||||
"value": "<h3>{}</h3>".format(in_data["label"])
|
||||
})
|
||||
|
||||
default = in_data["default"]
|
||||
|
|
@ -287,10 +330,13 @@ class PrepareProjectLocal(BaseAction):
|
|||
return items, multiselect_enumerators
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
if not event['data'].get('values', {}):
|
||||
in_data = event["data"].get("values")
|
||||
if not in_data:
|
||||
return
|
||||
|
||||
in_data = event['data']['values']
|
||||
create_project_structure_checked = in_data.pop(
|
||||
self.create_project_structure_key
|
||||
)
|
||||
|
||||
root_values = {}
|
||||
root_key = "__root__"
|
||||
|
|
@ -338,7 +384,27 @@ class PrepareProjectLocal(BaseAction):
|
|||
|
||||
self.log.debug("Setting Custom Attribute values")
|
||||
|
||||
project_name = entities[0]["full_name"]
|
||||
project_entity = entities[0]
|
||||
project_name = project_entity["full_name"]
|
||||
|
||||
# Try to find project document
|
||||
dbcon = AvalonMongoDB()
|
||||
dbcon.install()
|
||||
dbcon.Session["AVALON_PROJECT"] = project_name
|
||||
project_doc = dbcon.find_one({
|
||||
"type": "project"
|
||||
})
|
||||
# Create project if is not available
|
||||
# - creation is required to be able set project anatomy and attributes
|
||||
if not project_doc:
|
||||
project_code = project_entity["name"]
|
||||
self.log.info("Creating project \"{} [{}]\"".format(
|
||||
project_name, project_code
|
||||
))
|
||||
create_project(project_name, project_code, dbcon=dbcon)
|
||||
|
||||
dbcon.uninstall()
|
||||
|
||||
project_settings = ProjectSettings(project_name)
|
||||
project_anatomy_settings = project_settings["project_anatomy"]
|
||||
project_anatomy_settings["roots"] = root_data
|
||||
|
|
@ -353,11 +419,18 @@ class PrepareProjectLocal(BaseAction):
|
|||
|
||||
project_settings.save()
|
||||
|
||||
entity = entities[0]
|
||||
for key, value in custom_attribute_values.items():
|
||||
entity["custom_attributes"][key] = value
|
||||
self.log.debug("- Key \"{}\" set to \"{}\"".format(key, value))
|
||||
# Change custom attributes on project
|
||||
if custom_attribute_values:
|
||||
for key, value in custom_attribute_values.items():
|
||||
project_entity["custom_attributes"][key] = value
|
||||
self.log.debug("- Key \"{}\" set to \"{}\"".format(key, value))
|
||||
session.commit()
|
||||
|
||||
# Trigger create project structure action
|
||||
if create_project_structure_checked:
|
||||
self.trigger_action(
|
||||
self.create_project_structure_identifier, event
|
||||
)
|
||||
return True
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -1,4 +1,9 @@
|
|||
import os
|
||||
import tempfile
|
||||
import json
|
||||
import functools
|
||||
import datetime
|
||||
import traceback
|
||||
import time
|
||||
from openpype.api import Logger
|
||||
from openpype.settings import get_project_settings
|
||||
|
|
@ -583,3 +588,105 @@ class BaseHandler(object):
|
|||
return "/".join(
|
||||
[ent["name"] for ent in entity["link"]]
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def add_traceback_to_job(
|
||||
cls, job, session, exc_info,
|
||||
description=None,
|
||||
component_name=None,
|
||||
job_status=None
|
||||
):
|
||||
"""Add traceback file to a job.
|
||||
|
||||
Args:
|
||||
job (JobEntity): Entity of job where file should be able to
|
||||
download (Created or queried with passed session).
|
||||
session (Session): Ftrack session which was used to query/create
|
||||
entered job.
|
||||
exc_info (tuple): Exception info (e.g. from `sys.exc_info()`).
|
||||
description (str): Change job description to describe what
|
||||
happened. Job description won't change if not passed.
|
||||
component_name (str): Name of component and default name of
|
||||
downloaded file. Class name and current date time are used if
|
||||
not specified.
|
||||
job_status (str): Status of job which will be set. By default is
|
||||
set to 'failed'.
|
||||
"""
|
||||
if description:
|
||||
job_data = {
|
||||
"description": description
|
||||
}
|
||||
job["data"] = json.dumps(job_data)
|
||||
|
||||
if not job_status:
|
||||
job_status = "failed"
|
||||
|
||||
job["status"] = job_status
|
||||
|
||||
# Create temp file where traceback will be stored
|
||||
temp_obj = tempfile.NamedTemporaryFile(
|
||||
mode="w", prefix="openpype_ftrack_", suffix=".txt", delete=False
|
||||
)
|
||||
temp_obj.close()
|
||||
temp_filepath = temp_obj.name
|
||||
|
||||
# Store traceback to file
|
||||
result = traceback.format_exception(*exc_info)
|
||||
with open(temp_filepath, "w") as temp_file:
|
||||
temp_file.write("".join(result))
|
||||
|
||||
# Upload file with traceback to ftrack server and add it to job
|
||||
if not component_name:
|
||||
component_name = "{}_{}".format(
|
||||
cls.__name__,
|
||||
datetime.datetime.now().strftime("%y-%m-%d-%H%M")
|
||||
)
|
||||
cls.add_file_component_to_job(
|
||||
job, session, temp_filepath, component_name
|
||||
)
|
||||
# Delete temp file
|
||||
os.remove(temp_filepath)
|
||||
|
||||
@staticmethod
|
||||
def add_file_component_to_job(job, session, filepath, basename=None):
|
||||
"""Add filepath as downloadable component to job.
|
||||
|
||||
Args:
|
||||
job (JobEntity): Entity of job where file should be able to
|
||||
download (Created or queried with passed session).
|
||||
session (Session): Ftrack session which was used to query/create
|
||||
entered job.
|
||||
filepath (str): Path to file which should be added to job.
|
||||
basename (str): Defines name of file which will be downloaded on
|
||||
user's side. Must be without extension otherwise extension will
|
||||
be duplicated in downloaded name. Basename from entered path
|
||||
used when not entered.
|
||||
"""
|
||||
# Make sure session's locations are configured
|
||||
# - they can be deconfigured e.g. using `rollback` method
|
||||
session._configure_locations()
|
||||
|
||||
# Query `ftrack.server` location where component will be stored
|
||||
location = session.query(
|
||||
"Location where name is \"ftrack.server\""
|
||||
).one()
|
||||
|
||||
# Use filename as basename if not entered (must be without extension)
|
||||
if basename is None:
|
||||
basename = os.path.splitext(
|
||||
os.path.basename(filepath)
|
||||
)[0]
|
||||
|
||||
component = session.create_component(
|
||||
filepath,
|
||||
data={"name": basename},
|
||||
location=location
|
||||
)
|
||||
session.create(
|
||||
"JobComponent",
|
||||
{
|
||||
"component_id": component["id"],
|
||||
"job_id": job["id"]
|
||||
}
|
||||
)
|
||||
session.commit()
|
||||
|
|
|
|||
|
|
@ -380,7 +380,12 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
|||
|
||||
test_dest_files = list()
|
||||
for i in [1, 2]:
|
||||
template_data["frame"] = src_padding_exp % i
|
||||
template_data["representation"] = repre['ext']
|
||||
if not repre.get("udim"):
|
||||
template_data["frame"] = src_padding_exp % i
|
||||
else:
|
||||
template_data["udim"] = src_padding_exp % i
|
||||
|
||||
anatomy_filled = anatomy.format(template_data)
|
||||
template_filled = anatomy_filled[template_name]["path"]
|
||||
if repre_context is None:
|
||||
|
|
@ -388,7 +393,10 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
|||
test_dest_files.append(
|
||||
os.path.normpath(template_filled)
|
||||
)
|
||||
template_data["frame"] = repre_context["frame"]
|
||||
if not repre.get("udim"):
|
||||
template_data["frame"] = repre_context["frame"]
|
||||
else:
|
||||
template_data["udim"] = repre_context["udim"]
|
||||
|
||||
self.log.debug(
|
||||
"test_dest_files: {}".format(str(test_dest_files)))
|
||||
|
|
@ -453,7 +461,9 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
|||
dst_start_frame = dst_padding
|
||||
|
||||
# Store used frame value to template data
|
||||
template_data["frame"] = dst_start_frame
|
||||
if repre.get("frame"):
|
||||
template_data["frame"] = dst_start_frame
|
||||
|
||||
dst = "{0}{1}{2}".format(
|
||||
dst_head,
|
||||
dst_start_frame,
|
||||
|
|
@ -476,6 +486,10 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
|||
"Given file name is a full path"
|
||||
)
|
||||
|
||||
template_data["representation"] = repre['ext']
|
||||
# Store used frame value to template data
|
||||
if repre.get("udim"):
|
||||
template_data["udim"] = repre["udim"][0]
|
||||
src = os.path.join(stagingdir, fname)
|
||||
anatomy_filled = anatomy.format(template_data)
|
||||
template_filled = anatomy_filled[template_name]["path"]
|
||||
|
|
@ -488,6 +502,9 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
|||
repre['published_path'] = dst
|
||||
self.log.debug("__ dst: {}".format(dst))
|
||||
|
||||
if repre.get("udim"):
|
||||
repre_context["udim"] = repre.get("udim") # store list
|
||||
|
||||
repre["publishedFiles"] = published_files
|
||||
|
||||
for key in self.db_representation_context_keys:
|
||||
|
|
@ -1045,6 +1062,7 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
|||
)
|
||||
)
|
||||
shutil.copy(file_url, new_name)
|
||||
os.remove(file_url)
|
||||
else:
|
||||
self.log.debug(
|
||||
"Renaming file {} to {}".format(
|
||||
|
|
|
|||
|
|
@ -17,7 +17,7 @@
|
|||
},
|
||||
"publish": {
|
||||
"folder": "{root[work]}/{project[name]}/{hierarchy}/{asset}/publish/{family}/{subset}/{@version}",
|
||||
"file": "{project[code]}_{asset}_{subset}_{@version}<_{output}><.{@frame}>.{ext}",
|
||||
"file": "{project[code]}_{asset}_{subset}_{@version}<_{output}><.{@frame}><_{udim}>.{ext}",
|
||||
"path": "{@folder}/{@file}",
|
||||
"thumbnail": "{thumbnail_root}/{project[name]}/{_id}_{thumbnail_type}.{ext}"
|
||||
},
|
||||
|
|
|
|||
|
|
@ -136,7 +136,8 @@
|
|||
"Pypeclub",
|
||||
"Administrator",
|
||||
"Project manager"
|
||||
]
|
||||
],
|
||||
"create_project_structure_checked": false
|
||||
},
|
||||
"clean_hierarchical_attr": {
|
||||
"enabled": true,
|
||||
|
|
|
|||
|
|
@ -172,6 +172,8 @@
|
|||
"deadline_group": "",
|
||||
"deadline_chunk_size": 1,
|
||||
"deadline_priority": 50,
|
||||
"publishing_script": "",
|
||||
"skip_integration_repre_list": [],
|
||||
"aov_filter": {
|
||||
"maya": [
|
||||
".+(?:\\.|_)([Bb]eauty)(?:\\.|_).*"
|
||||
|
|
@ -184,6 +186,10 @@
|
|||
".*"
|
||||
]
|
||||
}
|
||||
},
|
||||
"CleanUp": {
|
||||
"paterns": [],
|
||||
"remove_temp_renders": false
|
||||
}
|
||||
},
|
||||
"tools": {
|
||||
|
|
@ -271,28 +277,7 @@
|
|||
}
|
||||
}
|
||||
},
|
||||
"project_folder_structure": {
|
||||
"__project_root__": {
|
||||
"prod": {},
|
||||
"resources": {
|
||||
"footage": {
|
||||
"plates": {},
|
||||
"offline": {}
|
||||
},
|
||||
"audio": {},
|
||||
"art_dept": {}
|
||||
},
|
||||
"editorial": {},
|
||||
"assets[ftrack.Library]": {
|
||||
"characters[ftrack]": {},
|
||||
"locations[ftrack]": {}
|
||||
},
|
||||
"shots[ftrack.Sequence]": {
|
||||
"scripts": {},
|
||||
"editorial[ftrack.Folder]": {}
|
||||
}
|
||||
}
|
||||
},
|
||||
"project_folder_structure": "{\"__project_root__\": {\"prod\": {}, \"resources\": {\"footage\": {\"plates\": {}, \"offline\": {}}, \"audio\": {}, \"art_dept\": {}}, \"editorial\": {}, \"assets[ftrack.Library]\": {\"characters[ftrack]\": {}, \"locations[ftrack]\": {}}, \"shots[ftrack.Sequence]\": {\"scripts\": {}, \"editorial[ftrack.Folder]\": {}}}}",
|
||||
"sync_server": {
|
||||
"enabled": true,
|
||||
"config": {
|
||||
|
|
|
|||
|
|
@ -10,11 +10,22 @@
|
|||
},
|
||||
"create": {
|
||||
"CreateWriteRender": {
|
||||
"fpath_template": "{work}/renders/nuke/{subset}/{subset}.{frame}.{ext}"
|
||||
"fpath_template": "{work}/renders/nuke/{subset}/{subset}.{frame}.{ext}",
|
||||
"defaults": [
|
||||
"Main",
|
||||
"Mask"
|
||||
]
|
||||
},
|
||||
"CreateWritePrerender": {
|
||||
"fpath_template": "{work}/prerenders/nuke/{subset}/{subset}.{frame}.{ext}",
|
||||
"use_range_limit": true
|
||||
"use_range_limit": true,
|
||||
"defaults": [
|
||||
"Key01",
|
||||
"Bg01",
|
||||
"Fg01",
|
||||
"Branch01",
|
||||
"Part01"
|
||||
]
|
||||
}
|
||||
},
|
||||
"publish": {
|
||||
|
|
|
|||
|
|
@ -123,6 +123,16 @@
|
|||
],
|
||||
"help": "Process multiple Mov files and publish them for layout and comp."
|
||||
},
|
||||
"create_texture_batch": {
|
||||
"name": "texture_batch",
|
||||
"label": "Texture Batch",
|
||||
"family": "texture_batch",
|
||||
"icon": "image",
|
||||
"defaults": [
|
||||
"Main"
|
||||
],
|
||||
"help": "Texture files with UDIM together with worfile"
|
||||
},
|
||||
"__dynamic_keys_labels__": {
|
||||
"create_workfile": "Workfile",
|
||||
"create_model": "Model",
|
||||
|
|
@ -134,10 +144,65 @@
|
|||
"create_image": "Image",
|
||||
"create_matchmove": "Matchmove",
|
||||
"create_render": "Render",
|
||||
"create_mov_batch": "Batch Mov"
|
||||
"create_mov_batch": "Batch Mov",
|
||||
"create_texture_batch": "Batch Texture"
|
||||
}
|
||||
},
|
||||
"publish": {
|
||||
"CollectTextures": {
|
||||
"enabled": true,
|
||||
"active": true,
|
||||
"main_workfile_extensions": [
|
||||
"mra"
|
||||
],
|
||||
"other_workfile_extensions": [
|
||||
"spp",
|
||||
"psd"
|
||||
],
|
||||
"texture_extensions": [
|
||||
"exr",
|
||||
"dpx",
|
||||
"jpg",
|
||||
"jpeg",
|
||||
"png",
|
||||
"tiff",
|
||||
"tga",
|
||||
"gif",
|
||||
"svg"
|
||||
],
|
||||
"workfile_families": [],
|
||||
"texture_families": [],
|
||||
"color_space": [
|
||||
"linsRGB",
|
||||
"raw",
|
||||
"acesg"
|
||||
],
|
||||
"input_naming_patterns": {
|
||||
"workfile": [
|
||||
"^([^.]+)(_[^_.]*)?_v([0-9]{3,}).+"
|
||||
],
|
||||
"textures": [
|
||||
"^([^_.]+)_([^_.]+)_v([0-9]{3,})_([^_.]+)_({color_space})_(1[0-9]{3}).+"
|
||||
]
|
||||
},
|
||||
"input_naming_groups": {
|
||||
"workfile": [
|
||||
"asset",
|
||||
"filler",
|
||||
"version"
|
||||
],
|
||||
"textures": [
|
||||
"asset",
|
||||
"shader",
|
||||
"version",
|
||||
"channel",
|
||||
"color_space",
|
||||
"udim"
|
||||
]
|
||||
},
|
||||
"workfile_subset_template": "textures{Subset}Workfile",
|
||||
"texture_subset_template": "textures{Subset}_{Shader}_{Channel}"
|
||||
},
|
||||
"ValidateSceneSettings": {
|
||||
"enabled": true,
|
||||
"optional": true,
|
||||
|
|
@ -165,6 +230,58 @@
|
|||
],
|
||||
"output": []
|
||||
}
|
||||
},
|
||||
"CollectEditorial": {
|
||||
"source_dir": "",
|
||||
"extensions": [
|
||||
"mov",
|
||||
"mp4"
|
||||
]
|
||||
},
|
||||
"CollectHierarchyInstance": {
|
||||
"shot_rename_template": "{project[code]}_{_sequence_}_{_shot_}",
|
||||
"shot_rename_search_patterns": {
|
||||
"_sequence_": "(\\d{4})(?=_\\d{4})",
|
||||
"_shot_": "(\\d{4})(?!_\\d{4})"
|
||||
},
|
||||
"shot_add_hierarchy": {
|
||||
"parents_path": "{project}/{folder}/{sequence}",
|
||||
"parents": {
|
||||
"project": "{project[name]}",
|
||||
"sequence": "{_sequence_}",
|
||||
"folder": "shots"
|
||||
}
|
||||
},
|
||||
"shot_add_tasks": {}
|
||||
},
|
||||
"shot_add_tasks": {
|
||||
"custom_start_frame": 0,
|
||||
"timeline_frame_start": 900000,
|
||||
"timeline_frame_offset": 0,
|
||||
"subsets": {
|
||||
"referenceMain": {
|
||||
"family": "review",
|
||||
"families": [
|
||||
"clip"
|
||||
],
|
||||
"extensions": [
|
||||
"mp4"
|
||||
],
|
||||
"version": 0,
|
||||
"keepSequence": false
|
||||
},
|
||||
"audioMain": {
|
||||
"family": "audio",
|
||||
"families": [
|
||||
"clip"
|
||||
],
|
||||
"extensions": [
|
||||
"wav"
|
||||
],
|
||||
"version": 0,
|
||||
"keepSequence": false
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -18,6 +18,11 @@
|
|||
"optional": true,
|
||||
"active": true
|
||||
},
|
||||
"ValidateStartFrame": {
|
||||
"enabled": false,
|
||||
"optional": true,
|
||||
"active": true
|
||||
},
|
||||
"ValidateAssetName": {
|
||||
"enabled": true,
|
||||
"optional": true,
|
||||
|
|
|
|||
|
|
@ -139,7 +139,8 @@ class HostsEnumEntity(BaseEnumEntity):
|
|||
"photoshop",
|
||||
"resolve",
|
||||
"tvpaint",
|
||||
"unreal"
|
||||
"unreal",
|
||||
"standalonepublisher"
|
||||
]
|
||||
if self.use_empty_value:
|
||||
host_names.insert(0, "")
|
||||
|
|
|
|||
|
|
@ -1,5 +1,6 @@
|
|||
import re
|
||||
import copy
|
||||
import json
|
||||
from abc import abstractmethod
|
||||
|
||||
from .base_entity import ItemEntity
|
||||
|
|
@ -440,6 +441,7 @@ class RawJsonEntity(InputEntity):
|
|||
|
||||
def _item_initalization(self):
|
||||
# Schema must define if valid value is dict or list
|
||||
store_as_string = self.schema_data.get("store_as_string", False)
|
||||
is_list = self.schema_data.get("is_list", False)
|
||||
if is_list:
|
||||
valid_value_types = (list, )
|
||||
|
|
@ -448,6 +450,8 @@ class RawJsonEntity(InputEntity):
|
|||
valid_value_types = (dict, )
|
||||
value_on_not_set = {}
|
||||
|
||||
self.store_as_string = store_as_string
|
||||
|
||||
self._is_list = is_list
|
||||
self.valid_value_types = valid_value_types
|
||||
self.value_on_not_set = value_on_not_set
|
||||
|
|
@ -491,6 +495,23 @@ class RawJsonEntity(InputEntity):
|
|||
result = self.metadata != self._metadata_for_current_state()
|
||||
return result
|
||||
|
||||
def schema_validations(self):
|
||||
if self.store_as_string and self.is_env_group:
|
||||
reason = (
|
||||
"RawJson entity can't store environment group metadata"
|
||||
" as string."
|
||||
)
|
||||
raise EntitySchemaError(self, reason)
|
||||
super(RawJsonEntity, self).schema_validations()
|
||||
|
||||
def _convert_to_valid_type(self, value):
|
||||
if isinstance(value, STRING_TYPE):
|
||||
try:
|
||||
return json.loads(value)
|
||||
except Exception:
|
||||
pass
|
||||
return super(RawJsonEntity, self)._convert_to_valid_type(value)
|
||||
|
||||
def _metadata_for_current_state(self):
|
||||
if (
|
||||
self._override_state is OverrideState.PROJECT
|
||||
|
|
@ -510,6 +531,9 @@ class RawJsonEntity(InputEntity):
|
|||
value = super(RawJsonEntity, self)._settings_value()
|
||||
if self.is_env_group and isinstance(value, dict):
|
||||
value.update(self.metadata)
|
||||
|
||||
if self.store_as_string:
|
||||
return json.dumps(value)
|
||||
return value
|
||||
|
||||
def _prepare_value(self, value):
|
||||
|
|
|
|||
|
|
@ -337,6 +337,11 @@ How output of the schema could look like on save:
|
|||
- schema also defines valid value type
|
||||
- by default it is dictionary
|
||||
- to be able use list it is required to define `is_list` to `true`
|
||||
- output can be stored as string
|
||||
- this is to allow any keys in dictionary
|
||||
- set key `store_as_string` to `true`
|
||||
- code using that setting must expected that value is string and use json module to convert it to python types
|
||||
|
||||
```
|
||||
{
|
||||
"type": "raw-json",
|
||||
|
|
|
|||
|
|
@ -441,6 +441,18 @@
|
|||
"key": "role_list",
|
||||
"label": "Roles",
|
||||
"object_type": "text"
|
||||
},
|
||||
{
|
||||
"type": "separator"
|
||||
},
|
||||
{
|
||||
"type": "label",
|
||||
"label": "Check \"Create project structure\" by default"
|
||||
},
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "create_project_structure_checked",
|
||||
"label": "Checked"
|
||||
}
|
||||
]
|
||||
},
|
||||
|
|
|
|||
|
|
@ -17,7 +17,8 @@
|
|||
"type": "raw-json",
|
||||
"label": "Project Folder Structure",
|
||||
"key": "project_folder_structure",
|
||||
"use_label_wrap": true
|
||||
"use_label_wrap": true,
|
||||
"store_as_string": true
|
||||
},
|
||||
{
|
||||
"type": "schema",
|
||||
|
|
|
|||
|
|
@ -63,6 +63,14 @@
|
|||
"type": "text",
|
||||
"key": "fpath_template",
|
||||
"label": "Path template"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "defaults",
|
||||
"label": "Subset name defaults",
|
||||
"object_type": {
|
||||
"type": "text"
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
|
|
@ -82,6 +90,14 @@
|
|||
"type": "boolean",
|
||||
"key": "use_range_limit",
|
||||
"label": "Use Frame range limit by default"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "defaults",
|
||||
"label": "Subset name defaults",
|
||||
"object_type": {
|
||||
"type": "text"
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
|
|
|
|||
|
|
@ -56,6 +56,119 @@
|
|||
"key": "publish",
|
||||
"label": "Publish plugins",
|
||||
"children": [
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"key": "CollectTextures",
|
||||
"label": "Collect Textures",
|
||||
"checkbox_key": "enabled",
|
||||
"children": [
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "enabled",
|
||||
"label": "Enabled"
|
||||
},
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "active",
|
||||
"label": "Active"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "main_workfile_extensions",
|
||||
"object_type": "text",
|
||||
"label": "Main workfile extensions"
|
||||
},
|
||||
{
|
||||
"key": "other_workfile_extensions",
|
||||
"label": "Support workfile extensions",
|
||||
"type": "list",
|
||||
"object_type": "text"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "texture_extensions",
|
||||
"object_type": "text",
|
||||
"label": "Texture extensions"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "workfile_families",
|
||||
"object_type": "text",
|
||||
"label": "Additional families for workfile"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "texture_families",
|
||||
"object_type": "text",
|
||||
"label": "Additional families for textures"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "color_space",
|
||||
"object_type": "text",
|
||||
"label": "Color spaces"
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": false,
|
||||
"key": "input_naming_patterns",
|
||||
"label": "Regex patterns for naming conventions",
|
||||
"children": [
|
||||
{
|
||||
"type": "label",
|
||||
"label": "Add regex groups matching expected name"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"object_type": "text",
|
||||
"key": "workfile",
|
||||
"label": "Workfile naming pattern"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"object_type": "text",
|
||||
"key": "textures",
|
||||
"label": "Textures naming pattern"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": false,
|
||||
"key": "input_naming_groups",
|
||||
"label": "Group order for regex patterns",
|
||||
"children": [
|
||||
{
|
||||
"type": "label",
|
||||
"label": "Add names of matched groups in correct order. Available values: ('filler', 'asset', 'shader', 'version', 'channel', 'color_space', 'udim')"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"object_type": "text",
|
||||
"key": "workfile",
|
||||
"label": "Workfile group positions"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"object_type": "text",
|
||||
"key": "textures",
|
||||
"label": "Textures group positions"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "workfile_subset_template",
|
||||
"label": "Subset name template for workfile"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "texture_subset_template",
|
||||
"label": "Subset name template for textures"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
|
|
@ -130,6 +243,165 @@
|
|||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"key": "CollectEditorial",
|
||||
"label": "Collect Editorial",
|
||||
"is_group": true,
|
||||
"children": [
|
||||
{
|
||||
"type": "text",
|
||||
"key": "source_dir",
|
||||
"label": "Editorial resources pointer"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "extensions",
|
||||
"label": "Accepted extensions",
|
||||
"object_type": "text"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"key": "CollectHierarchyInstance",
|
||||
"label": "Collect Instance Hierarchy",
|
||||
"is_group": true,
|
||||
"children": [
|
||||
{
|
||||
"type": "text",
|
||||
"key": "shot_rename_template",
|
||||
"label": "Shot rename template"
|
||||
},
|
||||
{
|
||||
"key": "shot_rename_search_patterns",
|
||||
"label": "Shot renaming paterns search",
|
||||
"type": "dict-modifiable",
|
||||
"highlight_content": true,
|
||||
"object_type": {
|
||||
"type": "text"
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"key": "shot_add_hierarchy",
|
||||
"label": "Shot hierarchy",
|
||||
"children": [
|
||||
{
|
||||
"type": "text",
|
||||
"key": "parents_path",
|
||||
"label": "Parents path template"
|
||||
},
|
||||
{
|
||||
"key": "parents",
|
||||
"label": "Parents",
|
||||
"type": "dict-modifiable",
|
||||
"highlight_content": true,
|
||||
"object_type": {
|
||||
"type": "text"
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"key": "shot_add_tasks",
|
||||
"label": "Add tasks to shot",
|
||||
"type": "dict-modifiable",
|
||||
"highlight_content": true,
|
||||
"object_type": {
|
||||
"type": "dict",
|
||||
"children": [
|
||||
{
|
||||
"type": "task-types-enum",
|
||||
"key": "type",
|
||||
"label": "Task type"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"key": "shot_add_tasks",
|
||||
"label": "Collect Clip Instances",
|
||||
"is_group": true,
|
||||
"children": [
|
||||
{
|
||||
"type": "number",
|
||||
"key": "custom_start_frame",
|
||||
"label": "Custom start frame",
|
||||
"default": 0,
|
||||
"minimum": 1,
|
||||
"maximum": 100000
|
||||
},
|
||||
{
|
||||
"type": "number",
|
||||
"key": "timeline_frame_start",
|
||||
"label": "Timeline start frame",
|
||||
"default": 900000,
|
||||
"minimum": 1,
|
||||
"maximum": 10000000
|
||||
},
|
||||
{
|
||||
"type": "number",
|
||||
"key": "timeline_frame_offset",
|
||||
"label": "Timeline frame offset",
|
||||
"default": 0,
|
||||
"minimum": -1000000,
|
||||
"maximum": 1000000
|
||||
},
|
||||
{
|
||||
"key": "subsets",
|
||||
"label": "Subsets",
|
||||
"type": "dict-modifiable",
|
||||
"highlight_content": true,
|
||||
"object_type": {
|
||||
"type": "dict",
|
||||
"children": [
|
||||
{
|
||||
"type": "text",
|
||||
"key": "family",
|
||||
"label": "Family"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "families",
|
||||
"label": "Families",
|
||||
"object_type": "text"
|
||||
},
|
||||
{
|
||||
"type": "splitter"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "extensions",
|
||||
"label": "Extensions",
|
||||
"object_type": "text"
|
||||
},
|
||||
{
|
||||
"key": "version",
|
||||
"label": "Version lock",
|
||||
"type": "number",
|
||||
"default": 0,
|
||||
"minimum": 0,
|
||||
"maximum": 10
|
||||
}
|
||||
,
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "keepSequence",
|
||||
"label": "Keep sequence if used for review",
|
||||
"default": false
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
|
|
|
|||
|
|
@ -52,6 +52,17 @@
|
|||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "schema_template",
|
||||
"name": "template_publish_plugin",
|
||||
"template_data": [
|
||||
{
|
||||
"key": "ValidateStartFrame",
|
||||
"label": "Validate Scene Start Frame",
|
||||
"docstring": "Validate first frame of scene is set to '0'."
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "schema_template",
|
||||
"name": "template_publish_plugin",
|
||||
|
|
|
|||
|
|
@ -3,6 +3,7 @@
|
|||
"key": "imageio",
|
||||
"label": "Color Management and Output Formats",
|
||||
"is_file": true,
|
||||
"is_group": true,
|
||||
"children": [
|
||||
{
|
||||
"key": "hiero",
|
||||
|
|
@ -14,7 +15,6 @@
|
|||
"type": "dict",
|
||||
"label": "Workfile",
|
||||
"collapsible": false,
|
||||
"is_group": true,
|
||||
"children": [
|
||||
{
|
||||
"type": "form",
|
||||
|
|
@ -89,7 +89,6 @@
|
|||
"type": "dict",
|
||||
"label": "Colorspace on Inputs by regex detection",
|
||||
"collapsible": true,
|
||||
"is_group": true,
|
||||
"children": [
|
||||
{
|
||||
"type": "list",
|
||||
|
|
@ -124,7 +123,6 @@
|
|||
"type": "dict",
|
||||
"label": "Viewer",
|
||||
"collapsible": false,
|
||||
"is_group": true,
|
||||
"children": [
|
||||
{
|
||||
"type": "text",
|
||||
|
|
@ -138,7 +136,6 @@
|
|||
"type": "dict",
|
||||
"label": "Workfile",
|
||||
"collapsible": false,
|
||||
"is_group": true,
|
||||
"children": [
|
||||
{
|
||||
"type": "form",
|
||||
|
|
@ -236,7 +233,6 @@
|
|||
"type": "dict",
|
||||
"label": "Nodes",
|
||||
"collapsible": true,
|
||||
"is_group": true,
|
||||
"children": [
|
||||
{
|
||||
"key": "requiredNodes",
|
||||
|
|
@ -339,7 +335,6 @@
|
|||
"type": "dict",
|
||||
"label": "Colorspace on Inputs by regex detection",
|
||||
"collapsible": true,
|
||||
"is_group": true,
|
||||
"children": [
|
||||
{
|
||||
"type": "list",
|
||||
|
|
|
|||
|
|
@ -554,6 +554,22 @@
|
|||
"key": "deadline_priority",
|
||||
"label": "Deadline Priotity"
|
||||
},
|
||||
{
|
||||
"type": "splitter"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "publishing_script",
|
||||
"label": "Publishing script path"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "skip_integration_repre_list",
|
||||
"label": "Skip integration of representation with ext",
|
||||
"object_type": {
|
||||
"type": "text"
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"key": "aov_filter",
|
||||
|
|
@ -594,6 +610,30 @@
|
|||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"key": "CleanUp",
|
||||
"label": "Clean Up",
|
||||
"is_group": true,
|
||||
"children": [
|
||||
{
|
||||
"type": "list",
|
||||
"key": "paterns",
|
||||
"label": "Paterrns (regex)",
|
||||
"object_type": {
|
||||
"type": "text"
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "remove_temp_renders",
|
||||
"label": "Remove Temp renders",
|
||||
"default": false
|
||||
}
|
||||
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
|
|
|
|||
|
|
@ -315,14 +315,28 @@ class DuplicatedEnvGroups(Exception):
|
|||
super(DuplicatedEnvGroups, self).__init__(msg)
|
||||
|
||||
|
||||
def load_openpype_default_settings():
|
||||
"""Load openpype default settings."""
|
||||
return load_jsons_from_dir(DEFAULTS_DIR)
|
||||
|
||||
|
||||
def reset_default_settings():
|
||||
"""Reset cache of default settings. Can't be used now."""
|
||||
global _DEFAULT_SETTINGS
|
||||
_DEFAULT_SETTINGS = None
|
||||
|
||||
|
||||
def get_default_settings():
|
||||
"""Get default settings.
|
||||
|
||||
Todo:
|
||||
Cache loaded defaults.
|
||||
|
||||
Returns:
|
||||
dict: Loaded default settings.
|
||||
"""
|
||||
# TODO add cacher
|
||||
return load_jsons_from_dir(DEFAULTS_DIR)
|
||||
return load_openpype_default_settings()
|
||||
# global _DEFAULT_SETTINGS
|
||||
# if _DEFAULT_SETTINGS is None:
|
||||
# _DEFAULT_SETTINGS = load_jsons_from_dir(DEFAULTS_DIR)
|
||||
|
|
@ -868,6 +882,25 @@ def get_environments():
|
|||
return find_environments(get_system_settings(False))
|
||||
|
||||
|
||||
def get_general_environments():
|
||||
"""Get general environments.
|
||||
|
||||
Function is implemented to be able load general environments without using
|
||||
`get_default_settings`.
|
||||
"""
|
||||
# Use only openpype defaults.
|
||||
# - prevent to use `get_system_settings` where `get_default_settings`
|
||||
# is used
|
||||
default_values = load_openpype_default_settings()
|
||||
studio_overrides = get_studio_system_settings_overrides()
|
||||
result = apply_overrides(default_values, studio_overrides)
|
||||
environments = result["general"]["environment"]
|
||||
|
||||
clear_metadata_from_settings(environments)
|
||||
|
||||
return environments
|
||||
|
||||
|
||||
def clear_metadata_from_settings(values):
|
||||
"""Remove all metadata keys from loaded settings."""
|
||||
if isinstance(values, dict):
|
||||
|
|
|
|||
|
|
@ -211,7 +211,8 @@ class DropDataFrame(QtWidgets.QFrame):
|
|||
folder_path = os.path.dirname(collection.head)
|
||||
if file_base[-1] in ['.', '_']:
|
||||
file_base = file_base[:-1]
|
||||
file_ext = collection.tail
|
||||
file_ext = os.path.splitext(
|
||||
collection.format('{head}{padding}{tail}'))[1]
|
||||
repr_name = file_ext.replace('.', '')
|
||||
range = collection.format('{ranges}')
|
||||
|
||||
|
|
|
|||
|
|
@ -1,3 +1,3 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Package declaring Pype version."""
|
||||
__version__ = "3.2.0-nightly.7"
|
||||
__version__ = "3.3.0-nightly.4"
|
||||
|
|
|
|||
2
poetry.lock
generated
2
poetry.lock
generated
|
|
@ -11,7 +11,7 @@ develop = false
|
|||
type = "git"
|
||||
url = "https://github.com/pypeclub/acre.git"
|
||||
reference = "master"
|
||||
resolved_reference = "68784b7eb5b7bb5f409b61ab31d4403878a3e1b7"
|
||||
resolved_reference = "5a812c6dcfd3aada87adb49be98c548c894d6566"
|
||||
|
||||
[[package]]
|
||||
name = "aiohttp"
|
||||
|
|
|
|||
17
start.py
17
start.py
|
|
@ -208,14 +208,21 @@ def set_openpype_global_environments() -> None:
|
|||
"""Set global OpenPype's environments."""
|
||||
import acre
|
||||
|
||||
from openpype.settings import get_environments
|
||||
try:
|
||||
from openpype.settings import get_general_environments
|
||||
|
||||
all_env = get_environments()
|
||||
general_env = get_general_environments()
|
||||
|
||||
except Exception:
|
||||
# Backwards compatibility for OpenPype versions where
|
||||
# `get_general_environments` does not exists yet
|
||||
from openpype.settings import get_environments
|
||||
|
||||
all_env = get_environments()
|
||||
general_env = all_env["global"]
|
||||
|
||||
# TODO Global environments will be stored in "general" settings so loading
|
||||
# will be modified and can be done in igniter.
|
||||
env = acre.merge(
|
||||
acre.parse(all_env["global"]),
|
||||
acre.parse(general_env),
|
||||
dict(os.environ)
|
||||
)
|
||||
os.environ.clear()
|
||||
|
|
|
|||
|
|
@ -83,8 +83,12 @@ function Show-PSWarning() {
|
|||
function Install-Poetry() {
|
||||
Write-Host ">>> " -NoNewline -ForegroundColor Green
|
||||
Write-Host "Installing Poetry ... "
|
||||
$python = "python"
|
||||
if (Get-Command "pyenv" -ErrorAction SilentlyContinue) {
|
||||
$python = & pyenv which python
|
||||
}
|
||||
$env:POETRY_HOME="$openpype_root\.poetry"
|
||||
(Invoke-WebRequest -Uri https://raw.githubusercontent.com/python-poetry/poetry/master/install-poetry.py -UseBasicParsing).Content | python -
|
||||
(Invoke-WebRequest -Uri https://raw.githubusercontent.com/python-poetry/poetry/master/install-poetry.py -UseBasicParsing).Content | & $($python) -
|
||||
}
|
||||
|
||||
$art = @"
|
||||
|
|
|
|||
|
|
@ -48,15 +48,23 @@ function Show-PSWarning() {
|
|||
function Install-Poetry() {
|
||||
Write-Host ">>> " -NoNewline -ForegroundColor Green
|
||||
Write-Host "Installing Poetry ... "
|
||||
$python = "python"
|
||||
if (Get-Command "pyenv" -ErrorAction SilentlyContinue) {
|
||||
$python = & pyenv which python
|
||||
}
|
||||
$env:POETRY_HOME="$openpype_root\.poetry"
|
||||
(Invoke-WebRequest -Uri https://raw.githubusercontent.com/python-poetry/poetry/master/install-poetry.py -UseBasicParsing).Content | python -
|
||||
(Invoke-WebRequest -Uri https://raw.githubusercontent.com/python-poetry/poetry/master/install-poetry.py -UseBasicParsing).Content | & $($python) -
|
||||
}
|
||||
|
||||
|
||||
function Test-Python() {
|
||||
Write-Host ">>> " -NoNewline -ForegroundColor green
|
||||
Write-Host "Detecting host Python ... " -NoNewline
|
||||
if (-not (Get-Command "python" -ErrorAction SilentlyContinue)) {
|
||||
$python = "python"
|
||||
if (Get-Command "pyenv" -ErrorAction SilentlyContinue) {
|
||||
$python = & pyenv which python
|
||||
}
|
||||
if (-not (Get-Command "python3" -ErrorAction SilentlyContinue)) {
|
||||
Write-Host "!!! Python not detected" -ForegroundColor red
|
||||
Set-Location -Path $current_dir
|
||||
Exit-WithCode 1
|
||||
|
|
@ -66,7 +74,7 @@ import sys
|
|||
print('{0}.{1}'.format(sys.version_info[0], sys.version_info[1]))
|
||||
'@
|
||||
|
||||
$p = & python -c $version_command
|
||||
$p = & $python -c $version_command
|
||||
$env:PYTHON_VERSION = $p
|
||||
$m = $p -match '(\d+)\.(\d+)'
|
||||
if(-not $m) {
|
||||
|
|
|
|||
|
|
@ -41,22 +41,40 @@ function Exit-WithCode($exitcode) {
|
|||
}
|
||||
|
||||
|
||||
function Find-Mongo {
|
||||
function Find-Mongo ($preferred_version) {
|
||||
$defaultPath = "C:\Program Files\MongoDB\Server"
|
||||
Write-Host ">>> " -NoNewLine -ForegroundColor Green
|
||||
Write-Host "Detecting MongoDB ... " -NoNewline
|
||||
if (-not (Get-Command "mongod" -ErrorAction SilentlyContinue)) {
|
||||
if(Test-Path "$($defaultPath)\*\bin\mongod.exe" -PathType Leaf) {
|
||||
# we have mongo server installed on standard Windows location
|
||||
# so we can inject it to the PATH. We'll use latest version available.
|
||||
# so we can inject it to the PATH. We'll use latest version available, or the one defined by
|
||||
# $preferred_version.
|
||||
$mongoVersions = Get-ChildItem -Directory 'C:\Program Files\MongoDB\Server' | Sort-Object -Property {$_.Name -as [int]}
|
||||
if(Test-Path "$($mongoVersions[-1])\bin\mongod.exe" -PathType Leaf) {
|
||||
$env:PATH = "$($env:PATH);$($mongoVersions[-1])\bin\"
|
||||
Write-Host "OK" -ForegroundColor Green
|
||||
$use_version = $mongoVersions[-1]
|
||||
foreach ($v in $mongoVersions) {
|
||||
Write-Host " - found [ " -NoNewline
|
||||
Write-Host $v -NoNewLine -ForegroundColor Cyan
|
||||
Write-Host " ]" -NoNewLine
|
||||
|
||||
$version = Split-Path $v -Leaf
|
||||
|
||||
if ($preferred_version -eq $version) {
|
||||
Write-Host " *" -ForegroundColor Green
|
||||
$use_version = $v
|
||||
} else {
|
||||
Write-Host ""
|
||||
}
|
||||
}
|
||||
|
||||
$env:PATH = "$($env:PATH);$($use_version)\bin\"
|
||||
|
||||
Write-Host " - auto-added from [ " -NoNewline
|
||||
Write-Host "$($mongoVersions[-1])\bin\mongod.exe" -NoNewLine -ForegroundColor Cyan
|
||||
Write-Host "$($use_version)\bin\mongod.exe" -NoNewLine -ForegroundColor Cyan
|
||||
Write-Host " ]"
|
||||
return "$($mongoVersions[-1])\bin\mongod.exe"
|
||||
return "$($use_version)\bin\mongod.exe"
|
||||
} else {
|
||||
Write-Host "FAILED " -NoNewLine -ForegroundColor Red
|
||||
Write-Host "MongoDB not detected" -ForegroundColor Yellow
|
||||
|
|
@ -95,7 +113,18 @@ $port = 2707
|
|||
# path to database
|
||||
$dbpath = (Get-Item $openpype_root).parent.FullName + "\mongo_db_data"
|
||||
|
||||
$mongoPath = Find-Mongo
|
||||
Start-Process -FilePath $mongopath "--dbpath $($dbpath) --port $($port)" -PassThru
|
||||
$preferred_version = "4.0"
|
||||
|
||||
$mongoPath = Find-Mongo $preferred_version
|
||||
Write-Host ">>> " -NoNewLine -ForegroundColor Green
|
||||
Write-Host "Using DB path: " -NoNewLine
|
||||
Write-Host " [ " -NoNewline -ForegroundColor Cyan
|
||||
Write-Host "$($dbpath)" -NoNewline -ForegroundColor White
|
||||
Write-Host " ] "-ForegroundColor Cyan
|
||||
Write-Host ">>> " -NoNewLine -ForegroundColor Green
|
||||
Write-Host "Port: " -NoNewLine
|
||||
Write-Host " [ " -NoNewline -ForegroundColor Cyan
|
||||
Write-Host "$($port)" -NoNewline -ForegroundColor White
|
||||
Write-Host " ] " -ForegroundColor Cyan
|
||||
Start-Process -FilePath $mongopath "--dbpath $($dbpath) --port $($port)" -PassThru | Out-Null
|
||||
|
||||
|
|
|
|||
BIN
website/docs/project_settings/assets/standalone_creators.png
Normal file
BIN
website/docs/project_settings/assets/standalone_creators.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 14 KiB |
98
website/docs/project_settings/settings_project_standalone.md
Normal file
98
website/docs/project_settings/settings_project_standalone.md
Normal file
|
|
@ -0,0 +1,98 @@
|
|||
---
|
||||
id: settings_project_standalone
|
||||
title: Project Standalone Publisher Setting
|
||||
sidebar_label: Standalone Publisher
|
||||
---
|
||||
|
||||
import Tabs from '@theme/Tabs';
|
||||
import TabItem from '@theme/TabItem';
|
||||
|
||||
Project settings can have project specific values. Each new project is using studio values defined in **default** project but these values can be modified or overriden per project.
|
||||
|
||||
:::warning Default studio values
|
||||
Projects always use default project values unless they have [project override](../admin_settings#project-overrides) (orage colour). Any changes in default project may affect all existing projects.
|
||||
:::
|
||||
|
||||
## Creator Plugins
|
||||
|
||||
Contains list of implemented families to show in middle menu in Standalone Publisher. Each plugin must contain:
|
||||
- name
|
||||
- label
|
||||
- family
|
||||
- icon
|
||||
- default subset(s)
|
||||
- help (additional short information about family)
|
||||
|
||||

|
||||
|
||||
## Publish plugins
|
||||
|
||||
### Collect Textures
|
||||
|
||||
Serves to collect all needed information about workfiles and textures created from those. Allows to publish
|
||||
main workfile (for example from Mari), additional worfiles (from Substance Painter) and exported textures.
|
||||
|
||||
Available configuration:
|
||||
- Main workfile extension - only single workfile can be "main" one
|
||||
- Support workfile extensions - additional workfiles will be published to same folder as "main", just under `resourses` subfolder
|
||||
- Texture extension - what kind of formats are expected for textures
|
||||
- Additional families for workfile - should any family ('ftrack', 'review') be added to published workfile
|
||||
- Additional families for textures - should any family ('ftrack', 'review') be added to published textures
|
||||
|
||||
#### Naming conventions
|
||||
|
||||
Implementation tries to be flexible and cover multiple naming conventions for workfiles and textures.
|
||||
|
||||
##### Workfile naming pattern
|
||||
|
||||
Provide regex matching pattern containing regex groups used to parse workfile name to learn needed information. (For example
|
||||
build name.)
|
||||
|
||||
Example:
|
||||
|
||||
- pattern: ```^([^.]+)(_[^_.]*)?_v([0-9]{3,}).+```
|
||||
- with groups: ```["asset", "filler", "version"]```
|
||||
|
||||
parses `corridorMain_v001` into three groups:
|
||||
- asset build (`corridorMain`)
|
||||
- filler (in this case empty)
|
||||
- version (`001`)
|
||||
|
||||
Advanced example (for texture files):
|
||||
|
||||
- pattern: ```^([^_.]+)_([^_.]+)_v([0-9]{3,})_([^_.]+)_({color_space})_(1[0-9]{3}).+```
|
||||
- with groups: ```["asset", "shader", "version", "channel", "color_space", "udim"]```
|
||||
|
||||
parses `corridorMain_aluminiumID_v001_baseColor_linsRGB_1001.exr`:
|
||||
- asset build (`corridorMain`)
|
||||
- shader (`aluminiumID`)
|
||||
- version (`001`)
|
||||
- channel (`baseColor`)
|
||||
- color_space (`linsRGB`)
|
||||
- udim (`1001`)
|
||||
|
||||
|
||||
In case of different naming pattern, additional groups could be added or removed. Number of matching groups (`(...)`) must be same as number of items in `Group order for regex patterns`
|
||||
|
||||
##### Workfile group positions
|
||||
|
||||
For each matching regex group set in previous paragraph, its ordinal position is required (in case of need for addition of new groups etc.)
|
||||
|
||||
Number of groups added here must match number of parsing groups from `Workfile naming pattern`.
|
||||
|
||||
##### Output names
|
||||
|
||||
Output names of published workfiles and textures could be configured separately:
|
||||
- Subset name template for workfile
|
||||
- Subset name template for textures (implemented keys: ["color_space", "channel", "subset", "shader"])
|
||||
|
||||
|
||||
### Validate Scene Settings
|
||||
|
||||
#### Check Frame Range for Extensions
|
||||
|
||||
Configure families, file extension and task to validate that DB setting (frame range) matches currently published values.
|
||||
|
||||
### ExtractThumbnailSP
|
||||
|
||||
Plugin responsible for generating thumbnails, configure appropriate values for your version o ffmpeg.
|
||||
|
|
@ -65,7 +65,8 @@ module.exports = {
|
|||
label: "Project Settings",
|
||||
items: [
|
||||
"project_settings/settings_project_global",
|
||||
"project_settings/settings_project_nuke"
|
||||
"project_settings/settings_project_nuke",
|
||||
"project_settings/settings_project_standalone"
|
||||
],
|
||||
},
|
||||
],
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue