Merge branch 'develop' into feature/OP-3828_Workfile-template-build-enhancements

This commit is contained in:
Jakub Trllo 2022-09-13 16:24:44 +02:00
commit e7f16228ba
33 changed files with 1455 additions and 1493 deletions

View file

@ -1,14 +1,13 @@
# Changelog
## [3.14.2-nightly.4](https://github.com/pypeclub/OpenPype/tree/HEAD)
## [3.14.2](https://github.com/pypeclub/OpenPype/tree/3.14.2) (2022-09-12)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.14.1...HEAD)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.14.1...3.14.2)
**🆕 New features**
- Nuke: Build workfile by template [\#3763](https://github.com/pypeclub/OpenPype/pull/3763)
- Houdini: Publishing workfiles [\#3697](https://github.com/pypeclub/OpenPype/pull/3697)
- Global: making collect audio plugin global [\#3679](https://github.com/pypeclub/OpenPype/pull/3679)
**🚀 Enhancements**
@ -18,6 +17,7 @@
- Photoshop: attempt to speed up ExtractImage [\#3793](https://github.com/pypeclub/OpenPype/pull/3793)
- SyncServer: Added cli commands for sync server [\#3765](https://github.com/pypeclub/OpenPype/pull/3765)
- Kitsu: Drop 'entities root' setting. [\#3739](https://github.com/pypeclub/OpenPype/pull/3739)
- git: update gitignore [\#3722](https://github.com/pypeclub/OpenPype/pull/3722)
**🐛 Bug fixes**
@ -42,11 +42,14 @@
- General: Move queries of asset and representation links [\#3770](https://github.com/pypeclub/OpenPype/pull/3770)
- General: Move create project folders to pipeline [\#3768](https://github.com/pypeclub/OpenPype/pull/3768)
- General: Create project function moved to client code [\#3766](https://github.com/pypeclub/OpenPype/pull/3766)
- Maya: Refactor submit deadline to use AbstractSubmitDeadline [\#3759](https://github.com/pypeclub/OpenPype/pull/3759)
- General: Change publish template settings location [\#3755](https://github.com/pypeclub/OpenPype/pull/3755)
- General: Move hostdirname functionality into host [\#3749](https://github.com/pypeclub/OpenPype/pull/3749)
- General: Move publish utils to pipeline [\#3745](https://github.com/pypeclub/OpenPype/pull/3745)
- Photoshop: Defined photoshop as addon [\#3736](https://github.com/pypeclub/OpenPype/pull/3736)
- Houdini: Define houdini as addon [\#3735](https://github.com/pypeclub/OpenPype/pull/3735)
- Fusion: Defined fusion as addon [\#3733](https://github.com/pypeclub/OpenPype/pull/3733)
- Flame: Defined flame as addon [\#3732](https://github.com/pypeclub/OpenPype/pull/3732)
- Blender: Define blender as module [\#3729](https://github.com/pypeclub/OpenPype/pull/3729)
- Resolve: Define resolve as addon [\#3727](https://github.com/pypeclub/OpenPype/pull/3727)
**Merged pull requests:**
@ -65,7 +68,6 @@
**🚀 Enhancements**
- General: Thumbnail can use project roots [\#3750](https://github.com/pypeclub/OpenPype/pull/3750)
- git: update gitignore [\#3722](https://github.com/pypeclub/OpenPype/pull/3722)
- Settings: Remove settings lock on tray exit [\#3720](https://github.com/pypeclub/OpenPype/pull/3720)
- General: Added helper getters to modules manager [\#3712](https://github.com/pypeclub/OpenPype/pull/3712)
- Unreal: Define unreal as module and use host class [\#3701](https://github.com/pypeclub/OpenPype/pull/3701)
@ -84,18 +86,17 @@
- Settings: Fix project overrides save [\#3708](https://github.com/pypeclub/OpenPype/pull/3708)
- Workfiles tool: Fix published workfile filtering [\#3704](https://github.com/pypeclub/OpenPype/pull/3704)
- PS, AE: Provide default variant value for workfile subset [\#3703](https://github.com/pypeclub/OpenPype/pull/3703)
- Webpublisher: added check for empty context [\#3682](https://github.com/pypeclub/OpenPype/pull/3682)
- Flame: retime is working on clip publishing [\#3684](https://github.com/pypeclub/OpenPype/pull/3684)
**🔀 Refactored code**
- General: Move delivery logic to pipeline [\#3751](https://github.com/pypeclub/OpenPype/pull/3751)
- General: Move publish utils to pipeline [\#3745](https://github.com/pypeclub/OpenPype/pull/3745)
- General: Host addons cleanup [\#3744](https://github.com/pypeclub/OpenPype/pull/3744)
- Webpublisher: Webpublisher is used as addon [\#3740](https://github.com/pypeclub/OpenPype/pull/3740)
- Photoshop: Defined photoshop as addon [\#3736](https://github.com/pypeclub/OpenPype/pull/3736)
- Harmony: Defined harmony as addon [\#3734](https://github.com/pypeclub/OpenPype/pull/3734)
- General: Module interfaces cleanup [\#3731](https://github.com/pypeclub/OpenPype/pull/3731)
- AfterEffects: Move AE functions from general lib [\#3730](https://github.com/pypeclub/OpenPype/pull/3730)
- Blender: Define blender as module [\#3729](https://github.com/pypeclub/OpenPype/pull/3729)
- AfterEffects: Define AfterEffects as module [\#3728](https://github.com/pypeclub/OpenPype/pull/3728)
- General: Replace PypeLogger with Logger [\#3725](https://github.com/pypeclub/OpenPype/pull/3725)
- Nuke: Define nuke as module [\#3724](https://github.com/pypeclub/OpenPype/pull/3724)
@ -120,12 +121,10 @@
**🚀 Enhancements**
- Ftrack: Addiotional component metadata [\#3685](https://github.com/pypeclub/OpenPype/pull/3685)
- Ftrack: Set task status on farm publishing [\#3680](https://github.com/pypeclub/OpenPype/pull/3680)
**🐛 Bug fixes**
- General: Switch from hero version to versioned works [\#3691](https://github.com/pypeclub/OpenPype/pull/3691)
- Flame: retime is working on clip publishing [\#3684](https://github.com/pypeclub/OpenPype/pull/3684)
## [3.13.0](https://github.com/pypeclub/OpenPype/tree/3.13.0) (2022-08-09)

View file

@ -32,7 +32,7 @@ from openpype.pipeline import (
)
from openpype.pipeline.load import any_outdated_containers
from openpype.hosts.maya import MAYA_ROOT_DIR
from openpype.hosts.maya.lib import copy_workspace_mel
from openpype.hosts.maya.lib import create_workspace_mel
from . import menu, lib
from .workfile_template_builder import MayaPlaceholderLoadPlugin
@ -64,7 +64,7 @@ class MayaHost(HostBase, IWorkfileHost, ILoadHost):
self._op_events = {}
def install(self):
project_name = os.getenv("AVALON_PROJECT")
project_name = legacy_io.active_project()
project_settings = get_project_settings(project_name)
# process path mapping
dirmap_processor = MayaDirmap("maya", project_name, project_settings)
@ -539,7 +539,7 @@ def on_task_changed():
lib.update_content_on_context_change()
msg = " project: {}\n asset: {}\n task:{}".format(
legacy_io.Session["AVALON_PROJECT"],
legacy_io.active_project(),
legacy_io.Session["AVALON_ASSET"],
legacy_io.Session["AVALON_TASK"]
)
@ -551,9 +551,10 @@ def on_task_changed():
def before_workfile_save(event):
project_name = legacy_io.active_project()
workdir_path = event["workdir_path"]
if workdir_path:
copy_workspace_mel(workdir_path)
create_workspace_mel(workdir_path, project_name)
class MayaDirmap(HostDirmap):

View file

@ -1,5 +1,5 @@
from openpype.lib import PreLaunchHook
from openpype.hosts.maya.lib import copy_workspace_mel
from openpype.hosts.maya.lib import create_workspace_mel
class PreCopyMel(PreLaunchHook):
@ -10,9 +10,10 @@ class PreCopyMel(PreLaunchHook):
app_groups = ["maya"]
def execute(self):
project_name = self.launch_context.env.get("AVALON_PROJECT")
workdir = self.launch_context.env.get("AVALON_WORKDIR")
if not workdir:
self.log.warning("BUG: Workdir is not filled.")
return
copy_workspace_mel(workdir)
create_workspace_mel(workdir, project_name)

View file

@ -1,26 +1,24 @@
import os
import shutil
from openpype.settings import get_project_settings
from openpype.lib import Logger
def copy_workspace_mel(workdir):
# Check that source mel exists
current_dir = os.path.dirname(os.path.abspath(__file__))
src_filepath = os.path.join(current_dir, "resources", "workspace.mel")
if not os.path.exists(src_filepath):
print("Source mel file does not exist. {}".format(src_filepath))
return
# Skip if workspace.mel already exists
def create_workspace_mel(workdir, project_name):
dst_filepath = os.path.join(workdir, "workspace.mel")
if os.path.exists(dst_filepath):
return
# Create workdir if does not exists yet
if not os.path.exists(workdir):
os.makedirs(workdir)
# Copy file
print("Copying workspace mel \"{}\" -> \"{}\"".format(
src_filepath, dst_filepath
))
shutil.copy(src_filepath, dst_filepath)
project_setting = get_project_settings(project_name)
mel_script = project_setting["maya"].get("mel_workspace")
# Skip if mel script in settings is empty
if not mel_script:
log = Logger.get_logger("create_workspace_mel")
log.debug("File 'workspace.mel' not created. Settings value is empty.")
return
with open(dst_filepath, "w") as mel_file:
mel_file.write(mel_script)

View file

@ -293,6 +293,7 @@ class CollectMayaRender(pyblish.api.ContextPlugin):
"source": filepath,
"expectedFiles": full_exp_files,
"publishRenderMetadataFolder": common_publish_meta_path,
"renderProducts": layer_render_products,
"resolutionWidth": lib.get_attr_in_layer(
"defaultResolution.width", layer=layer_name
),
@ -359,7 +360,6 @@ class CollectMayaRender(pyblish.api.ContextPlugin):
instance.data["label"] = label
instance.data["farm"] = True
instance.data.update(data)
self.log.debug("data: {}".format(json.dumps(data, indent=4)))
def parse_options(self, render_globals):
"""Get all overrides with a value, skip those without.

View file

@ -1,11 +0,0 @@
//Maya 2018 Project Definition
workspace -fr "shaders" "renderData/shaders";
workspace -fr "alembicCache" "cache/alembic";
workspace -fr "mayaAscii" "";
workspace -fr "mayaBinary" "";
workspace -fr "renderData" "renderData";
workspace -fr "fileCache" "cache/nCache";
workspace -fr "scene" "";
workspace -fr "sourceImages" "sourceimages";
workspace -fr "images" "renders";

View file

@ -9,6 +9,7 @@ import os
from abc import abstractmethod
import platform
import getpass
from functools import partial
from collections import OrderedDict
import six
@ -66,6 +67,96 @@ def requests_get(*args, **kwargs):
return requests.get(*args, **kwargs)
class DeadlineKeyValueVar(dict):
"""
Serializes dictionary key values as "{key}={value}" like Deadline uses
for EnvironmentKeyValue.
As an example:
EnvironmentKeyValue0="A_KEY=VALUE_A"
EnvironmentKeyValue1="OTHER_KEY=VALUE_B"
The keys are serialized in alphabetical order (sorted).
Example:
>>> var = DeadlineKeyValueVar("EnvironmentKeyValue")
>>> var["my_var"] = "hello"
>>> var["my_other_var"] = "hello2"
>>> var.serialize()
"""
def __init__(self, key):
super(DeadlineKeyValueVar, self).__init__()
self.__key = key
def serialize(self):
key = self.__key
# Allow custom location for index in serialized string
if "{}" not in key:
key = key + "{}"
return {
key.format(index): "{}={}".format(var_key, var_value)
for index, (var_key, var_value) in enumerate(sorted(self.items()))
}
class DeadlineIndexedVar(dict):
"""
Allows to set and query values by integer indices:
Query: var[1] or var.get(1)
Set: var[1] = "my_value"
Append: var += "value"
Note: Iterating the instance is not guarantueed to be the order of the
indices. To do so iterate with `sorted()`
"""
def __init__(self, key):
super(DeadlineIndexedVar, self).__init__()
self.__key = key
def serialize(self):
key = self.__key
# Allow custom location for index in serialized string
if "{}" not in key:
key = key + "{}"
return {
key.format(index): value for index, value in sorted(self.items())
}
def next_available_index(self):
# Add as first unused entry
i = 0
while i in self.keys():
i += 1
return i
def update(self, data):
# Force the integer key check
for key, value in data.items():
self.__setitem__(key, value)
def __iadd__(self, other):
index = self.next_available_index()
self[index] = other
return self
def __setitem__(self, key, value):
if not isinstance(key, int):
raise TypeError("Key must be an integer: {}".format(key))
if key < 0:
raise ValueError("Negative index can't be set: {}".format(key))
dict.__setitem__(self, key, value)
@attr.s
class DeadlineJobInfo(object):
"""Mapping of all Deadline *JobInfo* attributes.
@ -218,24 +309,8 @@ class DeadlineJobInfo(object):
# Environment
# ----------------------------------------------
_environmentKeyValue = attr.ib(factory=list)
@property
def EnvironmentKeyValue(self): # noqa: N802
"""Return all environment key values formatted for Deadline.
Returns:
dict: as `{'EnvironmentKeyValue0', 'key=value'}`
"""
out = {}
for index, v in enumerate(self._environmentKeyValue):
out["EnvironmentKeyValue{}".format(index)] = v
return out
@EnvironmentKeyValue.setter
def EnvironmentKeyValue(self, val): # noqa: N802
self._environmentKeyValue.append(val)
EnvironmentKeyValue = attr.ib(factory=partial(DeadlineKeyValueVar,
"EnvironmentKeyValue"))
IncludeEnvironment = attr.ib(default=None) # Default: false
UseJobEnvironmentOnly = attr.ib(default=None) # Default: false
@ -243,121 +318,29 @@ class DeadlineJobInfo(object):
# Job Extra Info
# ----------------------------------------------
_extraInfos = attr.ib(factory=list)
_extraInfoKeyValues = attr.ib(factory=list)
@property
def ExtraInfo(self): # noqa: N802
"""Return all ExtraInfo values formatted for Deadline.
Returns:
dict: as `{'ExtraInfo0': 'value'}`
"""
out = {}
for index, v in enumerate(self._extraInfos):
out["ExtraInfo{}".format(index)] = v
return out
@ExtraInfo.setter
def ExtraInfo(self, val): # noqa: N802
self._extraInfos.append(val)
@property
def ExtraInfoKeyValue(self): # noqa: N802
"""Return all ExtraInfoKeyValue values formatted for Deadline.
Returns:
dict: as {'ExtraInfoKeyValue0': 'key=value'}`
"""
out = {}
for index, v in enumerate(self._extraInfoKeyValues):
out["ExtraInfoKeyValue{}".format(index)] = v
return out
@ExtraInfoKeyValue.setter
def ExtraInfoKeyValue(self, val): # noqa: N802
self._extraInfoKeyValues.append(val)
ExtraInfo = attr.ib(factory=partial(DeadlineIndexedVar, "ExtraInfo"))
ExtraInfoKeyValue = attr.ib(factory=partial(DeadlineKeyValueVar,
"ExtraInfoKeyValue"))
# Task Extra Info Names
# ----------------------------------------------
OverrideTaskExtraInfoNames = attr.ib(default=None) # Default: false
_taskExtraInfos = attr.ib(factory=list)
@property
def TaskExtraInfoName(self): # noqa: N802
"""Return all TaskExtraInfoName values formatted for Deadline.
Returns:
dict: as `{'TaskExtraInfoName0': 'value'}`
"""
out = {}
for index, v in enumerate(self._taskExtraInfos):
out["TaskExtraInfoName{}".format(index)] = v
return out
@TaskExtraInfoName.setter
def TaskExtraInfoName(self, val): # noqa: N802
self._taskExtraInfos.append(val)
TaskExtraInfoName = attr.ib(factory=partial(DeadlineIndexedVar,
"TaskExtraInfoName"))
# Output
# ----------------------------------------------
_outputFilename = attr.ib(factory=list)
_outputFilenameTile = attr.ib(factory=list)
_outputDirectory = attr.ib(factory=list)
OutputFilename = attr.ib(factory=partial(DeadlineIndexedVar,
"OutputFilename"))
OutputFilenameTile = attr.ib(factory=partial(DeadlineIndexedVar,
"OutputFilename{}Tile"))
OutputDirectory = attr.ib(factory=partial(DeadlineIndexedVar,
"OutputDirectory"))
@property
def OutputFilename(self): # noqa: N802
"""Return all OutputFilename values formatted for Deadline.
Returns:
dict: as `{'OutputFilename0': 'filename'}`
"""
out = {}
for index, v in enumerate(self._outputFilename):
out["OutputFilename{}".format(index)] = v
return out
@OutputFilename.setter
def OutputFilename(self, val): # noqa: N802
self._outputFilename.append(val)
@property
def OutputFilenameTile(self): # noqa: N802
"""Return all OutputFilename#Tile values formatted for Deadline.
Returns:
dict: as `{'OutputFilenme#Tile': 'tile'}`
"""
out = {}
for index, v in enumerate(self._outputFilenameTile):
out["OutputFilename{}Tile".format(index)] = v
return out
@OutputFilenameTile.setter
def OutputFilenameTile(self, val): # noqa: N802
self._outputFilenameTile.append(val)
@property
def OutputDirectory(self): # noqa: N802
"""Return all OutputDirectory values formatted for Deadline.
Returns:
dict: as `{'OutputDirectory0': 'dir'}`
"""
out = {}
for index, v in enumerate(self._outputDirectory):
out["OutputDirectory{}".format(index)] = v
return out
@OutputDirectory.setter
def OutputDirectory(self, val): # noqa: N802
self._outputDirectory.append(val)
# Asset Dependency
# ----------------------------------------------
AssetDependency = attr.ib(factory=partial(DeadlineIndexedVar,
"AssetDependency"))
# Tile Job
# ----------------------------------------------
@ -381,7 +364,7 @@ class DeadlineJobInfo(object):
"""
def filter_data(a, v):
if a.name.startswith("_"):
if isinstance(v, (DeadlineIndexedVar, DeadlineKeyValueVar)):
return False
if v is None:
return False
@ -389,15 +372,27 @@ class DeadlineJobInfo(object):
serialized = attr.asdict(
self, dict_factory=OrderedDict, filter=filter_data)
serialized.update(self.EnvironmentKeyValue)
serialized.update(self.ExtraInfo)
serialized.update(self.ExtraInfoKeyValue)
serialized.update(self.TaskExtraInfoName)
serialized.update(self.OutputFilename)
serialized.update(self.OutputFilenameTile)
serialized.update(self.OutputDirectory)
# Custom serialize these attributes
for attribute in [
self.EnvironmentKeyValue,
self.ExtraInfo,
self.ExtraInfoKeyValue,
self.TaskExtraInfoName,
self.OutputFilename,
self.OutputFilenameTile,
self.OutputDirectory,
self.AssetDependency
]:
serialized.update(attribute.serialize())
return serialized
def update(self, data):
"""Update instance with data dict"""
for key, value in data.items():
setattr(self, key, value)
@six.add_metaclass(AbstractMetaInstancePlugin)
class AbstractSubmitDeadline(pyblish.api.InstancePlugin):
@ -521,68 +516,72 @@ class AbstractSubmitDeadline(pyblish.api.InstancePlugin):
published.
"""
anatomy = self._instance.context.data['anatomy']
file_path = None
for i in self._instance.context:
if "workfile" in i.data["families"] \
or i.data["family"] == "workfile":
# test if there is instance of workfile waiting
# to be published.
assert i.data["publish"] is True, (
"Workfile (scene) must be published along")
# determine published path from Anatomy.
template_data = i.data.get("anatomyData")
rep = i.data.get("representations")[0].get("ext")
template_data["representation"] = rep
template_data["ext"] = rep
template_data["comment"] = None
anatomy_filled = anatomy.format(template_data)
template_filled = anatomy_filled["publish"]["path"]
file_path = os.path.normpath(template_filled)
self.log.info("Using published scene for render {}".format(
file_path))
instance = self._instance
workfile_instance = self._get_workfile_instance(instance.context)
if workfile_instance is None:
return
if not os.path.exists(file_path):
self.log.error("published scene does not exist!")
raise
# determine published path from Anatomy.
template_data = workfile_instance.data.get("anatomyData")
rep = workfile_instance.data.get("representations")[0]
template_data["representation"] = rep.get("name")
template_data["ext"] = rep.get("ext")
template_data["comment"] = None
if not replace_in_path:
return file_path
anatomy = instance.context.data['anatomy']
anatomy_filled = anatomy.format(template_data)
template_filled = anatomy_filled["publish"]["path"]
file_path = os.path.normpath(template_filled)
# now we need to switch scene in expected files
# because <scene> token will now point to published
# scene file and that might differ from current one
new_scene = os.path.splitext(
os.path.basename(file_path))[0]
orig_scene = os.path.splitext(
os.path.basename(
self._instance.context.data["currentFile"]))[0]
exp = self._instance.data.get("expectedFiles")
self.log.info("Using published scene for render {}".format(file_path))
if isinstance(exp[0], dict):
# we have aovs and we need to iterate over them
new_exp = {}
for aov, files in exp[0].items():
replaced_files = []
for f in files:
replaced_files.append(
str(f).replace(orig_scene, new_scene)
)
new_exp[aov] = replaced_files
# [] might be too much here, TODO
self._instance.data["expectedFiles"] = [new_exp]
else:
new_exp = []
for f in exp:
new_exp.append(
str(f).replace(orig_scene, new_scene)
)
self._instance.data["expectedFiles"] = new_exp
if not os.path.exists(file_path):
self.log.error("published scene does not exist!")
raise
self.log.info("Scene name was switched {} -> {}".format(
orig_scene, new_scene
))
if not replace_in_path:
return file_path
# now we need to switch scene in expected files
# because <scene> token will now point to published
# scene file and that might differ from current one
def _clean_name(path):
return os.path.splitext(os.path.basename(path))[0]
new_scene = _clean_name(file_path)
orig_scene = _clean_name(instance.context.data["currentFile"])
expected_files = instance.data.get("expectedFiles")
if isinstance(expected_files[0], dict):
# we have aovs and we need to iterate over them
new_exp = {}
for aov, files in expected_files[0].items():
replaced_files = []
for f in files:
replaced_files.append(
str(f).replace(orig_scene, new_scene)
)
new_exp[aov] = replaced_files
# [] might be too much here, TODO
instance.data["expectedFiles"] = [new_exp]
else:
new_exp = []
for f in expected_files:
new_exp.append(
str(f).replace(orig_scene, new_scene)
)
instance.data["expectedFiles"] = new_exp
metadata_folder = instance.data.get("publishRenderMetadataFolder")
if metadata_folder:
metadata_folder = metadata_folder.replace(orig_scene,
new_scene)
instance.data["publishRenderMetadataFolder"] = metadata_folder
self.log.info("Scene name was switched {} -> {}".format(
orig_scene, new_scene
))
return file_path
@ -645,3 +644,22 @@ class AbstractSubmitDeadline(pyblish.api.InstancePlugin):
self._instance.data["deadlineSubmissionJob"] = result
return result["_id"]
@staticmethod
def _get_workfile_instance(context):
"""Find workfile instance in context"""
for i in context:
is_workfile = (
"workfile" in i.data.get("families", []) or
i.data["family"] == "workfile"
)
if not is_workfile:
continue
# test if there is instance of workfile waiting
# to be published.
assert i.data["publish"] is True, (
"Workfile (scene) must be published along")
return i

View file

@ -13,7 +13,7 @@ class CollectDeadlineServerFromInstance(pyblish.api.InstancePlugin):
order = pyblish.api.CollectorOrder + 0.415
label = "Deadline Webservice from the Instance"
families = ["rendering"]
families = ["rendering", "renderlayer"]
def process(self, instance):
instance.data["deadlineUrl"] = self._collect_deadline_url(instance)

View file

@ -67,9 +67,9 @@ class AfterEffectsSubmitDeadline(
dln_job_info.Group = self.group
dln_job_info.Department = self.department
dln_job_info.ChunkSize = self.chunk_size
dln_job_info.OutputFilename = \
dln_job_info.OutputFilename += \
os.path.basename(self._instance.data["expectedFiles"][0])
dln_job_info.OutputDirectory = \
dln_job_info.OutputDirectory += \
os.path.dirname(self._instance.data["expectedFiles"][0])
dln_job_info.JobDelay = "00:00:00"
@ -92,13 +92,12 @@ class AfterEffectsSubmitDeadline(
environment = dict({key: os.environ[key] for key in keys
if key in os.environ}, **legacy_io.Session)
for key in keys:
val = environment.get(key)
if val:
dln_job_info.EnvironmentKeyValue = "{key}={value}".format(
key=key,
value=val)
value = environment.get(key)
if value:
dln_job_info.EnvironmentKeyValue[key] = value
# to recognize job from PYPE for turning Event On/Off
dln_job_info.EnvironmentKeyValue = "OPENPYPE_RENDER_JOB=1"
dln_job_info.EnvironmentKeyValue["OPENPYPE_RENDER_JOB"] = "1"
return dln_job_info

View file

@ -284,14 +284,12 @@ class HarmonySubmitDeadline(
environment = dict({key: os.environ[key] for key in keys
if key in os.environ}, **legacy_io.Session)
for key in keys:
val = environment.get(key)
if val:
job_info.EnvironmentKeyValue = "{key}={value}".format(
key=key,
value=val)
value = environment.get(key)
if value:
job_info.EnvironmentKeyValue[key] = value
# to recognize job from PYPE for turning Event On/Off
job_info.EnvironmentKeyValue = "OPENPYPE_RENDER_JOB=1"
job_info.EnvironmentKeyValue["OPENPYPE_RENDER_JOB"] = "1"
return job_info

View file

@ -1,9 +1,13 @@
from .ftrack_module import (
FtrackModule,
FTRACK_MODULE_DIR
FTRACK_MODULE_DIR,
resolve_ftrack_url,
)
__all__ = (
"FtrackModule",
"FTRACK_MODULE_DIR"
"FTRACK_MODULE_DIR",
"resolve_ftrack_url",
)

View file

@ -6,14 +6,16 @@ import platform
import click
from openpype.modules import OpenPypeModule
from openpype_interfaces import (
from openpype.modules.interfaces import (
ITrayModule,
IPluginPaths,
ISettingsChangeListener
)
from openpype.settings import SaveWarningExc
from openpype.lib import Logger
FTRACK_MODULE_DIR = os.path.dirname(os.path.abspath(__file__))
_URL_NOT_SET = object()
class FtrackModule(
@ -28,17 +30,8 @@ class FtrackModule(
ftrack_settings = settings[self.name]
self.enabled = ftrack_settings["enabled"]
# Add http schema
ftrack_url = ftrack_settings["ftrack_server"].strip("/ ")
if ftrack_url:
if "http" not in ftrack_url:
ftrack_url = "https://" + ftrack_url
# Check if "ftrack.app" is part os url
if "ftrackapp.com" not in ftrack_url:
ftrack_url = ftrack_url + ".ftrackapp.com"
self.ftrack_url = ftrack_url
self._settings_ftrack_url = ftrack_settings["ftrack_server"]
self._ftrack_url = _URL_NOT_SET
current_dir = os.path.dirname(os.path.abspath(__file__))
low_platform = platform.system().lower()
@ -70,6 +63,16 @@ class FtrackModule(
self.timers_manager_connector = None
self._timers_manager_module = None
def get_ftrack_url(self):
if self._ftrack_url is _URL_NOT_SET:
self._ftrack_url = resolve_ftrack_url(
self._settings_ftrack_url,
logger=self.log
)
return self._ftrack_url
ftrack_url = property(get_ftrack_url)
def get_global_environments(self):
"""Ftrack's global environments."""
return {
@ -479,6 +482,51 @@ class FtrackModule(
click_group.add_command(cli_main)
def _check_ftrack_url(url):
import requests
try:
result = requests.get(url, allow_redirects=False)
except requests.exceptions.RequestException:
return False
if (result.status_code != 200 or "FTRACK_VERSION" not in result.headers):
return False
return True
def resolve_ftrack_url(url, logger=None):
"""Checks if Ftrack server is responding."""
if logger is None:
logger = Logger.get_logger(__name__)
url = url.strip("/ ")
if not url:
logger.error("Ftrack URL is not set!")
return None
if not url.startswith("http"):
url = "https://" + url
ftrack_url = None
if not url.endswith("ftrackapp.com"):
ftrackapp_url = url + ".ftrackapp.com"
if _check_ftrack_url(ftrackapp_url):
ftrack_url = ftrackapp_url
if not ftrack_url and _check_ftrack_url(url):
ftrack_url = url
if ftrack_url:
logger.debug("Ftrack server \"{}\" is accessible.".format(ftrack_url))
else:
logger.error("Ftrack server \"{}\" is not accessible!".format(url))
return ftrack_url
@click.group(FtrackModule.name, help="Ftrack module related commands.")
def cli_main():
pass

View file

@ -1,8 +1,6 @@
from .ftrack_server import FtrackServer
from .lib import check_ftrack_url
__all__ = (
"FtrackServer",
"check_ftrack_url"
)

View file

@ -20,9 +20,11 @@ from openpype.lib import (
get_openpype_version,
get_build_version,
)
from openpype_modules.ftrack import FTRACK_MODULE_DIR
from openpype_modules.ftrack import (
FTRACK_MODULE_DIR,
resolve_ftrack_url,
)
from openpype_modules.ftrack.lib import credentials
from openpype_modules.ftrack.ftrack_server.lib import check_ftrack_url
from openpype_modules.ftrack.ftrack_server import socket_thread
@ -114,7 +116,7 @@ def legacy_server(ftrack_url):
while True:
if not ftrack_accessible:
ftrack_accessible = check_ftrack_url(ftrack_url)
ftrack_accessible = resolve_ftrack_url(ftrack_url)
# Run threads only if Ftrack is accessible
if not ftrack_accessible and not printed_ftrack_error:
@ -257,7 +259,7 @@ def main_loop(ftrack_url):
while True:
# Check if accessible Ftrack and Mongo url
if not ftrack_accessible:
ftrack_accessible = check_ftrack_url(ftrack_url)
ftrack_accessible = resolve_ftrack_url(ftrack_url)
if not mongo_accessible:
mongo_accessible = check_mongo_url(mongo_uri)
@ -441,7 +443,7 @@ def run_event_server(
os.environ["CLOCKIFY_API_KEY"] = clockify_api_key
# Check url regex and accessibility
ftrack_url = check_ftrack_url(ftrack_url)
ftrack_url = resolve_ftrack_url(ftrack_url)
if not ftrack_url:
print('Exiting! < Please enter Ftrack server url >')
return 1

View file

@ -26,45 +26,12 @@ except ImportError:
from openpype_modules.ftrack.lib import get_ftrack_event_mongo_info
from openpype.client import OpenPypeMongoConnection
from openpype.api import Logger
from openpype.lib import Logger
TOPIC_STATUS_SERVER = "openpype.event.server.status"
TOPIC_STATUS_SERVER_RESULT = "openpype.event.server.status.result"
def check_ftrack_url(url, log_errors=True, logger=None):
"""Checks if Ftrack server is responding"""
if logger is None:
logger = Logger.get_logger(__name__)
if not url:
logger.error("Ftrack URL is not set!")
return None
url = url.strip('/ ')
if 'http' not in url:
if url.endswith('ftrackapp.com'):
url = 'https://' + url
else:
url = 'https://{0}.ftrackapp.com'.format(url)
try:
result = requests.get(url, allow_redirects=False)
except requests.exceptions.RequestException:
if log_errors:
logger.error("Entered Ftrack URL is not accesible!")
return False
if (result.status_code != 200 or 'FTRACK_VERSION' not in result.headers):
if log_errors:
logger.error("Entered Ftrack URL is not accesible!")
return False
logger.debug("Ftrack server {} is accessible.".format(url))
return url
class SocketBaseEventHub(ftrack_api.event.hub.EventHub):
hearbeat_msg = b"hearbeat"

View file

@ -19,11 +19,8 @@ from openpype.client.operations import (
CURRENT_PROJECT_SCHEMA,
CURRENT_PROJECT_CONFIG_SCHEMA,
)
from openpype.api import (
Logger,
get_anatomy_settings
)
from openpype.lib import ApplicationManager
from openpype.settings import get_anatomy_settings
from openpype.lib import ApplicationManager, Logger
from openpype.pipeline import AvalonMongoDB, schema
from .constants import CUST_ATTR_ID_KEY, FPS_KEYS

View file

@ -35,7 +35,7 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
family_mapping = {
"camera": "cam",
"look": "look",
"mayaascii": "scene",
"mayaAscii": "scene",
"model": "geo",
"rig": "rig",
"setdress": "setdress",
@ -74,11 +74,15 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
version_number = int(instance_version)
family = instance.data["family"]
family_low = family.lower()
# Perform case-insensitive family mapping
family_low = family.lower()
asset_type = instance.data.get("ftrackFamily")
if not asset_type and family_low in self.family_mapping:
asset_type = self.family_mapping[family_low]
if not asset_type:
for map_family, map_value in self.family_mapping.items():
if map_family.lower() == family_low:
asset_type = map_value
break
if not asset_type:
asset_type = "upload"
@ -86,15 +90,6 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
self.log.debug(
"Family: {}\nMapping: {}".format(family_low, self.family_mapping)
)
# Ignore this instance if neither "ftrackFamily" or a family mapping is
# found.
if not asset_type:
self.log.info((
"Family \"{}\" does not match any asset type mapping"
).format(family))
return
status_name = self._get_asset_version_status_name(instance)
# Base of component item data

View file

@ -6,22 +6,18 @@ import threading
from Qt import QtCore, QtWidgets, QtGui
import ftrack_api
from ..ftrack_server.lib import check_ftrack_url
from ..ftrack_server import socket_thread
from ..lib import credentials
from ..ftrack_module import FTRACK_MODULE_DIR
from . import login_dialog
from openpype import resources
from openpype.lib import Logger
log = Logger.get_logger("FtrackModule")
from openpype_modules.ftrack import resolve_ftrack_url, FTRACK_MODULE_DIR
from openpype_modules.ftrack.ftrack_server import socket_thread
from openpype_modules.ftrack.lib import credentials
from . import login_dialog
class FtrackTrayWrapper:
def __init__(self, module):
self.module = module
self.log = Logger.get_logger(self.__class__.__name__)
self.thread_action_server = None
self.thread_socket_server = None
@ -62,19 +58,19 @@ class FtrackTrayWrapper:
if validation:
self.widget_login.set_credentials(ft_user, ft_api_key)
self.module.set_credentials_to_env(ft_user, ft_api_key)
log.info("Connected to Ftrack successfully")
self.log.info("Connected to Ftrack successfully")
self.on_login_change()
return validation
if not validation and ft_user and ft_api_key:
log.warning(
self.log.warning(
"Current Ftrack credentials are not valid. {}: {} - {}".format(
str(os.environ.get("FTRACK_SERVER")), ft_user, ft_api_key
)
)
log.info("Please sign in to Ftrack")
self.log.info("Please sign in to Ftrack")
self.bool_logged = False
self.show_login_widget()
self.set_menu_visibility()
@ -104,7 +100,7 @@ class FtrackTrayWrapper:
self.action_credentials.setIcon(self.icon_not_logged)
self.action_credentials.setToolTip("Logged out")
log.info("Logged out of Ftrack")
self.log.info("Logged out of Ftrack")
self.bool_logged = False
self.set_menu_visibility()
@ -126,10 +122,6 @@ class FtrackTrayWrapper:
ftrack_url = self.module.ftrack_url
os.environ["FTRACK_SERVER"] = ftrack_url
parent_file_path = os.path.dirname(
os.path.dirname(os.path.realpath(__file__))
)
min_fail_seconds = 5
max_fail_count = 3
wait_time_after_max_fail = 10
@ -154,17 +146,19 @@ class FtrackTrayWrapper:
# Main loop
while True:
if not self.bool_action_server_running:
log.debug("Action server was pushed to stop.")
self.log.debug("Action server was pushed to stop.")
break
# Check if accessible Ftrack and Mongo url
if not ftrack_accessible:
ftrack_accessible = check_ftrack_url(ftrack_url)
ftrack_accessible = resolve_ftrack_url(ftrack_url)
# Run threads only if Ftrack is accessible
if not ftrack_accessible:
if not printed_ftrack_error:
log.warning("Can't access Ftrack {}".format(ftrack_url))
self.log.warning(
"Can't access Ftrack {}".format(ftrack_url)
)
if self.thread_socket_server is not None:
self.thread_socket_server.stop()
@ -191,7 +185,7 @@ class FtrackTrayWrapper:
self.set_menu_visibility()
elif failed_count == max_fail_count:
log.warning((
self.log.warning((
"Action server failed {} times."
" I'll try to run again {}s later"
).format(
@ -243,10 +237,10 @@ class FtrackTrayWrapper:
self.thread_action_server.join()
self.thread_action_server = None
log.info("Ftrack action server was forced to stop")
self.log.info("Ftrack action server was forced to stop")
except Exception:
log.warning(
self.log.warning(
"Error has happened during Killing action server",
exc_info=True
)
@ -343,7 +337,7 @@ class FtrackTrayWrapper:
self.thread_timer = None
except Exception as e:
log.error("During Killing Timer event server: {0}".format(e))
self.log.error("During Killing Timer event server: {0}".format(e))
def changed_user(self):
self.stop_action_server()

View file

@ -22,6 +22,8 @@ from .publish_plugins import (
)
from .lib import (
get_publish_template_name,
DiscoverResult,
publish_plugins_discover,
load_help_content_from_plugin,
@ -62,6 +64,8 @@ __all__ = (
"Extractor",
"get_publish_template_name",
"DiscoverResult",
"publish_plugins_discover",
"load_help_content_from_plugin",

View file

@ -0,0 +1,2 @@
DEFAULT_PUBLISH_TEMPLATE = "publish"
DEFAULT_HERO_PUBLISH_TEMPLATE = "hero"

View file

@ -2,6 +2,7 @@ import os
import sys
import types
import inspect
import copy
import tempfile
import xml.etree.ElementTree
@ -9,8 +10,190 @@ import six
import pyblish.plugin
import pyblish.api
from openpype.lib import Logger
from openpype.settings import get_project_settings, get_system_settings
from openpype.lib import Logger, filter_profiles
from openpype.settings import (
get_project_settings,
get_system_settings,
)
from .contants import (
DEFAULT_PUBLISH_TEMPLATE,
DEFAULT_HERO_PUBLISH_TEMPLATE,
)
def get_template_name_profiles(
project_name, project_settings=None, logger=None
):
"""Receive profiles for publish template keys.
At least one of arguments must be passed.
Args:
project_name (str): Name of project where to look for templates.
project_settings(Dic[str, Any]): Prepared project settings.
Returns:
List[Dict[str, Any]]: Publish template profiles.
"""
if not project_name and not project_settings:
raise ValueError((
"Both project name and project settings are missing."
" At least one must be entered."
))
if not project_settings:
project_settings = get_project_settings(project_name)
profiles = (
project_settings
["global"]
["tools"]
["publish"]
["template_name_profiles"]
)
if profiles:
return copy.deepcopy(profiles)
# Use legacy approach for cases new settings are not filled yet for the
# project
legacy_profiles = (
project_settings
["global"]
["publish"]
["IntegrateAssetNew"]
["template_name_profiles"]
)
if legacy_profiles:
if not logger:
logger = Logger.get_logger("get_template_name_profiles")
logger.warning((
"Project \"{}\" is using legacy access to publish template."
" It is recommended to move settings to new location"
" 'project_settings/global/tools/publish/template_name_profiles'."
).format(project_name))
# Replace "tasks" key with "task_names"
profiles = []
for profile in copy.deepcopy(legacy_profiles):
profile["task_names"] = profile.pop("tasks", [])
profiles.append(profile)
return profiles
def get_hero_template_name_profiles(
project_name, project_settings=None, logger=None
):
"""Receive profiles for hero publish template keys.
At least one of arguments must be passed.
Args:
project_name (str): Name of project where to look for templates.
project_settings(Dic[str, Any]): Prepared project settings.
Returns:
List[Dict[str, Any]]: Publish template profiles.
"""
if not project_name and not project_settings:
raise ValueError((
"Both project name and project settings are missing."
" At least one must be entered."
))
if not project_settings:
project_settings = get_project_settings(project_name)
profiles = (
project_settings
["global"]
["tools"]
["publish"]
["hero_template_name_profiles"]
)
if profiles:
return copy.deepcopy(profiles)
# Use legacy approach for cases new settings are not filled yet for the
# project
legacy_profiles = copy.deepcopy(
project_settings
["global"]
["publish"]
["IntegrateHeroVersion"]
["template_name_profiles"]
)
if legacy_profiles:
if not logger:
logger = Logger.get_logger("get_hero_template_name_profiles")
logger.warning((
"Project \"{}\" is using legacy access to hero publish template."
" It is recommended to move settings to new location"
" 'project_settings/global/tools/publish/"
"hero_template_name_profiles'."
).format(project_name))
return legacy_profiles
def get_publish_template_name(
project_name,
host_name,
family,
task_name,
task_type,
project_settings=None,
hero=False,
logger=None
):
"""Get template name which should be used for passed context.
Publish templates are filtered by host name, family, task name and
task type.
Default template which is used at if profiles are not available or profile
has empty value is defined by 'DEFAULT_PUBLISH_TEMPLATE' constant.
Args:
project_name (str): Name of project where to look for settings.
host_name (str): Name of host integration.
family (str): Family for which should be found template.
task_name (str): Task name on which is intance working.
task_type (str): Task type on which is intance working.
project_setting (Dict[str, Any]): Prepared project settings.
logger (logging.Logger): Custom logger used for 'filter_profiles'
function.
Returns:
str: Template name which should be used for integration.
"""
template = None
filter_criteria = {
"hosts": host_name,
"families": family,
"task_names": task_name,
"task_types": task_type,
}
if hero:
default_template = DEFAULT_HERO_PUBLISH_TEMPLATE
profiles = get_hero_template_name_profiles(
project_name, project_settings, logger
)
else:
profiles = get_template_name_profiles(
project_name, project_settings, logger
)
default_template = DEFAULT_PUBLISH_TEMPLATE
profile = filter_profiles(profiles, filter_criteria, logger=logger)
if profile:
template = profile["template_name"]
return template or default_template
class DiscoverResult:

View file

@ -5,6 +5,9 @@ import copy
import clique
import six
from bson.objectid import ObjectId
import pyblish.api
from openpype.client.operations import (
OperationsSession,
new_subset_document,
@ -14,8 +17,6 @@ from openpype.client.operations import (
prepare_version_update_data,
prepare_representation_update_data,
)
from bson.objectid import ObjectId
import pyblish.api
from openpype.client import (
get_representations,
@ -23,10 +24,12 @@ from openpype.client import (
get_version_by_name,
)
from openpype.lib import source_hash
from openpype.lib.profiles_filtering import filter_profiles
from openpype.lib.file_transaction import FileTransaction
from openpype.pipeline import legacy_io
from openpype.pipeline.publish import KnownPublishError
from openpype.pipeline.publish import (
KnownPublishError,
get_publish_template_name,
)
log = logging.getLogger(__name__)
@ -792,52 +795,26 @@ class IntegrateAsset(pyblish.api.InstancePlugin):
def get_template_name(self, instance):
"""Return anatomy template name to use for integration"""
# Define publish template name from profiles
filter_criteria = self.get_profile_filter_criteria(instance)
template_name_profiles = self._get_template_name_profiles(instance)
profile = filter_profiles(
template_name_profiles,
filter_criteria,
logger=self.log
)
if profile:
return profile["template_name"]
return self.default_template_name
def _get_template_name_profiles(self, instance):
"""Receive profiles for publish template keys.
Reuse template name profiles from legacy integrator. Goal is to move
the profile settings out of plugin settings but until that happens we
want to be able set it at one place and don't break backwards
compatibility (more then once).
"""
return (
instance.context.data["project_settings"]
["global"]
["publish"]
["IntegrateAssetNew"]
["template_name_profiles"]
)
def get_profile_filter_criteria(self, instance):
"""Return filter criteria for `filter_profiles`"""
# Anatomy data is pre-filled by Collectors
anatomy_data = instance.data["anatomyData"]
project_name = legacy_io.active_project()
# Task can be optional in anatomy data
task = anatomy_data.get("task", {})
host_name = instance.context.data["hostName"]
anatomy_data = instance.data["anatomyData"]
family = anatomy_data["family"]
task_info = anatomy_data.get("task") or {}
# Return filter criteria
return {
"families": anatomy_data["family"],
"tasks": task.get("name"),
"task_types": task.get("type"),
"hosts": instance.context.data["hostName"],
}
return get_publish_template_name(
project_name,
host_name,
family,
task_name=task_info.get("name"),
task_type=task_info.get("type"),
project_settings=instance.context.data["project_settings"],
logger=self.log
)
def get_rootless_path(self, anatomy, path):
"""Returns, if possible, path without absolute portion from root

View file

@ -14,14 +14,12 @@ from openpype.client import (
get_archived_representations,
get_representations,
)
from openpype.lib import (
create_hard_link,
filter_profiles
)
from openpype.lib import create_hard_link
from openpype.pipeline import (
schema,
legacy_io,
)
from openpype.pipeline.publish import get_publish_template_name
class IntegrateHeroVersion(pyblish.api.InstancePlugin):
@ -68,10 +66,11 @@ class IntegrateHeroVersion(pyblish.api.InstancePlugin):
)
return
template_key = self._get_template_key(instance)
anatomy = instance.context.data["anatomy"]
project_name = anatomy.project_name
template_key = self._get_template_key(project_name, instance)
if template_key not in anatomy.templates:
self.log.warning((
"!!! Anatomy of project \"{}\" does not have set"
@ -527,30 +526,24 @@ class IntegrateHeroVersion(pyblish.api.InstancePlugin):
return publish_folder
def _get_template_key(self, instance):
def _get_template_key(self, project_name, instance):
anatomy_data = instance.data["anatomyData"]
task_data = anatomy_data.get("task") or {}
task_name = task_data.get("name")
task_type = task_data.get("type")
task_info = anatomy_data.get("task") or {}
host_name = instance.context.data["hostName"]
# TODO raise error if Hero not set?
family = self.main_family_from_instance(instance)
key_values = {
"families": family,
"task_names": task_name,
"task_types": task_type,
"hosts": host_name
}
profile = filter_profiles(
self.template_name_profiles,
key_values,
return get_publish_template_name(
project_name,
host_name,
family,
task_info.get("name"),
task_info.get("type"),
project_settings=instance.context.data["project_settings"],
hero=True,
logger=self.log
)
if profile:
template_name = profile["template_name"]
else:
template_name = self._default_template_name
return template_name
def main_family_from_instance(self, instance):
"""Returns main family of entered instance."""

View file

@ -15,7 +15,6 @@ from bson.objectid import ObjectId
from pymongo import DeleteOne, InsertOne
import pyblish.api
import openpype.api
from openpype.client import (
get_asset_by_name,
get_subset_by_id,
@ -25,14 +24,17 @@ from openpype.client import (
get_representations,
get_archived_representations,
)
from openpype.lib.profiles_filtering import filter_profiles
from openpype.lib import (
prepare_template_data,
create_hard_link,
StringTemplate,
TemplateUnsolved
TemplateUnsolved,
source_hash,
filter_profiles,
get_local_site_id,
)
from openpype.pipeline import legacy_io
from openpype.pipeline.publish import get_publish_template_name
# this is needed until speedcopy for linux is fixed
if sys.platform == "win32":
@ -138,7 +140,6 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
integrated_file_sizes = {}
# Attributes set by settings
template_name_profiles = None
subset_grouping_profiles = None
def process(self, instance):
@ -388,22 +389,16 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
family = self.main_family_from_instance(instance)
key_values = {
"families": family,
"tasks": task_name,
"hosts": instance.context.data["hostName"],
"task_types": task_type
}
profile = filter_profiles(
self.template_name_profiles,
key_values,
template_name = get_publish_template_name(
project_name,
instance.context.data["hostName"],
family,
task_name=task_info.get("name"),
task_type=task_info.get("type"),
project_settings=instance.context.data["project_settings"],
logger=self.log
)
template_name = "publish"
if profile:
template_name = profile["template_name"]
published_representations = {}
for idx, repre in enumerate(repres):
published_files = []
@ -1058,7 +1053,7 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
for _src, dest in resources:
path = self.get_rootless_path(anatomy, dest)
dest = self.get_dest_temp_url(dest)
file_hash = openpype.api.source_hash(dest)
file_hash = source_hash(dest)
if self.TMP_FILE_EXT and \
',{}'.format(self.TMP_FILE_EXT) in file_hash:
file_hash = file_hash.replace(',{}'.format(self.TMP_FILE_EXT),
@ -1168,7 +1163,7 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
def _get_sites(self, sync_project_presets):
"""Returns tuple (local_site, remote_site)"""
local_site_id = openpype.api.get_local_site_id()
local_site_id = get_local_site_id()
local_site = sync_project_presets["config"]. \
get("active_site", "studio").strip()

View file

@ -455,7 +455,7 @@
"family_mapping": {
"camera": "cam",
"look": "look",
"mayaascii": "scene",
"mayaAscii": "scene",
"model": "geo",
"rig": "rig",
"setdress": "setdress",

View file

@ -418,6 +418,10 @@
"filter_families": []
}
]
},
"publish": {
"template_name_profiles": [],
"hero_template_name_profiles": []
}
},
"project_folder_structure": "{\"__project_root__\": {\"prod\": {}, \"resources\": {\"footage\": {\"plates\": {}, \"offline\": {}}, \"audio\": {}, \"art_dept\": {}}, \"editorial\": {}, \"assets\": {\"characters\": {}, \"locations\": {}}, \"shots\": {}}}",

View file

@ -1,4 +1,5 @@
{
"mel_workspace": "workspace -fr \"shaders\" \"renderData/shaders\";\nworkspace -fr \"images\" \"renders\";\nworkspace -fr \"particles\" \"particles\";\nworkspace -fr \"mayaAscii\" \"\";\nworkspace -fr \"mayaBinary\" \"\";\nworkspace -fr \"scene\" \"\";\nworkspace -fr \"alembicCache\" \"cache/alembic\";\nworkspace -fr \"renderData\" \"renderData\";\nworkspace -fr \"sourceImages\" \"sourceimages\";\nworkspace -fr \"fileCache\" \"cache/nCache\";\n",
"ext_mapping": {
"model": "ma",
"mayaAscii": "ma",

View file

@ -5,6 +5,13 @@
"label": "Maya",
"is_file": true,
"children": [
{
"type": "text",
"multiline" : true,
"use_label_wrap": true,
"key": "mel_workspace",
"label": "Maya MEL Workspace"
},
{
"type": "dict-modifiable",
"key": "ext_mapping",

View file

@ -663,10 +663,14 @@
]
}
},
{
"type": "label",
"label": "<b>NOTE:</b> Publish template profiles settings were moved to <a href=\"settings://project_settings/global/tools/publish/template_name_profiles\"><b>Tools/Publish/Template name profiles</b></a>. Please move values there."
},
{
"type": "list",
"key": "template_name_profiles",
"label": "Template name profiles",
"label": "Template name profiles (DEPRECATED)",
"use_label_wrap": true,
"object_type": {
"type": "dict",
@ -771,10 +775,14 @@
"type": "list",
"object_type": "text"
},
{
"type": "label",
"label": "<b>NOTE:</b> Hero publish template profiles settings were moved to <a href=\"settings://project_settings/global/tools/publish/hero_template_name_profiles\"><b>Tools/Publish/Hero template name profiles</b></a>. Please move values there."
},
{
"type": "list",
"key": "template_name_profiles",
"label": "Template name profiles",
"label": "Template name profiles (DEPRECATED)",
"use_label_wrap": true,
"object_type": {
"type": "dict",

View file

@ -284,6 +284,102 @@
}
}
]
},
{
"type": "dict",
"key": "publish",
"label": "Publish",
"children": [
{
"type": "label",
"label": "<b>NOTE:</b> For backwards compatibility can be value empty and in that case are used values from <a href=\"settings://project_settings/global/publish/IntegrateAssetNew\"><b>IntegrateAssetNew</b></a>. This will change in future so please move all values here as soon as possible."
},
{
"type": "list",
"key": "template_name_profiles",
"label": "Template name profiles",
"use_label_wrap": true,
"object_type": {
"type": "dict",
"children": [
{
"key": "families",
"label": "Families",
"type": "list",
"object_type": "text"
},
{
"type": "hosts-enum",
"key": "hosts",
"label": "Hosts",
"multiselection": true
},
{
"key": "task_types",
"label": "Task types",
"type": "task-types-enum"
},
{
"key": "task_names",
"label": "Task names",
"type": "list",
"object_type": "text"
},
{
"type": "separator"
},
{
"type": "text",
"key": "template_name",
"label": "Template name"
}
]
}
},
{
"type": "list",
"key": "hero_template_name_profiles",
"label": "Hero template name profiles",
"use_label_wrap": true,
"object_type": {
"type": "dict",
"children": [
{
"key": "families",
"label": "Families",
"type": "list",
"object_type": "text"
},
{
"type": "hosts-enum",
"key": "hosts",
"label": "Hosts",
"multiselection": true
},
{
"key": "task_types",
"label": "Task types",
"type": "task-types-enum"
},
{
"key": "task_names",
"label": "Task names",
"type": "list",
"object_type": "text"
},
{
"type": "separator"
},
{
"type": "text",
"key": "template_name",
"label": "Template name",
"tooltip": "Name of template from Anatomy templates"
}
]
}
}
]
}
]
}

View file

@ -32,6 +32,7 @@ class HostToolsHelper:
self._workfiles_tool = None
self._loader_tool = None
self._creator_tool = None
self._publisher_tool = None
self._subset_manager_tool = None
self._scene_inventory_tool = None
self._library_loader_tool = None
@ -193,7 +194,6 @@ class HostToolsHelper:
library_loader_tool.showNormal()
library_loader_tool.refresh()
def show_publish(self, parent=None):
"""Try showing the most desirable publish GUI
@ -269,6 +269,31 @@ class HostToolsHelper:
dialog.activateWindow()
dialog.showNormal()
def get_publisher_tool(self, parent):
"""Create, cache and return publisher window."""
if self._publisher_tool is None:
from openpype.tools.publisher import PublisherWindow
host = registered_host()
ILoadHost.validate_load_methods(host)
publisher_window = PublisherWindow(
parent=parent or self._parent
)
self._publisher_tool = publisher_window
return self._publisher_tool
def show_publisher_tool(self, parent=None):
with qt_app_context():
dialog = self.get_publisher_tool(parent)
dialog.show()
dialog.raise_()
dialog.activateWindow()
dialog.showNormal()
def get_tool_by_name(self, tool_name, parent=None, *args, **kwargs):
"""Show tool by it's name.
@ -298,6 +323,10 @@ class HostToolsHelper:
elif tool_name == "publish":
self.log.info("Can't return publish tool window.")
# "new" publisher
elif tool_name == "publisher":
return self.get_publisher_tool(parent, *args, **kwargs)
elif tool_name == "experimental_tools":
return self.get_experimental_tools_dialog(parent, *args, **kwargs)
@ -335,6 +364,9 @@ class HostToolsHelper:
elif tool_name == "publish":
self.show_publish(parent, *args, **kwargs)
elif tool_name == "publisher":
self.show_publisher_tool(parent, *args, **kwargs)
elif tool_name == "experimental_tools":
self.show_experimental_tools_dialog(parent, *args, **kwargs)
@ -414,6 +446,10 @@ def show_publish(parent=None):
_SingletonPoint.show_tool_by_name("publish", parent)
def show_publisher(parent=None):
_SingletonPoint.show_tool_by_name("publisher", parent)
def show_experimental_tools_dialog(parent=None):
_SingletonPoint.show_tool_by_name("experimental_tools", parent)

View file

@ -1,3 +1,3 @@
# -*- coding: utf-8 -*-
"""Package declaring Pype version."""
__version__ = "3.14.2-nightly.4"
__version__ = "3.14.2"