mirror of
https://github.com/ynput/ayon-core.git
synced 2025-12-24 21:04:40 +01:00
Merge branch 'develop' into feature/OP-3733_Ftrack-set-version-status-on-integration
This commit is contained in:
commit
de0e5cf8f9
68 changed files with 2401 additions and 575 deletions
19
CHANGELOG.md
19
CHANGELOG.md
|
|
@ -1,12 +1,15 @@
|
|||
# Changelog
|
||||
|
||||
## [3.13.1-nightly.2](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
## [3.13.1-nightly.3](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.13.0...HEAD)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- General: Extract Review can scale with pixel aspect ratio [\#3644](https://github.com/pypeclub/OpenPype/pull/3644)
|
||||
- Maya: Refactor moved usage of CreateRender settings [\#3643](https://github.com/pypeclub/OpenPype/pull/3643)
|
||||
- General: Hero version representations have full context [\#3638](https://github.com/pypeclub/OpenPype/pull/3638)
|
||||
- Nuke: color settings for render write node is working now [\#3632](https://github.com/pypeclub/OpenPype/pull/3632)
|
||||
- Maya: FBX support for update in reference loader [\#3631](https://github.com/pypeclub/OpenPype/pull/3631)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
|
@ -16,7 +19,10 @@
|
|||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Deadline: Global job pre load is not Pype 2 compatible [\#3666](https://github.com/pypeclub/OpenPype/pull/3666)
|
||||
- Maya: Remove unused get current renderer logic [\#3645](https://github.com/pypeclub/OpenPype/pull/3645)
|
||||
- Kitsu|Fix: Movie project type fails & first loop children names [\#3636](https://github.com/pypeclub/OpenPype/pull/3636)
|
||||
- fix the bug of failing to extract look when UDIMs format used in AiImage [\#3628](https://github.com/pypeclub/OpenPype/pull/3628)
|
||||
|
||||
## [3.13.0](https://github.com/pypeclub/OpenPype/tree/3.13.0) (2022-08-09)
|
||||
|
||||
|
|
@ -55,7 +61,6 @@
|
|||
- General: Update imports in start script [\#3579](https://github.com/pypeclub/OpenPype/pull/3579)
|
||||
- Nuke: render family integration consistency [\#3576](https://github.com/pypeclub/OpenPype/pull/3576)
|
||||
- Ftrack: Handle missing published path in integrator [\#3570](https://github.com/pypeclub/OpenPype/pull/3570)
|
||||
- Maya: fix Review image plane attribute [\#3569](https://github.com/pypeclub/OpenPype/pull/3569)
|
||||
- Nuke: publish existing frames with slate with correct range [\#3555](https://github.com/pypeclub/OpenPype/pull/3555)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
|
@ -85,11 +90,10 @@
|
|||
|
||||
- General: Global thumbnail extractor is ready for more cases [\#3561](https://github.com/pypeclub/OpenPype/pull/3561)
|
||||
- Maya: add additional validators to Settings [\#3540](https://github.com/pypeclub/OpenPype/pull/3540)
|
||||
- General: Interactive console in cli [\#3526](https://github.com/pypeclub/OpenPype/pull/3526)
|
||||
- Ftrack: Automatic daily review session creation can define trigger hour [\#3516](https://github.com/pypeclub/OpenPype/pull/3516)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Maya: fix Review image plane attribute [\#3569](https://github.com/pypeclub/OpenPype/pull/3569)
|
||||
- Maya: Fix animated attributes \(ie. overscan\) on loaded cameras breaking review publishing. [\#3562](https://github.com/pypeclub/OpenPype/pull/3562)
|
||||
- NewPublisher: Python 2 compatible html escape [\#3559](https://github.com/pypeclub/OpenPype/pull/3559)
|
||||
- Remove invalid submodules from `/vendor` [\#3557](https://github.com/pypeclub/OpenPype/pull/3557)
|
||||
|
|
@ -98,12 +102,6 @@
|
|||
- Module interfaces: Fix import error [\#3547](https://github.com/pypeclub/OpenPype/pull/3547)
|
||||
- Workfiles tool: Show of tool and it's flags [\#3539](https://github.com/pypeclub/OpenPype/pull/3539)
|
||||
- General: Create workfile documents works again [\#3538](https://github.com/pypeclub/OpenPype/pull/3538)
|
||||
- Additional fixes for powershell scripts [\#3525](https://github.com/pypeclub/OpenPype/pull/3525)
|
||||
- Maya: Added wrapper around cmds.setAttr [\#3523](https://github.com/pypeclub/OpenPype/pull/3523)
|
||||
- Nuke: double slate [\#3521](https://github.com/pypeclub/OpenPype/pull/3521)
|
||||
- General: Fix hash of centos oiio archive [\#3519](https://github.com/pypeclub/OpenPype/pull/3519)
|
||||
- Maya: Renderman display output fix [\#3514](https://github.com/pypeclub/OpenPype/pull/3514)
|
||||
- TrayPublisher: Simple creation enhancements and fixes [\#3513](https://github.com/pypeclub/OpenPype/pull/3513)
|
||||
|
||||
**🔀 Refactored code**
|
||||
|
||||
|
|
@ -112,7 +110,6 @@
|
|||
- Refactor Integrate Asset [\#3530](https://github.com/pypeclub/OpenPype/pull/3530)
|
||||
- General: Client docstrings cleanup [\#3529](https://github.com/pypeclub/OpenPype/pull/3529)
|
||||
- General: Move load related functions into pipeline [\#3527](https://github.com/pypeclub/OpenPype/pull/3527)
|
||||
- General: Get current context document functions [\#3522](https://github.com/pypeclub/OpenPype/pull/3522)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
|
|
|
|||
|
|
@ -6,6 +6,7 @@ that has project name as a context (e.g. on 'ProjectEntity'?).
|
|||
+ We will need more specific functions doing wery specific queires really fast.
|
||||
"""
|
||||
|
||||
import re
|
||||
import collections
|
||||
|
||||
import six
|
||||
|
|
@ -1009,17 +1010,70 @@ def get_representation_by_name(
|
|||
return conn.find_one(query_filter, _prepare_fields(fields))
|
||||
|
||||
|
||||
def _flatten_dict(data):
|
||||
flatten_queue = collections.deque()
|
||||
flatten_queue.append(data)
|
||||
output = {}
|
||||
while flatten_queue:
|
||||
item = flatten_queue.popleft()
|
||||
for key, value in item.items():
|
||||
if not isinstance(value, dict):
|
||||
output[key] = value
|
||||
continue
|
||||
|
||||
tmp = {}
|
||||
for subkey, subvalue in value.items():
|
||||
new_key = "{}.{}".format(key, subkey)
|
||||
tmp[new_key] = subvalue
|
||||
flatten_queue.append(tmp)
|
||||
return output
|
||||
|
||||
|
||||
def _regex_filters(filters):
|
||||
output = []
|
||||
for key, value in filters.items():
|
||||
regexes = []
|
||||
a_values = []
|
||||
if isinstance(value, re.Pattern):
|
||||
regexes.append(value)
|
||||
elif isinstance(value, (list, tuple, set)):
|
||||
for item in value:
|
||||
if isinstance(item, re.Pattern):
|
||||
regexes.append(item)
|
||||
else:
|
||||
a_values.append(item)
|
||||
else:
|
||||
a_values.append(value)
|
||||
|
||||
key_filters = []
|
||||
if len(a_values) == 1:
|
||||
key_filters.append({key: a_values[0]})
|
||||
elif a_values:
|
||||
key_filters.append({key: {"$in": a_values}})
|
||||
|
||||
for regex in regexes:
|
||||
key_filters.append({key: {"$regex": regex}})
|
||||
|
||||
if len(key_filters) == 1:
|
||||
output.append(key_filters[0])
|
||||
else:
|
||||
output.append({"$or": key_filters})
|
||||
|
||||
return output
|
||||
|
||||
|
||||
def _get_representations(
|
||||
project_name,
|
||||
representation_ids,
|
||||
representation_names,
|
||||
version_ids,
|
||||
extensions,
|
||||
context_filters,
|
||||
names_by_version_ids,
|
||||
standard,
|
||||
archived,
|
||||
fields
|
||||
):
|
||||
default_output = []
|
||||
repre_types = []
|
||||
if standard:
|
||||
repre_types.append("representation")
|
||||
|
|
@ -1027,7 +1081,7 @@ def _get_representations(
|
|||
repre_types.append("archived_representation")
|
||||
|
||||
if not repre_types:
|
||||
return []
|
||||
return default_output
|
||||
|
||||
if len(repre_types) == 1:
|
||||
query_filter = {"type": repre_types[0]}
|
||||
|
|
@ -1037,25 +1091,21 @@ def _get_representations(
|
|||
if representation_ids is not None:
|
||||
representation_ids = _convert_ids(representation_ids)
|
||||
if not representation_ids:
|
||||
return []
|
||||
return default_output
|
||||
query_filter["_id"] = {"$in": representation_ids}
|
||||
|
||||
if representation_names is not None:
|
||||
if not representation_names:
|
||||
return []
|
||||
return default_output
|
||||
query_filter["name"] = {"$in": list(representation_names)}
|
||||
|
||||
if version_ids is not None:
|
||||
version_ids = _convert_ids(version_ids)
|
||||
if not version_ids:
|
||||
return []
|
||||
return default_output
|
||||
query_filter["parent"] = {"$in": version_ids}
|
||||
|
||||
if extensions is not None:
|
||||
if not extensions:
|
||||
return []
|
||||
query_filter["context.ext"] = {"$in": list(extensions)}
|
||||
|
||||
or_queries = []
|
||||
if names_by_version_ids is not None:
|
||||
or_query = []
|
||||
for version_id, names in names_by_version_ids.items():
|
||||
|
|
@ -1065,8 +1115,36 @@ def _get_representations(
|
|||
"name": {"$in": list(names)}
|
||||
})
|
||||
if not or_query:
|
||||
return default_output
|
||||
or_queries.append(or_query)
|
||||
|
||||
if context_filters is not None:
|
||||
if not context_filters:
|
||||
return []
|
||||
query_filter["$or"] = or_query
|
||||
_flatten_filters = _flatten_dict(context_filters)
|
||||
flatten_filters = {}
|
||||
for key, value in _flatten_filters.items():
|
||||
if not key.startswith("context"):
|
||||
key = "context.{}".format(key)
|
||||
flatten_filters[key] = value
|
||||
|
||||
for item in _regex_filters(flatten_filters):
|
||||
for key, value in item.items():
|
||||
if key != "$or":
|
||||
query_filter[key] = value
|
||||
|
||||
elif value:
|
||||
or_queries.append(value)
|
||||
|
||||
if len(or_queries) == 1:
|
||||
query_filter["$or"] = or_queries[0]
|
||||
elif or_queries:
|
||||
and_query = []
|
||||
for or_query in or_queries:
|
||||
if isinstance(or_query, list):
|
||||
or_query = {"$or": or_query}
|
||||
and_query.append(or_query)
|
||||
query_filter["$and"] = and_query
|
||||
|
||||
conn = get_project_connection(project_name)
|
||||
|
||||
|
|
@ -1078,7 +1156,7 @@ def get_representations(
|
|||
representation_ids=None,
|
||||
representation_names=None,
|
||||
version_ids=None,
|
||||
extensions=None,
|
||||
context_filters=None,
|
||||
names_by_version_ids=None,
|
||||
archived=False,
|
||||
standard=True,
|
||||
|
|
@ -1096,8 +1174,8 @@ def get_representations(
|
|||
as filter. Filter ignored if 'None' is passed.
|
||||
version_ids (Iterable[str]): Subset ids used as parent filter. Filter
|
||||
ignored if 'None' is passed.
|
||||
extensions (Iterable[str]): Filter by extension of main representation
|
||||
file (without dot).
|
||||
context_filters (Dict[str, List[str, re.Pattern]]): Filter by
|
||||
representation context fields.
|
||||
names_by_version_ids (dict[ObjectId, list[str]]): Complex filtering
|
||||
using version ids and list of names under the version.
|
||||
archived (bool): Output will also contain archived representations.
|
||||
|
|
@ -1113,7 +1191,7 @@ def get_representations(
|
|||
representation_ids=representation_ids,
|
||||
representation_names=representation_names,
|
||||
version_ids=version_ids,
|
||||
extensions=extensions,
|
||||
context_filters=context_filters,
|
||||
names_by_version_ids=names_by_version_ids,
|
||||
standard=True,
|
||||
archived=archived,
|
||||
|
|
@ -1126,7 +1204,7 @@ def get_archived_representations(
|
|||
representation_ids=None,
|
||||
representation_names=None,
|
||||
version_ids=None,
|
||||
extensions=None,
|
||||
context_filters=None,
|
||||
names_by_version_ids=None,
|
||||
fields=None
|
||||
):
|
||||
|
|
@ -1142,8 +1220,8 @@ def get_archived_representations(
|
|||
as filter. Filter ignored if 'None' is passed.
|
||||
version_ids (Iterable[str]): Subset ids used as parent filter. Filter
|
||||
ignored if 'None' is passed.
|
||||
extensions (Iterable[str]): Filter by extension of main representation
|
||||
file (without dot).
|
||||
context_filters (Dict[str, List[str, re.Pattern]]): Filter by
|
||||
representation context fields.
|
||||
names_by_version_ids (dict[ObjectId, List[str]]): Complex filtering
|
||||
using version ids and list of names under the version.
|
||||
fields (Iterable[str]): Fields that should be returned. All fields are
|
||||
|
|
@ -1158,7 +1236,7 @@ def get_archived_representations(
|
|||
representation_ids=representation_ids,
|
||||
representation_names=representation_names,
|
||||
version_ids=version_ids,
|
||||
extensions=extensions,
|
||||
context_filters=context_filters,
|
||||
names_by_version_ids=names_by_version_ids,
|
||||
standard=False,
|
||||
archived=True,
|
||||
|
|
|
|||
|
|
@ -19,8 +19,15 @@ class MissingMethodsError(ValueError):
|
|||
joined_missing = ", ".join(
|
||||
['"{}"'.format(item) for item in missing_methods]
|
||||
)
|
||||
if isinstance(host, HostBase):
|
||||
host_name = host.name
|
||||
else:
|
||||
try:
|
||||
host_name = host.__file__.replace("\\", "/").split("/")[-3]
|
||||
except Exception:
|
||||
host_name = str(host)
|
||||
message = (
|
||||
"Host \"{}\" miss methods {}".format(host.name, joined_missing)
|
||||
"Host \"{}\" miss methods {}".format(host_name, joined_missing)
|
||||
)
|
||||
super(MissingMethodsError, self).__init__(message)
|
||||
|
||||
|
|
|
|||
|
|
@ -1,27 +1,6 @@
|
|||
import os
|
||||
from .module import OpenPypeMaya
|
||||
|
||||
|
||||
def add_implementation_envs(env, _app):
|
||||
# Add requirements to PYTHONPATH
|
||||
pype_root = os.environ["OPENPYPE_REPOS_ROOT"]
|
||||
new_python_paths = [
|
||||
os.path.join(pype_root, "openpype", "hosts", "maya", "startup")
|
||||
]
|
||||
old_python_path = env.get("PYTHONPATH") or ""
|
||||
for path in old_python_path.split(os.pathsep):
|
||||
if not path:
|
||||
continue
|
||||
|
||||
norm_path = os.path.normpath(path)
|
||||
if norm_path not in new_python_paths:
|
||||
new_python_paths.append(norm_path)
|
||||
|
||||
env["PYTHONPATH"] = os.pathsep.join(new_python_paths)
|
||||
|
||||
# Set default values if are not already set via settings
|
||||
defaults = {
|
||||
"OPENPYPE_LOG_NO_COLORS": "Yes"
|
||||
}
|
||||
for key, value in defaults.items():
|
||||
if not env.get(key):
|
||||
env[key] = value
|
||||
__all__ = (
|
||||
"OpenPypeMaya",
|
||||
)
|
||||
|
|
|
|||
253
openpype/hosts/maya/api/lib_template_builder.py
Normal file
253
openpype/hosts/maya/api/lib_template_builder.py
Normal file
|
|
@ -0,0 +1,253 @@
|
|||
import json
|
||||
from collections import OrderedDict
|
||||
import maya.cmds as cmds
|
||||
|
||||
import qargparse
|
||||
from openpype.tools.utils.widgets import OptionDialog
|
||||
from .lib import get_main_window, imprint
|
||||
|
||||
# To change as enum
|
||||
build_types = ["context_asset", "linked_asset", "all_assets"]
|
||||
|
||||
|
||||
def get_placeholder_attributes(node):
|
||||
return {
|
||||
attr: cmds.getAttr("{}.{}".format(node, attr))
|
||||
for attr in cmds.listAttr(node, userDefined=True)}
|
||||
|
||||
|
||||
def delete_placeholder_attributes(node):
|
||||
'''
|
||||
function to delete all extra placeholder attributes
|
||||
'''
|
||||
extra_attributes = get_placeholder_attributes(node)
|
||||
for attribute in extra_attributes:
|
||||
cmds.deleteAttr(node + '.' + attribute)
|
||||
|
||||
|
||||
def create_placeholder():
|
||||
args = placeholder_window()
|
||||
|
||||
if not args:
|
||||
return # operation canceled, no locator created
|
||||
|
||||
# custom arg parse to force empty data query
|
||||
# and still imprint them on placeholder
|
||||
# and getting items when arg is of type Enumerator
|
||||
options = create_options(args)
|
||||
|
||||
# create placeholder name dynamically from args and options
|
||||
placeholder_name = create_placeholder_name(args, options)
|
||||
|
||||
selection = cmds.ls(selection=True)
|
||||
if not selection:
|
||||
raise ValueError("Nothing is selected")
|
||||
|
||||
placeholder = cmds.spaceLocator(name=placeholder_name)[0]
|
||||
|
||||
# get the long name of the placeholder (with the groups)
|
||||
placeholder_full_name = cmds.ls(selection[0], long=True)[
|
||||
0] + '|' + placeholder.replace('|', '')
|
||||
|
||||
if selection:
|
||||
cmds.parent(placeholder, selection[0])
|
||||
|
||||
imprint(placeholder_full_name, options)
|
||||
|
||||
# Some tweaks because imprint force enums to to default value so we get
|
||||
# back arg read and force them to attributes
|
||||
imprint_enum(placeholder_full_name, args)
|
||||
|
||||
# Add helper attributes to keep placeholder info
|
||||
cmds.addAttr(
|
||||
placeholder_full_name,
|
||||
longName="parent",
|
||||
hidden=True,
|
||||
dataType="string"
|
||||
)
|
||||
cmds.addAttr(
|
||||
placeholder_full_name,
|
||||
longName="index",
|
||||
hidden=True,
|
||||
attributeType="short",
|
||||
defaultValue=-1
|
||||
)
|
||||
|
||||
cmds.setAttr(placeholder_full_name + '.parent', "", type="string")
|
||||
|
||||
|
||||
def create_placeholder_name(args, options):
|
||||
placeholder_builder_type = [
|
||||
arg.read() for arg in args if 'builder_type' in str(arg)
|
||||
][0]
|
||||
placeholder_family = options['family']
|
||||
placeholder_name = placeholder_builder_type.split('_')
|
||||
|
||||
# add famlily in any
|
||||
if placeholder_family:
|
||||
placeholder_name.insert(1, placeholder_family)
|
||||
|
||||
# add loader arguments if any
|
||||
if options['loader_args']:
|
||||
pos = 2
|
||||
loader_args = options['loader_args'].replace('\'', '\"')
|
||||
loader_args = json.loads(loader_args)
|
||||
values = [v for v in loader_args.values()]
|
||||
for i in range(len(values)):
|
||||
placeholder_name.insert(i + pos, values[i])
|
||||
|
||||
placeholder_name = '_'.join(placeholder_name)
|
||||
|
||||
return placeholder_name.capitalize()
|
||||
|
||||
|
||||
def update_placeholder():
|
||||
placeholder = cmds.ls(selection=True)
|
||||
if len(placeholder) == 0:
|
||||
raise ValueError("No node selected")
|
||||
if len(placeholder) > 1:
|
||||
raise ValueError("Too many selected nodes")
|
||||
placeholder = placeholder[0]
|
||||
|
||||
args = placeholder_window(get_placeholder_attributes(placeholder))
|
||||
|
||||
if not args:
|
||||
return # operation canceled
|
||||
|
||||
# delete placeholder attributes
|
||||
delete_placeholder_attributes(placeholder)
|
||||
|
||||
options = create_options(args)
|
||||
|
||||
imprint(placeholder, options)
|
||||
imprint_enum(placeholder, args)
|
||||
|
||||
cmds.addAttr(
|
||||
placeholder,
|
||||
longName="parent",
|
||||
hidden=True,
|
||||
dataType="string"
|
||||
)
|
||||
cmds.addAttr(
|
||||
placeholder,
|
||||
longName="index",
|
||||
hidden=True,
|
||||
attributeType="short",
|
||||
defaultValue=-1
|
||||
)
|
||||
|
||||
cmds.setAttr(placeholder + '.parent', '', type="string")
|
||||
|
||||
|
||||
def create_options(args):
|
||||
options = OrderedDict()
|
||||
for arg in args:
|
||||
if not type(arg) == qargparse.Separator:
|
||||
options[str(arg)] = arg._data.get("items") or arg.read()
|
||||
return options
|
||||
|
||||
|
||||
def imprint_enum(placeholder, args):
|
||||
"""
|
||||
Imprint method doesn't act properly with enums.
|
||||
Replacing the functionnality with this for now
|
||||
"""
|
||||
enum_values = {str(arg): arg.read()
|
||||
for arg in args if arg._data.get("items")}
|
||||
string_to_value_enum_table = {
|
||||
build: i for i, build
|
||||
in enumerate(build_types)}
|
||||
for key, value in enum_values.items():
|
||||
cmds.setAttr(
|
||||
placeholder + "." + key,
|
||||
string_to_value_enum_table[value])
|
||||
|
||||
|
||||
def placeholder_window(options=None):
|
||||
options = options or dict()
|
||||
dialog = OptionDialog(parent=get_main_window())
|
||||
dialog.setWindowTitle("Create Placeholder")
|
||||
|
||||
args = [
|
||||
qargparse.Separator("Main attributes"),
|
||||
qargparse.Enum(
|
||||
"builder_type",
|
||||
label="Asset Builder Type",
|
||||
default=options.get("builder_type", 0),
|
||||
items=build_types,
|
||||
help="""Asset Builder Type
|
||||
Builder type describe what template loader will look for.
|
||||
context_asset : Template loader will look for subsets of
|
||||
current context asset (Asset bob will find asset)
|
||||
linked_asset : Template loader will look for assets linked
|
||||
to current context asset.
|
||||
Linked asset are looked in avalon database under field "inputLinks"
|
||||
"""
|
||||
),
|
||||
qargparse.String(
|
||||
"family",
|
||||
default=options.get("family", ""),
|
||||
label="OpenPype Family",
|
||||
placeholder="ex: model, look ..."),
|
||||
qargparse.String(
|
||||
"representation",
|
||||
default=options.get("representation", ""),
|
||||
label="OpenPype Representation",
|
||||
placeholder="ex: ma, abc ..."),
|
||||
qargparse.String(
|
||||
"loader",
|
||||
default=options.get("loader", ""),
|
||||
label="Loader",
|
||||
placeholder="ex: ReferenceLoader, LightLoader ...",
|
||||
help="""Loader
|
||||
Defines what openpype loader will be used to load assets.
|
||||
Useable loader depends on current host's loader list.
|
||||
Field is case sensitive.
|
||||
"""),
|
||||
qargparse.String(
|
||||
"loader_args",
|
||||
default=options.get("loader_args", ""),
|
||||
label="Loader Arguments",
|
||||
placeholder='ex: {"camera":"persp", "lights":True}',
|
||||
help="""Loader
|
||||
Defines a dictionnary of arguments used to load assets.
|
||||
Useable arguments depend on current placeholder Loader.
|
||||
Field should be a valid python dict. Anything else will be ignored.
|
||||
"""),
|
||||
qargparse.Integer(
|
||||
"order",
|
||||
default=options.get("order", 0),
|
||||
min=0,
|
||||
max=999,
|
||||
label="Order",
|
||||
placeholder="ex: 0, 100 ... (smallest order loaded first)",
|
||||
help="""Order
|
||||
Order defines asset loading priority (0 to 999)
|
||||
Priority rule is : "lowest is first to load"."""),
|
||||
qargparse.Separator(
|
||||
"Optional attributes"),
|
||||
qargparse.String(
|
||||
"asset",
|
||||
default=options.get("asset", ""),
|
||||
label="Asset filter",
|
||||
placeholder="regex filtering by asset name",
|
||||
help="Filtering assets by matching field regex to asset's name"),
|
||||
qargparse.String(
|
||||
"subset",
|
||||
default=options.get("subset", ""),
|
||||
label="Subset filter",
|
||||
placeholder="regex filtering by subset name",
|
||||
help="Filtering assets by matching field regex to subset's name"),
|
||||
qargparse.String(
|
||||
"hierarchy",
|
||||
default=options.get("hierarchy", ""),
|
||||
label="Hierarchy filter",
|
||||
placeholder="regex filtering by asset's hierarchy",
|
||||
help="Filtering assets by matching field asset's hierarchy")
|
||||
]
|
||||
dialog.create(args)
|
||||
|
||||
if not dialog.exec_():
|
||||
return None
|
||||
|
||||
return args
|
||||
|
|
@ -9,10 +9,15 @@ import maya.cmds as cmds
|
|||
from openpype.settings import get_project_settings
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype.pipeline.workfile import BuildWorkfile
|
||||
from openpype.pipeline.workfile.build_template import (
|
||||
build_workfile_template,
|
||||
update_workfile_template
|
||||
)
|
||||
from openpype.tools.utils import host_tools
|
||||
from openpype.hosts.maya.api import lib, lib_rendersettings
|
||||
from .lib import get_main_window, IS_HEADLESS
|
||||
from .commands import reset_frame_range
|
||||
from .lib_template_builder import create_placeholder, update_placeholder
|
||||
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
|
@ -147,6 +152,34 @@ def install():
|
|||
parent_widget
|
||||
)
|
||||
)
|
||||
|
||||
builder_menu = cmds.menuItem(
|
||||
"Template Builder",
|
||||
subMenu=True,
|
||||
tearOff=True,
|
||||
parent=MENU_NAME
|
||||
)
|
||||
cmds.menuItem(
|
||||
"Create Placeholder",
|
||||
parent=builder_menu,
|
||||
command=lambda *args: create_placeholder()
|
||||
)
|
||||
cmds.menuItem(
|
||||
"Update Placeholder",
|
||||
parent=builder_menu,
|
||||
command=lambda *args: update_placeholder()
|
||||
)
|
||||
cmds.menuItem(
|
||||
"Build Workfile from template",
|
||||
parent=builder_menu,
|
||||
command=build_workfile_template
|
||||
)
|
||||
cmds.menuItem(
|
||||
"Update Workfile from template",
|
||||
parent=builder_menu,
|
||||
command=update_workfile_template
|
||||
)
|
||||
|
||||
cmds.setParent(MENU_NAME, menu=True)
|
||||
|
||||
def add_scripts_menu():
|
||||
|
|
|
|||
252
openpype/hosts/maya/api/template_loader.py
Normal file
252
openpype/hosts/maya/api/template_loader.py
Normal file
|
|
@ -0,0 +1,252 @@
|
|||
import re
|
||||
from maya import cmds
|
||||
|
||||
from openpype.client import get_representations
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype.pipeline.workfile.abstract_template_loader import (
|
||||
AbstractPlaceholder,
|
||||
AbstractTemplateLoader
|
||||
)
|
||||
from openpype.pipeline.workfile.build_template_exceptions import (
|
||||
TemplateAlreadyImported
|
||||
)
|
||||
|
||||
PLACEHOLDER_SET = 'PLACEHOLDERS_SET'
|
||||
|
||||
|
||||
class MayaTemplateLoader(AbstractTemplateLoader):
|
||||
"""Concrete implementation of AbstractTemplateLoader for maya
|
||||
"""
|
||||
|
||||
def import_template(self, path):
|
||||
"""Import template into current scene.
|
||||
Block if a template is already loaded.
|
||||
Args:
|
||||
path (str): A path to current template (usually given by
|
||||
get_template_path implementation)
|
||||
Returns:
|
||||
bool: Wether the template was succesfully imported or not
|
||||
"""
|
||||
if cmds.objExists(PLACEHOLDER_SET):
|
||||
raise TemplateAlreadyImported(
|
||||
"Build template already loaded\n"
|
||||
"Clean scene if needed (File > New Scene)")
|
||||
|
||||
cmds.sets(name=PLACEHOLDER_SET, empty=True)
|
||||
self.new_nodes = cmds.file(path, i=True, returnNewNodes=True)
|
||||
cmds.setAttr(PLACEHOLDER_SET + '.hiddenInOutliner', True)
|
||||
|
||||
for set in cmds.listSets(allSets=True):
|
||||
if (cmds.objExists(set) and
|
||||
cmds.attributeQuery('id', node=set, exists=True) and
|
||||
cmds.getAttr(set + '.id') == 'pyblish.avalon.instance'):
|
||||
if cmds.attributeQuery('asset', node=set, exists=True):
|
||||
cmds.setAttr(
|
||||
set + '.asset',
|
||||
legacy_io.Session['AVALON_ASSET'], type='string'
|
||||
)
|
||||
|
||||
return True
|
||||
|
||||
def template_already_imported(self, err_msg):
|
||||
clearButton = "Clear scene and build"
|
||||
updateButton = "Update template"
|
||||
abortButton = "Abort"
|
||||
|
||||
title = "Scene already builded"
|
||||
message = (
|
||||
"It's seems a template was already build for this scene.\n"
|
||||
"Error message reveived :\n\n\"{}\"".format(err_msg))
|
||||
buttons = [clearButton, updateButton, abortButton]
|
||||
defaultButton = clearButton
|
||||
cancelButton = abortButton
|
||||
dismissString = abortButton
|
||||
answer = cmds.confirmDialog(
|
||||
t=title,
|
||||
m=message,
|
||||
b=buttons,
|
||||
db=defaultButton,
|
||||
cb=cancelButton,
|
||||
ds=dismissString)
|
||||
|
||||
if answer == clearButton:
|
||||
cmds.file(newFile=True, force=True)
|
||||
self.import_template(self.template_path)
|
||||
self.populate_template()
|
||||
elif answer == updateButton:
|
||||
self.update_missing_containers()
|
||||
elif answer == abortButton:
|
||||
return
|
||||
|
||||
@staticmethod
|
||||
def get_template_nodes():
|
||||
attributes = cmds.ls('*.builder_type', long=True)
|
||||
return [attribute.rpartition('.')[0] for attribute in attributes]
|
||||
|
||||
def get_loaded_containers_by_id(self):
|
||||
try:
|
||||
containers = cmds.sets("AVALON_CONTAINERS", q=True)
|
||||
except ValueError:
|
||||
return None
|
||||
|
||||
return [
|
||||
cmds.getAttr(container + '.representation')
|
||||
for container in containers]
|
||||
|
||||
|
||||
class MayaPlaceholder(AbstractPlaceholder):
|
||||
"""Concrete implementation of AbstractPlaceholder for maya
|
||||
"""
|
||||
|
||||
optional_keys = {'asset', 'subset', 'hierarchy'}
|
||||
|
||||
def get_data(self, node):
|
||||
user_data = dict()
|
||||
for attr in self.required_keys.union(self.optional_keys):
|
||||
attribute_name = '{}.{}'.format(node, attr)
|
||||
if not cmds.attributeQuery(attr, node=node, exists=True):
|
||||
print("{} not found".format(attribute_name))
|
||||
continue
|
||||
user_data[attr] = cmds.getAttr(
|
||||
attribute_name,
|
||||
asString=True)
|
||||
user_data['parent'] = (
|
||||
cmds.getAttr(node + '.parent', asString=True)
|
||||
or node.rpartition('|')[0]
|
||||
or ""
|
||||
)
|
||||
user_data['node'] = node
|
||||
if user_data['parent']:
|
||||
siblings = cmds.listRelatives(user_data['parent'], children=True)
|
||||
else:
|
||||
siblings = cmds.ls(assemblies=True)
|
||||
node_shortname = user_data['node'].rpartition('|')[2]
|
||||
current_index = cmds.getAttr(node + '.index', asString=True)
|
||||
user_data['index'] = (
|
||||
current_index if current_index >= 0
|
||||
else siblings.index(node_shortname))
|
||||
|
||||
self.data = user_data
|
||||
|
||||
def parent_in_hierarchy(self, containers):
|
||||
"""Parent loaded container to placeholder's parent
|
||||
ie : Set loaded content as placeholder's sibling
|
||||
Args:
|
||||
containers (String): Placeholder loaded containers
|
||||
"""
|
||||
if not containers:
|
||||
return
|
||||
|
||||
roots = cmds.sets(containers, q=True)
|
||||
nodes_to_parent = []
|
||||
for root in roots:
|
||||
if root.endswith("_RN"):
|
||||
refRoot = cmds.referenceQuery(root, n=True)[0]
|
||||
refRoot = cmds.listRelatives(refRoot, parent=True) or [refRoot]
|
||||
nodes_to_parent.extend(refRoot)
|
||||
elif root in cmds.listSets(allSets=True):
|
||||
if not cmds.sets(root, q=True):
|
||||
return
|
||||
else:
|
||||
continue
|
||||
else:
|
||||
nodes_to_parent.append(root)
|
||||
|
||||
if self.data['parent']:
|
||||
cmds.parent(nodes_to_parent, self.data['parent'])
|
||||
# Move loaded nodes to correct index in outliner hierarchy
|
||||
placeholder_node = self.data['node']
|
||||
placeholder_form = cmds.xform(
|
||||
placeholder_node,
|
||||
q=True,
|
||||
matrix=True,
|
||||
worldSpace=True
|
||||
)
|
||||
for node in set(nodes_to_parent):
|
||||
cmds.reorder(node, front=True)
|
||||
cmds.reorder(node, relative=self.data['index'])
|
||||
cmds.xform(node, matrix=placeholder_form, ws=True)
|
||||
|
||||
holding_sets = cmds.listSets(object=placeholder_node)
|
||||
if not holding_sets:
|
||||
return
|
||||
for holding_set in holding_sets:
|
||||
cmds.sets(roots, forceElement=holding_set)
|
||||
|
||||
def clean(self):
|
||||
"""Hide placeholder, parent them to root
|
||||
add them to placeholder set and register placeholder's parent
|
||||
to keep placeholder info available for future use
|
||||
"""
|
||||
node = self.data['node']
|
||||
if self.data['parent']:
|
||||
cmds.setAttr(node + '.parent', self.data['parent'], type='string')
|
||||
if cmds.getAttr(node + '.index') < 0:
|
||||
cmds.setAttr(node + '.index', self.data['index'])
|
||||
|
||||
holding_sets = cmds.listSets(object=node)
|
||||
if holding_sets:
|
||||
for set in holding_sets:
|
||||
cmds.sets(node, remove=set)
|
||||
|
||||
if cmds.listRelatives(node, p=True):
|
||||
node = cmds.parent(node, world=True)[0]
|
||||
cmds.sets(node, addElement=PLACEHOLDER_SET)
|
||||
cmds.hide(node)
|
||||
cmds.setAttr(node + '.hiddenInOutliner', True)
|
||||
|
||||
def get_representations(self, current_asset_doc, linked_asset_docs):
|
||||
project_name = legacy_io.active_project()
|
||||
|
||||
builder_type = self.data["builder_type"]
|
||||
if builder_type == "context_asset":
|
||||
context_filters = {
|
||||
"asset": [current_asset_doc["name"]],
|
||||
"subset": [re.compile(self.data["subset"])],
|
||||
"hierarchy": [re.compile(self.data["hierarchy"])],
|
||||
"representations": [self.data["representation"]],
|
||||
"family": [self.data["family"]]
|
||||
}
|
||||
|
||||
elif builder_type != "linked_asset":
|
||||
context_filters = {
|
||||
"asset": [re.compile(self.data["asset"])],
|
||||
"subset": [re.compile(self.data["subset"])],
|
||||
"hierarchy": [re.compile(self.data["hierarchy"])],
|
||||
"representation": [self.data["representation"]],
|
||||
"family": [self.data["family"]]
|
||||
}
|
||||
|
||||
else:
|
||||
asset_regex = re.compile(self.data["asset"])
|
||||
linked_asset_names = []
|
||||
for asset_doc in linked_asset_docs:
|
||||
asset_name = asset_doc["name"]
|
||||
if asset_regex.match(asset_name):
|
||||
linked_asset_names.append(asset_name)
|
||||
|
||||
context_filters = {
|
||||
"asset": linked_asset_names,
|
||||
"subset": [re.compile(self.data["subset"])],
|
||||
"hierarchy": [re.compile(self.data["hierarchy"])],
|
||||
"representation": [self.data["representation"]],
|
||||
"family": [self.data["family"]],
|
||||
}
|
||||
|
||||
return list(get_representations(
|
||||
project_name,
|
||||
context_filters=context_filters
|
||||
))
|
||||
|
||||
def err_message(self):
|
||||
return (
|
||||
"Error while trying to load a representation.\n"
|
||||
"Either the subset wasn't published or the template is malformed."
|
||||
"\n\n"
|
||||
"Builder was looking for :\n{attributes}".format(
|
||||
attributes="\n".join([
|
||||
"{}: {}".format(key.title(), value)
|
||||
for key, value in self.data.items()]
|
||||
)
|
||||
)
|
||||
)
|
||||
|
|
@ -2,11 +2,9 @@
|
|||
import os
|
||||
from maya import cmds
|
||||
|
||||
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
|
||||
|
||||
|
||||
def file_extensions():
|
||||
return HOST_WORKFILE_EXTENSIONS["maya"]
|
||||
return [".ma", ".mb"]
|
||||
|
||||
|
||||
def has_unsaved_changes():
|
||||
|
|
|
|||
47
openpype/hosts/maya/module.py
Normal file
47
openpype/hosts/maya/module.py
Normal file
|
|
@ -0,0 +1,47 @@
|
|||
import os
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype.modules.interfaces import IHostModule
|
||||
|
||||
MAYA_ROOT_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class OpenPypeMaya(OpenPypeModule, IHostModule):
|
||||
name = "openpype_maya"
|
||||
host_name = "maya"
|
||||
|
||||
def initialize(self, module_settings):
|
||||
self.enabled = True
|
||||
|
||||
def add_implementation_envs(self, env, _app):
|
||||
# Add requirements to PYTHONPATH
|
||||
new_python_paths = [
|
||||
os.path.join(MAYA_ROOT_DIR, "startup")
|
||||
]
|
||||
old_python_path = env.get("PYTHONPATH") or ""
|
||||
for path in old_python_path.split(os.pathsep):
|
||||
if not path:
|
||||
continue
|
||||
|
||||
norm_path = os.path.normpath(path)
|
||||
if norm_path not in new_python_paths:
|
||||
new_python_paths.append(norm_path)
|
||||
|
||||
env["PYTHONPATH"] = os.pathsep.join(new_python_paths)
|
||||
|
||||
# Set default values if are not already set via settings
|
||||
defaults = {
|
||||
"OPENPYPE_LOG_NO_COLORS": "Yes"
|
||||
}
|
||||
for key, value in defaults.items():
|
||||
if not env.get(key):
|
||||
env[key] = value
|
||||
|
||||
def get_launch_hook_paths(self, app):
|
||||
if app.host_name != self.host_name:
|
||||
return []
|
||||
return [
|
||||
os.path.join(MAYA_ROOT_DIR, "hooks")
|
||||
]
|
||||
|
||||
def get_workfile_extensions(self):
|
||||
return [".ma", ".mb"]
|
||||
|
|
@ -154,12 +154,6 @@ class CollectMayaRender(pyblish.api.ContextPlugin):
|
|||
layer_name = "rs_{}".format(expected_layer_name)
|
||||
|
||||
# collect all frames we are expecting to be rendered
|
||||
renderer = self.get_render_attribute("currentRenderer",
|
||||
layer=layer_name)
|
||||
# handle various renderman names
|
||||
if renderer.startswith("renderman"):
|
||||
renderer = "renderman"
|
||||
|
||||
# return all expected files for all cameras and aovs in given
|
||||
# frame range
|
||||
layer_render_products = get_layer_render_products(layer_name)
|
||||
|
|
@ -360,6 +354,7 @@ class CollectMayaRender(pyblish.api.ContextPlugin):
|
|||
|
||||
instance = context.create_instance(expected_layer_name)
|
||||
instance.data["label"] = label
|
||||
instance.data["farm"] = True
|
||||
instance.data.update(data)
|
||||
self.log.debug("data: {}".format(json.dumps(data, indent=4)))
|
||||
|
||||
|
|
|
|||
|
|
@ -1,129 +0,0 @@
|
|||
from .api.utils import (
|
||||
setup,
|
||||
get_resolve_module
|
||||
)
|
||||
|
||||
from .api.pipeline import (
|
||||
install,
|
||||
uninstall,
|
||||
ls,
|
||||
containerise,
|
||||
update_container,
|
||||
publish,
|
||||
launch_workfiles_app,
|
||||
maintained_selection,
|
||||
remove_instance,
|
||||
list_instances
|
||||
)
|
||||
|
||||
from .api.lib import (
|
||||
maintain_current_timeline,
|
||||
publish_clip_color,
|
||||
get_project_manager,
|
||||
get_current_project,
|
||||
get_current_timeline,
|
||||
create_bin,
|
||||
get_media_pool_item,
|
||||
create_media_pool_item,
|
||||
create_timeline_item,
|
||||
get_timeline_item,
|
||||
get_video_track_names,
|
||||
get_current_timeline_items,
|
||||
get_pype_timeline_item_by_name,
|
||||
get_timeline_item_pype_tag,
|
||||
set_timeline_item_pype_tag,
|
||||
imprint,
|
||||
set_publish_attribute,
|
||||
get_publish_attribute,
|
||||
create_compound_clip,
|
||||
swap_clips,
|
||||
get_pype_clip_metadata,
|
||||
set_project_manager_to_folder_name,
|
||||
get_otio_clip_instance_data,
|
||||
get_reformated_path
|
||||
)
|
||||
|
||||
from .api.menu import launch_pype_menu
|
||||
|
||||
from .api.plugin import (
|
||||
ClipLoader,
|
||||
TimelineItemLoader,
|
||||
Creator,
|
||||
PublishClip
|
||||
)
|
||||
|
||||
from .api.workio import (
|
||||
open_file,
|
||||
save_file,
|
||||
current_file,
|
||||
has_unsaved_changes,
|
||||
file_extensions,
|
||||
work_root
|
||||
)
|
||||
|
||||
from .api.testing_utils import TestGUI
|
||||
|
||||
|
||||
__all__ = [
|
||||
# pipeline
|
||||
"install",
|
||||
"uninstall",
|
||||
"ls",
|
||||
"containerise",
|
||||
"update_container",
|
||||
"reload_pipeline",
|
||||
"publish",
|
||||
"launch_workfiles_app",
|
||||
"maintained_selection",
|
||||
"remove_instance",
|
||||
"list_instances",
|
||||
|
||||
# utils
|
||||
"setup",
|
||||
"get_resolve_module",
|
||||
|
||||
# lib
|
||||
"maintain_current_timeline",
|
||||
"publish_clip_color",
|
||||
"get_project_manager",
|
||||
"get_current_project",
|
||||
"get_current_timeline",
|
||||
"create_bin",
|
||||
"get_media_pool_item",
|
||||
"create_media_pool_item",
|
||||
"create_timeline_item",
|
||||
"get_timeline_item",
|
||||
"get_video_track_names",
|
||||
"get_current_timeline_items",
|
||||
"get_pype_timeline_item_by_name",
|
||||
"get_timeline_item_pype_tag",
|
||||
"set_timeline_item_pype_tag",
|
||||
"imprint",
|
||||
"set_publish_attribute",
|
||||
"get_publish_attribute",
|
||||
"create_compound_clip",
|
||||
"swap_clips",
|
||||
"get_pype_clip_metadata",
|
||||
"set_project_manager_to_folder_name",
|
||||
"get_otio_clip_instance_data",
|
||||
"get_reformated_path",
|
||||
|
||||
# menu
|
||||
"launch_pype_menu",
|
||||
|
||||
# plugin
|
||||
"ClipLoader",
|
||||
"TimelineItemLoader",
|
||||
"Creator",
|
||||
"PublishClip",
|
||||
|
||||
# workio
|
||||
"open_file",
|
||||
"save_file",
|
||||
"current_file",
|
||||
"has_unsaved_changes",
|
||||
"file_extensions",
|
||||
"work_root",
|
||||
|
||||
"TestGUI"
|
||||
]
|
||||
|
|
@ -1,11 +1,137 @@
|
|||
"""
|
||||
resolve api
|
||||
"""
|
||||
import os
|
||||
|
||||
bmdvr = None
|
||||
bmdvf = None
|
||||
|
||||
API_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
HOST_DIR = os.path.dirname(API_DIR)
|
||||
PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
|
||||
from .utils import (
|
||||
get_resolve_module
|
||||
)
|
||||
|
||||
from .pipeline import (
|
||||
install,
|
||||
uninstall,
|
||||
ls,
|
||||
containerise,
|
||||
update_container,
|
||||
publish,
|
||||
launch_workfiles_app,
|
||||
maintained_selection,
|
||||
remove_instance,
|
||||
list_instances
|
||||
)
|
||||
|
||||
from .lib import (
|
||||
maintain_current_timeline,
|
||||
publish_clip_color,
|
||||
get_project_manager,
|
||||
get_current_project,
|
||||
get_current_timeline,
|
||||
create_bin,
|
||||
get_media_pool_item,
|
||||
create_media_pool_item,
|
||||
create_timeline_item,
|
||||
get_timeline_item,
|
||||
get_video_track_names,
|
||||
get_current_timeline_items,
|
||||
get_pype_timeline_item_by_name,
|
||||
get_timeline_item_pype_tag,
|
||||
set_timeline_item_pype_tag,
|
||||
imprint,
|
||||
set_publish_attribute,
|
||||
get_publish_attribute,
|
||||
create_compound_clip,
|
||||
swap_clips,
|
||||
get_pype_clip_metadata,
|
||||
set_project_manager_to_folder_name,
|
||||
get_otio_clip_instance_data,
|
||||
get_reformated_path
|
||||
)
|
||||
|
||||
from .menu import launch_pype_menu
|
||||
|
||||
from .plugin import (
|
||||
ClipLoader,
|
||||
TimelineItemLoader,
|
||||
Creator,
|
||||
PublishClip
|
||||
)
|
||||
|
||||
from .workio import (
|
||||
open_file,
|
||||
save_file,
|
||||
current_file,
|
||||
has_unsaved_changes,
|
||||
file_extensions,
|
||||
work_root
|
||||
)
|
||||
|
||||
from .testing_utils import TestGUI
|
||||
|
||||
|
||||
__all__ = [
|
||||
"bmdvr",
|
||||
"bmdvf",
|
||||
|
||||
# pipeline
|
||||
"install",
|
||||
"uninstall",
|
||||
"ls",
|
||||
"containerise",
|
||||
"update_container",
|
||||
"reload_pipeline",
|
||||
"publish",
|
||||
"launch_workfiles_app",
|
||||
"maintained_selection",
|
||||
"remove_instance",
|
||||
"list_instances",
|
||||
|
||||
# utils
|
||||
"get_resolve_module",
|
||||
|
||||
# lib
|
||||
"maintain_current_timeline",
|
||||
"publish_clip_color",
|
||||
"get_project_manager",
|
||||
"get_current_project",
|
||||
"get_current_timeline",
|
||||
"create_bin",
|
||||
"get_media_pool_item",
|
||||
"create_media_pool_item",
|
||||
"create_timeline_item",
|
||||
"get_timeline_item",
|
||||
"get_video_track_names",
|
||||
"get_current_timeline_items",
|
||||
"get_pype_timeline_item_by_name",
|
||||
"get_timeline_item_pype_tag",
|
||||
"set_timeline_item_pype_tag",
|
||||
"imprint",
|
||||
"set_publish_attribute",
|
||||
"get_publish_attribute",
|
||||
"create_compound_clip",
|
||||
"swap_clips",
|
||||
"get_pype_clip_metadata",
|
||||
"set_project_manager_to_folder_name",
|
||||
"get_otio_clip_instance_data",
|
||||
"get_reformated_path",
|
||||
|
||||
# menu
|
||||
"launch_pype_menu",
|
||||
|
||||
# plugin
|
||||
"ClipLoader",
|
||||
"TimelineItemLoader",
|
||||
"Creator",
|
||||
"PublishClip",
|
||||
|
||||
# workio
|
||||
"open_file",
|
||||
"save_file",
|
||||
"current_file",
|
||||
"has_unsaved_changes",
|
||||
"file_extensions",
|
||||
"work_root",
|
||||
|
||||
"TestGUI"
|
||||
]
|
||||
|
|
|
|||
|
|
@ -4,7 +4,7 @@ from __future__ import absolute_import
|
|||
import pyblish.api
|
||||
|
||||
|
||||
from ...action import get_errored_instances_from_context
|
||||
from openpype.action import get_errored_instances_from_context
|
||||
|
||||
|
||||
class SelectInvalidAction(pyblish.api.Action):
|
||||
|
|
|
|||
|
|
@ -4,13 +4,13 @@ import re
|
|||
import os
|
||||
import contextlib
|
||||
from opentimelineio import opentime
|
||||
|
||||
from openpype.lib import Logger
|
||||
from openpype.pipeline.editorial import is_overlapping_otio_ranges
|
||||
|
||||
from ..otio import davinci_export as otio_export
|
||||
|
||||
from openpype.api import Logger
|
||||
|
||||
log = Logger().get_logger(__name__)
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
self = sys.modules[__name__]
|
||||
self.project_manager = None
|
||||
|
|
|
|||
|
|
@ -3,13 +3,13 @@ import sys
|
|||
|
||||
from Qt import QtWidgets, QtCore
|
||||
|
||||
from openpype.tools.utils import host_tools
|
||||
|
||||
from .pipeline import (
|
||||
publish,
|
||||
launch_workfiles_app
|
||||
)
|
||||
|
||||
from openpype.tools.utils import host_tools
|
||||
|
||||
|
||||
def load_stylesheet():
|
||||
path = os.path.join(os.path.dirname(__file__), "menu_style.qss")
|
||||
|
|
|
|||
|
|
@ -7,7 +7,7 @@ from collections import OrderedDict
|
|||
|
||||
from pyblish import api as pyblish
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.lib import Logger
|
||||
from openpype.pipeline import (
|
||||
schema,
|
||||
register_loader_plugin_path,
|
||||
|
|
@ -16,11 +16,15 @@ from openpype.pipeline import (
|
|||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
)
|
||||
from . import lib
|
||||
from . import PLUGINS_DIR
|
||||
from openpype.tools.utils import host_tools
|
||||
log = Logger().get_logger(__name__)
|
||||
|
||||
from . import lib
|
||||
from .utils import get_resolve_module
|
||||
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
HOST_DIR = os.path.dirname(os.path.abspath(os.path.dirname(__file__)))
|
||||
PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
|
||||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
|
||||
|
|
@ -39,7 +43,6 @@ def install():
|
|||
See the Maya equivalent for inspiration on how to implement this.
|
||||
|
||||
"""
|
||||
from .. import get_resolve_module
|
||||
|
||||
log.info("openpype.hosts.resolve installed")
|
||||
|
||||
|
|
|
|||
|
|
@ -1,9 +1,9 @@
|
|||
#!/usr/bin/env python
|
||||
import time
|
||||
from openpype.hosts.resolve.utils import get_resolve_module
|
||||
from openpype.api import Logger
|
||||
from openpype.lib import Logger
|
||||
|
||||
log = Logger().get_logger(__name__)
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
wait_delay = 2.5
|
||||
wait = 0.00
|
||||
|
|
|
|||
|
|
@ -4,21 +4,21 @@
|
|||
Resolve's tools for setting environment
|
||||
"""
|
||||
|
||||
import sys
|
||||
import os
|
||||
import shutil
|
||||
from . import HOST_DIR
|
||||
from openpype.api import Logger
|
||||
log = Logger().get_logger(__name__)
|
||||
import sys
|
||||
|
||||
from openpype.lib import Logger
|
||||
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
|
||||
def get_resolve_module():
|
||||
from openpype.hosts import resolve
|
||||
from openpype.hosts.resolve import api
|
||||
# dont run if already loaded
|
||||
if resolve.api.bmdvr:
|
||||
if api.bmdvr:
|
||||
log.info(("resolve module is assigned to "
|
||||
f"`pype.hosts.resolve.api.bmdvr`: {resolve.api.bmdvr}"))
|
||||
return resolve.api.bmdvr
|
||||
f"`pype.hosts.resolve.api.bmdvr`: {api.bmdvr}"))
|
||||
return api.bmdvr
|
||||
try:
|
||||
"""
|
||||
The PYTHONPATH needs to be set correctly for this import
|
||||
|
|
@ -71,79 +71,9 @@ def get_resolve_module():
|
|||
# assign global var and return
|
||||
bmdvr = bmd.scriptapp("Resolve")
|
||||
bmdvf = bmd.scriptapp("Fusion")
|
||||
resolve.api.bmdvr = bmdvr
|
||||
resolve.api.bmdvf = bmdvf
|
||||
api.bmdvr = bmdvr
|
||||
api.bmdvf = bmdvf
|
||||
log.info(("Assigning resolve module to "
|
||||
f"`pype.hosts.resolve.api.bmdvr`: {resolve.api.bmdvr}"))
|
||||
f"`pype.hosts.resolve.api.bmdvr`: {api.bmdvr}"))
|
||||
log.info(("Assigning resolve module to "
|
||||
f"`pype.hosts.resolve.api.bmdvf`: {resolve.api.bmdvf}"))
|
||||
|
||||
|
||||
def _sync_utility_scripts(env=None):
|
||||
""" Synchronizing basic utlility scripts for resolve.
|
||||
|
||||
To be able to run scripts from inside `Resolve/Workspace/Scripts` menu
|
||||
all scripts has to be accessible from defined folder.
|
||||
"""
|
||||
if not env:
|
||||
env = os.environ
|
||||
|
||||
# initiate inputs
|
||||
scripts = {}
|
||||
us_env = env.get("RESOLVE_UTILITY_SCRIPTS_SOURCE_DIR")
|
||||
us_dir = env.get("RESOLVE_UTILITY_SCRIPTS_DIR", "")
|
||||
us_paths = [os.path.join(
|
||||
HOST_DIR,
|
||||
"utility_scripts"
|
||||
)]
|
||||
|
||||
# collect script dirs
|
||||
if us_env:
|
||||
log.info(f"Utility Scripts Env: `{us_env}`")
|
||||
us_paths = us_env.split(
|
||||
os.pathsep) + us_paths
|
||||
|
||||
# collect scripts from dirs
|
||||
for path in us_paths:
|
||||
scripts.update({path: os.listdir(path)})
|
||||
|
||||
log.info(f"Utility Scripts Dir: `{us_paths}`")
|
||||
log.info(f"Utility Scripts: `{scripts}`")
|
||||
|
||||
# make sure no script file is in folder
|
||||
if next((s for s in os.listdir(us_dir)), None):
|
||||
for s in os.listdir(us_dir):
|
||||
path = os.path.join(us_dir, s)
|
||||
log.info(f"Removing `{path}`...")
|
||||
if os.path.isdir(path):
|
||||
shutil.rmtree(path, onerror=None)
|
||||
else:
|
||||
os.remove(path)
|
||||
|
||||
# copy scripts into Resolve's utility scripts dir
|
||||
for d, sl in scripts.items():
|
||||
# directory and scripts list
|
||||
for s in sl:
|
||||
# script in script list
|
||||
src = os.path.join(d, s)
|
||||
dst = os.path.join(us_dir, s)
|
||||
log.info(f"Copying `{src}` to `{dst}`...")
|
||||
if os.path.isdir(src):
|
||||
shutil.copytree(
|
||||
src, dst, symlinks=False,
|
||||
ignore=None, ignore_dangling_symlinks=False
|
||||
)
|
||||
else:
|
||||
shutil.copy2(src, dst)
|
||||
|
||||
|
||||
def setup(env=None):
|
||||
""" Wrapper installer started from pype.hooks.resolve.ResolvePrelaunch()
|
||||
"""
|
||||
if not env:
|
||||
env = os.environ
|
||||
|
||||
# synchronize resolve utility scripts
|
||||
_sync_utility_scripts(env)
|
||||
|
||||
log.info("Resolve OpenPype wrapper has been installed")
|
||||
f"`pype.hosts.resolve.api.bmdvf`: {api.bmdvf}"))
|
||||
|
|
|
|||
|
|
@ -2,14 +2,14 @@
|
|||
|
||||
import os
|
||||
from openpype.api import Logger
|
||||
from .. import (
|
||||
from .lib import (
|
||||
get_project_manager,
|
||||
get_current_project,
|
||||
set_project_manager_to_folder_name
|
||||
)
|
||||
|
||||
|
||||
log = Logger().get_logger(__name__)
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
exported_projet_ext = ".drp"
|
||||
|
||||
|
|
@ -60,7 +60,7 @@ def open_file(filepath):
|
|||
# load project from input path
|
||||
project = pm.LoadProject(fname)
|
||||
log.info(f"Project {project.GetName()} opened...")
|
||||
return True
|
||||
|
||||
except AttributeError:
|
||||
log.warning((f"Project with name `{fname}` does not exist! It will "
|
||||
f"be imported from {filepath} and then loaded..."))
|
||||
|
|
@ -69,9 +69,8 @@ def open_file(filepath):
|
|||
project = pm.LoadProject(fname)
|
||||
log.info(f"Project imported/loaded {project.GetName()}...")
|
||||
return True
|
||||
else:
|
||||
return False
|
||||
|
||||
return False
|
||||
return True
|
||||
|
||||
def current_file():
|
||||
pm = get_project_manager()
|
||||
|
|
@ -80,13 +79,9 @@ def current_file():
|
|||
name = project.GetName()
|
||||
fname = name + exported_projet_ext
|
||||
current_file = os.path.join(current_dir, fname)
|
||||
normalised = os.path.normpath(current_file)
|
||||
|
||||
# Unsaved current file
|
||||
if normalised == "":
|
||||
if not current_file:
|
||||
return None
|
||||
|
||||
return normalised
|
||||
return os.path.normpath(current_file)
|
||||
|
||||
|
||||
def work_root(session):
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
import os
|
||||
import importlib
|
||||
|
||||
from openpype.lib import PreLaunchHook
|
||||
from openpype.hosts.resolve.api import utils
|
||||
from openpype.hosts.resolve.utils import setup
|
||||
|
||||
|
||||
class ResolvePrelaunch(PreLaunchHook):
|
||||
|
|
@ -43,18 +43,6 @@ class ResolvePrelaunch(PreLaunchHook):
|
|||
self.launch_context.env.get("PRE_PYTHON_SCRIPT", ""))
|
||||
self.launch_context.env["PRE_PYTHON_SCRIPT"] = pre_py_sc
|
||||
self.log.debug(f"-- pre_py_sc: `{pre_py_sc}`...")
|
||||
try:
|
||||
__import__("openpype.hosts.resolve")
|
||||
__import__("pyblish")
|
||||
|
||||
except ImportError:
|
||||
self.log.warning(
|
||||
"pyblish: Could not load Resolve integration.",
|
||||
exc_info=True
|
||||
)
|
||||
|
||||
else:
|
||||
# Resolve Setup integration
|
||||
importlib.reload(utils)
|
||||
self.log.debug(f"-- utils.__file__: `{utils.__file__}`")
|
||||
utils.setup(self.launch_context.env)
|
||||
# Resolve Setup integration
|
||||
setup(self.launch_context.env)
|
||||
|
|
|
|||
|
|
@ -1,9 +1,12 @@
|
|||
# from pprint import pformat
|
||||
from openpype.hosts import resolve
|
||||
from openpype.hosts.resolve.api import lib
|
||||
from openpype.hosts.resolve.api import plugin, lib
|
||||
from openpype.hosts.resolve.api.lib import (
|
||||
get_video_track_names,
|
||||
create_bin,
|
||||
)
|
||||
|
||||
|
||||
class CreateShotClip(resolve.Creator):
|
||||
class CreateShotClip(plugin.Creator):
|
||||
"""Publishable clip"""
|
||||
|
||||
label = "Create Publishable Clip"
|
||||
|
|
@ -11,7 +14,7 @@ class CreateShotClip(resolve.Creator):
|
|||
icon = "film"
|
||||
defaults = ["Main"]
|
||||
|
||||
gui_tracks = resolve.get_video_track_names()
|
||||
gui_tracks = get_video_track_names()
|
||||
gui_name = "OpenPype publish attributes creator"
|
||||
gui_info = "Define sequential rename and fill hierarchy data."
|
||||
gui_inputs = {
|
||||
|
|
@ -250,7 +253,7 @@ class CreateShotClip(resolve.Creator):
|
|||
sq_markers = self.timeline.GetMarkers()
|
||||
|
||||
# create media bin for compound clips (trackItems)
|
||||
mp_folder = resolve.create_bin(self.timeline.GetName())
|
||||
mp_folder = create_bin(self.timeline.GetName())
|
||||
|
||||
kwargs = {
|
||||
"ui_inputs": widget.result,
|
||||
|
|
@ -264,6 +267,6 @@ class CreateShotClip(resolve.Creator):
|
|||
self.rename_index = i
|
||||
self.log.info(track_item_data)
|
||||
# convert track item to timeline media pool item
|
||||
track_item = resolve.PublishClip(
|
||||
track_item = plugin.PublishClip(
|
||||
self, track_item_data, **kwargs).convert()
|
||||
track_item.SetClipColor(lib.publish_clip_color)
|
||||
|
|
|
|||
|
|
@ -1,21 +1,22 @@
|
|||
from copy import deepcopy
|
||||
from importlib import reload
|
||||
|
||||
from openpype.client import (
|
||||
get_version_by_id,
|
||||
get_last_version_by_subset_id,
|
||||
)
|
||||
from openpype.hosts import resolve
|
||||
# from openpype.hosts import resolve
|
||||
from openpype.pipeline import (
|
||||
get_representation_path,
|
||||
legacy_io,
|
||||
)
|
||||
from openpype.hosts.resolve.api import lib, plugin
|
||||
reload(plugin)
|
||||
reload(lib)
|
||||
from openpype.hosts.resolve.api.pipeline import (
|
||||
containerise,
|
||||
update_container,
|
||||
)
|
||||
|
||||
|
||||
class LoadClip(resolve.TimelineItemLoader):
|
||||
class LoadClip(plugin.TimelineItemLoader):
|
||||
"""Load a subset to timeline as clip
|
||||
|
||||
Place clip to timeline on its asset origin timings collected
|
||||
|
|
@ -46,7 +47,7 @@ class LoadClip(resolve.TimelineItemLoader):
|
|||
})
|
||||
|
||||
# load clip to timeline and get main variables
|
||||
timeline_item = resolve.ClipLoader(
|
||||
timeline_item = plugin.ClipLoader(
|
||||
self, context, **options).load()
|
||||
namespace = namespace or timeline_item.GetName()
|
||||
version = context['version']
|
||||
|
|
@ -80,7 +81,7 @@ class LoadClip(resolve.TimelineItemLoader):
|
|||
|
||||
self.log.info("Loader done: `{}`".format(name))
|
||||
|
||||
return resolve.containerise(
|
||||
return containerise(
|
||||
timeline_item,
|
||||
name, namespace, context,
|
||||
self.__class__.__name__,
|
||||
|
|
@ -98,7 +99,7 @@ class LoadClip(resolve.TimelineItemLoader):
|
|||
context.update({"representation": representation})
|
||||
name = container['name']
|
||||
namespace = container['namespace']
|
||||
timeline_item_data = resolve.get_pype_timeline_item_by_name(namespace)
|
||||
timeline_item_data = lib.get_pype_timeline_item_by_name(namespace)
|
||||
timeline_item = timeline_item_data["clip"]["item"]
|
||||
project_name = legacy_io.active_project()
|
||||
version = get_version_by_id(project_name, representation["parent"])
|
||||
|
|
@ -109,7 +110,7 @@ class LoadClip(resolve.TimelineItemLoader):
|
|||
self.fname = get_representation_path(representation)
|
||||
context["version"] = {"data": version_data}
|
||||
|
||||
loader = resolve.ClipLoader(self, context)
|
||||
loader = plugin.ClipLoader(self, context)
|
||||
timeline_item = loader.update(timeline_item)
|
||||
|
||||
# add additional metadata from the version to imprint Avalon knob
|
||||
|
|
@ -136,7 +137,7 @@ class LoadClip(resolve.TimelineItemLoader):
|
|||
# update color of clip regarding the version order
|
||||
self.set_item_color(timeline_item, version)
|
||||
|
||||
return resolve.update_container(timeline_item, data_imprint)
|
||||
return update_container(timeline_item, data_imprint)
|
||||
|
||||
@classmethod
|
||||
def set_item_color(cls, timeline_item, version):
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
import os
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
from openpype.hosts import resolve
|
||||
from openpype.hosts.resolve.api.lib import get_project_manager
|
||||
|
||||
|
||||
class ExtractWorkfile(openpype.api.Extractor):
|
||||
|
|
@ -29,7 +29,7 @@ class ExtractWorkfile(openpype.api.Extractor):
|
|||
os.path.join(staging_dir, drp_file_name))
|
||||
|
||||
# write out the drp workfile
|
||||
resolve.get_project_manager().ExportProject(
|
||||
get_project_manager().ExportProject(
|
||||
project.GetName(), drp_file_path)
|
||||
|
||||
# create drp workfile representation
|
||||
|
|
|
|||
|
|
@ -1,9 +1,15 @@
|
|||
import pyblish
|
||||
from openpype.hosts import resolve
|
||||
|
||||
# # developer reload modules
|
||||
from pprint import pformat
|
||||
|
||||
import pyblish
|
||||
|
||||
from openpype.hosts.resolve.api.lib import (
|
||||
get_current_timeline_items,
|
||||
get_timeline_item_pype_tag,
|
||||
publish_clip_color,
|
||||
get_publish_attribute,
|
||||
get_otio_clip_instance_data,
|
||||
)
|
||||
|
||||
|
||||
class PrecollectInstances(pyblish.api.ContextPlugin):
|
||||
"""Collect all Track items selection."""
|
||||
|
|
@ -14,8 +20,8 @@ class PrecollectInstances(pyblish.api.ContextPlugin):
|
|||
|
||||
def process(self, context):
|
||||
otio_timeline = context.data["otioTimeline"]
|
||||
selected_timeline_items = resolve.get_current_timeline_items(
|
||||
filter=True, selecting_color=resolve.publish_clip_color)
|
||||
selected_timeline_items = get_current_timeline_items(
|
||||
filter=True, selecting_color=publish_clip_color)
|
||||
|
||||
self.log.info(
|
||||
"Processing enabled track items: {}".format(
|
||||
|
|
@ -27,7 +33,7 @@ class PrecollectInstances(pyblish.api.ContextPlugin):
|
|||
timeline_item = timeline_item_data["clip"]["item"]
|
||||
|
||||
# get pype tag data
|
||||
tag_data = resolve.get_timeline_item_pype_tag(timeline_item)
|
||||
tag_data = get_timeline_item_pype_tag(timeline_item)
|
||||
self.log.debug(f"__ tag_data: {pformat(tag_data)}")
|
||||
|
||||
if not tag_data:
|
||||
|
|
@ -67,7 +73,7 @@ class PrecollectInstances(pyblish.api.ContextPlugin):
|
|||
"asset": asset,
|
||||
"item": timeline_item,
|
||||
"families": families,
|
||||
"publish": resolve.get_publish_attribute(timeline_item),
|
||||
"publish": get_publish_attribute(timeline_item),
|
||||
"fps": context.data["fps"],
|
||||
"handleStart": handle_start,
|
||||
"handleEnd": handle_end,
|
||||
|
|
@ -75,7 +81,7 @@ class PrecollectInstances(pyblish.api.ContextPlugin):
|
|||
})
|
||||
|
||||
# otio clip data
|
||||
otio_data = resolve.get_otio_clip_instance_data(
|
||||
otio_data = get_otio_clip_instance_data(
|
||||
otio_timeline, timeline_item_data) or {}
|
||||
data.update(otio_data)
|
||||
|
||||
|
|
@ -134,7 +140,7 @@ class PrecollectInstances(pyblish.api.ContextPlugin):
|
|||
"asset": asset,
|
||||
"family": family,
|
||||
"families": [],
|
||||
"publish": resolve.get_publish_attribute(timeline_item)
|
||||
"publish": get_publish_attribute(timeline_item)
|
||||
})
|
||||
|
||||
context.create_instance(**data)
|
||||
|
|
|
|||
|
|
@ -1,11 +1,9 @@
|
|||
import pyblish.api
|
||||
from pprint import pformat
|
||||
from importlib import reload
|
||||
|
||||
from openpype.hosts import resolve
|
||||
from openpype.hosts.resolve import api as rapi
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype.hosts.resolve.otio import davinci_export
|
||||
reload(davinci_export)
|
||||
|
||||
|
||||
class PrecollectWorkfile(pyblish.api.ContextPlugin):
|
||||
|
|
@ -18,9 +16,9 @@ class PrecollectWorkfile(pyblish.api.ContextPlugin):
|
|||
|
||||
asset = legacy_io.Session["AVALON_ASSET"]
|
||||
subset = "workfile"
|
||||
project = resolve.get_current_project()
|
||||
project = rapi.get_current_project()
|
||||
fps = project.GetSetting("timelineFrameRate")
|
||||
video_tracks = resolve.get_video_track_names()
|
||||
video_tracks = rapi.get_video_track_names()
|
||||
|
||||
# adding otio timeline to context
|
||||
otio_timeline = davinci_export.create_otio_timeline(project)
|
||||
|
|
|
|||
|
|
@ -6,10 +6,11 @@ from openpype.pipeline import install_host
|
|||
|
||||
|
||||
def main(env):
|
||||
import openpype.hosts.resolve as bmdvr
|
||||
from openpype.hosts.resolve.utils import setup
|
||||
import openpype.hosts.resolve.api as bmdvr
|
||||
# Registers openpype's Global pyblish plugins
|
||||
install_host(bmdvr)
|
||||
bmdvr.setup(env)
|
||||
setup(env)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
|
|
|
|||
|
|
@ -2,13 +2,13 @@ import os
|
|||
import sys
|
||||
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.api import Logger
|
||||
from openpype.lib import Logger
|
||||
|
||||
log = Logger().get_logger(__name__)
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
|
||||
def main(env):
|
||||
import openpype.hosts.resolve as bmdvr
|
||||
import openpype.hosts.resolve.api as bmdvr
|
||||
|
||||
# activate resolve from openpype
|
||||
install_host(bmdvr)
|
||||
|
|
|
|||
|
|
@ -6,8 +6,8 @@ import opentimelineio as otio
|
|||
|
||||
from openpype.pipeline import install_host
|
||||
|
||||
from openpype.hosts.resolve import TestGUI
|
||||
import openpype.hosts.resolve as bmdvr
|
||||
import openpype.hosts.resolve.api as bmdvr
|
||||
from openpype.hosts.resolve.api.testing_utils import TestGUI
|
||||
from openpype.hosts.resolve.otio import davinci_export as otio_export
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -2,11 +2,16 @@
|
|||
import os
|
||||
import sys
|
||||
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.resolve import TestGUI
|
||||
import openpype.hosts.resolve as bmdvr
|
||||
import clique
|
||||
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.resolve.api.testing_utils import TestGUI
|
||||
import openpype.hosts.resolve.api as bmdvr
|
||||
from openpype.hosts.resolve.api.lib import (
|
||||
create_media_pool_item,
|
||||
create_timeline_item,
|
||||
)
|
||||
|
||||
|
||||
class ThisTestGUI(TestGUI):
|
||||
extensions = [".exr", ".jpg", ".mov", ".png", ".mp4", ".ari", ".arx"]
|
||||
|
|
@ -55,10 +60,10 @@ class ThisTestGUI(TestGUI):
|
|||
# skip if unwanted extension
|
||||
if ext not in self.extensions:
|
||||
return
|
||||
media_pool_item = bmdvr.create_media_pool_item(fpath)
|
||||
media_pool_item = create_media_pool_item(fpath)
|
||||
print(media_pool_item)
|
||||
|
||||
track_item = bmdvr.create_timeline_item(media_pool_item)
|
||||
track_item = create_timeline_item(media_pool_item)
|
||||
print(track_item)
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -1,13 +1,17 @@
|
|||
#! python3
|
||||
from openpype.pipeline import install_host
|
||||
import openpype.hosts.resolve as bmdvr
|
||||
from openpype.hosts.resolve import api as bmdvr
|
||||
from openpype.hosts.resolve.api.lib import (
|
||||
create_media_pool_item,
|
||||
create_timeline_item,
|
||||
)
|
||||
|
||||
|
||||
def file_processing(fpath):
|
||||
media_pool_item = bmdvr.create_media_pool_item(fpath)
|
||||
media_pool_item = create_media_pool_item(fpath)
|
||||
print(media_pool_item)
|
||||
|
||||
track_item = bmdvr.create_timeline_item(media_pool_item)
|
||||
track_item = create_timeline_item(media_pool_item)
|
||||
print(track_item)
|
||||
|
||||
|
||||
|
|
|
|||
54
openpype/hosts/resolve/utils.py
Normal file
54
openpype/hosts/resolve/utils.py
Normal file
|
|
@ -0,0 +1,54 @@
|
|||
import os
|
||||
import shutil
|
||||
from openpype.lib import Logger
|
||||
|
||||
RESOLVE_ROOT_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
def setup(env):
|
||||
log = Logger.get_logger("ResolveSetup")
|
||||
scripts = {}
|
||||
us_env = env.get("RESOLVE_UTILITY_SCRIPTS_SOURCE_DIR")
|
||||
us_dir = env.get("RESOLVE_UTILITY_SCRIPTS_DIR", "")
|
||||
us_paths = [os.path.join(
|
||||
RESOLVE_ROOT_DIR,
|
||||
"utility_scripts"
|
||||
)]
|
||||
|
||||
# collect script dirs
|
||||
if us_env:
|
||||
log.info(f"Utility Scripts Env: `{us_env}`")
|
||||
us_paths = us_env.split(
|
||||
os.pathsep) + us_paths
|
||||
|
||||
# collect scripts from dirs
|
||||
for path in us_paths:
|
||||
scripts.update({path: os.listdir(path)})
|
||||
|
||||
log.info(f"Utility Scripts Dir: `{us_paths}`")
|
||||
log.info(f"Utility Scripts: `{scripts}`")
|
||||
|
||||
# make sure no script file is in folder
|
||||
for s in os.listdir(us_dir):
|
||||
path = os.path.join(us_dir, s)
|
||||
log.info(f"Removing `{path}`...")
|
||||
if os.path.isdir(path):
|
||||
shutil.rmtree(path, onerror=None)
|
||||
else:
|
||||
os.remove(path)
|
||||
|
||||
# copy scripts into Resolve's utility scripts dir
|
||||
for d, sl in scripts.items():
|
||||
# directory and scripts list
|
||||
for s in sl:
|
||||
# script in script list
|
||||
src = os.path.join(d, s)
|
||||
dst = os.path.join(us_dir, s)
|
||||
log.info(f"Copying `{src}` to `{dst}`...")
|
||||
if os.path.isdir(src):
|
||||
shutil.copytree(
|
||||
src, dst, symlinks=False,
|
||||
ignore=None, ignore_dangling_symlinks=False
|
||||
)
|
||||
else:
|
||||
shutil.copy2(src, dst)
|
||||
|
|
@ -957,32 +957,24 @@ class ApplicationLaunchContext:
|
|||
|
||||
# TODO load additional studio paths from settings
|
||||
import openpype
|
||||
pype_dir = os.path.dirname(os.path.abspath(openpype.__file__))
|
||||
openpype_dir = os.path.dirname(os.path.abspath(openpype.__file__))
|
||||
|
||||
# --- START: Backwards compatibility ---
|
||||
hooks_dir = os.path.join(pype_dir, "hooks")
|
||||
global_hooks_dir = os.path.join(openpype_dir, "hooks")
|
||||
|
||||
subfolder_names = ["global"]
|
||||
if self.host_name:
|
||||
subfolder_names.append(self.host_name)
|
||||
for subfolder_name in subfolder_names:
|
||||
path = os.path.join(hooks_dir, subfolder_name)
|
||||
if (
|
||||
os.path.exists(path)
|
||||
and os.path.isdir(path)
|
||||
and path not in paths
|
||||
):
|
||||
paths.append(path)
|
||||
# --- END: Backwards compatibility ---
|
||||
|
||||
subfolders_list = [
|
||||
["hooks"]
|
||||
hooks_dirs = [
|
||||
global_hooks_dir
|
||||
]
|
||||
if self.host_name:
|
||||
subfolders_list.append(["hosts", self.host_name, "hooks"])
|
||||
# If host requires launch hooks and is module then launch hooks
|
||||
# should be collected using 'collect_launch_hook_paths'
|
||||
# - module have to implement 'get_launch_hook_paths'
|
||||
host_module = self.modules_manager.get_host_module(self.host_name)
|
||||
if not host_module:
|
||||
hooks_dirs.append(os.path.join(
|
||||
openpype_dir, "hosts", self.host_name, "hooks"
|
||||
))
|
||||
|
||||
for subfolders in subfolders_list:
|
||||
path = os.path.join(pype_dir, *subfolders)
|
||||
for path in hooks_dirs:
|
||||
if (
|
||||
os.path.exists(path)
|
||||
and os.path.isdir(path)
|
||||
|
|
@ -991,7 +983,9 @@ class ApplicationLaunchContext:
|
|||
paths.append(path)
|
||||
|
||||
# Load modules paths
|
||||
paths.extend(self.modules_manager.collect_launch_hook_paths())
|
||||
paths.extend(
|
||||
self.modules_manager.collect_launch_hook_paths(self.application)
|
||||
)
|
||||
|
||||
return paths
|
||||
|
||||
|
|
@ -1304,6 +1298,7 @@ def get_app_environments_for_context(
|
|||
dict: Environments for passed context and application.
|
||||
"""
|
||||
|
||||
from openpype.modules import ModulesManager
|
||||
from openpype.pipeline import AvalonMongoDB, Anatomy
|
||||
|
||||
# Avalon database connection
|
||||
|
|
@ -1316,8 +1311,6 @@ def get_app_environments_for_context(
|
|||
asset_doc = get_asset_by_name(project_name, asset_name)
|
||||
|
||||
if modules_manager is None:
|
||||
from openpype.modules import ModulesManager
|
||||
|
||||
modules_manager = ModulesManager()
|
||||
|
||||
# Prepare app object which can be obtained only from ApplciationManager
|
||||
|
|
@ -1344,7 +1337,7 @@ def get_app_environments_for_context(
|
|||
})
|
||||
|
||||
prepare_app_environments(data, env_group, modules_manager)
|
||||
prepare_context_environments(data, env_group)
|
||||
prepare_context_environments(data, env_group, modules_manager)
|
||||
|
||||
# Discard avalon connection
|
||||
dbcon.uninstall()
|
||||
|
|
@ -1503,8 +1496,10 @@ def prepare_app_environments(
|
|||
final_env = None
|
||||
# Add host specific environments
|
||||
if app.host_name and implementation_envs:
|
||||
module = __import__("openpype.hosts", fromlist=[app.host_name])
|
||||
host_module = getattr(module, app.host_name, None)
|
||||
host_module = modules_manager.get_host_module(app.host_name)
|
||||
if not host_module:
|
||||
module = __import__("openpype.hosts", fromlist=[app.host_name])
|
||||
host_module = getattr(module, app.host_name, None)
|
||||
add_implementation_envs = None
|
||||
if host_module:
|
||||
add_implementation_envs = getattr(
|
||||
|
|
@ -1563,7 +1558,7 @@ def apply_project_environments_value(
|
|||
return env
|
||||
|
||||
|
||||
def prepare_context_environments(data, env_group=None):
|
||||
def prepare_context_environments(data, env_group=None, modules_manager=None):
|
||||
"""Modify launch environments with context data for launched host.
|
||||
|
||||
Args:
|
||||
|
|
@ -1658,10 +1653,10 @@ def prepare_context_environments(data, env_group=None):
|
|||
data["env"]["AVALON_APP"] = app.host_name
|
||||
data["env"]["AVALON_WORKDIR"] = workdir
|
||||
|
||||
_prepare_last_workfile(data, workdir)
|
||||
_prepare_last_workfile(data, workdir, modules_manager)
|
||||
|
||||
|
||||
def _prepare_last_workfile(data, workdir):
|
||||
def _prepare_last_workfile(data, workdir, modules_manager):
|
||||
"""last workfile workflow preparation.
|
||||
|
||||
Function check if should care about last workfile workflow and tries
|
||||
|
|
@ -1676,8 +1671,13 @@ def _prepare_last_workfile(data, workdir):
|
|||
result will be stored.
|
||||
workdir (str): Path to folder where workfiles should be stored.
|
||||
"""
|
||||
|
||||
from openpype.modules import ModulesManager
|
||||
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
|
||||
|
||||
if not modules_manager:
|
||||
modules_manager = ModulesManager()
|
||||
|
||||
log = data["log"]
|
||||
|
||||
_workdir_data = data.get("workdir_data")
|
||||
|
|
@ -1725,7 +1725,12 @@ def _prepare_last_workfile(data, workdir):
|
|||
# Last workfile path
|
||||
last_workfile_path = data.get("last_workfile_path") or ""
|
||||
if not last_workfile_path:
|
||||
extensions = HOST_WORKFILE_EXTENSIONS.get(app.host_name)
|
||||
host_module = modules_manager.get_host_module(app.host_name)
|
||||
if host_module:
|
||||
extensions = host_module.get_workfile_extensions()
|
||||
else:
|
||||
extensions = HOST_WORKFILE_EXTENSIONS.get(app.host_name)
|
||||
|
||||
if extensions:
|
||||
from openpype.pipeline.workfile import (
|
||||
get_workfile_template_key,
|
||||
|
|
|
|||
|
|
@ -1,11 +1,9 @@
|
|||
"""Should be used only inside of hosts."""
|
||||
import os
|
||||
import json
|
||||
import re
|
||||
import copy
|
||||
import platform
|
||||
import logging
|
||||
import collections
|
||||
import functools
|
||||
import warnings
|
||||
|
||||
|
|
@ -13,13 +11,9 @@ from openpype.client import (
|
|||
get_project,
|
||||
get_assets,
|
||||
get_asset_by_name,
|
||||
get_subsets,
|
||||
get_last_versions,
|
||||
get_last_version_by_subset_name,
|
||||
get_representations,
|
||||
get_workfile_info,
|
||||
)
|
||||
from openpype.settings import get_project_settings
|
||||
from .profiles_filtering import filter_profiles
|
||||
from .events import emit_event
|
||||
from .path_templates import StringTemplate
|
||||
|
|
|
|||
|
|
@ -140,7 +140,7 @@ class _LoadCache:
|
|||
def get_default_modules_dir():
|
||||
"""Path to default OpenPype modules."""
|
||||
|
||||
current_dir = os.path.abspath(os.path.dirname(__file__))
|
||||
current_dir = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
output = []
|
||||
for folder_name in ("default_modules", ):
|
||||
|
|
@ -298,6 +298,8 @@ def _load_modules():
|
|||
# Add current directory at first place
|
||||
# - has small differences in import logic
|
||||
current_dir = os.path.abspath(os.path.dirname(__file__))
|
||||
hosts_dir = os.path.join(os.path.dirname(current_dir), "hosts")
|
||||
module_dirs.insert(0, hosts_dir)
|
||||
module_dirs.insert(0, current_dir)
|
||||
|
||||
processed_paths = set()
|
||||
|
|
@ -314,6 +316,7 @@ def _load_modules():
|
|||
continue
|
||||
|
||||
is_in_current_dir = dirpath == current_dir
|
||||
is_in_host_dir = dirpath == hosts_dir
|
||||
for filename in os.listdir(dirpath):
|
||||
# Ignore filenames
|
||||
if filename in IGNORED_FILENAMES:
|
||||
|
|
@ -353,6 +356,24 @@ def _load_modules():
|
|||
sys.modules[new_import_str] = default_module
|
||||
setattr(openpype_modules, basename, default_module)
|
||||
|
||||
elif is_in_host_dir:
|
||||
import_str = "openpype.hosts.{}".format(basename)
|
||||
new_import_str = "{}.{}".format(modules_key, basename)
|
||||
# Until all hosts are converted to be able use them as
|
||||
# modules is this error check needed
|
||||
try:
|
||||
default_module = __import__(
|
||||
import_str, fromlist=("", )
|
||||
)
|
||||
sys.modules[new_import_str] = default_module
|
||||
setattr(openpype_modules, basename, default_module)
|
||||
|
||||
except Exception:
|
||||
log.warning(
|
||||
"Failed to import host folder {}".format(basename),
|
||||
exc_info=True
|
||||
)
|
||||
|
||||
elif os.path.isdir(fullpath):
|
||||
import_module_from_dirpath(dirpath, filename, modules_key)
|
||||
|
||||
|
|
@ -768,24 +789,50 @@ class ModulesManager:
|
|||
output.extend(paths)
|
||||
return output
|
||||
|
||||
def collect_launch_hook_paths(self):
|
||||
"""Helper to collect hooks from modules inherited ILaunchHookPaths.
|
||||
def collect_launch_hook_paths(self, app):
|
||||
"""Helper to collect application launch hooks.
|
||||
|
||||
It used to be based on 'ILaunchHookPaths' which is not true anymore.
|
||||
Module just have to have implemented 'get_launch_hook_paths' method.
|
||||
|
||||
Args:
|
||||
app (Application): Application object which can be used for
|
||||
filtering of which launch hook paths are returned.
|
||||
|
||||
Returns:
|
||||
list: Paths to launch hook directories.
|
||||
"""
|
||||
from openpype_interfaces import ILaunchHookPaths
|
||||
|
||||
str_type = type("")
|
||||
expected_types = (list, tuple, set)
|
||||
|
||||
output = []
|
||||
for module in self.get_enabled_modules():
|
||||
# Skip module that do not inherit from `ILaunchHookPaths`
|
||||
if not isinstance(module, ILaunchHookPaths):
|
||||
# Skip module if does not have implemented 'get_launch_hook_paths'
|
||||
func = getattr(module, "get_launch_hook_paths", None)
|
||||
if func is None:
|
||||
continue
|
||||
|
||||
func = module.get_launch_hook_paths
|
||||
if hasattr(inspect, "signature"):
|
||||
sig = inspect.signature(func)
|
||||
expect_args = len(sig.parameters) > 0
|
||||
else:
|
||||
expect_args = len(inspect.getargspec(func)[0]) > 0
|
||||
|
||||
# Pass application argument if method expect it.
|
||||
try:
|
||||
if expect_args:
|
||||
hook_paths = func(app)
|
||||
else:
|
||||
hook_paths = func()
|
||||
except Exception:
|
||||
self.log.warning(
|
||||
"Failed to call 'get_launch_hook_paths'",
|
||||
exc_info=True
|
||||
)
|
||||
continue
|
||||
|
||||
hook_paths = module.get_launch_hook_paths()
|
||||
if not hook_paths:
|
||||
continue
|
||||
|
||||
|
|
@ -804,6 +851,45 @@ class ModulesManager:
|
|||
output.extend(hook_paths)
|
||||
return output
|
||||
|
||||
def get_host_module(self, host_name):
|
||||
"""Find host module by host name.
|
||||
|
||||
Args:
|
||||
host_name (str): Host name for which is found host module.
|
||||
|
||||
Returns:
|
||||
OpenPypeModule: Found host module by name.
|
||||
None: There was not found module inheriting IHostModule which has
|
||||
host name set to passed 'host_name'.
|
||||
"""
|
||||
|
||||
from openpype_interfaces import IHostModule
|
||||
|
||||
for module in self.get_enabled_modules():
|
||||
if (
|
||||
isinstance(module, IHostModule)
|
||||
and module.host_name == host_name
|
||||
):
|
||||
return module
|
||||
return None
|
||||
|
||||
def get_host_names(self):
|
||||
"""List of available host names based on host modules.
|
||||
|
||||
Returns:
|
||||
Iterable[str]: All available host names based on enabled modules
|
||||
inheriting 'IHostModule'.
|
||||
"""
|
||||
|
||||
from openpype_interfaces import IHostModule
|
||||
|
||||
host_names = {
|
||||
module.host_name
|
||||
for module in self.get_enabled_modules()
|
||||
if isinstance(module, IHostModule)
|
||||
}
|
||||
return host_names
|
||||
|
||||
def print_report(self):
|
||||
"""Print out report of time spent on modules initialization parts.
|
||||
|
||||
|
|
|
|||
|
|
@ -105,11 +105,17 @@ class CollectFtrackApi(pyblish.api.ContextPlugin):
|
|||
context.data["ftrackEntity"] = asset_entity
|
||||
context.data["ftrackTask"] = task_entity
|
||||
|
||||
self.per_instance_process(context, asset_name, task_name)
|
||||
self.per_instance_process(context, asset_entity, task_entity)
|
||||
|
||||
def per_instance_process(
|
||||
self, context, context_asset_name, context_task_name
|
||||
self, context, context_asset_entity, context_task_entity
|
||||
):
|
||||
context_task_name = None
|
||||
context_asset_name = None
|
||||
if context_asset_entity:
|
||||
context_asset_name = context_asset_entity["name"]
|
||||
if context_task_entity:
|
||||
context_task_name = context_task_entity["name"]
|
||||
instance_by_asset_and_task = {}
|
||||
for instance in context:
|
||||
self.log.debug(
|
||||
|
|
@ -120,6 +126,8 @@ class CollectFtrackApi(pyblish.api.ContextPlugin):
|
|||
|
||||
if not instance_asset_name and not instance_task_name:
|
||||
self.log.debug("Instance does not have set context keys.")
|
||||
instance.data["ftrackEntity"] = context_asset_entity
|
||||
instance.data["ftrackTask"] = context_task_entity
|
||||
continue
|
||||
|
||||
elif instance_asset_name and instance_task_name:
|
||||
|
|
@ -131,6 +139,8 @@ class CollectFtrackApi(pyblish.api.ContextPlugin):
|
|||
"Instance's context is same as in publish context."
|
||||
" Asset: {} | Task: {}"
|
||||
).format(context_asset_name, context_task_name))
|
||||
instance.data["ftrackEntity"] = context_asset_entity
|
||||
instance.data["ftrackTask"] = context_task_entity
|
||||
continue
|
||||
asset_name = instance_asset_name
|
||||
task_name = instance_task_name
|
||||
|
|
@ -141,6 +151,8 @@ class CollectFtrackApi(pyblish.api.ContextPlugin):
|
|||
"Instance's context task is same as in publish"
|
||||
" context. Task: {}"
|
||||
).format(context_task_name))
|
||||
instance.data["ftrackEntity"] = context_asset_entity
|
||||
instance.data["ftrackTask"] = context_task_entity
|
||||
continue
|
||||
|
||||
asset_name = context_asset_name
|
||||
|
|
@ -152,6 +164,8 @@ class CollectFtrackApi(pyblish.api.ContextPlugin):
|
|||
"Instance's context asset is same as in publish"
|
||||
" context. Asset: {}"
|
||||
).format(context_asset_name))
|
||||
instance.data["ftrackEntity"] = context_asset_entity
|
||||
instance.data["ftrackTask"] = context_task_entity
|
||||
continue
|
||||
|
||||
# Do not use context's task name
|
||||
|
|
|
|||
|
|
@ -0,0 +1,150 @@
|
|||
import pyblish.api
|
||||
from openpype.lib import filter_profiles
|
||||
|
||||
|
||||
class IntegrateFtrackFarmStatus(pyblish.api.ContextPlugin):
|
||||
"""Change task status when should be published on farm.
|
||||
|
||||
Instance which has set "farm" key in data to 'True' is considered as will
|
||||
be rendered on farm thus it's status should be changed.
|
||||
"""
|
||||
|
||||
order = pyblish.api.IntegratorOrder + 0.48
|
||||
label = "Integrate Ftrack Farm Status"
|
||||
|
||||
farm_status_profiles = []
|
||||
|
||||
def process(self, context):
|
||||
# Quick end
|
||||
if not self.farm_status_profiles:
|
||||
project_name = context.data["projectName"]
|
||||
self.log.info((
|
||||
"Status profiles are not filled for project \"{}\". Skipping"
|
||||
).format(project_name))
|
||||
return
|
||||
|
||||
filtered_instances = self.filter_instances(context)
|
||||
instances_with_status_names = self.get_instances_with_statuse_names(
|
||||
context, filtered_instances
|
||||
)
|
||||
if instances_with_status_names:
|
||||
self.fill_statuses(context, instances_with_status_names)
|
||||
|
||||
def filter_instances(self, context):
|
||||
filtered_instances = []
|
||||
for instance in context:
|
||||
# Skip disabled instances
|
||||
if instance.data.get("publish") is False:
|
||||
continue
|
||||
subset_name = instance.data["subset"]
|
||||
msg_start = "Skipping instance {}.".format(subset_name)
|
||||
if not instance.data.get("farm"):
|
||||
self.log.debug(
|
||||
"{} Won't be rendered on farm.".format(msg_start)
|
||||
)
|
||||
continue
|
||||
|
||||
task_entity = instance.data.get("ftrackTask")
|
||||
if not task_entity:
|
||||
self.log.debug(
|
||||
"{} Does not have filled task".format(msg_start)
|
||||
)
|
||||
continue
|
||||
|
||||
filtered_instances.append(instance)
|
||||
return filtered_instances
|
||||
|
||||
def get_instances_with_statuse_names(self, context, instances):
|
||||
instances_with_status_names = []
|
||||
for instance in instances:
|
||||
family = instance.data["family"]
|
||||
subset_name = instance.data["subset"]
|
||||
task_entity = instance.data["ftrackTask"]
|
||||
host_name = context.data["hostName"]
|
||||
task_name = task_entity["name"]
|
||||
task_type = task_entity["type"]["name"]
|
||||
status_profile = filter_profiles(
|
||||
self.farm_status_profiles,
|
||||
{
|
||||
"hosts": host_name,
|
||||
"task_types": task_type,
|
||||
"task_names": task_name,
|
||||
"families": family,
|
||||
"subsets": subset_name,
|
||||
},
|
||||
logger=self.log
|
||||
)
|
||||
if not status_profile:
|
||||
# There already is log in 'filter_profiles'
|
||||
continue
|
||||
|
||||
status_name = status_profile["status_name"]
|
||||
if status_name:
|
||||
instances_with_status_names.append((instance, status_name))
|
||||
return instances_with_status_names
|
||||
|
||||
def fill_statuses(self, context, instances_with_status_names):
|
||||
# Prepare available task statuses on the project
|
||||
project_name = context.data["projectName"]
|
||||
session = context.data["ftrackSession"]
|
||||
project_entity = session.query((
|
||||
"select project_schema from Project where full_name is \"{}\""
|
||||
).format(project_name)).one()
|
||||
project_schema = project_entity["project_schema"]
|
||||
|
||||
task_type_ids = set()
|
||||
for item in instances_with_status_names:
|
||||
instance, _ = item
|
||||
task_entity = instance.data["ftrackTask"]
|
||||
task_type_ids.add(task_entity["type"]["id"])
|
||||
|
||||
task_statuses_by_type_id = {
|
||||
task_type_id: project_schema.get_statuses("Task", task_type_id)
|
||||
for task_type_id in task_type_ids
|
||||
}
|
||||
|
||||
# Keep track if anything has changed
|
||||
skipped_status_names = set()
|
||||
status_changed = False
|
||||
for item in instances_with_status_names:
|
||||
instance, status_name = item
|
||||
task_entity = instance.data["ftrackTask"]
|
||||
task_statuses = task_statuses_by_type_id[task_entity["type"]["id"]]
|
||||
status_name_low = status_name.lower()
|
||||
|
||||
status_id = None
|
||||
status_name = None
|
||||
# Skip if status name was already tried to be found
|
||||
for status in task_statuses:
|
||||
if status["name"].lower() == status_name_low:
|
||||
status_id = status["id"]
|
||||
status_name = status["name"]
|
||||
break
|
||||
|
||||
if status_id is None:
|
||||
if status_name_low not in skipped_status_names:
|
||||
skipped_status_names.add(status_name_low)
|
||||
joined_status_names = ", ".join({
|
||||
'"{}"'.format(status["name"])
|
||||
for status in task_statuses
|
||||
})
|
||||
self.log.warning((
|
||||
"Status \"{}\" is not available on project \"{}\"."
|
||||
" Available statuses are {}"
|
||||
).format(status_name, project_name, joined_status_names))
|
||||
continue
|
||||
|
||||
# Change task status id
|
||||
if status_id != task_entity["status_id"]:
|
||||
task_entity["status_id"] = status_id
|
||||
status_changed = True
|
||||
path = "/".join([
|
||||
item["name"]
|
||||
for item in task_entity["link"]
|
||||
])
|
||||
self.log.debug("Set status \"{}\" to \"{}\"".format(
|
||||
status_name, path
|
||||
))
|
||||
|
||||
if status_changed:
|
||||
session.commit()
|
||||
|
|
@ -3,6 +3,7 @@ import json
|
|||
import copy
|
||||
import pyblish.api
|
||||
|
||||
from openpype.lib.openpype_version import get_openpype_version
|
||||
from openpype.lib.transcoding import (
|
||||
get_ffprobe_streams,
|
||||
convert_ffprobe_fps_to_float,
|
||||
|
|
@ -20,6 +21,17 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
label = "Integrate Ftrack Component"
|
||||
families = ["ftrack"]
|
||||
|
||||
metadata_keys_to_label = {
|
||||
"openpype_version": "OpenPype version",
|
||||
"frame_start": "Frame start",
|
||||
"frame_end": "Frame end",
|
||||
"duration": "Duration",
|
||||
"width": "Resolution width",
|
||||
"height": "Resolution height",
|
||||
"fps": "FPS",
|
||||
"codec": "Codec"
|
||||
}
|
||||
|
||||
family_mapping = {
|
||||
"camera": "cam",
|
||||
"look": "look",
|
||||
|
|
@ -43,6 +55,7 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
}
|
||||
keep_first_subset_name_for_review = True
|
||||
asset_versions_status_profiles = []
|
||||
additional_metadata_keys = []
|
||||
|
||||
def process(self, instance):
|
||||
self.log.debug("instance {}".format(instance))
|
||||
|
|
@ -105,7 +118,8 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
"component_data": None,
|
||||
"component_path": None,
|
||||
"component_location": None,
|
||||
"component_location_name": None
|
||||
"component_location_name": None,
|
||||
"additional_data": {}
|
||||
}
|
||||
|
||||
# Filter types of representations
|
||||
|
|
@ -152,6 +166,7 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
"name": "thumbnail"
|
||||
}
|
||||
thumbnail_item["thumbnail"] = True
|
||||
|
||||
# Create copy of item before setting location
|
||||
src_components_to_add.append(copy.deepcopy(thumbnail_item))
|
||||
# Create copy of first thumbnail
|
||||
|
|
@ -248,19 +263,15 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
first_thumbnail_component[
|
||||
"asset_data"]["name"] = extended_asset_name
|
||||
|
||||
component_meta = self._prepare_component_metadata(
|
||||
instance, repre, repre_path, True
|
||||
)
|
||||
|
||||
# Change location
|
||||
review_item["component_path"] = repre_path
|
||||
# Change component data
|
||||
review_item["component_data"] = {
|
||||
# Default component name is "main".
|
||||
"name": "ftrackreview-mp4",
|
||||
"metadata": {
|
||||
"ftr_meta": json.dumps(component_meta)
|
||||
}
|
||||
"metadata": self._prepare_component_metadata(
|
||||
instance, repre, repre_path, True
|
||||
)
|
||||
}
|
||||
|
||||
if is_first_review_repre:
|
||||
|
|
@ -302,13 +313,9 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
component_data = copy_src_item["component_data"]
|
||||
component_name = component_data["name"]
|
||||
component_data["name"] = component_name + "_src"
|
||||
component_meta = self._prepare_component_metadata(
|
||||
component_data["metadata"] = self._prepare_component_metadata(
|
||||
instance, repre, copy_src_item["component_path"], False
|
||||
)
|
||||
if component_meta:
|
||||
component_data["metadata"] = {
|
||||
"ftr_meta": json.dumps(component_meta)
|
||||
}
|
||||
component_list.append(copy_src_item)
|
||||
|
||||
# Add others representations as component
|
||||
|
|
@ -326,16 +333,12 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
):
|
||||
other_item["asset_data"]["name"] = extended_asset_name
|
||||
|
||||
component_meta = self._prepare_component_metadata(
|
||||
instance, repre, published_path, False
|
||||
)
|
||||
component_data = {
|
||||
"name": repre["name"]
|
||||
"name": repre["name"],
|
||||
"metadata": self._prepare_component_metadata(
|
||||
instance, repre, published_path, False
|
||||
)
|
||||
}
|
||||
if component_meta:
|
||||
component_data["metadata"] = {
|
||||
"ftr_meta": json.dumps(component_meta)
|
||||
}
|
||||
other_item["component_data"] = component_data
|
||||
other_item["component_location_name"] = unmanaged_location_name
|
||||
other_item["component_path"] = published_path
|
||||
|
|
@ -354,6 +357,9 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
))
|
||||
instance.data["ftrackComponentsList"] = component_list
|
||||
|
||||
def _collect_additional_metadata(self, streams):
|
||||
pass
|
||||
|
||||
def _get_repre_path(self, instance, repre, only_published):
|
||||
"""Get representation path that can be used for integration.
|
||||
|
||||
|
|
@ -423,6 +429,11 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
def _prepare_component_metadata(
|
||||
self, instance, repre, component_path, is_review
|
||||
):
|
||||
metadata = {}
|
||||
if "openpype_version" in self.additional_metadata_keys:
|
||||
label = self.metadata_keys_to_label["openpype_version"]
|
||||
metadata[label] = get_openpype_version()
|
||||
|
||||
extension = os.path.splitext(component_path)[-1]
|
||||
streams = []
|
||||
try:
|
||||
|
|
@ -442,13 +453,23 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
# - exr is special case which can have issues with reading through
|
||||
# ffmpegh but we want to set fps for it
|
||||
if not video_streams and extension not in [".exr"]:
|
||||
return {}
|
||||
return metadata
|
||||
|
||||
stream_width = None
|
||||
stream_height = None
|
||||
stream_fps = None
|
||||
frame_out = None
|
||||
codec_label = None
|
||||
for video_stream in video_streams:
|
||||
codec_label = video_stream.get("codec_long_name")
|
||||
if not codec_label:
|
||||
codec_label = video_stream.get("codec")
|
||||
|
||||
if codec_label:
|
||||
pix_fmt = video_stream.get("pix_fmt")
|
||||
if pix_fmt:
|
||||
codec_label += " ({})".format(pix_fmt)
|
||||
|
||||
tmp_width = video_stream.get("width")
|
||||
tmp_height = video_stream.get("height")
|
||||
if tmp_width and tmp_height:
|
||||
|
|
@ -456,8 +477,8 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
stream_height = tmp_height
|
||||
|
||||
input_framerate = video_stream.get("r_frame_rate")
|
||||
duration = video_stream.get("duration")
|
||||
if input_framerate is None or duration is None:
|
||||
stream_duration = video_stream.get("duration")
|
||||
if input_framerate is None or stream_duration is None:
|
||||
continue
|
||||
try:
|
||||
stream_fps = convert_ffprobe_fps_to_float(
|
||||
|
|
@ -473,9 +494,9 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
stream_height = tmp_height
|
||||
|
||||
self.log.debug("FPS from stream is {} and duration is {}".format(
|
||||
input_framerate, duration
|
||||
input_framerate, stream_duration
|
||||
))
|
||||
frame_out = float(duration) * stream_fps
|
||||
frame_out = float(stream_duration) * stream_fps
|
||||
break
|
||||
|
||||
# Prepare FPS
|
||||
|
|
@ -483,43 +504,58 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
|
|||
if instance_fps is None:
|
||||
instance_fps = instance.context.data["fps"]
|
||||
|
||||
if not is_review:
|
||||
output = {}
|
||||
fps = stream_fps or instance_fps
|
||||
if fps:
|
||||
output["frameRate"] = fps
|
||||
|
||||
if stream_width and stream_height:
|
||||
output["width"] = int(stream_width)
|
||||
output["height"] = int(stream_height)
|
||||
return output
|
||||
|
||||
frame_start = repre.get("frameStartFtrack")
|
||||
frame_end = repre.get("frameEndFtrack")
|
||||
if frame_start is None or frame_end is None:
|
||||
frame_start = instance.data["frameStart"]
|
||||
frame_end = instance.data["frameEnd"]
|
||||
|
||||
fps = None
|
||||
repre_fps = repre.get("fps")
|
||||
if repre_fps is not None:
|
||||
repre_fps = float(repre_fps)
|
||||
|
||||
fps = stream_fps or repre_fps or instance_fps
|
||||
|
||||
# Prepare frame ranges
|
||||
frame_start = repre.get("frameStartFtrack")
|
||||
frame_end = repre.get("frameEndFtrack")
|
||||
if frame_start is None or frame_end is None:
|
||||
frame_start = instance.data["frameStart"]
|
||||
frame_end = instance.data["frameEnd"]
|
||||
duration = (frame_end - frame_start) + 1
|
||||
|
||||
for key, value in [
|
||||
("fps", fps),
|
||||
("frame_start", frame_start),
|
||||
("frame_end", frame_end),
|
||||
("duration", duration),
|
||||
("width", stream_width),
|
||||
("height", stream_height),
|
||||
("fps", fps),
|
||||
("codec", codec_label)
|
||||
]:
|
||||
if not value or key not in self.additional_metadata_keys:
|
||||
continue
|
||||
label = self.metadata_keys_to_label[key]
|
||||
metadata[label] = value
|
||||
|
||||
if not is_review:
|
||||
ftr_meta = {}
|
||||
if fps:
|
||||
ftr_meta["frameRate"] = fps
|
||||
|
||||
if stream_width and stream_height:
|
||||
ftr_meta["width"] = int(stream_width)
|
||||
ftr_meta["height"] = int(stream_height)
|
||||
metadata["ftr_meta"] = json.dumps(ftr_meta)
|
||||
return metadata
|
||||
|
||||
# Frame end of uploaded video file should be duration in frames
|
||||
# - frame start is always 0
|
||||
# - frame end is duration in frames
|
||||
if not frame_out:
|
||||
frame_out = frame_end - frame_start + 1
|
||||
frame_out = duration
|
||||
|
||||
# Ftrack documentation says that it is required to have
|
||||
# 'width' and 'height' in review component. But with those values
|
||||
# review video does not play.
|
||||
component_meta = {
|
||||
metadata["ftr_meta"] = json.dumps({
|
||||
"frameIn": 0,
|
||||
"frameOut": frame_out,
|
||||
"frameRate": float(fps)
|
||||
}
|
||||
|
||||
return component_meta
|
||||
})
|
||||
return metadata
|
||||
|
|
|
|||
|
|
@ -1,9 +1,12 @@
|
|||
import sys
|
||||
import collections
|
||||
import six
|
||||
import pyblish.api
|
||||
from copy import deepcopy
|
||||
|
||||
import pyblish.api
|
||||
|
||||
from openpype.client import get_asset_by_id
|
||||
from openpype.lib import filter_profiles
|
||||
|
||||
|
||||
# Copy of constant `openpype_modules.ftrack.lib.avalon_sync.CUST_ATTR_AUTO_SYNC`
|
||||
|
|
@ -73,6 +76,7 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin):
|
|||
"traypublisher"
|
||||
]
|
||||
optional = False
|
||||
create_task_status_profiles = []
|
||||
|
||||
def process(self, context):
|
||||
self.context = context
|
||||
|
|
@ -82,14 +86,16 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin):
|
|||
hierarchy_context = self._get_active_assets(context)
|
||||
self.log.debug("__ hierarchy_context: {}".format(hierarchy_context))
|
||||
|
||||
self.session = self.context.data["ftrackSession"]
|
||||
session = self.context.data["ftrackSession"]
|
||||
project_name = self.context.data["projectEntity"]["name"]
|
||||
query = 'Project where full_name is "{}"'.format(project_name)
|
||||
project = self.session.query(query).one()
|
||||
auto_sync_state = project[
|
||||
"custom_attributes"][CUST_ATTR_AUTO_SYNC]
|
||||
project = session.query(query).one()
|
||||
auto_sync_state = project["custom_attributes"][CUST_ATTR_AUTO_SYNC]
|
||||
|
||||
self.ft_project = None
|
||||
self.session = session
|
||||
self.ft_project = project
|
||||
self.task_types = self.get_all_task_types(project)
|
||||
self.task_statuses = self.get_task_statuses(project)
|
||||
|
||||
# disable termporarily ftrack project's autosyncing
|
||||
if auto_sync_state:
|
||||
|
|
@ -121,10 +127,7 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin):
|
|||
self.log.debug(entity_type)
|
||||
|
||||
if entity_type.lower() == 'project':
|
||||
query = 'Project where full_name is "{}"'.format(entity_name)
|
||||
entity = self.session.query(query).one()
|
||||
self.ft_project = entity
|
||||
self.task_types = self.get_all_task_types(entity)
|
||||
entity = self.ft_project
|
||||
|
||||
elif self.ft_project is None or parent is None:
|
||||
raise AssertionError(
|
||||
|
|
@ -229,13 +232,6 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin):
|
|||
task_type=task_type,
|
||||
parent=entity
|
||||
)
|
||||
try:
|
||||
self.session.commit()
|
||||
except Exception:
|
||||
tp, value, tb = sys.exc_info()
|
||||
self.session.rollback()
|
||||
self.session._configure_locations()
|
||||
six.reraise(tp, value, tb)
|
||||
|
||||
for instance in instances_by_task_name[task_name.lower()]:
|
||||
instance.data["ftrackTask"] = task_entity
|
||||
|
|
@ -318,7 +314,37 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin):
|
|||
|
||||
return tasks
|
||||
|
||||
def get_task_statuses(self, project_entity):
|
||||
project_schema = project_entity["project_schema"]
|
||||
task_workflow_statuses = project_schema["_task_workflow"]["statuses"]
|
||||
return {
|
||||
status["id"]: status
|
||||
for status in task_workflow_statuses
|
||||
}
|
||||
|
||||
def create_task(self, name, task_type, parent):
|
||||
filter_data = {
|
||||
"task_names": name,
|
||||
"task_types": task_type
|
||||
}
|
||||
profile = filter_profiles(
|
||||
self.create_task_status_profiles,
|
||||
filter_data
|
||||
)
|
||||
status_id = None
|
||||
if profile:
|
||||
status_name = profile["status_name"]
|
||||
status_name_low = status_name.lower()
|
||||
for _status_id, status in self.task_statuses.items():
|
||||
if status["name"].lower() == status_name_low:
|
||||
status_id = _status_id
|
||||
break
|
||||
|
||||
if status_id is None:
|
||||
self.log.warning(
|
||||
"Task status \"{}\" was not found".format(status_name)
|
||||
)
|
||||
|
||||
task = self.session.create('Task', {
|
||||
'name': name,
|
||||
'parent': parent
|
||||
|
|
@ -327,6 +353,8 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin):
|
|||
self.log.info(task_type)
|
||||
self.log.info(self.task_types)
|
||||
task['type'] = self.task_types[task_type]
|
||||
if status_id is not None:
|
||||
task["status_id"] = status_id
|
||||
|
||||
try:
|
||||
self.session.commit()
|
||||
|
|
|
|||
|
|
@ -1,4 +1,4 @@
|
|||
from abc import abstractmethod
|
||||
from abc import abstractmethod, abstractproperty
|
||||
|
||||
from openpype import resources
|
||||
|
||||
|
|
@ -50,12 +50,32 @@ class IPluginPaths(OpenPypeInterface):
|
|||
class ILaunchHookPaths(OpenPypeInterface):
|
||||
"""Module has launch hook paths to return.
|
||||
|
||||
Modules does not have to inherit from this interface (changed 8.11.2022).
|
||||
Module just have to have implemented 'get_launch_hook_paths' to be able use
|
||||
the advantage.
|
||||
|
||||
Expected result is list of paths.
|
||||
["path/to/launch_hooks_dir"]
|
||||
"""
|
||||
|
||||
@abstractmethod
|
||||
def get_launch_hook_paths(self):
|
||||
def get_launch_hook_paths(self, app):
|
||||
"""Paths to directory with application launch hooks.
|
||||
|
||||
Method can be also defined without arguments.
|
||||
```python
|
||||
def get_launch_hook_paths(self):
|
||||
return []
|
||||
```
|
||||
|
||||
Args:
|
||||
app (Application): Application object which can be used for
|
||||
filtering of which launch hook paths are returned.
|
||||
|
||||
Returns:
|
||||
Iterable[str]: Path to directories where launch hooks can be found.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
|
||||
|
|
@ -66,6 +86,7 @@ class ITrayModule(OpenPypeInterface):
|
|||
The module still must be usable if is not used in tray even if
|
||||
would do nothing.
|
||||
"""
|
||||
|
||||
tray_initialized = False
|
||||
_tray_manager = None
|
||||
|
||||
|
|
@ -78,16 +99,19 @@ class ITrayModule(OpenPypeInterface):
|
|||
This is where GUIs should be loaded or tray specific parts should be
|
||||
prepared.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def tray_menu(self, tray_menu):
|
||||
"""Add module's action to tray menu."""
|
||||
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def tray_start(self):
|
||||
"""Start procedure in Pype tray."""
|
||||
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
|
|
@ -96,6 +120,7 @@ class ITrayModule(OpenPypeInterface):
|
|||
|
||||
This is place where all threads should be shut.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
def execute_in_main_thread(self, callback):
|
||||
|
|
@ -104,6 +129,7 @@ class ITrayModule(OpenPypeInterface):
|
|||
Some callbacks need to be processed on main thread (menu actions
|
||||
must be added on main thread or they won't get triggered etc.)
|
||||
"""
|
||||
|
||||
if not self.tray_initialized:
|
||||
# TODO Called without initialized tray, still main thread needed
|
||||
try:
|
||||
|
|
@ -128,6 +154,7 @@ class ITrayModule(OpenPypeInterface):
|
|||
msecs (int): Duration of message visibility in miliseconds.
|
||||
Default is 10000 msecs, may differ by Qt version.
|
||||
"""
|
||||
|
||||
if self._tray_manager:
|
||||
self._tray_manager.show_tray_message(title, message, icon, msecs)
|
||||
|
||||
|
|
@ -280,16 +307,19 @@ class ITrayService(ITrayModule):
|
|||
|
||||
def set_service_running_icon(self):
|
||||
"""Change icon of an QAction to green circle."""
|
||||
|
||||
if self.menu_action:
|
||||
self.menu_action.setIcon(self.get_icon_running())
|
||||
|
||||
def set_service_failed_icon(self):
|
||||
"""Change icon of an QAction to red circle."""
|
||||
|
||||
if self.menu_action:
|
||||
self.menu_action.setIcon(self.get_icon_failed())
|
||||
|
||||
def set_service_idle_icon(self):
|
||||
"""Change icon of an QAction to orange circle."""
|
||||
|
||||
if self.menu_action:
|
||||
self.menu_action.setIcon(self.get_icon_idle())
|
||||
|
||||
|
|
@ -303,6 +333,7 @@ class ISettingsChangeListener(OpenPypeInterface):
|
|||
"publish": ["path/to/publish_plugins"]
|
||||
}
|
||||
"""
|
||||
|
||||
@abstractmethod
|
||||
def on_system_settings_save(
|
||||
self, old_value, new_value, changes, new_value_metadata
|
||||
|
|
@ -320,3 +351,24 @@ class ISettingsChangeListener(OpenPypeInterface):
|
|||
self, old_value, new_value, changes, project_name, new_value_metadata
|
||||
):
|
||||
pass
|
||||
|
||||
|
||||
class IHostModule(OpenPypeInterface):
|
||||
"""Module which also contain a host implementation."""
|
||||
|
||||
@abstractproperty
|
||||
def host_name(self):
|
||||
"""Name of host which module represents."""
|
||||
|
||||
pass
|
||||
|
||||
def get_workfile_extensions(self):
|
||||
"""Define workfile extensions for host.
|
||||
|
||||
Not all hosts support workfiles thus this is optional implementation.
|
||||
|
||||
Returns:
|
||||
List[str]: Extensions used for workfiles with dot.
|
||||
"""
|
||||
|
||||
return []
|
||||
|
|
|
|||
|
|
@ -6,7 +6,11 @@ from typing import List
|
|||
import gazu
|
||||
from pymongo import UpdateOne
|
||||
|
||||
from openpype.client import get_project, get_assets
|
||||
from openpype.client import (
|
||||
get_projects,
|
||||
get_project,
|
||||
get_assets,
|
||||
)
|
||||
from openpype.pipeline import AvalonMongoDB
|
||||
from openpype.api import get_project_settings
|
||||
from openpype.modules.kitsu.utils.credentials import validate_credentials
|
||||
|
|
@ -37,7 +41,7 @@ def sync_zou(login: str, password: str):
|
|||
dbcon = AvalonMongoDB()
|
||||
dbcon.install()
|
||||
|
||||
op_projects = [p for p in dbcon.projects()]
|
||||
op_projects = list(get_projects())
|
||||
for project_doc in op_projects:
|
||||
sync_zou_from_op_project(project_doc["name"], dbcon, project_doc)
|
||||
|
||||
|
|
|
|||
|
|
@ -6,7 +6,7 @@ import platform
|
|||
import copy
|
||||
from collections import deque, defaultdict
|
||||
|
||||
|
||||
from openpype.client import get_projects
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype_interfaces import ITrayModule
|
||||
from openpype.settings import (
|
||||
|
|
@ -913,7 +913,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
enabled_projects = []
|
||||
|
||||
if self.enabled:
|
||||
for project in self.connection.projects(projection={"name": 1}):
|
||||
for project in get_projects(fields=["name"]):
|
||||
project_name = project["name"]
|
||||
if self.is_project_enabled(project_name):
|
||||
enabled_projects.append(project_name)
|
||||
|
|
@ -1242,10 +1242,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
def _prepare_sync_project_settings(self, exclude_locals):
|
||||
sync_project_settings = {}
|
||||
system_sites = self.get_all_site_configs()
|
||||
project_docs = self.connection.projects(
|
||||
projection={"name": 1},
|
||||
only_active=True
|
||||
)
|
||||
project_docs = get_projects(fields=["name"])
|
||||
for project_doc in project_docs:
|
||||
project_name = project_doc["name"]
|
||||
sites = copy.deepcopy(system_sites) # get all configured sites
|
||||
|
|
|
|||
|
|
@ -16,6 +16,7 @@ from .utils import (
|
|||
switch_container,
|
||||
|
||||
get_loader_identifier,
|
||||
get_loaders_by_name,
|
||||
|
||||
get_representation_path_from_context,
|
||||
get_representation_path,
|
||||
|
|
@ -61,6 +62,7 @@ __all__ = (
|
|||
"switch_container",
|
||||
|
||||
"get_loader_identifier",
|
||||
"get_loaders_by_name",
|
||||
|
||||
"get_representation_path_from_context",
|
||||
"get_representation_path",
|
||||
|
|
|
|||
|
|
@ -369,6 +369,20 @@ def get_loader_identifier(loader):
|
|||
return loader.__name__
|
||||
|
||||
|
||||
def get_loaders_by_name():
|
||||
from .plugins import discover_loader_plugins
|
||||
|
||||
loaders_by_name = {}
|
||||
for loader in discover_loader_plugins():
|
||||
loader_name = loader.__name__
|
||||
if loader_name in loaders_by_name:
|
||||
raise KeyError(
|
||||
"Duplicated loader name {} !".format(loader_name)
|
||||
)
|
||||
loaders_by_name[loader_name] = loader
|
||||
return loaders_by_name
|
||||
|
||||
|
||||
def _get_container_loader(container):
|
||||
"""Return the Loader corresponding to the container"""
|
||||
from .plugins import discover_loader_plugins
|
||||
|
|
|
|||
526
openpype/pipeline/workfile/abstract_template_loader.py
Normal file
526
openpype/pipeline/workfile/abstract_template_loader.py
Normal file
|
|
@ -0,0 +1,526 @@
|
|||
import os
|
||||
from abc import ABCMeta, abstractmethod
|
||||
|
||||
import six
|
||||
import logging
|
||||
from functools import reduce
|
||||
|
||||
from openpype.client import get_asset_by_name
|
||||
from openpype.settings import get_project_settings
|
||||
from openpype.lib import (
|
||||
StringTemplate,
|
||||
Logger,
|
||||
filter_profiles,
|
||||
get_linked_assets,
|
||||
)
|
||||
from openpype.pipeline import legacy_io, Anatomy
|
||||
from openpype.pipeline.load import (
|
||||
get_loaders_by_name,
|
||||
get_representation_context,
|
||||
load_with_repre_context,
|
||||
)
|
||||
|
||||
from .build_template_exceptions import (
|
||||
TemplateAlreadyImported,
|
||||
TemplateLoadingFailed,
|
||||
TemplateProfileNotFound,
|
||||
TemplateNotFound
|
||||
)
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def update_representations(entities, entity):
|
||||
if entity['context']['subset'] not in entities:
|
||||
entities[entity['context']['subset']] = entity
|
||||
else:
|
||||
current = entities[entity['context']['subset']]
|
||||
incomming = entity
|
||||
entities[entity['context']['subset']] = max(
|
||||
current, incomming,
|
||||
key=lambda entity: entity["context"].get("version", -1))
|
||||
|
||||
return entities
|
||||
|
||||
|
||||
def parse_loader_args(loader_args):
|
||||
if not loader_args:
|
||||
return dict()
|
||||
try:
|
||||
parsed_args = eval(loader_args)
|
||||
if not isinstance(parsed_args, dict):
|
||||
return dict()
|
||||
else:
|
||||
return parsed_args
|
||||
except Exception as err:
|
||||
print(
|
||||
"Error while parsing loader arguments '{}'.\n{}: {}\n\n"
|
||||
"Continuing with default arguments. . .".format(
|
||||
loader_args,
|
||||
err.__class__.__name__,
|
||||
err))
|
||||
return dict()
|
||||
|
||||
|
||||
@six.add_metaclass(ABCMeta)
|
||||
class AbstractTemplateLoader:
|
||||
"""
|
||||
Abstraction of Template Loader.
|
||||
Properties:
|
||||
template_path : property to get current template path
|
||||
Methods:
|
||||
import_template : Abstract Method. Used to load template,
|
||||
depending on current host
|
||||
get_template_nodes : Abstract Method. Used to query nodes acting
|
||||
as placeholders. Depending on current host
|
||||
"""
|
||||
|
||||
_log = None
|
||||
|
||||
def __init__(self, placeholder_class):
|
||||
# TODO template loader should expect host as and argument
|
||||
# - host have all responsibility for most of code (also provide
|
||||
# placeholder class)
|
||||
# - also have responsibility for current context
|
||||
# - this won't work in DCCs where multiple workfiles with
|
||||
# different contexts can be opened at single time
|
||||
# - template loader should have ability to change context
|
||||
project_name = legacy_io.active_project()
|
||||
asset_name = legacy_io.Session["AVALON_ASSET"]
|
||||
|
||||
self.loaders_by_name = get_loaders_by_name()
|
||||
self.current_asset = asset_name
|
||||
self.project_name = project_name
|
||||
self.host_name = legacy_io.Session["AVALON_APP"]
|
||||
self.task_name = legacy_io.Session["AVALON_TASK"]
|
||||
self.placeholder_class = placeholder_class
|
||||
self.current_asset_doc = get_asset_by_name(project_name, asset_name)
|
||||
self.task_type = (
|
||||
self.current_asset_doc
|
||||
.get("data", {})
|
||||
.get("tasks", {})
|
||||
.get(self.task_name, {})
|
||||
.get("type")
|
||||
)
|
||||
|
||||
self.log.info(
|
||||
"BUILDING ASSET FROM TEMPLATE :\n"
|
||||
"Starting templated build for {asset} in {project}\n\n"
|
||||
"Asset : {asset}\n"
|
||||
"Task : {task_name} ({task_type})\n"
|
||||
"Host : {host}\n"
|
||||
"Project : {project}\n".format(
|
||||
asset=self.current_asset,
|
||||
host=self.host_name,
|
||||
project=self.project_name,
|
||||
task_name=self.task_name,
|
||||
task_type=self.task_type
|
||||
))
|
||||
# Skip if there is no loader
|
||||
if not self.loaders_by_name:
|
||||
self.log.warning(
|
||||
"There is no registered loaders. No assets will be loaded")
|
||||
return
|
||||
|
||||
@property
|
||||
def log(self):
|
||||
if self._log is None:
|
||||
self._log = Logger.get_logger(self.__class__.__name__)
|
||||
return self._log
|
||||
|
||||
def template_already_imported(self, err_msg):
|
||||
"""In case template was already loaded.
|
||||
Raise the error as a default action.
|
||||
Override this method in your template loader implementation
|
||||
to manage this case."""
|
||||
self.log.error("{}: {}".format(
|
||||
err_msg.__class__.__name__,
|
||||
err_msg))
|
||||
raise TemplateAlreadyImported(err_msg)
|
||||
|
||||
def template_loading_failed(self, err_msg):
|
||||
"""In case template loading failed
|
||||
Raise the error as a default action.
|
||||
Override this method in your template loader implementation
|
||||
to manage this case.
|
||||
"""
|
||||
self.log.error("{}: {}".format(
|
||||
err_msg.__class__.__name__,
|
||||
err_msg))
|
||||
raise TemplateLoadingFailed(err_msg)
|
||||
|
||||
@property
|
||||
def template_path(self):
|
||||
"""
|
||||
Property returning template path. Avoiding setter.
|
||||
Getting template path from open pype settings based on current avalon
|
||||
session and solving the path variables if needed.
|
||||
Returns:
|
||||
str: Solved template path
|
||||
Raises:
|
||||
TemplateProfileNotFound: No profile found from settings for
|
||||
current avalon session
|
||||
KeyError: Could not solve path because a key does not exists
|
||||
in avalon context
|
||||
TemplateNotFound: Solved path does not exists on current filesystem
|
||||
"""
|
||||
project_name = self.project_name
|
||||
host_name = self.host_name
|
||||
task_name = self.task_name
|
||||
task_type = self.task_type
|
||||
|
||||
anatomy = Anatomy(project_name)
|
||||
project_settings = get_project_settings(project_name)
|
||||
|
||||
build_info = project_settings[host_name]["templated_workfile_build"]
|
||||
profile = filter_profiles(
|
||||
build_info["profiles"],
|
||||
{
|
||||
"task_types": task_type,
|
||||
"tasks": task_name
|
||||
}
|
||||
)
|
||||
|
||||
if not profile:
|
||||
raise TemplateProfileNotFound(
|
||||
"No matching profile found for task '{}' of type '{}' "
|
||||
"with host '{}'".format(task_name, task_type, host_name)
|
||||
)
|
||||
|
||||
path = profile["path"]
|
||||
if not path:
|
||||
raise TemplateLoadingFailed(
|
||||
"Template path is not set.\n"
|
||||
"Path need to be set in {}\\Template Workfile Build "
|
||||
"Settings\\Profiles".format(host_name.title()))
|
||||
|
||||
# Try fill path with environments and anatomy roots
|
||||
fill_data = {
|
||||
key: value
|
||||
for key, value in os.environ.items()
|
||||
}
|
||||
fill_data["root"] = anatomy.roots
|
||||
result = StringTemplate.format_template(path, fill_data)
|
||||
if result.solved:
|
||||
path = result.normalized()
|
||||
|
||||
if path and os.path.exists(path):
|
||||
self.log.info("Found template at: '{}'".format(path))
|
||||
return path
|
||||
|
||||
solved_path = None
|
||||
while True:
|
||||
try:
|
||||
solved_path = anatomy.path_remapper(path)
|
||||
except KeyError as missing_key:
|
||||
raise KeyError(
|
||||
"Could not solve key '{}' in template path '{}'".format(
|
||||
missing_key, path))
|
||||
|
||||
if solved_path is None:
|
||||
solved_path = path
|
||||
if solved_path == path:
|
||||
break
|
||||
path = solved_path
|
||||
|
||||
solved_path = os.path.normpath(solved_path)
|
||||
if not os.path.exists(solved_path):
|
||||
raise TemplateNotFound(
|
||||
"Template found in openPype settings for task '{}' with host "
|
||||
"'{}' does not exists. (Not found : {})".format(
|
||||
task_name, host_name, solved_path))
|
||||
|
||||
self.log.info("Found template at: '{}'".format(solved_path))
|
||||
|
||||
return solved_path
|
||||
|
||||
def populate_template(self, ignored_ids=None):
|
||||
"""
|
||||
Use template placeholders to load assets and parent them in hierarchy
|
||||
Arguments :
|
||||
ignored_ids :
|
||||
Returns:
|
||||
None
|
||||
"""
|
||||
|
||||
loaders_by_name = self.loaders_by_name
|
||||
current_asset_doc = self.current_asset_doc
|
||||
linked_assets = get_linked_assets(current_asset_doc)
|
||||
|
||||
ignored_ids = ignored_ids or []
|
||||
placeholders = self.get_placeholders()
|
||||
self.log.debug("Placeholders found in template: {}".format(
|
||||
[placeholder.name for placeholder in placeholders]
|
||||
))
|
||||
for placeholder in placeholders:
|
||||
self.log.debug("Start to processing placeholder {}".format(
|
||||
placeholder.name
|
||||
))
|
||||
placeholder_representations = self.get_placeholder_representations(
|
||||
placeholder,
|
||||
current_asset_doc,
|
||||
linked_assets
|
||||
)
|
||||
|
||||
if not placeholder_representations:
|
||||
self.log.info(
|
||||
"There's no representation for this placeholder: "
|
||||
"{}".format(placeholder.name)
|
||||
)
|
||||
continue
|
||||
|
||||
for representation in placeholder_representations:
|
||||
self.preload(placeholder, loaders_by_name, representation)
|
||||
|
||||
if self.load_data_is_incorrect(
|
||||
placeholder,
|
||||
representation,
|
||||
ignored_ids):
|
||||
continue
|
||||
|
||||
self.log.info(
|
||||
"Loading {}_{} with loader {}\n"
|
||||
"Loader arguments used : {}".format(
|
||||
representation['context']['asset'],
|
||||
representation['context']['subset'],
|
||||
placeholder.loader_name,
|
||||
placeholder.loader_args))
|
||||
|
||||
try:
|
||||
container = self.load(
|
||||
placeholder, loaders_by_name, representation)
|
||||
except Exception:
|
||||
self.load_failed(placeholder, representation)
|
||||
else:
|
||||
self.load_succeed(placeholder, container)
|
||||
finally:
|
||||
self.postload(placeholder)
|
||||
|
||||
def get_placeholder_representations(
|
||||
self, placeholder, current_asset_doc, linked_asset_docs
|
||||
):
|
||||
placeholder_representations = placeholder.get_representations(
|
||||
current_asset_doc,
|
||||
linked_asset_docs
|
||||
)
|
||||
for repre_doc in reduce(
|
||||
update_representations,
|
||||
placeholder_representations,
|
||||
dict()
|
||||
).values():
|
||||
yield repre_doc
|
||||
|
||||
def load_data_is_incorrect(
|
||||
self, placeholder, last_representation, ignored_ids):
|
||||
if not last_representation:
|
||||
self.log.warning(placeholder.err_message())
|
||||
return True
|
||||
if (str(last_representation['_id']) in ignored_ids):
|
||||
print("Ignoring : ", last_representation['_id'])
|
||||
return True
|
||||
return False
|
||||
|
||||
def preload(self, placeholder, loaders_by_name, last_representation):
|
||||
pass
|
||||
|
||||
def load(self, placeholder, loaders_by_name, last_representation):
|
||||
repre = get_representation_context(last_representation)
|
||||
return load_with_repre_context(
|
||||
loaders_by_name[placeholder.loader_name],
|
||||
repre,
|
||||
options=parse_loader_args(placeholder.loader_args))
|
||||
|
||||
def load_succeed(self, placeholder, container):
|
||||
placeholder.parent_in_hierarchy(container)
|
||||
|
||||
def load_failed(self, placeholder, last_representation):
|
||||
self.log.warning(
|
||||
"Got error trying to load {}:{} with {}".format(
|
||||
last_representation['context']['asset'],
|
||||
last_representation['context']['subset'],
|
||||
placeholder.loader_name
|
||||
),
|
||||
exc_info=True
|
||||
)
|
||||
|
||||
def postload(self, placeholder):
|
||||
placeholder.clean()
|
||||
|
||||
def update_missing_containers(self):
|
||||
loaded_containers_ids = self.get_loaded_containers_by_id()
|
||||
self.populate_template(ignored_ids=loaded_containers_ids)
|
||||
|
||||
def get_placeholders(self):
|
||||
placeholders = map(self.placeholder_class, self.get_template_nodes())
|
||||
valid_placeholders = filter(
|
||||
lambda i: i.is_valid,
|
||||
placeholders
|
||||
)
|
||||
sorted_placeholders = list(sorted(
|
||||
valid_placeholders,
|
||||
key=lambda i: i.order
|
||||
))
|
||||
return sorted_placeholders
|
||||
|
||||
@abstractmethod
|
||||
def get_loaded_containers_by_id(self):
|
||||
"""
|
||||
Collect already loaded containers for updating scene
|
||||
Return:
|
||||
dict (string, node): A dictionnary id as key
|
||||
and containers as value
|
||||
"""
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def import_template(self, template_path):
|
||||
"""
|
||||
Import template in current host
|
||||
Args:
|
||||
template_path (str): fullpath to current task and
|
||||
host's template file
|
||||
Return:
|
||||
None
|
||||
"""
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def get_template_nodes(self):
|
||||
"""
|
||||
Returning a list of nodes acting as host placeholders for
|
||||
templating. The data representation is by user.
|
||||
AbstractLoadTemplate (and LoadTemplate) won't directly manipulate nodes
|
||||
Args :
|
||||
None
|
||||
Returns:
|
||||
list(AnyNode): Solved template path
|
||||
"""
|
||||
pass
|
||||
|
||||
|
||||
@six.add_metaclass(ABCMeta)
|
||||
class AbstractPlaceholder:
|
||||
"""Abstraction of placeholders logic.
|
||||
|
||||
Properties:
|
||||
required_keys: A list of mandatory keys to decribe placeholder
|
||||
and assets to load.
|
||||
optional_keys: A list of optional keys to decribe
|
||||
placeholder and assets to load
|
||||
loader_name: Name of linked loader to use while loading assets
|
||||
|
||||
Args:
|
||||
identifier (str): Placeholder identifier. Should be possible to be
|
||||
used as identifier in "a scene" (e.g. unique node name).
|
||||
"""
|
||||
|
||||
required_keys = {
|
||||
"builder_type",
|
||||
"family",
|
||||
"representation",
|
||||
"order",
|
||||
"loader",
|
||||
"loader_args"
|
||||
}
|
||||
optional_keys = {}
|
||||
|
||||
def __init__(self, identifier):
|
||||
self._log = None
|
||||
self._name = identifier
|
||||
self.get_data(identifier)
|
||||
|
||||
@property
|
||||
def log(self):
|
||||
if self._log is None:
|
||||
self._log = Logger.get_logger(repr(self))
|
||||
return self._log
|
||||
|
||||
def __repr__(self):
|
||||
return "< {} {} >".format(self.__class__.__name__, self.name)
|
||||
|
||||
@property
|
||||
def name(self):
|
||||
return self._name
|
||||
|
||||
@property
|
||||
def loader_args(self):
|
||||
return self.data["loader_args"]
|
||||
|
||||
@property
|
||||
def builder_type(self):
|
||||
return self.data["builder_type"]
|
||||
|
||||
@property
|
||||
def order(self):
|
||||
return self.data["order"]
|
||||
|
||||
@property
|
||||
def loader_name(self):
|
||||
"""Return placeholder loader name.
|
||||
|
||||
Returns:
|
||||
str: Loader name that will be used to load placeholder
|
||||
representations.
|
||||
"""
|
||||
|
||||
return self.data["loader"]
|
||||
|
||||
@property
|
||||
def is_valid(self):
|
||||
"""Test validity of placeholder.
|
||||
|
||||
i.e.: every required key exists in placeholder data
|
||||
|
||||
Returns:
|
||||
bool: True if every key is in data
|
||||
"""
|
||||
|
||||
if set(self.required_keys).issubset(self.data.keys()):
|
||||
self.log.debug("Valid placeholder : {}".format(self.name))
|
||||
return True
|
||||
self.log.info("Placeholder is not valid : {}".format(self.name))
|
||||
return False
|
||||
|
||||
@abstractmethod
|
||||
def parent_in_hierarchy(self, container):
|
||||
"""Place loaded container in correct hierarchy given by placeholder
|
||||
|
||||
Args:
|
||||
container (Dict[str, Any]): Loaded container created by loader.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def clean(self):
|
||||
"""Clean placeholder from hierarchy after loading assets."""
|
||||
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def get_representations(self, current_asset_doc, linked_asset_docs):
|
||||
"""Query representations based on placeholder data.
|
||||
|
||||
Args:
|
||||
current_asset_doc (Dict[str, Any]): Document of current
|
||||
context asset.
|
||||
linked_asset_docs (List[Dict[str, Any]]): Documents of assets
|
||||
linked to current context asset.
|
||||
|
||||
Returns:
|
||||
Iterable[Dict[str, Any]]: Representations that are matching
|
||||
placeholder filters.
|
||||
"""
|
||||
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def get_data(self, identifier):
|
||||
"""Collect information about placeholder by identifier.
|
||||
|
||||
Args:
|
||||
identifier (str): A unique placeholder identifier defined by
|
||||
implementation.
|
||||
"""
|
||||
|
||||
pass
|
||||
68
openpype/pipeline/workfile/build_template.py
Normal file
68
openpype/pipeline/workfile/build_template.py
Normal file
|
|
@ -0,0 +1,68 @@
|
|||
from importlib import import_module
|
||||
from openpype.lib import classes_from_module
|
||||
from openpype.host import HostBase
|
||||
from openpype.pipeline import registered_host
|
||||
|
||||
from .abstract_template_loader import (
|
||||
AbstractPlaceholder,
|
||||
AbstractTemplateLoader)
|
||||
|
||||
from .build_template_exceptions import (
|
||||
TemplateLoadingFailed,
|
||||
TemplateAlreadyImported,
|
||||
MissingHostTemplateModule,
|
||||
MissingTemplatePlaceholderClass,
|
||||
MissingTemplateLoaderClass
|
||||
)
|
||||
|
||||
_module_path_format = 'openpype.hosts.{host}.api.template_loader'
|
||||
|
||||
|
||||
def build_workfile_template(*args):
|
||||
template_loader = build_template_loader()
|
||||
try:
|
||||
template_loader.import_template(template_loader.template_path)
|
||||
except TemplateAlreadyImported as err:
|
||||
template_loader.template_already_imported(err)
|
||||
except TemplateLoadingFailed as err:
|
||||
template_loader.template_loading_failed(err)
|
||||
else:
|
||||
template_loader.populate_template()
|
||||
|
||||
|
||||
def update_workfile_template(args):
|
||||
template_loader = build_template_loader()
|
||||
template_loader.update_missing_containers()
|
||||
|
||||
|
||||
def build_template_loader():
|
||||
# TODO refactor to use advantage of 'HostBase' and don't import dynamically
|
||||
# - hosts should have methods that gives option to return builders
|
||||
host = registered_host()
|
||||
if isinstance(host, HostBase):
|
||||
host_name = host.name
|
||||
else:
|
||||
host_name = host.__name__.partition('.')[2]
|
||||
module_path = _module_path_format.format(host=host_name)
|
||||
module = import_module(module_path)
|
||||
if not module:
|
||||
raise MissingHostTemplateModule(
|
||||
"No template loader found for host {}".format(host_name))
|
||||
|
||||
template_loader_class = classes_from_module(
|
||||
AbstractTemplateLoader,
|
||||
module
|
||||
)
|
||||
template_placeholder_class = classes_from_module(
|
||||
AbstractPlaceholder,
|
||||
module
|
||||
)
|
||||
|
||||
if not template_loader_class:
|
||||
raise MissingTemplateLoaderClass()
|
||||
template_loader_class = template_loader_class[0]
|
||||
|
||||
if not template_placeholder_class:
|
||||
raise MissingTemplatePlaceholderClass()
|
||||
template_placeholder_class = template_placeholder_class[0]
|
||||
return template_loader_class(template_placeholder_class)
|
||||
35
openpype/pipeline/workfile/build_template_exceptions.py
Normal file
35
openpype/pipeline/workfile/build_template_exceptions.py
Normal file
|
|
@ -0,0 +1,35 @@
|
|||
class MissingHostTemplateModule(Exception):
|
||||
"""Error raised when expected module does not exists"""
|
||||
pass
|
||||
|
||||
|
||||
class MissingTemplatePlaceholderClass(Exception):
|
||||
"""Error raised when module doesn't implement a placeholder class"""
|
||||
pass
|
||||
|
||||
|
||||
class MissingTemplateLoaderClass(Exception):
|
||||
"""Error raised when module doesn't implement a template loader class"""
|
||||
pass
|
||||
|
||||
|
||||
class TemplateNotFound(Exception):
|
||||
"""Exception raised when template does not exist."""
|
||||
pass
|
||||
|
||||
|
||||
class TemplateProfileNotFound(Exception):
|
||||
"""Exception raised when current profile
|
||||
doesn't match any template profile"""
|
||||
pass
|
||||
|
||||
|
||||
class TemplateAlreadyImported(Exception):
|
||||
"""Error raised when Template was already imported by host for
|
||||
this session"""
|
||||
pass
|
||||
|
||||
|
||||
class TemplateLoadingFailed(Exception):
|
||||
"""Error raised whend Template loader was unable to load the template"""
|
||||
pass
|
||||
|
|
@ -430,6 +430,9 @@
|
|||
"enabled": false,
|
||||
"custom_attribute_keys": []
|
||||
},
|
||||
"IntegrateHierarchyToFtrack": {
|
||||
"create_task_status_profiles": []
|
||||
},
|
||||
"IntegrateFtrackNote": {
|
||||
"enabled": true,
|
||||
"note_template": "{intent}: {comment}",
|
||||
|
|
@ -484,7 +487,11 @@
|
|||
"usd": "usd"
|
||||
},
|
||||
"keep_first_subset_name_for_review": true,
|
||||
"asset_versions_status_profiles": []
|
||||
"asset_versions_status_profiles": [],
|
||||
"additional_metadata_keys": []
|
||||
},
|
||||
"IntegrateFtrackFarmStatus": {
|
||||
"farm_status_profiles": []
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -967,6 +967,9 @@
|
|||
}
|
||||
]
|
||||
},
|
||||
"templated_workfile_build": {
|
||||
"profiles": []
|
||||
},
|
||||
"filters": {
|
||||
"preset 1": {
|
||||
"ValidateNoAnimation": false,
|
||||
|
|
@ -976,4 +979,4 @@
|
|||
"ValidateNoAnimation": false
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -19,4 +19,4 @@
|
|||
"step": "step"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -823,6 +823,44 @@
|
|||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"key": "IntegrateHierarchyToFtrack",
|
||||
"label": "Integrate Hierarchy to ftrack",
|
||||
"is_group": true,
|
||||
"collapsible": true,
|
||||
"children": [
|
||||
{
|
||||
"type": "label",
|
||||
"label": "Set task status on new task creation. Ftrack's default status is used otherwise."
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "create_task_status_profiles",
|
||||
"object_type": {
|
||||
"type": "dict",
|
||||
"children": [
|
||||
{
|
||||
"key": "task_types",
|
||||
"label": "Task types",
|
||||
"type": "task-types-enum"
|
||||
},
|
||||
{
|
||||
"key": "task_names",
|
||||
"label": "Task names",
|
||||
"type": "list",
|
||||
"object_type": "text"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "status_name",
|
||||
"label": "Status name"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
|
|
@ -983,6 +1021,82 @@
|
|||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"key": "additional_metadata_keys",
|
||||
"label": "Additional metadata keys on components",
|
||||
"type": "enum",
|
||||
"multiselection": true,
|
||||
"enum_items": [
|
||||
{"openpype_version": "OpenPype version"},
|
||||
{"frame_start": "Frame start"},
|
||||
{"frame_end": "Frame end"},
|
||||
{"duration": "Duration"},
|
||||
{"width": "Resolution width"},
|
||||
{"height": "Resolution height"},
|
||||
{"fps": "FPS"},
|
||||
{"code": "Codec"}
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"key": "IntegrateFtrackFarmStatus",
|
||||
"label": "Integrate Ftrack Farm Status",
|
||||
"children": [
|
||||
{
|
||||
"type": "label",
|
||||
"label": "Change status of task when it's subset is submitted to farm"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"collapsible": true,
|
||||
"key": "farm_status_profiles",
|
||||
"label": "Farm status profiles",
|
||||
"use_label_wrap": true,
|
||||
"object_type": {
|
||||
"type": "dict",
|
||||
"children": [
|
||||
{
|
||||
"key": "hosts",
|
||||
"label": "Host names",
|
||||
"type": "hosts-enum",
|
||||
"multiselection": true
|
||||
},
|
||||
{
|
||||
"key": "task_types",
|
||||
"label": "Task types",
|
||||
"type": "task-types-enum"
|
||||
},
|
||||
{
|
||||
"key": "task_names",
|
||||
"label": "Task names",
|
||||
"type": "list",
|
||||
"object_type": "text"
|
||||
},
|
||||
{
|
||||
"key": "families",
|
||||
"label": "Families",
|
||||
"type": "list",
|
||||
"object_type": "text"
|
||||
},
|
||||
{
|
||||
"key": "subsets",
|
||||
"label": "Subset names",
|
||||
"type": "list",
|
||||
"object_type": "text"
|
||||
},
|
||||
{
|
||||
"type": "separator"
|
||||
},
|
||||
{
|
||||
"key": "status_name",
|
||||
"label": "Status name",
|
||||
"type": "text"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
|
|
|
|||
|
|
@ -77,6 +77,10 @@
|
|||
"type": "schema",
|
||||
"name": "schema_workfile_build"
|
||||
},
|
||||
{
|
||||
"type": "schema",
|
||||
"name": "schema_templated_workfile_build"
|
||||
},
|
||||
{
|
||||
"type": "schema",
|
||||
"name": "schema_publish_gui_filter"
|
||||
|
|
|
|||
|
|
@ -0,0 +1,35 @@
|
|||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"key": "templated_workfile_build",
|
||||
"label": "Templated Workfile Build Settings",
|
||||
"children": [
|
||||
{
|
||||
"type": "list",
|
||||
"key": "profiles",
|
||||
"label": "Profiles",
|
||||
"object_type": {
|
||||
"type": "dict",
|
||||
"children": [
|
||||
{
|
||||
"key": "task_types",
|
||||
"label": "Task types",
|
||||
"type": "task-types-enum"
|
||||
},
|
||||
{
|
||||
"key": "tasks",
|
||||
"label": "Task names",
|
||||
"type": "list",
|
||||
"object_type": "text"
|
||||
},
|
||||
{
|
||||
"key": "path",
|
||||
"label": "Path to template",
|
||||
"type": "text",
|
||||
"object_type": "text"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
|
|
@ -10,6 +10,7 @@ from Qt import QtCore, QtGui
|
|||
import qtawesome
|
||||
|
||||
from openpype.client import (
|
||||
get_projects,
|
||||
get_project,
|
||||
get_assets,
|
||||
)
|
||||
|
|
@ -527,7 +528,7 @@ class LauncherModel(QtCore.QObject):
|
|||
current_project = self.project_name
|
||||
project_names = set()
|
||||
project_docs_by_name = {}
|
||||
for project_doc in self._dbcon.projects(only_active=True):
|
||||
for project_doc in get_projects():
|
||||
project_name = project_doc["name"]
|
||||
project_names.add(project_name)
|
||||
project_docs_by_name[project_name] = project_doc
|
||||
|
|
|
|||
|
|
@ -3,7 +3,7 @@ import sys
|
|||
from Qt import QtWidgets, QtCore, QtGui
|
||||
|
||||
from openpype import style
|
||||
from openpype.client import get_project
|
||||
from openpype.client import get_projects, get_project
|
||||
from openpype.pipeline import AvalonMongoDB
|
||||
from openpype.tools.utils import lib as tools_lib
|
||||
from openpype.tools.loader.widgets import (
|
||||
|
|
@ -239,7 +239,7 @@ class LibraryLoaderWindow(QtWidgets.QDialog):
|
|||
|
||||
def get_filtered_projects(self):
|
||||
projects = list()
|
||||
for project in self.dbcon.projects():
|
||||
for project in get_projects(fields=["name", "data.library_project"]):
|
||||
is_library = project.get("data", {}).get("library_project", False)
|
||||
if (
|
||||
(is_library and self.show_libraries) or
|
||||
|
|
|
|||
|
|
@ -8,6 +8,7 @@ from pymongo import UpdateOne, DeleteOne
|
|||
from Qt import QtCore, QtGui
|
||||
|
||||
from openpype.client import (
|
||||
get_projects,
|
||||
get_project,
|
||||
get_assets,
|
||||
get_asset_ids_with_subsets,
|
||||
|
|
@ -54,12 +55,8 @@ class ProjectModel(QtGui.QStandardItemModel):
|
|||
self._items_by_name[None] = none_project
|
||||
new_project_items.append(none_project)
|
||||
|
||||
project_docs = self.dbcon.projects(
|
||||
projection={"name": 1},
|
||||
only_active=True
|
||||
)
|
||||
project_names = set()
|
||||
for project_doc in project_docs:
|
||||
for project_doc in get_projects(fields=["name"]):
|
||||
project_name = project_doc.get("name")
|
||||
if not project_name:
|
||||
continue
|
||||
|
|
|
|||
|
|
@ -4,8 +4,9 @@ import sys
|
|||
from Qt import QtWidgets, QtCore
|
||||
import qtawesome
|
||||
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype import style
|
||||
from openpype.client import get_projects
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype.tools.utils.delegates import VersionDelegate
|
||||
from openpype.tools.utils.lib import (
|
||||
qt_app_context,
|
||||
|
|
@ -195,8 +196,7 @@ def show(root=None, debug=False, parent=None, items=None):
|
|||
|
||||
if not os.environ.get("AVALON_PROJECT"):
|
||||
any_project = next(
|
||||
project for project in legacy_io.projects()
|
||||
if project.get("active", True) is not False
|
||||
project for project in get_projects()
|
||||
)
|
||||
|
||||
project_name = any_project["name"]
|
||||
|
|
|
|||
|
|
@ -3,6 +3,7 @@ import uuid
|
|||
from Qt import QtWidgets, QtCore, QtGui
|
||||
import qtawesome
|
||||
|
||||
from openpype.client import get_projects
|
||||
from openpype.pipeline import AvalonMongoDB
|
||||
from openpype.style import get_objected_colors
|
||||
from openpype.tools.utils.widgets import ImageButton
|
||||
|
|
@ -783,8 +784,6 @@ class ProjectModel(QtGui.QStandardItemModel):
|
|||
|
||||
self.setColumnCount(2)
|
||||
|
||||
self.dbcon = None
|
||||
|
||||
self._only_active = only_active
|
||||
self._default_item = None
|
||||
self._items_by_name = {}
|
||||
|
|
@ -828,9 +827,6 @@ class ProjectModel(QtGui.QStandardItemModel):
|
|||
index = self.index(index.row(), 0, index.parent())
|
||||
return super(ProjectModel, self).flags(index)
|
||||
|
||||
def set_dbcon(self, dbcon):
|
||||
self.dbcon = dbcon
|
||||
|
||||
def refresh(self):
|
||||
# Change id of versions refresh
|
||||
self._version_refresh_id = uuid.uuid4()
|
||||
|
|
@ -846,31 +842,30 @@ class ProjectModel(QtGui.QStandardItemModel):
|
|||
|
||||
self._default_item.setData("", PROJECT_VERSION_ROLE)
|
||||
project_names = set()
|
||||
if self.dbcon is not None:
|
||||
for project_doc in self.dbcon.projects(
|
||||
projection={"name": 1, "data.active": 1},
|
||||
only_active=self._only_active
|
||||
):
|
||||
project_name = project_doc["name"]
|
||||
project_names.add(project_name)
|
||||
if project_name in self._items_by_name:
|
||||
item = self._items_by_name[project_name]
|
||||
else:
|
||||
item = QtGui.QStandardItem(project_name)
|
||||
for project_doc in get_projects(
|
||||
inactive=not self._only_active,
|
||||
fields=["name", "data.active"]
|
||||
):
|
||||
project_name = project_doc["name"]
|
||||
project_names.add(project_name)
|
||||
if project_name in self._items_by_name:
|
||||
item = self._items_by_name[project_name]
|
||||
else:
|
||||
item = QtGui.QStandardItem(project_name)
|
||||
|
||||
self._items_by_name[project_name] = item
|
||||
new_items.append(item)
|
||||
self._items_by_name[project_name] = item
|
||||
new_items.append(item)
|
||||
|
||||
is_active = project_doc.get("data", {}).get("active", True)
|
||||
item.setData(project_name, PROJECT_NAME_ROLE)
|
||||
item.setData(is_active, PROJECT_IS_ACTIVE_ROLE)
|
||||
item.setData("", PROJECT_VERSION_ROLE)
|
||||
item.setData(False, PROJECT_IS_SELECTED_ROLE)
|
||||
is_active = project_doc.get("data", {}).get("active", True)
|
||||
item.setData(project_name, PROJECT_NAME_ROLE)
|
||||
item.setData(is_active, PROJECT_IS_ACTIVE_ROLE)
|
||||
item.setData("", PROJECT_VERSION_ROLE)
|
||||
item.setData(False, PROJECT_IS_SELECTED_ROLE)
|
||||
|
||||
if not is_active:
|
||||
font = item.font()
|
||||
font.setItalic(True)
|
||||
item.setFont(font)
|
||||
if not is_active:
|
||||
font = item.font()
|
||||
font.setItalic(True)
|
||||
item.setFont(font)
|
||||
|
||||
root_item = self.invisibleRootItem()
|
||||
for project_name in tuple(self._items_by_name.keys()):
|
||||
|
|
@ -1067,8 +1062,6 @@ class ProjectListWidget(QtWidgets.QWidget):
|
|||
self.project_model = project_model
|
||||
self.inactive_chk = inactive_chk
|
||||
|
||||
self.dbcon = None
|
||||
|
||||
def set_entity(self, entity):
|
||||
self._entity = entity
|
||||
|
||||
|
|
@ -1211,15 +1204,6 @@ class ProjectListWidget(QtWidgets.QWidget):
|
|||
selected_project = index.data(PROJECT_NAME_ROLE)
|
||||
break
|
||||
|
||||
if not self.dbcon:
|
||||
try:
|
||||
self.dbcon = AvalonMongoDB()
|
||||
self.dbcon.install()
|
||||
except Exception:
|
||||
self.dbcon = None
|
||||
self.current_project = None
|
||||
|
||||
self.project_model.set_dbcon(self.dbcon)
|
||||
self.project_model.refresh()
|
||||
|
||||
self.project_proxy.sort(0)
|
||||
|
|
|
|||
|
|
@ -3,6 +3,7 @@ from Qt import QtWidgets, QtCore
|
|||
import qtawesome
|
||||
|
||||
from openpype.client import (
|
||||
get_projects,
|
||||
get_project,
|
||||
get_asset_by_id,
|
||||
)
|
||||
|
|
@ -291,9 +292,7 @@ class AssetWidget(QtWidgets.QWidget):
|
|||
def _set_projects(self):
|
||||
project_names = list()
|
||||
|
||||
for doc in self.dbcon.projects(projection={"name": 1},
|
||||
only_active=True):
|
||||
|
||||
for doc in get_projects(fields=["name"]):
|
||||
project_name = doc.get("name")
|
||||
if project_name:
|
||||
project_names.append(project_name)
|
||||
|
|
@ -320,8 +319,7 @@ class AssetWidget(QtWidgets.QWidget):
|
|||
def on_project_change(self):
|
||||
projects = list()
|
||||
|
||||
for project in self.dbcon.projects(projection={"name": 1},
|
||||
only_active=True):
|
||||
for project in get_projects(fields=["name"]):
|
||||
projects.append(project['name'])
|
||||
project_name = self.combo_projects.currentText()
|
||||
if project_name in projects:
|
||||
|
|
|
|||
|
|
@ -443,10 +443,6 @@ class FamilyConfigCache:
|
|||
if profiles:
|
||||
# Make sure connection is installed
|
||||
# - accessing attribute which does not have auto-install
|
||||
self.dbcon.install()
|
||||
database = getattr(self.dbcon, "database", None)
|
||||
if database is None:
|
||||
database = self.dbcon._database
|
||||
asset_doc = get_asset_by_name(
|
||||
project_name, asset_name, fields=["data.tasks"]
|
||||
) or {}
|
||||
|
|
|
|||
|
|
@ -3,6 +3,7 @@ import logging
|
|||
|
||||
import Qt
|
||||
from Qt import QtCore, QtGui
|
||||
from openpype.client import get_projects
|
||||
from .constants import (
|
||||
PROJECT_IS_ACTIVE_ROLE,
|
||||
PROJECT_NAME_ROLE,
|
||||
|
|
@ -296,29 +297,29 @@ class ProjectModel(QtGui.QStandardItemModel):
|
|||
self._default_item = item
|
||||
|
||||
project_names = set()
|
||||
if self.dbcon is not None:
|
||||
for project_doc in self.dbcon.projects(
|
||||
projection={"name": 1, "data.active": 1},
|
||||
only_active=self._only_active
|
||||
):
|
||||
project_name = project_doc["name"]
|
||||
project_names.add(project_name)
|
||||
if project_name in self._items_by_name:
|
||||
item = self._items_by_name[project_name]
|
||||
else:
|
||||
item = QtGui.QStandardItem(project_name)
|
||||
project_docs = get_projects(
|
||||
inactive=not self._only_active,
|
||||
fields=["name", "data.active"]
|
||||
)
|
||||
for project_doc in project_docs:
|
||||
project_name = project_doc["name"]
|
||||
project_names.add(project_name)
|
||||
if project_name in self._items_by_name:
|
||||
item = self._items_by_name[project_name]
|
||||
else:
|
||||
item = QtGui.QStandardItem(project_name)
|
||||
|
||||
self._items_by_name[project_name] = item
|
||||
new_items.append(item)
|
||||
self._items_by_name[project_name] = item
|
||||
new_items.append(item)
|
||||
|
||||
is_active = project_doc.get("data", {}).get("active", True)
|
||||
item.setData(project_name, PROJECT_NAME_ROLE)
|
||||
item.setData(is_active, PROJECT_IS_ACTIVE_ROLE)
|
||||
is_active = project_doc.get("data", {}).get("active", True)
|
||||
item.setData(project_name, PROJECT_NAME_ROLE)
|
||||
item.setData(is_active, PROJECT_IS_ACTIVE_ROLE)
|
||||
|
||||
if not is_active:
|
||||
font = item.font()
|
||||
font.setItalic(True)
|
||||
item.setFont(font)
|
||||
if not is_active:
|
||||
font = item.font()
|
||||
font.setItalic(True)
|
||||
item.setFont(font)
|
||||
|
||||
root_item = self.invisibleRootItem()
|
||||
for project_name in tuple(self._items_by_name.keys()):
|
||||
|
|
|
|||
|
|
@ -1,3 +1,3 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Package declaring Pype version."""
|
||||
__version__ = "3.13.1-nightly.2"
|
||||
__version__ = "3.13.1-nightly.3"
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
[tool.poetry]
|
||||
name = "OpenPype"
|
||||
version = "3.13.1-nightly.2" # OpenPype
|
||||
version = "3.13.1-nightly.3" # OpenPype
|
||||
description = "Open VFX and Animation pipeline with support."
|
||||
authors = ["OpenPype Team <info@openpype.io>"]
|
||||
license = "MIT License"
|
||||
|
|
|
|||
|
|
@ -120,3 +120,54 @@ raw json.
|
|||
You can configure path mapping using Maya `dirmap` command. This will add bi-directional mapping between
|
||||
list of paths specified in **Settings**. You can find it in **Settings -> Project Settings -> Maya -> Maya Directory Mapping**
|
||||

|
||||
|
||||
## Templated Build Workfile
|
||||
|
||||
Building a workfile using a template designed by users. Helping to assert homogeneous subsets hierarchy and imports. Template stored as file easy to define, change and customize for production needs.
|
||||
|
||||
**1. Make a template**
|
||||
|
||||
Make your template. Add families and everything needed for your tasks. Here is an example template for the modeling task using a placeholder to import a gauge.
|
||||
|
||||

|
||||
|
||||
If needed, you can add placeholders when the template needs to load some assets. **OpenPype > Template Builder > Create Placeholder**
|
||||
|
||||

|
||||
|
||||
- **Configure placeholders**
|
||||
|
||||
Fill in the necessary fields (the optional fields are regex filters)
|
||||
|
||||

|
||||
|
||||
|
||||
- Builder type: Wether the the placeholder should load current asset representations or linked assets representations
|
||||
|
||||
- Representation: Representation that will be loaded (ex: ma, abc, png, etc...)
|
||||
|
||||
- Family: Family of the representation to load (main, look, image, etc ...)
|
||||
|
||||
- Loader: Placeholder loader name that will be used to load corresponding representations
|
||||
|
||||
- Order: Priority for current placeholder loader (priority is lowest first, highet last)
|
||||
|
||||
- **Save your template**
|
||||
|
||||
|
||||
**2. Configure Template**
|
||||
|
||||
- **Go to Studio settings > Project > Your DCC > Templated Build Settings**
|
||||
- Add a profile for your task and enter path to your template
|
||||
|
||||

|
||||
|
||||
**3. Build your workfile**
|
||||
|
||||
- Open maya
|
||||
|
||||
- Build your workfile
|
||||
|
||||

|
||||
|
||||
|
||||
|
|
|
|||
BIN
website/docs/assets/maya-build_workfile_from_template.png
Normal file
BIN
website/docs/assets/maya-build_workfile_from_template.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 29 KiB |
BIN
website/docs/assets/maya-create_placeholder.png
Normal file
BIN
website/docs/assets/maya-create_placeholder.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 31 KiB |
BIN
website/docs/assets/maya-placeholder_new.png
Normal file
BIN
website/docs/assets/maya-placeholder_new.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 27 KiB |
BIN
website/docs/assets/maya-workfile-outliner.png
Normal file
BIN
website/docs/assets/maya-workfile-outliner.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 4.7 KiB |
BIN
website/docs/assets/settings/template_build_workfile.png
Normal file
BIN
website/docs/assets/settings/template_build_workfile.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 12 KiB |
Loading…
Add table
Add a link
Reference in a new issue