mirror of
https://github.com/ynput/ayon-core.git
synced 2026-01-01 16:34:53 +01:00
* General: Connect to AYON server (base) (#3924) * implemented 'get_workfile_info' in entities * removed 'prepare_asset_update_data' which is not used * disable settings and project manager if in v4 mode * prepared conversion helper functions for v4 entities * prepared conversion functions for hero versions * fix hero versions * implemented get_archived_representations * fix get latest versions * return prepared changes * handle archived representation * raise exception on failed json conversion * map archived to active properly * make sure default fields are added * fix conversion of hero version entity * fix conversion of archived representations * fix some conversions of representations and versions * changed active behavior in queries * fixed hero versions * implemented basic thumbnail caching * added raw variants of crud methods * implemented methods to get and create thumbnail * fix from flat dict * implemented some basic folder conversion for updates * fix thumbnail updates for version * implemented v4 thumbnail integrator * simplified data mapping * 'get_thumbnail' function also expect entity type and entity id for which is the thumbnail received * implemented 'get_thumbnail' for server * fix how thumbnail id is received from entity * removed unnecessary method 'get_thumbnail_id_from_source' * implemented thumbnail resolver for v4 * removed unnecessary print * move create and delete project directly to server api * disable local settings action too on v4 * OP-3521 - added method to check and download updated addons from v4 server * OP-3521 - added more descriptive error message for missing source * OP-3521 - added default implementation of addon downloader to import * OP-3521 - added check for dependency package zips WIP - server doesn't contain required endpoint. Testing only with mockup data for now. * OP-3521 - fixed parsing of DependencyItem Added Server Url type and ServerAddonDownloader - v4 server doesn't know its own DNS for static files so it is sending unique name and url must be created during runtime. * OP-3521 - fixed creation of targed directories * change nev keys to look for and don't set them automatically * fix task type conversion * implemented base of loading v4 addons in v3 * Refactored argument name in Downloaders * Updated parsing to DependencyItem according to current schema * Implemented downloading of package from server * Updated resolving of failures Uses Enum items. * Introduced passing of authorization token Better to inject it than to have it from env var. * Remove weird parsing of server_url Not necessary, endpoints have same prefix. * Fix doubling asset version name in addons folder Zip file should already contain `addonName_addonVersion` as first subfolder * Fix doubling asset version name in addons folder Zip file should already contain `addonName_addonVersion` as first subfolder * Made server_endpoint optional Argument should be better for testing, but for calling from separate methods it would be better to encapsulate it. Removed unwanted temporary productionPackage value * Use existing method to pull addon info from Server to load v4 version of addon * Raise exception when server doesn't have any production dependency package * added ability to specify v3 alias of addon name * expect v3_alias as uppered constant * Re-implemented method to get addon info Previous implementation wouldn't work in Python2 hosts. Will be refactored in the future. * fix '__getattr__' * added ayon api to pyproject.toml and lock file * use ayon api in common connection * added mapping for label * use ayon_api in client codebase * separated clearing cache of url and username * bump ayon api version * rename env 'OP4_TEST' to 'USE_AYON_SERVER' * Move and renamend get_addons_info to get_addons_info_as_dict in addon_distribution Should be moved to ayon_api later * Replaced requests calls with ayon_api * Replaced OP4_TEST_ENABLED with AYON_SERVER_ENABLED fixed endpoints * Hound * Hound * OP-3521 - fix wrong key in get_representation_parents parents overloads parents * OP-3521 - changes for v4 of SiteSync addon * OP-3521 - fix names * OP-3521 - remove storing project_name It should be safer to go thorug self.dbcon apparently * OP-3521 - remove unwanted "context["folder"]" can be only in dummy test data * OP-3521 - move site sync loaders to addon * Use only project instead of self.project * OP-3521 - added missed get_progress_for_repre * base of settings conversion script * simplified ayon functions in start.py * added loading of settings from ayon server * added a note about colors * fix global and local settings functions * AvalonMongoDB is not using mongo connection on ayon server enabled * 'get_dynamic_modules_dirs' is not checking system settings for paths in setting * log viewer is disabled when ayon server is enabled * basic logic of enabling/disabled addons * don't use mongo logging if ayon server is enabled * update ayon api * bump ayon api again * use ayon_api to get addons info in modules/base * update ayon api * moved helper functions to get addons and dependencies dir to common functions * Initialization of AddonInfo is not crashing on unkonwn sources * renamed 'DependencyDownloader' to 'AyonServerDownloader' * renamed function 'default_addon_downloader' to 'get_default_addon_downloader' * Added ability to convert 'WebAddonSource' to 'ServerResourceSorce' * missing dependency package on server won't cause crash * data sent to downloaders don't contain ayon specific headers * modified addon distribution to not duplicate 'ayon_api' functionality * fix doubled function defintioin * unzip client file to addon destination * formatting - unify quotes * disable usage of mongo connection if in ayon mode * renamed window.py to login_window.py * added webpublisher settings conversion * added maya conversion function * reuse variable * reuse variable (similar to previous commit) * fix ayon addons loading * fix typo 'AyonSettingsCahe' -> 'AyonSettingsCache' * fix enabled state changes * fix rr_path in royal render conversion * avoid mongo calls in AYON state * implemented custom AYON start script * fix formatting (after black) * ayon_start cleanup * 'get_addons_dir' and 'get_dependencies_dir' store value to environment variable * add docstrings to local dir functions * addon info has full name * fix modules enabled states * removed unused 'run_disk_mapping_commands' * removed ayon logic from 'start.py' * fix warning message * renamed 'openpype_common' to 'ayon_common' * removed unused import * don't import igniter * removed startup validations of third parties * change what's shown in version info * fix which keys are applied from ayon values * fix method name * get applications from attribs * Implemented UI basics to be able change user or logout * merged server.py and credentials.py * add more metadata to urls * implemented change token * implemented change user ui functionality * implemented change user ui * modify window to handle username and token value * pass username to add server * fix show UI cases * added loggin action to tray * update ayon api * added missing dependency * convert applications to config in a right way * initial implementation of 'nuke' settings conversion * removed few nuke comments * implemented hiero conversion * added imageio conversion * added run ayon tray script * fix few settings conversions * Renamed class of source classes as they are not just for addons * implemented objec to track source transfer progress * Implemented distribution item with multiple sources * Implemented ayon distribution wrapper to care about multiple things during distribution * added 'cleanup' method for downlaoders * download gets tranfer progress object * Change UploadState enum * added missing imports * use AyonDistribution in ayon_start.py * removed unused functions * removed implemented TODOs * fix import * fix key used for Web source * removed temp development fix * formatting fix * keep information if source require distribution * handle 'require_distribution' attribute in distribution process * added path attribute to server source * added option to pass addons infor to ayon distribution * fix tests * fix formatting * Fix typo * Fix typo * remove '_try_convert_to_server_source' * renamed attributes and methods to match their content * it is possible to pass dependency package info to AyonDistribution * fix called methods in tests * added public properties for error message and error detail * Added filename to WebSourceInfo Useful for GDrive sharable links where target file name is unknown/unparsable, it should be provided explicitly. * unify source conversion by adding 'convert_source' function * Fix error message Co-authored-by: Roy Nieterau <roy_nieterau@hotmail.com> * added docstring for 'transfer_progress' * don't create metadata file on read * added few docstrings * add default folder fields to folder/task queries * fix generators * add dependencies when runnign from code * add sys paths from distribution to pythonpath env * fix missing applications * added missing conversions for maya renderers * fix formatting * update ayon api * fix hashes in lock file * Use better exception Co-authored-by: Ondřej Samohel <33513211+antirotor@users.noreply.github.com> * Use Python 3 syntax Co-authored-by: Ondřej Samohel <33513211+antirotor@users.noreply.github.com> * apply some of sugested changes in ayon_start * added some docstrings and suggested modifications * copy create env from develop * fix rendersettings conversion * change code by suggestions * added missing args to docstring * added missing docstrings * separated downloader and download factory * fix ayon settings * added some basic file docstring to ayon_settings * join else conditions * fix project settings conversion * fix created at conversion * fix workfile info query * fix publisher UI * added utils function 'get_ayon_appdirs' * fix 'get_all_current_info' * fix server url assignment when url is set * updated ayon api * added utils functions to create local site id for ayon * added helper functions to create global connection * create global connection in ayon start to start use site id * use ayon site id in ayon mode * formatting cleanup * added header docstring * fixes after ayon_api update * load addons from ynput appdirs * fix function call * added docstring * update ayon pyton api * fix settings access * use ayon_api to get root overrides in Anatomy * bumbayon version to 0.1.13 * nuke: fixing settings keys from settings * fix burnins definitions * change v4 to AYON in thumbnail integrate * fix one more v4 information * Fixes after rebase * fix extract burnin conversion * additional fix of extract burnin * SiteSync:added missed loaders or v3 compatibility (#4587) * Added site sync loaders for v3 compatibility * Fix get_progress_for_repre * use 'files.name' instead of 'files.baseName' * update ayon api to 0.1.14 * add common to include files * change arguments for hero version creation * skip shotgrid settings conversion if different ayon addon is used * added ayon icons * fix labels of application variants * added option to show login window always on top * login window on invalid credentials is always on top * update ayon api * update ayon api * add entityType to project and folders * AYON: Editorial hierarchy creation (#4699) * disable extract hierarchy avalon when ayon mode is enabled * implemented extract hierarchy to AYON --------- Co-authored-by: Petr Kalis <petr.kalis@gmail.com> Co-authored-by: Roy Nieterau <roy_nieterau@hotmail.com> Co-authored-by: Ondřej Samohel <33513211+antirotor@users.noreply.github.com> Co-authored-by: Jakub Jezek <jakubjezek001@gmail.com> * replace 'legacy_io' with context functions in load plugins * added 'get_global_context' to pipeline init * use context getters instead of legacy_io in publish plugins * use data on context instead of 'legacy_io' in submit publish job * skip query of asset docs in collect nuke reads * use context functions on other places * 'list_looks' expects project name * remove 'get_context_title' * don't pass AvalonMongoDB to prelaunch hooks * change how context is calculated in hiero * implemented function 'get_fps_for_current_context' for maya * initialize '_image_dir' and '_image_prefixes' in init * legacy creator is using 'get_current_project_name' * fill docstrings * use context functions in workfile builders * hound fixes * 'create_workspace_mel' can expect project settings * swapped order of arguments * use information from instance/context data * Use self.project_name in workfiles tool Co-authored-by: Roy Nieterau <roy_nieterau@hotmail.com> * Remove outdated todo Co-authored-by: Roy Nieterau <roy_nieterau@hotmail.com> * don't query project document in nuke lib * Fix access to context data * Use right function to get project name Co-authored-by: Roy Nieterau <roy_nieterau@hotmail.com> * fix submit max deadline and swap order of arguments * added 'get_context_label' to nuke * fix import * fix typo 'curent_context' -> 'current_context' * fix project_setting variable * fix submit publish job environments * use task from context * Removed unused import --------- Co-authored-by: Petr Kalis <petr.kalis@gmail.com> Co-authored-by: Roy Nieterau <roy_nieterau@hotmail.com> Co-authored-by: Ondřej Samohel <33513211+antirotor@users.noreply.github.com> Co-authored-by: Jakub Jezek <jakubjezek001@gmail.com>
572 lines
18 KiB
Python
572 lines
18 KiB
Python
import logging
|
|
import json
|
|
import os
|
|
|
|
import contextlib
|
|
import copy
|
|
|
|
import six
|
|
|
|
from maya import cmds
|
|
|
|
from openpype.client import (
|
|
get_version_by_name,
|
|
get_last_version_by_subset_id,
|
|
get_representation_by_id,
|
|
get_representation_by_name,
|
|
get_representation_parents,
|
|
)
|
|
from openpype.pipeline import (
|
|
schema,
|
|
discover_loader_plugins,
|
|
loaders_from_representation,
|
|
load_container,
|
|
update_container,
|
|
remove_container,
|
|
get_representation_path,
|
|
get_current_project_name,
|
|
)
|
|
from openpype.hosts.maya.api.lib import (
|
|
matrix_equals,
|
|
unique_namespace,
|
|
get_container_transforms,
|
|
DEFAULT_MATRIX
|
|
)
|
|
|
|
log = logging.getLogger("PackageLoader")
|
|
|
|
|
|
def to_namespace(node, namespace):
|
|
"""Return node name as if it's inside the namespace.
|
|
|
|
Args:
|
|
node (str): Node name
|
|
namespace (str): Namespace
|
|
|
|
Returns:
|
|
str: The node in the namespace.
|
|
|
|
"""
|
|
namespace_prefix = "|{}:".format(namespace)
|
|
node = namespace_prefix.join(node.split("|"))
|
|
return node
|
|
|
|
|
|
@contextlib.contextmanager
|
|
def namespaced(namespace, new=True):
|
|
"""Work inside namespace during context
|
|
|
|
Args:
|
|
new (bool): When enabled this will rename the namespace to a unique
|
|
namespace if the input namespace already exists.
|
|
|
|
Yields:
|
|
str: The namespace that is used during the context
|
|
|
|
"""
|
|
original = cmds.namespaceInfo(cur=True)
|
|
if new:
|
|
namespace = unique_namespace(namespace)
|
|
cmds.namespace(add=namespace)
|
|
|
|
try:
|
|
cmds.namespace(set=namespace)
|
|
yield namespace
|
|
finally:
|
|
cmds.namespace(set=original)
|
|
|
|
|
|
@contextlib.contextmanager
|
|
def unlocked(nodes):
|
|
|
|
# Get node state by Maya's uuid
|
|
nodes = cmds.ls(nodes, long=True)
|
|
uuids = cmds.ls(nodes, uuid=True)
|
|
states = cmds.lockNode(nodes, query=True, lock=True)
|
|
states = {uuid: state for uuid, state in zip(uuids, states)}
|
|
originals = {uuid: node for uuid, node in zip(uuids, nodes)}
|
|
|
|
try:
|
|
cmds.lockNode(nodes, lock=False)
|
|
yield
|
|
finally:
|
|
# Reapply original states
|
|
_iteritems = getattr(states, "iteritems", states.items)
|
|
for uuid, state in _iteritems():
|
|
nodes_from_id = cmds.ls(uuid, long=True)
|
|
if nodes_from_id:
|
|
node = nodes_from_id[0]
|
|
else:
|
|
log.debug("Falling back to node name: %s", node)
|
|
node = originals[uuid]
|
|
if not cmds.objExists(node):
|
|
log.warning("Unable to find: %s", node)
|
|
continue
|
|
cmds.lockNode(node, lock=state)
|
|
|
|
|
|
def load_package(filepath, name, namespace=None):
|
|
"""Load a package that was gathered elsewhere.
|
|
|
|
A package is a group of published instances, possibly with additional data
|
|
in a hierarchy.
|
|
|
|
"""
|
|
|
|
if namespace is None:
|
|
# Define a unique namespace for the package
|
|
namespace = os.path.basename(filepath).split(".")[0]
|
|
unique_namespace(namespace)
|
|
assert isinstance(namespace, six.string_types)
|
|
|
|
# Load the setdress package data
|
|
with open(filepath, "r") as fp:
|
|
data = json.load(fp)
|
|
|
|
# Load the setdress alembic hierarchy
|
|
# We import this into the namespace in which we'll load the package's
|
|
# instances into afterwards.
|
|
alembic = filepath.replace(".json", ".abc")
|
|
hierarchy = cmds.file(alembic,
|
|
reference=True,
|
|
namespace=namespace,
|
|
returnNewNodes=True,
|
|
groupReference=True,
|
|
groupName="{}:{}".format(namespace, name),
|
|
typ="Alembic")
|
|
|
|
# Get the top root node (the reference group)
|
|
root = "{}:{}".format(namespace, name)
|
|
|
|
containers = []
|
|
all_loaders = discover_loader_plugins()
|
|
for representation_id, instances in data.items():
|
|
|
|
# Find the compatible loaders
|
|
loaders = loaders_from_representation(
|
|
all_loaders, representation_id
|
|
)
|
|
|
|
for instance in instances:
|
|
container = _add(instance=instance,
|
|
representation_id=representation_id,
|
|
loaders=loaders,
|
|
namespace=namespace,
|
|
root=root)
|
|
containers.append(container)
|
|
|
|
# TODO: Do we want to cripple? Or do we want to add a 'parent' parameter?
|
|
# Cripple the original avalon containers so they don't show up in the
|
|
# manager
|
|
# for container in containers:
|
|
# cmds.setAttr("%s.id" % container,
|
|
# "setdress.container",
|
|
# type="string")
|
|
|
|
# TODO: Lock all loaded nodes
|
|
# This is to ensure the hierarchy remains unaltered by the artists
|
|
# for node in nodes:
|
|
# cmds.lockNode(node, lock=True)
|
|
|
|
return containers + hierarchy
|
|
|
|
|
|
def _add(instance, representation_id, loaders, namespace, root="|"):
|
|
"""Add an item from the package
|
|
|
|
Args:
|
|
instance (dict):
|
|
representation_id (str):
|
|
loaders (list):
|
|
namespace (str):
|
|
|
|
Returns:
|
|
str: The created Avalon container.
|
|
|
|
"""
|
|
|
|
# Process within the namespace
|
|
with namespaced(namespace, new=False) as namespace:
|
|
|
|
# Get the used loader
|
|
Loader = next((x for x in loaders if
|
|
x.__name__ == instance['loader']),
|
|
None)
|
|
|
|
if Loader is None:
|
|
log.warning("Loader is missing: %s. Skipping %s",
|
|
instance['loader'], instance)
|
|
raise RuntimeError("Loader is missing.")
|
|
|
|
container = load_container(
|
|
Loader,
|
|
representation_id,
|
|
namespace=instance['namespace']
|
|
)
|
|
|
|
# Get the root from the loaded container
|
|
loaded_root = get_container_transforms({"objectName": container},
|
|
root=True)
|
|
|
|
# Apply matrix to root node (if any matrix edits)
|
|
matrix = instance.get("matrix", None)
|
|
if matrix:
|
|
cmds.xform(loaded_root, objectSpace=True, matrix=matrix)
|
|
|
|
# Parent into the setdress hierarchy
|
|
# Namespace is missing from parent node(s), add namespace
|
|
# manually
|
|
parent = root + to_namespace(instance["parent"], namespace)
|
|
cmds.parent(loaded_root, parent, relative=True)
|
|
|
|
return container
|
|
|
|
|
|
# Store root nodes based on representation and namespace
|
|
def _instances_by_namespace(data):
|
|
"""Rebuild instance data so we can look it up by namespace.
|
|
|
|
Note that the `representation` is added into the instance's
|
|
data with a `representation` key.
|
|
|
|
Args:
|
|
data (dict): scene build data
|
|
|
|
Returns:
|
|
dict
|
|
|
|
"""
|
|
result = {}
|
|
# Add new assets
|
|
for representation_id, instances in data.items():
|
|
|
|
# Ensure we leave the source data unaltered
|
|
instances = copy.deepcopy(instances)
|
|
for instance in instances:
|
|
instance['representation'] = representation_id
|
|
result[instance['namespace']] = instance
|
|
|
|
return result
|
|
|
|
|
|
def get_contained_containers(container):
|
|
"""Get the Avalon containers in this container
|
|
|
|
Args:
|
|
container (dict): The container dict.
|
|
|
|
Returns:
|
|
list: A list of member container dictionaries.
|
|
|
|
"""
|
|
|
|
from .pipeline import parse_container
|
|
|
|
# Get avalon containers in this package setdress container
|
|
containers = []
|
|
members = cmds.sets(container['objectName'], query=True)
|
|
for node in cmds.ls(members, type="objectSet"):
|
|
try:
|
|
member_container = parse_container(node)
|
|
containers.append(member_container)
|
|
except schema.ValidationError:
|
|
pass
|
|
|
|
return containers
|
|
|
|
|
|
def update_package_version(container, version):
|
|
"""
|
|
Update package by version number
|
|
|
|
Args:
|
|
container (dict): container data of the container node
|
|
version (int): the new version number of the package
|
|
|
|
Returns:
|
|
None
|
|
|
|
"""
|
|
|
|
# Versioning (from `core.maya.pipeline`)
|
|
project_name = get_current_project_name()
|
|
current_representation = get_representation_by_id(
|
|
project_name, container["representation"]
|
|
)
|
|
|
|
assert current_representation is not None, "This is a bug"
|
|
|
|
version_doc, subset_doc, asset_doc, project_doc = (
|
|
get_representation_parents(project_name, current_representation)
|
|
)
|
|
|
|
if version == -1:
|
|
new_version = get_last_version_by_subset_id(
|
|
project_name, subset_doc["_id"]
|
|
)
|
|
else:
|
|
new_version = get_version_by_name(
|
|
project_name, version, subset_doc["_id"]
|
|
)
|
|
|
|
assert new_version is not None, "This is a bug"
|
|
|
|
# Get the new representation (new file)
|
|
new_representation = get_representation_by_name(
|
|
project_name, current_representation["name"], new_version["_id"]
|
|
)
|
|
|
|
update_package(container, new_representation)
|
|
|
|
|
|
def update_package(set_container, representation):
|
|
"""Update any matrix changes in the scene based on the new data
|
|
|
|
Args:
|
|
set_container (dict): container data from `ls()`
|
|
representation (dict): the representation document from the database
|
|
|
|
Returns:
|
|
None
|
|
|
|
"""
|
|
|
|
# Load the original package data
|
|
project_name = get_current_project_name()
|
|
current_representation = get_representation_by_id(
|
|
project_name, set_container["representation"]
|
|
)
|
|
|
|
current_file = get_representation_path(current_representation)
|
|
assert current_file.endswith(".json")
|
|
with open(current_file, "r") as fp:
|
|
current_data = json.load(fp)
|
|
|
|
# Load the new package data
|
|
new_file = get_representation_path(representation)
|
|
assert new_file.endswith(".json")
|
|
with open(new_file, "r") as fp:
|
|
new_data = json.load(fp)
|
|
|
|
# Update scene content
|
|
containers = get_contained_containers(set_container)
|
|
update_scene(set_container, containers, current_data, new_data, new_file)
|
|
|
|
# TODO: This should be handled by the pipeline itself
|
|
cmds.setAttr(set_container['objectName'] + ".representation",
|
|
str(representation['_id']), type="string")
|
|
|
|
|
|
def update_scene(set_container, containers, current_data, new_data, new_file):
|
|
"""Updates the hierarchy, assets and their matrix
|
|
|
|
Updates the following within the scene:
|
|
* Setdress hierarchy alembic
|
|
* Matrix
|
|
* Parenting
|
|
* Representations
|
|
|
|
It removes any assets which are not present in the new build data
|
|
|
|
Args:
|
|
set_container (dict): the setdress container of the scene
|
|
containers (list): the list of containers under the setdress container
|
|
current_data (dict): the current build data of the setdress
|
|
new_data (dict): the new build data of the setdres
|
|
|
|
Returns:
|
|
processed_containers (list): all new and updated containers
|
|
|
|
"""
|
|
|
|
set_namespace = set_container['namespace']
|
|
project_name = get_current_project_name()
|
|
|
|
# Update the setdress hierarchy alembic
|
|
set_root = get_container_transforms(set_container, root=True)
|
|
set_hierarchy_root = cmds.listRelatives(set_root, fullPath=True)[0]
|
|
set_hierarchy_reference = cmds.referenceQuery(set_hierarchy_root,
|
|
referenceNode=True)
|
|
new_alembic = new_file.replace(".json", ".abc")
|
|
assert os.path.exists(new_alembic), "%s does not exist." % new_alembic
|
|
with unlocked(cmds.listRelatives(set_root, ad=True, fullPath=True)):
|
|
cmds.file(new_alembic,
|
|
loadReference=set_hierarchy_reference,
|
|
type="Alembic")
|
|
|
|
identity = DEFAULT_MATRIX[:]
|
|
|
|
processed_namespaces = set()
|
|
processed_containers = list()
|
|
|
|
new_lookup = _instances_by_namespace(new_data)
|
|
old_lookup = _instances_by_namespace(current_data)
|
|
for container in containers:
|
|
container_ns = container['namespace']
|
|
|
|
# Consider it processed here, even it it fails we want to store that
|
|
# the namespace was already available.
|
|
processed_namespaces.add(container_ns)
|
|
processed_containers.append(container['objectName'])
|
|
|
|
if container_ns in new_lookup:
|
|
root = get_container_transforms(container, root=True)
|
|
if not root:
|
|
log.error("Can't find root for %s", container['objectName'])
|
|
continue
|
|
|
|
old_instance = old_lookup.get(container_ns, {})
|
|
new_instance = new_lookup[container_ns]
|
|
|
|
# Update the matrix
|
|
# check matrix against old_data matrix to find local overrides
|
|
current_matrix = cmds.xform(root,
|
|
query=True,
|
|
matrix=True,
|
|
objectSpace=True)
|
|
|
|
original_matrix = old_instance.get("matrix", identity)
|
|
has_matrix_override = not matrix_equals(current_matrix,
|
|
original_matrix)
|
|
|
|
if has_matrix_override:
|
|
log.warning("Matrix override preserved on %s", container_ns)
|
|
else:
|
|
new_matrix = new_instance.get("matrix", identity)
|
|
cmds.xform(root, matrix=new_matrix, objectSpace=True)
|
|
|
|
# Update the parenting
|
|
if old_instance.get("parent", None) != new_instance["parent"]:
|
|
|
|
parent = to_namespace(new_instance['parent'], set_namespace)
|
|
if not cmds.objExists(parent):
|
|
log.error("Can't find parent %s", parent)
|
|
continue
|
|
|
|
# Set the new parent
|
|
cmds.lockNode(root, lock=False)
|
|
root = cmds.parent(root, parent, relative=True)
|
|
cmds.lockNode(root, lock=True)
|
|
|
|
# Update the representation
|
|
representation_current = container['representation']
|
|
representation_old = old_instance['representation']
|
|
representation_new = new_instance['representation']
|
|
has_representation_override = (representation_current !=
|
|
representation_old)
|
|
|
|
if representation_new != representation_current:
|
|
|
|
if has_representation_override:
|
|
log.warning("Your scene had local representation "
|
|
"overrides within the set. New "
|
|
"representations not loaded for %s.",
|
|
container_ns)
|
|
continue
|
|
|
|
# We check it against the current 'loader' in the scene instead
|
|
# of the original data of the package that was loaded because
|
|
# an Artist might have made scene local overrides
|
|
if new_instance['loader'] != container['loader']:
|
|
log.warning("Loader is switched - local edits will be "
|
|
"lost. Removing: %s",
|
|
container_ns)
|
|
|
|
# Remove this from the "has been processed" list so it's
|
|
# considered as new element and added afterwards.
|
|
processed_containers.pop()
|
|
processed_namespaces.remove(container_ns)
|
|
remove_container(container)
|
|
continue
|
|
|
|
# Check whether the conversion can be done by the Loader.
|
|
# They *must* use the same asset, subset and Loader for
|
|
# `update_container` to make sense.
|
|
old = get_representation_by_id(
|
|
project_name, representation_current
|
|
)
|
|
new = get_representation_by_id(
|
|
project_name, representation_new
|
|
)
|
|
is_valid = compare_representations(old=old, new=new)
|
|
if not is_valid:
|
|
log.error("Skipping: %s. See log for details.",
|
|
container_ns)
|
|
continue
|
|
|
|
new_version = new["context"]["version"]
|
|
update_container(container, version=new_version)
|
|
|
|
else:
|
|
# Remove this container because it's not in the new data
|
|
log.warning("Removing content: %s", container_ns)
|
|
remove_container(container)
|
|
|
|
# Add new assets
|
|
all_loaders = discover_loader_plugins()
|
|
for representation_id, instances in new_data.items():
|
|
|
|
# Find the compatible loaders
|
|
loaders = loaders_from_representation(
|
|
all_loaders, representation_id
|
|
)
|
|
for instance in instances:
|
|
|
|
# Already processed in update functionality
|
|
if instance['namespace'] in processed_namespaces:
|
|
continue
|
|
|
|
container = _add(instance=instance,
|
|
representation_id=representation_id,
|
|
loaders=loaders,
|
|
namespace=set_container['namespace'],
|
|
root=set_root)
|
|
|
|
# Add to the setdress container
|
|
cmds.sets(container,
|
|
addElement=set_container['objectName'])
|
|
|
|
processed_containers.append(container)
|
|
|
|
return processed_containers
|
|
|
|
|
|
def compare_representations(old, new):
|
|
"""Check if the old representation given can be updated
|
|
|
|
Due to limitations of the `update_container` function we cannot allow
|
|
differences in the following data:
|
|
|
|
* Representation name (extension)
|
|
* Asset name
|
|
* Subset name (variation)
|
|
|
|
If any of those data values differs, the function will raise an
|
|
RuntimeError
|
|
|
|
Args:
|
|
old(dict): representation data from the database
|
|
new(dict): representation data from the database
|
|
|
|
Returns:
|
|
bool: False if the representation is not invalid else True
|
|
"""
|
|
|
|
if new["name"] != old["name"]:
|
|
log.error("Cannot switch extensions")
|
|
return False
|
|
|
|
new_context = new["context"]
|
|
old_context = old["context"]
|
|
|
|
if new_context["asset"] != old_context["asset"]:
|
|
log.error("Changing assets between updates is "
|
|
"not supported.")
|
|
return False
|
|
|
|
if new_context["subset"] != old_context["subset"]:
|
|
log.error("Changing subsets between updates is "
|
|
"not supported.")
|
|
return False
|
|
|
|
return True
|