mirror of
https://github.com/ynput/ayon-core.git
synced 2025-12-24 21:04:40 +01:00
* implemented 'get_workfile_info' in entities * removed 'prepare_asset_update_data' which is not used * disable settings and project manager if in v4 mode * prepared conversion helper functions for v4 entities * prepared conversion functions for hero versions * fix hero versions * implemented get_archived_representations * fix get latest versions * return prepared changes * handle archived representation * raise exception on failed json conversion * map archived to active properly * make sure default fields are added * fix conversion of hero version entity * fix conversion of archived representations * fix some conversions of representations and versions * changed active behavior in queries * fixed hero versions * implemented basic thumbnail caching * added raw variants of crud methods * implemented methods to get and create thumbnail * fix from flat dict * implemented some basic folder conversion for updates * fix thumbnail updates for version * implemented v4 thumbnail integrator * simplified data mapping * 'get_thumbnail' function also expect entity type and entity id for which is the thumbnail received * implemented 'get_thumbnail' for server * fix how thumbnail id is received from entity * removed unnecessary method 'get_thumbnail_id_from_source' * implemented thumbnail resolver for v4 * removed unnecessary print * move create and delete project directly to server api * disable local settings action too on v4 * OP-3521 - added method to check and download updated addons from v4 server * OP-3521 - added more descriptive error message for missing source * OP-3521 - added default implementation of addon downloader to import * OP-3521 - added check for dependency package zips WIP - server doesn't contain required endpoint. Testing only with mockup data for now. * OP-3521 - fixed parsing of DependencyItem Added Server Url type and ServerAddonDownloader - v4 server doesn't know its own DNS for static files so it is sending unique name and url must be created during runtime. * OP-3521 - fixed creation of targed directories * change nev keys to look for and don't set them automatically * fix task type conversion * implemented base of loading v4 addons in v3 * Refactored argument name in Downloaders * Updated parsing to DependencyItem according to current schema * Implemented downloading of package from server * Updated resolving of failures Uses Enum items. * Introduced passing of authorization token Better to inject it than to have it from env var. * Remove weird parsing of server_url Not necessary, endpoints have same prefix. * Fix doubling asset version name in addons folder Zip file should already contain `addonName_addonVersion` as first subfolder * Fix doubling asset version name in addons folder Zip file should already contain `addonName_addonVersion` as first subfolder * Made server_endpoint optional Argument should be better for testing, but for calling from separate methods it would be better to encapsulate it. Removed unwanted temporary productionPackage value * Use existing method to pull addon info from Server to load v4 version of addon * Raise exception when server doesn't have any production dependency package * added ability to specify v3 alias of addon name * expect v3_alias as uppered constant * Re-implemented method to get addon info Previous implementation wouldn't work in Python2 hosts. Will be refactored in the future. * fix '__getattr__' * added ayon api to pyproject.toml and lock file * use ayon api in common connection * added mapping for label * use ayon_api in client codebase * separated clearing cache of url and username * bump ayon api version * rename env 'OP4_TEST' to 'USE_AYON_SERVER' * Move and renamend get_addons_info to get_addons_info_as_dict in addon_distribution Should be moved to ayon_api later * Replaced requests calls with ayon_api * Replaced OP4_TEST_ENABLED with AYON_SERVER_ENABLED fixed endpoints * Hound * Hound * OP-3521 - fix wrong key in get_representation_parents parents overloads parents * OP-3521 - changes for v4 of SiteSync addon * OP-3521 - fix names * OP-3521 - remove storing project_name It should be safer to go thorug self.dbcon apparently * OP-3521 - remove unwanted "context["folder"]" can be only in dummy test data * OP-3521 - move site sync loaders to addon * Use only project instead of self.project * OP-3521 - added missed get_progress_for_repre * base of settings conversion script * simplified ayon functions in start.py * added loading of settings from ayon server * added a note about colors * fix global and local settings functions * AvalonMongoDB is not using mongo connection on ayon server enabled * 'get_dynamic_modules_dirs' is not checking system settings for paths in setting * log viewer is disabled when ayon server is enabled * basic logic of enabling/disabled addons * don't use mongo logging if ayon server is enabled * update ayon api * bump ayon api again * use ayon_api to get addons info in modules/base * update ayon api * moved helper functions to get addons and dependencies dir to common functions * Initialization of AddonInfo is not crashing on unkonwn sources * renamed 'DependencyDownloader' to 'AyonServerDownloader' * renamed function 'default_addon_downloader' to 'get_default_addon_downloader' * Added ability to convert 'WebAddonSource' to 'ServerResourceSorce' * missing dependency package on server won't cause crash * data sent to downloaders don't contain ayon specific headers * modified addon distribution to not duplicate 'ayon_api' functionality * fix doubled function defintioin * unzip client file to addon destination * formatting - unify quotes * disable usage of mongo connection if in ayon mode * renamed window.py to login_window.py * added webpublisher settings conversion * added maya conversion function * reuse variable * reuse variable (similar to previous commit) * fix ayon addons loading * fix typo 'AyonSettingsCahe' -> 'AyonSettingsCache' * fix enabled state changes * fix rr_path in royal render conversion * avoid mongo calls in AYON state * implemented custom AYON start script * fix formatting (after black) * ayon_start cleanup * 'get_addons_dir' and 'get_dependencies_dir' store value to environment variable * add docstrings to local dir functions * addon info has full name * fix modules enabled states * removed unused 'run_disk_mapping_commands' * removed ayon logic from 'start.py' * fix warning message * renamed 'openpype_common' to 'ayon_common' * removed unused import * don't import igniter * removed startup validations of third parties * change what's shown in version info * fix which keys are applied from ayon values * fix method name * get applications from attribs * Implemented UI basics to be able change user or logout * merged server.py and credentials.py * add more metadata to urls * implemented change token * implemented change user ui functionality * implemented change user ui * modify window to handle username and token value * pass username to add server * fix show UI cases * added loggin action to tray * update ayon api * added missing dependency * convert applications to config in a right way * initial implementation of 'nuke' settings conversion * removed few nuke comments * implemented hiero conversion * added imageio conversion * added run ayon tray script * fix few settings conversions * Renamed class of source classes as they are not just for addons * implemented objec to track source transfer progress * Implemented distribution item with multiple sources * Implemented ayon distribution wrapper to care about multiple things during distribution * added 'cleanup' method for downlaoders * download gets tranfer progress object * Change UploadState enum * added missing imports * use AyonDistribution in ayon_start.py * removed unused functions * removed implemented TODOs * fix import * fix key used for Web source * removed temp development fix * formatting fix * keep information if source require distribution * handle 'require_distribution' attribute in distribution process * added path attribute to server source * added option to pass addons infor to ayon distribution * fix tests * fix formatting * Fix typo * Fix typo * remove '_try_convert_to_server_source' * renamed attributes and methods to match their content * it is possible to pass dependency package info to AyonDistribution * fix called methods in tests * added public properties for error message and error detail * Added filename to WebSourceInfo Useful for GDrive sharable links where target file name is unknown/unparsable, it should be provided explicitly. * unify source conversion by adding 'convert_source' function * Fix error message Co-authored-by: Roy Nieterau <roy_nieterau@hotmail.com> * added docstring for 'transfer_progress' * don't create metadata file on read * added few docstrings * add default folder fields to folder/task queries * fix generators * add dependencies when runnign from code * add sys paths from distribution to pythonpath env * fix missing applications * added missing conversions for maya renderers * fix formatting * update ayon api * fix hashes in lock file * Use better exception Co-authored-by: Ondřej Samohel <33513211+antirotor@users.noreply.github.com> * Use Python 3 syntax Co-authored-by: Ondřej Samohel <33513211+antirotor@users.noreply.github.com> * apply some of sugested changes in ayon_start * added some docstrings and suggested modifications * copy create env from develop * fix rendersettings conversion * change code by suggestions * added missing args to docstring * added missing docstrings * separated downloader and download factory * fix ayon settings * added some basic file docstring to ayon_settings * join else conditions * fix project settings conversion * fix created at conversion * fix workfile info query * fix publisher UI * added utils function 'get_ayon_appdirs' * fix 'get_all_current_info' * fix server url assignment when url is set * updated ayon api * added utils functions to create local site id for ayon * added helper functions to create global connection * create global connection in ayon start to start use site id * use ayon site id in ayon mode * formatting cleanup * added header docstring * fixes after ayon_api update * load addons from ynput appdirs * fix function call * added docstring * update ayon pyton api * fix settings access * use ayon_api to get root overrides in Anatomy * bumbayon version to 0.1.13 * nuke: fixing settings keys from settings * fix burnins definitions * change v4 to AYON in thumbnail integrate * fix one more v4 information * Fixes after rebase * fix extract burnin conversion * additional fix of extract burnin * SiteSync:added missed loaders or v3 compatibility (#4587) * Added site sync loaders for v3 compatibility * Fix get_progress_for_repre * use 'files.name' instead of 'files.baseName' * update ayon api to 0.1.14 * add common to include files * change arguments for hero version creation * skip shotgrid settings conversion if different ayon addon is used * added ayon icons * fix labels of application variants * added option to show login window always on top * login window on invalid credentials is always on top * update ayon api * update ayon api * add entityType to project and folders * AYON: Editorial hierarchy creation (#4699) * disable extract hierarchy avalon when ayon mode is enabled * implemented extract hierarchy to AYON --------- Co-authored-by: Petr Kalis <petr.kalis@gmail.com> Co-authored-by: Roy Nieterau <roy_nieterau@hotmail.com> Co-authored-by: Ondřej Samohel <33513211+antirotor@users.noreply.github.com> Co-authored-by: Jakub Jezek <jakubjezek001@gmail.com>
863 lines
26 KiB
Python
863 lines
26 KiB
Python
import copy
|
|
import json
|
|
import collections
|
|
import uuid
|
|
import datetime
|
|
|
|
from bson.objectid import ObjectId
|
|
from ayon_api import get_server_api_connection
|
|
|
|
from openpype.client.operations_base import (
|
|
REMOVED_VALUE,
|
|
CreateOperation,
|
|
UpdateOperation,
|
|
DeleteOperation,
|
|
BaseOperationsSession
|
|
)
|
|
|
|
from openpype.client.mongo.operations import (
|
|
CURRENT_THUMBNAIL_SCHEMA,
|
|
CURRENT_REPRESENTATION_SCHEMA,
|
|
CURRENT_HERO_VERSION_SCHEMA,
|
|
CURRENT_VERSION_SCHEMA,
|
|
CURRENT_SUBSET_SCHEMA,
|
|
CURRENT_ASSET_DOC_SCHEMA,
|
|
CURRENT_PROJECT_SCHEMA,
|
|
)
|
|
|
|
from .conversion_utils import (
|
|
convert_create_asset_to_v4,
|
|
convert_create_task_to_v4,
|
|
convert_create_subset_to_v4,
|
|
convert_create_version_to_v4,
|
|
convert_create_hero_version_to_v4,
|
|
convert_create_representation_to_v4,
|
|
convert_create_workfile_info_to_v4,
|
|
|
|
convert_update_folder_to_v4,
|
|
convert_update_subset_to_v4,
|
|
convert_update_version_to_v4,
|
|
convert_update_hero_version_to_v4,
|
|
convert_update_representation_to_v4,
|
|
convert_update_workfile_info_to_v4,
|
|
)
|
|
from .utils import create_entity_id
|
|
|
|
|
|
def _create_or_convert_to_id(entity_id=None):
|
|
if entity_id is None:
|
|
return create_entity_id()
|
|
|
|
if isinstance(entity_id, ObjectId):
|
|
raise TypeError("Type of 'ObjectId' is not supported anymore.")
|
|
|
|
# Validate if can be converted to uuid
|
|
uuid.UUID(entity_id)
|
|
return entity_id
|
|
|
|
|
|
def new_project_document(
|
|
project_name, project_code, config, data=None, entity_id=None
|
|
):
|
|
"""Create skeleton data of project document.
|
|
|
|
Args:
|
|
project_name (str): Name of project. Used as identifier of a project.
|
|
project_code (str): Shorter version of projet without spaces and
|
|
special characters (in most of cases). Should be also considered
|
|
as unique name across projects.
|
|
config (Dic[str, Any]): Project config consist of roots, templates,
|
|
applications and other project Anatomy related data.
|
|
data (Dict[str, Any]): Project data with information about it's
|
|
attributes (e.g. 'fps' etc.) or integration specific keys.
|
|
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
|
|
created if not passed.
|
|
|
|
Returns:
|
|
Dict[str, Any]: Skeleton of project document.
|
|
"""
|
|
|
|
if data is None:
|
|
data = {}
|
|
|
|
data["code"] = project_code
|
|
|
|
return {
|
|
"_id": _create_or_convert_to_id(entity_id),
|
|
"name": project_name,
|
|
"type": CURRENT_PROJECT_SCHEMA,
|
|
"entity_data": data,
|
|
"config": config
|
|
}
|
|
|
|
|
|
def new_asset_document(
|
|
name, project_id, parent_id, parents, data=None, entity_id=None
|
|
):
|
|
"""Create skeleton data of asset document.
|
|
|
|
Args:
|
|
name (str): Is considered as unique identifier of asset in project.
|
|
project_id (Union[str, ObjectId]): Id of project doument.
|
|
parent_id (Union[str, ObjectId]): Id of parent asset.
|
|
parents (List[str]): List of parent assets names.
|
|
data (Dict[str, Any]): Asset document data. Empty dictionary is used
|
|
if not passed. Value of 'parent_id' is used to fill 'visualParent'.
|
|
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
|
|
created if not passed.
|
|
|
|
Returns:
|
|
Dict[str, Any]: Skeleton of asset document.
|
|
"""
|
|
|
|
if data is None:
|
|
data = {}
|
|
if parent_id is not None:
|
|
parent_id = _create_or_convert_to_id(parent_id)
|
|
data["visualParent"] = parent_id
|
|
data["parents"] = parents
|
|
|
|
return {
|
|
"_id": _create_or_convert_to_id(entity_id),
|
|
"type": "asset",
|
|
"name": name,
|
|
# This will be ignored
|
|
"parent": project_id,
|
|
"data": data,
|
|
"schema": CURRENT_ASSET_DOC_SCHEMA
|
|
}
|
|
|
|
|
|
def new_subset_document(name, family, asset_id, data=None, entity_id=None):
|
|
"""Create skeleton data of subset document.
|
|
|
|
Args:
|
|
name (str): Is considered as unique identifier of subset under asset.
|
|
family (str): Subset's family.
|
|
asset_id (Union[str, ObjectId]): Id of parent asset.
|
|
data (Dict[str, Any]): Subset document data. Empty dictionary is used
|
|
if not passed. Value of 'family' is used to fill 'family'.
|
|
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
|
|
created if not passed.
|
|
|
|
Returns:
|
|
Dict[str, Any]: Skeleton of subset document.
|
|
"""
|
|
|
|
if data is None:
|
|
data = {}
|
|
data["family"] = family
|
|
return {
|
|
"_id": _create_or_convert_to_id(entity_id),
|
|
"schema": CURRENT_SUBSET_SCHEMA,
|
|
"type": "subset",
|
|
"name": name,
|
|
"data": data,
|
|
"parent": _create_or_convert_to_id(asset_id)
|
|
}
|
|
|
|
|
|
def new_version_doc(version, subset_id, data=None, entity_id=None):
|
|
"""Create skeleton data of version document.
|
|
|
|
Args:
|
|
version (int): Is considered as unique identifier of version
|
|
under subset.
|
|
subset_id (Union[str, ObjectId]): Id of parent subset.
|
|
data (Dict[str, Any]): Version document data.
|
|
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
|
|
created if not passed.
|
|
|
|
Returns:
|
|
Dict[str, Any]: Skeleton of version document.
|
|
"""
|
|
|
|
if data is None:
|
|
data = {}
|
|
|
|
return {
|
|
"_id": _create_or_convert_to_id(entity_id),
|
|
"schema": CURRENT_VERSION_SCHEMA,
|
|
"type": "version",
|
|
"name": int(version),
|
|
"parent": _create_or_convert_to_id(subset_id),
|
|
"data": data
|
|
}
|
|
|
|
|
|
def new_hero_version_doc(subset_id, data, version=None, entity_id=None):
|
|
"""Create skeleton data of hero version document.
|
|
|
|
Args:
|
|
subset_id (Union[str, ObjectId]): Id of parent subset.
|
|
data (Dict[str, Any]): Version document data.
|
|
version (int): Version of source version.
|
|
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
|
|
created if not passed.
|
|
|
|
Returns:
|
|
Dict[str, Any]: Skeleton of version document.
|
|
"""
|
|
|
|
if version is None:
|
|
version = -1
|
|
elif version > 0:
|
|
version = -version
|
|
|
|
return {
|
|
"_id": _create_or_convert_to_id(entity_id),
|
|
"schema": CURRENT_HERO_VERSION_SCHEMA,
|
|
"type": "hero_version",
|
|
"version": version,
|
|
"parent": _create_or_convert_to_id(subset_id),
|
|
"data": data
|
|
}
|
|
|
|
|
|
def new_representation_doc(
|
|
name, version_id, context, data=None, entity_id=None
|
|
):
|
|
"""Create skeleton data of representation document.
|
|
|
|
Args:
|
|
name (str): Representation name considered as unique identifier
|
|
of representation under version.
|
|
version_id (Union[str, ObjectId]): Id of parent version.
|
|
context (Dict[str, Any]): Representation context used for fill template
|
|
of to query.
|
|
data (Dict[str, Any]): Representation document data.
|
|
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
|
|
created if not passed.
|
|
|
|
Returns:
|
|
Dict[str, Any]: Skeleton of version document.
|
|
"""
|
|
|
|
if data is None:
|
|
data = {}
|
|
|
|
return {
|
|
"_id": _create_or_convert_to_id(entity_id),
|
|
"schema": CURRENT_REPRESENTATION_SCHEMA,
|
|
"type": "representation",
|
|
"parent": _create_or_convert_to_id(version_id),
|
|
"name": name,
|
|
"data": data,
|
|
|
|
# Imprint shortcut to context for performance reasons.
|
|
"context": context
|
|
}
|
|
|
|
|
|
def new_thumbnail_doc(data=None, entity_id=None):
|
|
"""Create skeleton data of thumbnail document.
|
|
|
|
Args:
|
|
data (Dict[str, Any]): Thumbnail document data.
|
|
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
|
|
created if not passed.
|
|
|
|
Returns:
|
|
Dict[str, Any]: Skeleton of thumbnail document.
|
|
"""
|
|
|
|
if data is None:
|
|
data = {}
|
|
|
|
return {
|
|
"_id": _create_or_convert_to_id(entity_id),
|
|
"type": "thumbnail",
|
|
"schema": CURRENT_THUMBNAIL_SCHEMA,
|
|
"data": data
|
|
}
|
|
|
|
|
|
def new_workfile_info_doc(
|
|
filename, asset_id, task_name, files, data=None, entity_id=None
|
|
):
|
|
"""Create skeleton data of workfile info document.
|
|
|
|
Workfile document is at this moment used primarily for artist notes.
|
|
|
|
Args:
|
|
filename (str): Filename of workfile.
|
|
asset_id (Union[str, ObjectId]): Id of asset under which workfile live.
|
|
task_name (str): Task under which was workfile created.
|
|
files (List[str]): List of rootless filepaths related to workfile.
|
|
data (Dict[str, Any]): Additional metadata.
|
|
|
|
Returns:
|
|
Dict[str, Any]: Skeleton of workfile info document.
|
|
"""
|
|
|
|
if not data:
|
|
data = {}
|
|
|
|
return {
|
|
"_id": _create_or_convert_to_id(entity_id),
|
|
"type": "workfile",
|
|
"parent": _create_or_convert_to_id(asset_id),
|
|
"task_name": task_name,
|
|
"filename": filename,
|
|
"data": data,
|
|
"files": files
|
|
}
|
|
|
|
|
|
def _prepare_update_data(old_doc, new_doc, replace):
|
|
changes = {}
|
|
for key, value in new_doc.items():
|
|
if key not in old_doc or value != old_doc[key]:
|
|
changes[key] = value
|
|
|
|
if replace:
|
|
for key in old_doc.keys():
|
|
if key not in new_doc:
|
|
changes[key] = REMOVED_VALUE
|
|
return changes
|
|
|
|
|
|
def prepare_subset_update_data(old_doc, new_doc, replace=True):
|
|
"""Compare two subset documents and prepare update data.
|
|
|
|
Based on compared values will create update data for
|
|
'MongoUpdateOperation'.
|
|
|
|
Empty output means that documents are identical.
|
|
|
|
Returns:
|
|
Dict[str, Any]: Changes between old and new document.
|
|
"""
|
|
|
|
return _prepare_update_data(old_doc, new_doc, replace)
|
|
|
|
|
|
def prepare_version_update_data(old_doc, new_doc, replace=True):
|
|
"""Compare two version documents and prepare update data.
|
|
|
|
Based on compared values will create update data for
|
|
'MongoUpdateOperation'.
|
|
|
|
Empty output means that documents are identical.
|
|
|
|
Returns:
|
|
Dict[str, Any]: Changes between old and new document.
|
|
"""
|
|
|
|
return _prepare_update_data(old_doc, new_doc, replace)
|
|
|
|
|
|
def prepare_hero_version_update_data(old_doc, new_doc, replace=True):
|
|
"""Compare two hero version documents and prepare update data.
|
|
|
|
Based on compared values will create update data for 'UpdateOperation'.
|
|
|
|
Empty output means that documents are identical.
|
|
|
|
Returns:
|
|
Dict[str, Any]: Changes between old and new document.
|
|
"""
|
|
|
|
return _prepare_update_data(old_doc, new_doc, replace)
|
|
|
|
|
|
def prepare_representation_update_data(old_doc, new_doc, replace=True):
|
|
"""Compare two representation documents and prepare update data.
|
|
|
|
Based on compared values will create update data for
|
|
'MongoUpdateOperation'.
|
|
|
|
Empty output means that documents are identical.
|
|
|
|
Returns:
|
|
Dict[str, Any]: Changes between old and new document.
|
|
"""
|
|
|
|
return _prepare_update_data(old_doc, new_doc, replace)
|
|
|
|
|
|
def prepare_workfile_info_update_data(old_doc, new_doc, replace=True):
|
|
"""Compare two workfile info documents and prepare update data.
|
|
|
|
Based on compared values will create update data for
|
|
'MongoUpdateOperation'.
|
|
|
|
Empty output means that documents are identical.
|
|
|
|
Returns:
|
|
Dict[str, Any]: Changes between old and new document.
|
|
"""
|
|
|
|
return _prepare_update_data(old_doc, new_doc, replace)
|
|
|
|
|
|
class FailedOperations(Exception):
|
|
pass
|
|
|
|
|
|
def entity_data_json_default(value):
|
|
if isinstance(value, datetime.datetime):
|
|
return int(value.timestamp())
|
|
|
|
raise TypeError(
|
|
"Object of type {} is not JSON serializable".format(str(type(value)))
|
|
)
|
|
|
|
|
|
def failed_json_default(value):
|
|
return "< Failed value {} > {}".format(type(value), str(value))
|
|
|
|
|
|
class ServerCreateOperation(CreateOperation):
|
|
"""Opeartion to create an entity.
|
|
|
|
Args:
|
|
project_name (str): On which project operation will happen.
|
|
entity_type (str): Type of entity on which change happens.
|
|
e.g. 'asset', 'representation' etc.
|
|
data (Dict[str, Any]): Data of entity that will be created.
|
|
"""
|
|
|
|
def __init__(self, project_name, entity_type, data, session):
|
|
self._session = session
|
|
|
|
if not data:
|
|
data = {}
|
|
data = copy.deepcopy(data)
|
|
if entity_type == "project":
|
|
raise ValueError("Project cannot be created using operations")
|
|
|
|
tasks = None
|
|
if entity_type in "asset":
|
|
# TODO handle tasks
|
|
entity_type = "folder"
|
|
if "data" in data:
|
|
tasks = data["data"].get("tasks")
|
|
|
|
project = self._session.get_project(project_name)
|
|
new_data = convert_create_asset_to_v4(data, project, self.con)
|
|
|
|
elif entity_type == "task":
|
|
project = self._session.get_project(project_name)
|
|
new_data = convert_create_task_to_v4(data, project, self.con)
|
|
|
|
elif entity_type == "subset":
|
|
new_data = convert_create_subset_to_v4(data, self.con)
|
|
|
|
elif entity_type == "version":
|
|
new_data = convert_create_version_to_v4(data, self.con)
|
|
|
|
elif entity_type == "hero_version":
|
|
new_data = convert_create_hero_version_to_v4(
|
|
data, project_name, self.con
|
|
)
|
|
entity_type = "version"
|
|
|
|
elif entity_type in ("representation", "archived_representation"):
|
|
new_data = convert_create_representation_to_v4(data, self.con)
|
|
entity_type = "representation"
|
|
|
|
elif entity_type == "workfile":
|
|
new_data = convert_create_workfile_info_to_v4(
|
|
data, project_name, self.con
|
|
)
|
|
|
|
else:
|
|
raise ValueError(
|
|
"Unhandled entity type \"{}\"".format(entity_type)
|
|
)
|
|
|
|
# Simple check if data can be dumped into json
|
|
# - should raise error on 'ObjectId' object
|
|
try:
|
|
new_data = json.loads(
|
|
json.dumps(new_data, default=entity_data_json_default)
|
|
)
|
|
|
|
except:
|
|
raise ValueError("Couldn't json parse body: {}".format(
|
|
json.dumps(new_data, default=failed_json_default)
|
|
))
|
|
|
|
super(ServerCreateOperation, self).__init__(
|
|
project_name, entity_type, new_data
|
|
)
|
|
|
|
if "id" not in self._data:
|
|
self._data["id"] = create_entity_id()
|
|
|
|
if tasks:
|
|
copied_tasks = copy.deepcopy(tasks)
|
|
for task_name, task in copied_tasks.items():
|
|
task["name"] = task_name
|
|
task["folderId"] = self._data["id"]
|
|
self.session.create_entity(
|
|
project_name, "task", task, nested_id=self.id
|
|
)
|
|
|
|
@property
|
|
def con(self):
|
|
return self.session.con
|
|
|
|
@property
|
|
def session(self):
|
|
return self._session
|
|
|
|
@property
|
|
def entity_id(self):
|
|
return self._data["id"]
|
|
|
|
def to_server_operation(self):
|
|
return {
|
|
"id": self.id,
|
|
"type": "create",
|
|
"entityType": self.entity_type,
|
|
"entityId": self.entity_id,
|
|
"data": self._data
|
|
}
|
|
|
|
|
|
class ServerUpdateOperation(UpdateOperation):
|
|
"""Operation to update an entity.
|
|
|
|
Args:
|
|
project_name (str): On which project operation will happen.
|
|
entity_type (str): Type of entity on which change happens.
|
|
e.g. 'asset', 'representation' etc.
|
|
entity_id (Union[str, ObjectId]): Identifier of an entity.
|
|
update_data (Dict[str, Any]): Key -> value changes that will be set in
|
|
database. If value is set to 'REMOVED_VALUE' the key will be
|
|
removed. Only first level of dictionary is checked (on purpose).
|
|
"""
|
|
|
|
def __init__(
|
|
self, project_name, entity_type, entity_id, update_data, session
|
|
):
|
|
self._session = session
|
|
|
|
update_data = copy.deepcopy(update_data)
|
|
if entity_type == "project":
|
|
raise ValueError("Project cannot be created using operations")
|
|
|
|
if entity_type in ("asset", "archived_asset"):
|
|
new_update_data = convert_update_folder_to_v4(
|
|
project_name, entity_id, update_data, self.con
|
|
)
|
|
entity_type = "folder"
|
|
|
|
elif entity_type == "subset":
|
|
new_update_data = convert_update_subset_to_v4(
|
|
project_name, entity_id, update_data, self.con
|
|
)
|
|
|
|
elif entity_type == "version":
|
|
new_update_data = convert_update_version_to_v4(
|
|
project_name, entity_id, update_data, self.con
|
|
)
|
|
|
|
elif entity_type == "hero_version":
|
|
new_update_data = convert_update_hero_version_to_v4(
|
|
project_name, entity_id, update_data, self.con
|
|
)
|
|
entity_type = "version"
|
|
|
|
elif entity_type in ("representation", "archived_representation"):
|
|
new_update_data = convert_update_representation_to_v4(
|
|
project_name, entity_id, update_data, self.con
|
|
)
|
|
entity_type = "representation"
|
|
|
|
elif entity_type == "workfile":
|
|
new_update_data = convert_update_workfile_info_to_v4(
|
|
project_name, entity_id, update_data, self.con
|
|
)
|
|
|
|
else:
|
|
raise ValueError(
|
|
"Unhandled entity type \"{}\"".format(entity_type)
|
|
)
|
|
|
|
try:
|
|
new_update_data = json.loads(
|
|
json.dumps(new_update_data, default=entity_data_json_default)
|
|
)
|
|
|
|
except:
|
|
raise ValueError("Couldn't json parse body: {}".format(
|
|
json.dumps(new_update_data, default=failed_json_default)
|
|
))
|
|
|
|
super(ServerUpdateOperation, self).__init__(
|
|
project_name, entity_type, entity_id, new_update_data
|
|
)
|
|
|
|
@property
|
|
def con(self):
|
|
return self.session.con
|
|
|
|
@property
|
|
def session(self):
|
|
return self._session
|
|
|
|
def to_server_operation(self):
|
|
if not self._update_data:
|
|
return None
|
|
|
|
update_data = {}
|
|
for key, value in self._update_data.items():
|
|
if value is REMOVED_VALUE:
|
|
value = None
|
|
update_data[key] = value
|
|
|
|
return {
|
|
"id": self.id,
|
|
"type": "update",
|
|
"entityType": self.entity_type,
|
|
"entityId": self.entity_id,
|
|
"data": update_data
|
|
}
|
|
|
|
|
|
class ServerDeleteOperation(DeleteOperation):
|
|
"""Opeartion to delete an entity.
|
|
|
|
Args:
|
|
project_name (str): On which project operation will happen.
|
|
entity_type (str): Type of entity on which change happens.
|
|
e.g. 'asset', 'representation' etc.
|
|
entity_id (Union[str, ObjectId]): Entity id that will be removed.
|
|
"""
|
|
|
|
def __init__(self, project_name, entity_type, entity_id, session):
|
|
self._session = session
|
|
|
|
if entity_type == "asset":
|
|
entity_type == "folder"
|
|
|
|
if entity_type == "hero_version":
|
|
entity_type = "version"
|
|
|
|
super(ServerDeleteOperation, self).__init__(
|
|
project_name, entity_type, entity_id
|
|
)
|
|
|
|
@property
|
|
def con(self):
|
|
return self.session.con
|
|
|
|
@property
|
|
def session(self):
|
|
return self._session
|
|
|
|
def to_server_operation(self):
|
|
return {
|
|
"id": self.id,
|
|
"type": self.operation_name,
|
|
"entityId": self.entity_id,
|
|
"entityType": self.entity_type,
|
|
}
|
|
|
|
|
|
class OperationsSession(BaseOperationsSession):
|
|
def __init__(self, con=None, *args, **kwargs):
|
|
super(OperationsSession, self).__init__(*args, **kwargs)
|
|
if con is None:
|
|
con = get_server_api_connection()
|
|
self._con = con
|
|
self._project_cache = {}
|
|
self._nested_operations = collections.defaultdict(list)
|
|
|
|
@property
|
|
def con(self):
|
|
return self._con
|
|
|
|
def get_project(self, project_name):
|
|
if project_name not in self._project_cache:
|
|
self._project_cache[project_name] = self.con.get_project(
|
|
project_name)
|
|
return copy.deepcopy(self._project_cache[project_name])
|
|
|
|
def commit(self):
|
|
"""Commit session operations."""
|
|
|
|
operations, self._operations = self._operations, []
|
|
if not operations:
|
|
return
|
|
|
|
operations_by_project = collections.defaultdict(list)
|
|
for operation in operations:
|
|
operations_by_project[operation.project_name].append(operation)
|
|
|
|
body_by_id = {}
|
|
results = []
|
|
for project_name, operations in operations_by_project.items():
|
|
operations_body = []
|
|
for operation in operations:
|
|
body = operation.to_server_operation()
|
|
if body is not None:
|
|
try:
|
|
json.dumps(body)
|
|
except:
|
|
raise ValueError("Couldn't json parse body: {}".format(
|
|
json.dumps(
|
|
body, indent=4, default=failed_json_default
|
|
)
|
|
))
|
|
|
|
body_by_id[operation.id] = body
|
|
operations_body.append(body)
|
|
|
|
if operations_body:
|
|
result = self._con.post(
|
|
"projects/{}/operations".format(project_name),
|
|
operations=operations_body,
|
|
canFail=False
|
|
)
|
|
results.append(result.data)
|
|
|
|
for result in results:
|
|
if result.get("success"):
|
|
continue
|
|
|
|
if "operations" not in result:
|
|
raise FailedOperations(
|
|
"Operation failed. Content: {}".format(str(result))
|
|
)
|
|
|
|
for op_result in result["operations"]:
|
|
if not op_result["success"]:
|
|
operation_id = op_result["id"]
|
|
raise FailedOperations((
|
|
"Operation \"{}\" failed with data:\n{}\nError: {}."
|
|
).format(
|
|
operation_id,
|
|
json.dumps(body_by_id[operation_id], indent=4),
|
|
op_result.get("error", "unknown"),
|
|
))
|
|
|
|
def create_entity(self, project_name, entity_type, data, nested_id=None):
|
|
"""Fast access to 'ServerCreateOperation'.
|
|
|
|
Args:
|
|
project_name (str): On which project the creation happens.
|
|
entity_type (str): Which entity type will be created.
|
|
data (Dicst[str, Any]): Entity data.
|
|
nested_id (str): Id of other operation from which is triggered
|
|
operation -> Operations can trigger suboperations but they
|
|
must be added to operations list after it's parent is added.
|
|
|
|
Returns:
|
|
ServerCreateOperation: Object of update operation.
|
|
"""
|
|
|
|
operation = ServerCreateOperation(
|
|
project_name, entity_type, data, self
|
|
)
|
|
|
|
if nested_id:
|
|
self._nested_operations[nested_id].append(operation)
|
|
else:
|
|
self.add(operation)
|
|
if operation.id in self._nested_operations:
|
|
self.extend(self._nested_operations.pop(operation.id))
|
|
|
|
return operation
|
|
|
|
def update_entity(
|
|
self, project_name, entity_type, entity_id, update_data, nested_id=None
|
|
):
|
|
"""Fast access to 'ServerUpdateOperation'.
|
|
|
|
Returns:
|
|
ServerUpdateOperation: Object of update operation.
|
|
"""
|
|
|
|
operation = ServerUpdateOperation(
|
|
project_name, entity_type, entity_id, update_data, self
|
|
)
|
|
if nested_id:
|
|
self._nested_operations[nested_id].append(operation)
|
|
else:
|
|
self.add(operation)
|
|
if operation.id in self._nested_operations:
|
|
self.extend(self._nested_operations.pop(operation.id))
|
|
return operation
|
|
|
|
def delete_entity(
|
|
self, project_name, entity_type, entity_id, nested_id=None
|
|
):
|
|
"""Fast access to 'ServerDeleteOperation'.
|
|
|
|
Returns:
|
|
ServerDeleteOperation: Object of delete operation.
|
|
"""
|
|
|
|
operation = ServerDeleteOperation(
|
|
project_name, entity_type, entity_id, self
|
|
)
|
|
if nested_id:
|
|
self._nested_operations[nested_id].append(operation)
|
|
else:
|
|
self.add(operation)
|
|
if operation.id in self._nested_operations:
|
|
self.extend(self._nested_operations.pop(operation.id))
|
|
return operation
|
|
|
|
|
|
def create_project(
|
|
project_name,
|
|
project_code,
|
|
library_project=False,
|
|
preset_name=None,
|
|
con=None
|
|
):
|
|
"""Create project using OpenPype settings.
|
|
|
|
This project creation function is not validating project document on
|
|
creation. It is because project document is created blindly with only
|
|
minimum required information about project which is it's name, code, type
|
|
and schema.
|
|
|
|
Entered project name must be unique and project must not exist yet.
|
|
|
|
Note:
|
|
This function is here to be OP v4 ready but in v3 has more logic
|
|
to do. That's why inner imports are in the body.
|
|
|
|
Args:
|
|
project_name (str): New project name. Should be unique.
|
|
project_code (str): Project's code should be unique too.
|
|
library_project (bool): Project is library project.
|
|
preset_name (str): Name of anatomy preset. Default is used if not
|
|
passed.
|
|
con (ServerAPI): Connection to server with logged user.
|
|
|
|
Raises:
|
|
ValueError: When project name already exists in MongoDB.
|
|
|
|
Returns:
|
|
dict: Created project document.
|
|
"""
|
|
|
|
if con is None:
|
|
con = get_server_api_connection()
|
|
|
|
return con.create_project(
|
|
project_name,
|
|
project_code,
|
|
library_project,
|
|
preset_name
|
|
)
|
|
|
|
|
|
def delete_project(project_name, con=None):
|
|
if con is None:
|
|
con = get_server_api_connection()
|
|
|
|
return con.delete_project(project_name)
|
|
|
|
|
|
def create_thumbnail(project_name, src_filepath, con=None):
|
|
if con is None:
|
|
con = get_server_api_connection()
|
|
return con.create_thumbnail(project_name, src_filepath)
|