General: Connect to AYON server (base) (#3924)

* implemented 'get_workfile_info' in entities

* removed 'prepare_asset_update_data' which is not used

* disable settings and project manager if in v4 mode

* prepared conversion helper functions for v4 entities

* prepared conversion functions for hero versions

* fix hero versions

* implemented get_archived_representations

* fix get latest versions

* return prepared changes

* handle archived representation

* raise exception on failed json conversion

* map archived to active properly

* make sure default fields are added

* fix conversion of hero version entity

* fix conversion of archived representations

* fix some conversions of representations and versions

* changed active behavior in queries

* fixed hero versions

* implemented basic thumbnail caching

* added raw variants of crud methods

* implemented methods to get and create thumbnail

* fix from flat dict

* implemented some basic folder conversion for updates

* fix thumbnail updates for version

* implemented v4 thumbnail integrator

* simplified data mapping

* 'get_thumbnail' function also expect entity type and entity id for which is the thumbnail received

* implemented 'get_thumbnail' for server

* fix how thumbnail id is received from entity

* removed unnecessary method 'get_thumbnail_id_from_source'

* implemented thumbnail resolver for v4

* removed unnecessary print

* move create and delete project directly to server api

* disable local settings action too on v4

* OP-3521 - added method to check and download updated addons from v4 server

* OP-3521 - added more descriptive error message for missing source

* OP-3521 - added default implementation of addon downloader to import

* OP-3521 - added check for dependency package zips

WIP - server doesn't contain required endpoint. Testing only with mockup data for now.

* OP-3521 - fixed parsing of DependencyItem

Added Server Url type and ServerAddonDownloader - v4 server doesn't know its own DNS for static files so it is sending unique name and url must be created during runtime.

* OP-3521 - fixed creation of targed directories

* change nev keys to look for and don't set them automatically

* fix task type conversion

* implemented base of loading v4 addons in v3

* Refactored argument name in Downloaders

* Updated parsing to DependencyItem according to current schema

* Implemented downloading of package from server

* Updated resolving of failures

Uses Enum items.

* Introduced passing of authorization token

Better to inject it than to have it from env var.

* Remove weird parsing of server_url

Not necessary, endpoints have same prefix.

* Fix doubling asset version name in addons folder

Zip file should already contain `addonName_addonVersion` as first subfolder

* Fix doubling asset version name in addons folder

Zip file should already contain `addonName_addonVersion` as first subfolder

* Made server_endpoint optional

Argument should be better for testing, but for calling from separate methods it would be better to encapsulate it.

Removed unwanted temporary productionPackage value

* Use existing method to pull addon info from Server to load v4 version of addon

* Raise exception when server doesn't have any production dependency package

* added ability to specify v3 alias of addon name

* expect v3_alias as uppered constant

* Re-implemented method to get addon info

Previous implementation wouldn't work in Python2 hosts.
Will be refactored in the future.

* fix '__getattr__'

* added ayon api to pyproject.toml and lock file

* use ayon api in common connection

* added mapping for label

* use ayon_api in client codebase

* separated clearing cache of url and username

* bump ayon api version

* rename env 'OP4_TEST' to 'USE_AYON_SERVER'

* Move and renamend get_addons_info to get_addons_info_as_dict in addon_distribution

Should be moved to ayon_api later

* Replaced requests calls with ayon_api

* Replaced OP4_TEST_ENABLED with AYON_SERVER_ENABLED

fixed endpoints

* Hound

* Hound

* OP-3521 - fix wrong key in get_representation_parents

parents overloads parents

* OP-3521 - changes for v4 of SiteSync addon

* OP-3521 - fix names

* OP-3521 - remove storing project_name

It should be safer to go thorug self.dbcon apparently

* OP-3521 - remove unwanted

"context["folder"]" can be only in dummy test data

* OP-3521 - move site sync loaders to addon

* Use only project instead of self.project

* OP-3521 - added missed get_progress_for_repre

* base of settings conversion script

* simplified ayon functions in start.py

* added loading of settings from ayon server

* added a note about colors

* fix global and local settings functions

* AvalonMongoDB is not using mongo connection on ayon server enabled

* 'get_dynamic_modules_dirs' is not checking system settings for paths in setting

* log viewer is disabled when ayon server is enabled

* basic logic of enabling/disabled addons

* don't use mongo logging if ayon server is enabled

* update ayon api

* bump ayon api again

* use ayon_api to get addons info in modules/base

* update ayon api

* moved helper functions to get addons and dependencies dir to common functions

* Initialization of AddonInfo is not crashing on unkonwn sources

* renamed 'DependencyDownloader' to 'AyonServerDownloader'

* renamed function 'default_addon_downloader' to 'get_default_addon_downloader'

* Added ability to convert 'WebAddonSource' to 'ServerResourceSorce'

* missing dependency package on server won't cause crash

* data sent to downloaders don't contain ayon specific headers

* modified addon distribution to not duplicate 'ayon_api' functionality

* fix doubled function defintioin

* unzip client file to addon destination

* formatting - unify quotes

* disable usage of mongo connection if in ayon mode

* renamed window.py to login_window.py

* added webpublisher settings conversion

* added maya conversion function

* reuse variable

* reuse variable (similar to previous commit)

* fix ayon addons loading

* fix typo 'AyonSettingsCahe' -> 'AyonSettingsCache'

* fix enabled state changes

* fix rr_path in royal render conversion

* avoid mongo calls in AYON state

* implemented custom AYON start script

* fix formatting (after black)

* ayon_start cleanup

* 'get_addons_dir' and 'get_dependencies_dir' store value to environment variable

* add docstrings to local dir functions

* addon info has full name

* fix modules enabled states

* removed unused 'run_disk_mapping_commands'

* removed ayon logic from 'start.py'

* fix warning message

* renamed 'openpype_common' to 'ayon_common'

* removed unused import

* don't import igniter

* removed startup validations of third parties

* change what's shown in version info

* fix which keys are applied from ayon values

* fix method name

* get applications from attribs

* Implemented UI basics to be able change user or logout

* merged server.py and credentials.py

* add more metadata to urls

* implemented change token

* implemented change user ui functionality

* implemented change user ui

* modify window to handle username and token value

* pass username to add server

* fix show UI cases

* added loggin action to tray

* update ayon api

* added missing dependency

* convert applications to config in a right way

* initial implementation of 'nuke' settings conversion

* removed few nuke comments

* implemented hiero conversion

* added imageio conversion

* added run ayon tray script

* fix few settings conversions

* Renamed class of source classes as they are not just for addons

* implemented objec to track source transfer progress

* Implemented distribution item with multiple sources

* Implemented ayon distribution wrapper to care about multiple things during distribution

* added 'cleanup' method for downlaoders

* download gets tranfer progress object

* Change UploadState enum

* added missing imports

* use AyonDistribution in ayon_start.py

* removed unused functions

* removed implemented TODOs

* fix import

* fix key used for Web source

* removed temp development fix

* formatting fix

* keep information if source require distribution

* handle 'require_distribution' attribute in distribution process

* added path attribute to server source

* added option to pass addons infor to ayon distribution

* fix tests

* fix formatting

* Fix typo

* Fix typo

* remove '_try_convert_to_server_source'

* renamed attributes and methods to match their content

* it is possible to pass dependency package info to AyonDistribution

* fix called methods in tests

* added public properties for error message and error detail

* Added filename to WebSourceInfo

Useful for GDrive sharable links where target file name is unknown/unparsable, it should be provided explicitly.

* unify source conversion by adding 'convert_source' function

* Fix error message

Co-authored-by: Roy Nieterau <roy_nieterau@hotmail.com>

* added docstring for 'transfer_progress'

* don't create metadata file on read

* added few docstrings

* add default folder fields to folder/task queries

* fix generators

* add dependencies when runnign from code

* add sys paths from distribution to pythonpath env

* fix missing applications

* added missing conversions for maya renderers

* fix formatting

* update ayon api

* fix hashes in lock file

* Use better exception

Co-authored-by: Ondřej Samohel <33513211+antirotor@users.noreply.github.com>

* Use Python 3 syntax

Co-authored-by: Ondřej Samohel <33513211+antirotor@users.noreply.github.com>

* apply some of sugested changes in ayon_start

* added some docstrings and suggested modifications

* copy create env from develop

* fix rendersettings conversion

* change code by suggestions

* added missing args to docstring

* added missing docstrings

* separated downloader and download factory

* fix ayon settings

* added some basic file docstring to ayon_settings

* join else conditions

* fix project settings conversion

* fix created at conversion

* fix workfile info query

* fix publisher UI

* added utils function 'get_ayon_appdirs'

* fix 'get_all_current_info'

* fix server url assignment when url is set

* updated ayon api

* added utils functions to create local site id for ayon

* added helper functions to create global connection

* create global connection in ayon start to start use site id

* use ayon site id in ayon mode

* formatting cleanup

* added header docstring

* fixes after ayon_api update

* load addons from ynput appdirs

* fix function call

* added docstring

* update ayon pyton api

* fix settings access

* use ayon_api to get root overrides in Anatomy

* bumbayon version to 0.1.13

* nuke: fixing settings keys from settings

* fix burnins definitions

* change v4 to AYON in thumbnail integrate

* fix one more v4 information

* Fixes after rebase

* fix extract burnin conversion

* additional fix of extract burnin

* SiteSync:added missed loaders or v3 compatibility (#4587)

* Added site sync loaders for v3 compatibility

* Fix get_progress_for_repre

* use 'files.name' instead of 'files.baseName'

* update ayon api to 0.1.14

* add common to include files

* change arguments for hero version creation

* skip shotgrid settings conversion if different ayon addon is used

* added ayon icons

* fix labels of application variants

* added option to show login window always on top

* login window on invalid credentials is always on top

* update ayon api

* update ayon api

* add entityType to project and folders

* AYON: Editorial hierarchy creation (#4699)

* disable extract hierarchy avalon when ayon mode is enabled

* implemented extract hierarchy to AYON

---------

Co-authored-by: Petr Kalis <petr.kalis@gmail.com>
Co-authored-by: Roy Nieterau <roy_nieterau@hotmail.com>
Co-authored-by: Ondřej Samohel <33513211+antirotor@users.noreply.github.com>
Co-authored-by: Jakub Jezek <jakubjezek001@gmail.com>
This commit is contained in:
Jakub Trllo 2023-03-24 14:21:05 +01:00 committed by Jakub Trllo
parent 33148f45e7
commit 47473a8a23
83 changed files with 11472 additions and 3167 deletions

376
ayon_start.py Normal file
View file

@ -0,0 +1,376 @@
# -*- coding: utf-8 -*-
"""Main entry point for AYON command.
Bootstrapping process of AYON.
"""
import os
import sys
import site
import traceback
import contextlib
# Enabled logging debug mode when "--debug" is passed
if "--verbose" in sys.argv:
expected_values = (
"Expected: notset, debug, info, warning, error, critical"
" or integer [0-50]."
)
idx = sys.argv.index("--verbose")
sys.argv.pop(idx)
if idx < len(sys.argv):
value = sys.argv.pop(idx)
else:
raise RuntimeError((
f"Expect value after \"--verbose\" argument. {expected_values}"
))
log_level = None
low_value = value.lower()
if low_value.isdigit():
log_level = int(low_value)
elif low_value == "notset":
log_level = 0
elif low_value == "debug":
log_level = 10
elif low_value == "info":
log_level = 20
elif low_value == "warning":
log_level = 30
elif low_value == "error":
log_level = 40
elif low_value == "critical":
log_level = 50
if log_level is None:
raise ValueError((
"Unexpected value after \"--verbose\" "
f"argument \"{value}\". {expected_values}"
))
os.environ["OPENPYPE_LOG_LEVEL"] = str(log_level)
# Enable debug mode, may affect log level if log level is not defined
if "--debug" in sys.argv:
sys.argv.remove("--debug")
os.environ["OPENPYPE_DEBUG"] = "1"
if "--automatic-tests" in sys.argv:
sys.argv.remove("--automatic-tests")
os.environ["IS_TEST"] = "1"
if "--use-staging" in sys.argv:
sys.argv.remove("--use-staging")
os.environ["OPENPYPE_USE_STAGING"] = "1"
_silent_commands = {
"run",
"standalonepublisher",
"extractenvironments",
"version"
}
if "--headless" in sys.argv:
os.environ["OPENPYPE_HEADLESS_MODE"] = "1"
sys.argv.remove("--headless")
elif os.getenv("OPENPYPE_HEADLESS_MODE") != "1":
os.environ.pop("OPENPYPE_HEADLESS_MODE", None)
IS_BUILT_APPLICATION = getattr(sys, "frozen", False)
HEADLESS_MODE_ENABLED = os.environ.get("OPENPYPE_HEADLESS_MODE") == "1"
SILENT_MODE_ENABLED = any(arg in _silent_commands for arg in sys.argv)
_pythonpath = os.getenv("PYTHONPATH", "")
_python_paths = _pythonpath.split(os.pathsep)
if not IS_BUILT_APPLICATION:
# Code root defined by `start.py` directory
AYON_ROOT = os.path.dirname(os.path.abspath(__file__))
_dependencies_path = site.getsitepackages()[-1]
else:
AYON_ROOT = os.path.dirname(sys.executable)
# add dependencies folder to sys.pat for frozen code
_dependencies_path = os.path.normpath(
os.path.join(AYON_ROOT, "dependencies")
)
# add stuff from `<frozen>/dependencies` to PYTHONPATH.
sys.path.append(_dependencies_path)
_python_paths.append(_dependencies_path)
# Vendored python modules that must not be in PYTHONPATH environment but
# are required for OpenPype processes
sys.path.insert(0, os.path.join(AYON_ROOT, "vendor", "python"))
# Add common package to sys path
# - common contains common code for bootstraping and OpenPype processes
sys.path.insert(0, os.path.join(AYON_ROOT, "common"))
# This is content of 'core' addon which is ATM part of build
common_python_vendor = os.path.join(
AYON_ROOT,
"openpype",
"vendor",
"python",
"common"
)
# Add tools dir to sys path for pyblish UI discovery
tools_dir = os.path.join(AYON_ROOT, "openpype", "tools")
for path in (AYON_ROOT, common_python_vendor, tools_dir):
while path in _python_paths:
_python_paths.remove(path)
while path in sys.path:
sys.path.remove(path)
_python_paths.insert(0, path)
sys.path.insert(0, path)
os.environ["PYTHONPATH"] = os.pathsep.join(_python_paths)
# enabled AYON state
os.environ["USE_AYON_SERVER"] = "1"
# Set this to point either to `python` from venv in case of live code
# or to `ayon` or `ayon_console` in case of frozen code
os.environ["OPENPYPE_EXECUTABLE"] = sys.executable
os.environ["AYON_ROOT"] = AYON_ROOT
os.environ["OPENPYPE_ROOT"] = AYON_ROOT
os.environ["OPENPYPE_REPOS_ROOT"] = AYON_ROOT
os.environ["AVALON_LABEL"] = "AYON"
# Set name of pyblish UI import
os.environ["PYBLISH_GUI"] = "pyblish_pype"
import blessed # noqa: E402
import certifi # noqa: E402
if sys.__stdout__:
term = blessed.Terminal()
def _print(message: str):
if SILENT_MODE_ENABLED:
pass
elif message.startswith("!!! "):
print(f'{term.orangered2("!!! ")}{message[4:]}')
elif message.startswith(">>> "):
print(f'{term.aquamarine3(">>> ")}{message[4:]}')
elif message.startswith("--- "):
print(f'{term.darkolivegreen3("--- ")}{message[4:]}')
elif message.startswith("*** "):
print(f'{term.gold("*** ")}{message[4:]}')
elif message.startswith(" - "):
print(f'{term.wheat(" - ")}{message[4:]}')
elif message.startswith(" . "):
print(f'{term.tan(" . ")}{message[4:]}')
elif message.startswith(" - "):
print(f'{term.seagreen3(" - ")}{message[7:]}')
elif message.startswith(" ! "):
print(f'{term.goldenrod(" ! ")}{message[7:]}')
elif message.startswith(" * "):
print(f'{term.aquamarine1(" * ")}{message[7:]}')
elif message.startswith(" "):
print(f'{term.darkseagreen3(" ")}{message[4:]}')
else:
print(message)
else:
def _print(message: str):
if not SILENT_MODE_ENABLED:
print(message)
# if SSL_CERT_FILE is not set prior to OpenPype launch, we set it to point
# to certifi bundle to make sure we have reasonably new CA certificates.
if not os.getenv("SSL_CERT_FILE"):
os.environ["SSL_CERT_FILE"] = certifi.where()
elif os.getenv("SSL_CERT_FILE") != certifi.where():
_print("--- your system is set to use custom CA certificate bundle.")
from ayon_common.connection.credentials import (
ask_to_login_ui,
add_server,
need_server_or_login,
load_environments,
set_environments,
create_global_connection,
confirm_server_login,
)
from ayon_common.distribution.addon_distribution import AyonDistribution
def set_global_environments() -> None:
"""Set global OpenPype's environments."""
import acre
from openpype.settings import get_general_environments
general_env = get_general_environments()
# first resolve general environment because merge doesn't expect
# values to be list.
# TODO: switch to OpenPype environment functions
merged_env = acre.merge(
acre.compute(acre.parse(general_env), cleanup=False),
dict(os.environ)
)
env = acre.compute(
merged_env,
cleanup=False
)
os.environ.clear()
os.environ.update(env)
# Hardcoded default values
os.environ["PYBLISH_GUI"] = "pyblish_pype"
# Change scale factor only if is not set
if "QT_AUTO_SCREEN_SCALE_FACTOR" not in os.environ:
os.environ["QT_AUTO_SCREEN_SCALE_FACTOR"] = "1"
def set_addons_environments():
"""Set global environments for OpenPype modules.
This requires to have OpenPype in `sys.path`.
"""
import acre
from openpype.modules import ModulesManager
modules_manager = ModulesManager()
# Merge environments with current environments and update values
if module_envs := modules_manager.collect_global_environments():
parsed_envs = acre.parse(module_envs)
env = acre.merge(parsed_envs, dict(os.environ))
os.environ.clear()
os.environ.update(env)
def _connect_to_ayon_server():
load_environments()
if not need_server_or_login():
create_global_connection()
return
if HEADLESS_MODE_ENABLED:
_print("!!! Cannot open v4 Login dialog in headless mode.")
_print((
"!!! Please use `AYON_SERVER_URL` to specify server address"
" and 'AYON_TOKEN' to specify user's token."
))
sys.exit(1)
current_url = os.environ.get("AYON_SERVER_URL")
url, token, username = ask_to_login_ui(current_url, always_on_top=True)
if url is not None and token is not None:
confirm_server_login(url, token, username)
return
if url is not None:
add_server(url, username)
_print("!!! Login was not successful.")
sys.exit(0)
def _check_and_update_from_ayon_server():
"""Gets addon info from v4, compares with local folder and updates it.
Raises:
RuntimeError
"""
distribution = AyonDistribution()
distribution.distribute()
distribution.validate_distribution()
python_paths = [
path
for path in os.getenv("PYTHONPATH", "").split(os.pathsep)
if path
]
for path in distribution.get_sys_paths():
sys.path.insert(0, path)
if path not in python_paths:
python_paths.append(path)
os.environ["PYTHONPATH"] = os.pathsep.join(python_paths)
def boot():
"""Bootstrap OpenPype."""
from openpype.version import __version__
# TODO load version
os.environ["OPENPYPE_VERSION"] = __version__
os.environ["AYON_VERSION"] = __version__
use_staging = os.environ.get("OPENPYPE_USE_STAGING") == "1"
_connect_to_ayon_server()
_check_and_update_from_ayon_server()
# delete OpenPype module and it's submodules from cache so it is used from
# specific version
modules_to_del = [
sys.modules.pop(module_name)
for module_name in tuple(sys.modules)
if module_name == "openpype" or module_name.startswith("openpype.")
]
for module_name in modules_to_del:
with contextlib.suppress(AttributeError, KeyError):
del sys.modules[module_name]
from openpype import cli
from openpype.lib import terminal as t
_print(">>> loading environments ...")
_print(" - global AYON ...")
set_global_environments()
_print(" - for addons ...")
set_addons_environments()
# print info when not running scripts defined in 'silent commands'
if not SILENT_MODE_ENABLED:
info = get_info(use_staging)
info.insert(0, f">>> Using AYON from [ {AYON_ROOT} ]")
t_width = 20
with contextlib.suppress(ValueError, OSError):
t_width = os.get_terminal_size().columns - 2
_header = f"*** AYON [{__version__}] "
info.insert(0, _header + "-" * (t_width - len(_header)))
for i in info:
t.echo(i)
try:
cli.main(obj={}, prog_name="openpype")
except Exception: # noqa
exc_info = sys.exc_info()
_print("!!! OpenPype crashed:")
traceback.print_exception(*exc_info)
sys.exit(1)
def get_info(use_staging=None) -> list:
"""Print additional information to console."""
inf = []
if use_staging:
inf.append(("AYON variant", "staging"))
else:
inf.append(("AYON variant", "production"))
# NOTE add addons information
maximum = max(len(i[0]) for i in inf)
formatted = []
for info in inf:
padding = (maximum - len(info[0])) + 1
formatted.append(f'... {info[0]}:{" " * padding}[ {info[1]} ]')
return formatted
if __name__ == "__main__":
boot()

View file

@ -0,0 +1,475 @@
"""Handle credentials and connection to server for client application.
Cache and store used server urls. Store/load API keys to/from keyring if
needed. Store metadata about used urls, usernames for the urls and when was
the connection with the username established.
On bootstrap is created global connection with information about site and
client version. The connection object lives in 'ayon_api'.
"""
import os
import json
import platform
import datetime
import contextlib
from typing import Optional, Union, Any
import ayon_api
from ayon_api.exceptions import UrlError
from ayon_api.utils import (
validate_url,
is_token_valid,
logout_from_server,
)
from ayon_common.utils import get_ayon_appdirs, get_local_site_id
class ChangeUserResult:
def __init__(
self, logged_out, old_url, old_token, old_username,
new_url, new_token, new_username
):
shutdown = logged_out
restart = new_url is not None and new_url != old_url
token_changed = new_token is not None and new_token == old_token
self.logged_out = logged_out
self.old_url = old_url
self.old_token = old_token
self.old_username = old_username
self.new_url = new_url
self.new_token = new_token
self.new_username = new_username
self.shutdown = shutdown
self.restart = restart
self.token_changed = token_changed
def _get_servers_path():
return get_ayon_appdirs("used_servers.json")
def get_servers_info_data():
"""Metadata about used server on this machine.
Store data about all used server urls, last used url and user username for
the url. Using this metadata we can remember which username was used per
url if token stored in keyring loose lifetime.
Returns:
dict[str, Any]: Information about servers.
"""
data = {}
servers_info_path = _get_servers_path()
if not os.path.exists(servers_info_path):
dirpath = os.path.dirname(servers_info_path)
if not os.path.exists(dirpath):
os.makedirs(dirpath)
return data
with open(servers_info_path, "r") as stream:
with contextlib.suppress(BaseException):
data = json.load(stream)
return data
def add_server(url: str, username: str):
"""Add server to server info metadata.
This function will also mark the url as last used url on the machine so on
next launch will be used.
Args:
url (str): Server url.
username (str): Name of user used to log in.
"""
servers_info_path = _get_servers_path()
data = get_servers_info_data()
data["last_server"] = url
if "urls" not in data:
data["urls"] = {}
data["urls"][url] = {
"updated_dt": datetime.datetime.now().strftime("%Y/%m/%d %H:%M:%S"),
"username": username,
}
with open(servers_info_path, "w") as stream:
json.dump(data, stream)
def remove_server(url: str):
"""Remove server url from servers information.
This should be used on logout to completelly loose information about server
on the machine.
Args:
url (str): Server url.
"""
if not url:
return
servers_info_path = _get_servers_path()
data = get_servers_info_data()
if data.get("last_server") == url:
data["last_server"] = None
if "urls" in data:
data["urls"].pop(url, None)
with open(servers_info_path, "w") as stream:
json.dump(data, stream)
def get_last_server(
data: Optional[dict[str, Any]] = None
) -> Union[str, None]:
"""Last server used to log in on this machine.
Args:
data (Optional[dict[str, Any]]): Prepared server information data.
Returns:
Union[str, None]: Last used server url.
"""
if data is None:
data = get_servers_info_data()
return data.get("last_server")
def get_last_username_by_url(
url: str,
data: Optional[dict[str, Any]] = None
) -> Union[str, None]:
"""Get last username which was used for passed url.
Args:
url (str): Server url.
data (Optional[dict[str, Any]]): Servers info.
Returns:
Union[str, None]: Username.
"""
if not url:
return None
if data is None:
data = get_servers_info_data()
if urls := data.get("urls"):
if url_info := urls.get(url):
return url_info.get("username")
return None
def get_last_server_with_username():
"""Receive last server and username used in last connection.
Returns:
tuple[Union[str, None], Union[str, None]]: Url and username.
"""
data = get_servers_info_data()
url = get_last_server(data)
username = get_last_username_by_url(url)
return url, username
class TokenKeyring:
# Fake username with hardcoded username
username_key = "username"
def __init__(self, url):
try:
import keyring
except Exception as exc:
raise NotImplementedError(
"Python module `keyring` is not available."
) from exc
# hack for cx_freeze and Windows keyring backend
if platform.system().lower() == "windows":
from keyring.backends import Windows
keyring.set_keyring(Windows.WinVaultKeyring())
self._url = url
self._keyring_key = f"AYON/{url}"
def get_value(self):
import keyring
return keyring.get_password(self._keyring_key, self.username_key)
def set_value(self, value):
import keyring
if value is not None:
keyring.set_password(self._keyring_key, self.username_key, value)
return
with contextlib.suppress(keyring.errors.PasswordDeleteError):
keyring.delete_password(self._keyring_key, self.username_key)
def load_token(url: str) -> Union[str, None]:
"""Get token for url from keyring.
Args:
url (str): Server url.
Returns:
Union[str, None]: Token for passed url available in keyring.
"""
return TokenKeyring(url).get_value()
def store_token(url: str, token: str):
"""Store token by url to keyring.
Args:
url (str): Server url.
token (str): User token to server.
"""
TokenKeyring(url).set_value(token)
def ask_to_login_ui(
url: Optional[str] = None,
always_on_top: Optional[bool] = False
) -> tuple[str, str, str]:
"""Ask user to login using UI.
This should be used only when user is not yet logged in at all or available
credentials are invalid. To change credentials use 'change_user_ui'
function.
Args:
url (Optional[str]): Server url that could be prefilled in UI.
always_on_top (Optional[bool]): Window will be drawn on top of
other windows.
Returns:
tuple[str, str, str]: Url, user's token and username.
"""
from .ui import ask_to_login
if url is None:
url = get_last_server()
username = get_last_username_by_url(url)
return ask_to_login(url, username, always_on_top=always_on_top)
def change_user_ui() -> ChangeUserResult:
"""Change user using UI.
Show UI to user where he can change credentials or url. Output will contain
all information about old/new values of url, username, api key. If user
confirmed or declined values.
Returns:
ChangeUserResult: Information about user change.
"""
from .ui import change_user
url, username = get_last_server_with_username()
token = load_token(url)
result = change_user(url, username, token)
new_url, new_token, new_username, logged_out = result
output = ChangeUserResult(
logged_out, url, token, username,
new_url, new_token, new_username
)
if output.logged_out:
logout(url, token)
elif output.token_changed:
change_token(
output.new_url,
output.new_token,
output.new_username,
output.old_url
)
return output
def change_token(
url: str,
token: str,
username: Optional[str] = None,
old_url: Optional[str] = None
):
"""Change url and token in currently running session.
Function can also change server url, in that case are previous credentials
NOT removed from cache.
Args:
url (str): Url to server.
token (str): New token to be used for url connection.
username (Optional[str]): Username of logged user.
old_url (Optional[str]): Previous url. Value from 'get_last_server'
is used if not entered.
"""
if old_url is None:
old_url = get_last_server()
if old_url and old_url == url:
remove_url_cache(old_url)
# TODO check if ayon_api is already connected
add_server(url, username)
store_token(url, token)
ayon_api.change_token(url, token)
def remove_url_cache(url: str):
"""Clear cache for server url.
Args:
url (str): Server url which is removed from cache.
"""
store_token(url, None)
def remove_token_cache(url: str, token: str):
"""Remove token from local cache of url.
Is skipped if cached token under the passed url is not the same
as passed token.
Args:
url (str): Url to server.
token (str): Token to be removed from url cache.
"""
if load_token(url) == token:
remove_url_cache(url)
def logout(url: str, token: str):
"""Logout from server and throw token away.
Args:
url (str): Url from which should be logged out.
token (str): Token which should be used to log out.
"""
remove_server(url)
ayon_api.close_connection()
ayon_api.set_environments(None, None)
remove_token_cache(url, token)
logout_from_server(url, token)
def load_environments():
"""Load environments on startup.
Handle environments needed for connection with server. Environments are
'AYON_SERVER_URL' and 'AYON_TOKEN'.
Server is looked up from environment. Already set environent is not
changed. If environemnt is not filled then last server stored in appdirs
is used.
Token is skipped if url is not available. Otherwise, is also checked from
env and if is not available then uses 'load_token' to try to get token
based on server url.
"""
server_url = os.environ.get("AYON_SERVER_URL")
if not server_url:
server_url = get_last_server()
if not server_url:
return
os.environ["AYON_SERVER_URL"] = server_url
if not os.environ.get("AYON_TOKEN"):
if token := load_token(server_url):
os.environ["AYON_TOKEN"] = token
def set_environments(url: str, token: str):
"""Change url and token environemnts in currently running process.
Args:
url (str): New server url.
token (str): User's token.
"""
ayon_api.set_environments(url, token)
def create_global_connection():
"""Create global connection with site id and client version.
Make sure the global connection in 'ayon_api' have entered site id and
client version.
"""
if hasattr(ayon_api, "create_connection"):
ayon_api.create_connection(
get_local_site_id(), os.environ.get("AYON_VERSION")
)
def need_server_or_login() -> bool:
"""Check if server url or login to the server are needed.
It is recommended to call 'load_environments' on startup before this check.
But in some cases this function could be called after startup.
Returns:
bool: 'True' if server and token are available. Otherwise 'False'.
"""
server_url = os.environ.get("AYON_SERVER_URL")
if not server_url:
return True
try:
server_url = validate_url(server_url)
except UrlError:
return True
token = os.environ.get("AYON_TOKEN")
if token:
return not is_token_valid(server_url, token)
token = load_token(server_url)
return not is_token_valid(server_url, token)
def confirm_server_login(url, token, username):
"""Confirm login of user and do necessary stepts to apply changes.
This should not be used on "change" of user but on first login.
Args:
url (str): Server url where user authenticated.
token (str): API token used for authentication to server.
username (Union[str, None]): Username related to API token.
"""
add_server(url, username)
store_token(url, token)
set_environments(url, token)
create_global_connection()

View file

@ -0,0 +1,12 @@
from .login_window import (
ServerLoginWindow,
ask_to_login,
change_user,
)
__all__ = (
"ServerLoginWindow",
"ask_to_login",
"change_user",
)

View file

@ -0,0 +1,11 @@
def set_style_property(widget, property_name, property_value):
"""Set widget's property that may affect style.
Style of widget is polished if current property value is different.
"""
cur_value = widget.property(property_name)
if cur_value == property_value:
return
widget.setProperty(property_name, property_value)
widget.style().polish(widget)

View file

@ -0,0 +1,753 @@
import traceback
from Qt import QtWidgets, QtCore, QtGui
from ayon_api.exceptions import UrlError
from ayon_api.utils import validate_url, login_to_server
from ayon_common.resources import (
get_resource_path,
get_icon_path,
load_stylesheet,
)
from .widgets import (
PressHoverButton,
PlaceholderLineEdit,
)
from .lib import set_style_property
class LogoutConfirmDialog(QtWidgets.QDialog):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.setWindowTitle("Logout confirmation")
message_widget = QtWidgets.QWidget(self)
message_label = QtWidgets.QLabel(
(
"You are going to logout. This action will close this"
" application and will invalidate your login."
" All other applications launched with this login won't be"
" able to use it anymore.<br/><br/>"
"You can cancel logout and only change server and user login"
" in login dialog.<br/><br/>"
"Press OK to confirm logout."
),
message_widget
)
message_label.setWordWrap(True)
message_layout = QtWidgets.QHBoxLayout(message_widget)
message_layout.setContentsMargins(0, 0, 0, 0)
message_layout.addWidget(message_label, 1)
sep_frame = QtWidgets.QFrame(self)
sep_frame.setObjectName("Separator")
sep_frame.setMinimumHeight(2)
sep_frame.setMaximumHeight(2)
footer_widget = QtWidgets.QWidget(self)
cancel_btn = QtWidgets.QPushButton("Cancel", footer_widget)
confirm_btn = QtWidgets.QPushButton("OK", footer_widget)
footer_layout = QtWidgets.QHBoxLayout(footer_widget)
footer_layout.setContentsMargins(0, 0, 0, 0)
footer_layout.addStretch(1)
footer_layout.addWidget(cancel_btn, 0)
footer_layout.addWidget(confirm_btn, 0)
main_layout = QtWidgets.QVBoxLayout(self)
main_layout.addWidget(message_widget, 0)
main_layout.addStretch(1)
main_layout.addWidget(sep_frame, 0)
main_layout.addWidget(footer_widget, 0)
cancel_btn.clicked.connect(self._on_cancel_click)
confirm_btn.clicked.connect(self._on_confirm_click)
self._cancel_btn = cancel_btn
self._confirm_btn = confirm_btn
self._result = False
def showEvent(self, event):
super().showEvent(event)
self._match_btns_sizes()
def resizeEvent(self, event):
super().resizeEvent(event)
self._match_btns_sizes()
def _match_btns_sizes(self):
width = max(
self._cancel_btn.sizeHint().width(),
self._confirm_btn.sizeHint().width()
)
self._cancel_btn.setMinimumWidth(width)
self._confirm_btn.setMinimumWidth(width)
def _on_cancel_click(self):
self._result = False
self.reject()
def _on_confirm_click(self):
self._result = True
self.accept()
def get_result(self):
return self._result
class ServerLoginWindow(QtWidgets.QDialog):
default_width = 410
default_height = 170
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
icon_path = get_icon_path()
icon = QtGui.QIcon(icon_path)
self.setWindowIcon(icon)
self.setWindowTitle("Login to server")
edit_icon_path = get_resource_path("edit.png")
edit_icon = QtGui.QIcon(edit_icon_path)
# --- URL page ---
login_widget = QtWidgets.QWidget(self)
user_cred_widget = QtWidgets.QWidget(login_widget)
url_label = QtWidgets.QLabel("URL:", user_cred_widget)
url_widget = QtWidgets.QWidget(user_cred_widget)
url_input = PlaceholderLineEdit(url_widget)
url_input.setPlaceholderText("< https://ayon.server.com >")
url_preview = QtWidgets.QLineEdit(url_widget)
url_preview.setReadOnly(True)
url_preview.setObjectName("LikeDisabledInput")
url_edit_btn = PressHoverButton(user_cred_widget)
url_edit_btn.setIcon(edit_icon)
url_edit_btn.setObjectName("PasswordBtn")
url_layout = QtWidgets.QHBoxLayout(url_widget)
url_layout.setContentsMargins(0, 0, 0, 0)
url_layout.addWidget(url_input, 1)
url_layout.addWidget(url_preview, 1)
# --- URL separator ---
url_cred_sep = QtWidgets.QFrame(self)
url_cred_sep.setObjectName("Separator")
url_cred_sep.setMinimumHeight(2)
url_cred_sep.setMaximumHeight(2)
# --- Login page ---
username_label = QtWidgets.QLabel("Username:", user_cred_widget)
username_widget = QtWidgets.QWidget(user_cred_widget)
username_input = PlaceholderLineEdit(username_widget)
username_input.setPlaceholderText("< Artist >")
username_preview = QtWidgets.QLineEdit(username_widget)
username_preview.setReadOnly(True)
username_preview.setObjectName("LikeDisabledInput")
username_edit_btn = PressHoverButton(user_cred_widget)
username_edit_btn.setIcon(edit_icon)
username_edit_btn.setObjectName("PasswordBtn")
username_layout = QtWidgets.QHBoxLayout(username_widget)
username_layout.setContentsMargins(0, 0, 0, 0)
username_layout.addWidget(username_input, 1)
username_layout.addWidget(username_preview, 1)
password_label = QtWidgets.QLabel("Password:", user_cred_widget)
password_input = PlaceholderLineEdit(user_cred_widget)
password_input.setPlaceholderText("< *********** >")
password_input.setEchoMode(password_input.Password)
api_label = QtWidgets.QLabel("API key:", user_cred_widget)
api_preview = QtWidgets.QLineEdit(user_cred_widget)
api_preview.setReadOnly(True)
api_preview.setObjectName("LikeDisabledInput")
show_password_icon_path = get_resource_path("eye.png")
show_password_icon = QtGui.QIcon(show_password_icon_path)
show_password_btn = PressHoverButton(user_cred_widget)
show_password_btn.setObjectName("PasswordBtn")
show_password_btn.setIcon(show_password_icon)
show_password_btn.setFocusPolicy(QtCore.Qt.ClickFocus)
cred_msg_sep = QtWidgets.QFrame(self)
cred_msg_sep.setObjectName("Separator")
cred_msg_sep.setMinimumHeight(2)
cred_msg_sep.setMaximumHeight(2)
# --- Credentials inputs ---
user_cred_layout = QtWidgets.QGridLayout(user_cred_widget)
user_cred_layout.setContentsMargins(0, 0, 0, 0)
row = 0
user_cred_layout.addWidget(url_label, row, 0, 1, 1)
user_cred_layout.addWidget(url_widget, row, 1, 1, 1)
user_cred_layout.addWidget(url_edit_btn, row, 2, 1, 1)
row += 1
user_cred_layout.addWidget(url_cred_sep, row, 0, 1, 3)
row += 1
user_cred_layout.addWidget(username_label, row, 0, 1, 1)
user_cred_layout.addWidget(username_widget, row, 1, 1, 1)
user_cred_layout.addWidget(username_edit_btn, row, 2, 2, 1)
row += 1
user_cred_layout.addWidget(api_label, row, 0, 1, 1)
user_cred_layout.addWidget(api_preview, row, 1, 1, 1)
row += 1
user_cred_layout.addWidget(password_label, row, 0, 1, 1)
user_cred_layout.addWidget(password_input, row, 1, 1, 1)
user_cred_layout.addWidget(show_password_btn, row, 2, 1, 1)
row += 1
user_cred_layout.addWidget(cred_msg_sep, row, 0, 1, 3)
row += 1
user_cred_layout.setColumnStretch(0, 0)
user_cred_layout.setColumnStretch(1, 1)
user_cred_layout.setColumnStretch(2, 0)
login_layout = QtWidgets.QVBoxLayout(login_widget)
login_layout.setContentsMargins(0, 0, 0, 0)
login_layout.addWidget(user_cred_widget, 1)
# --- Messages ---
# Messages for users (e.g. invalid url etc.)
message_label = QtWidgets.QLabel(self)
message_label.setWordWrap(True)
message_label.setTextInteractionFlags(QtCore.Qt.TextBrowserInteraction)
footer_widget = QtWidgets.QWidget(self)
logout_btn = QtWidgets.QPushButton("Logout", footer_widget)
user_message = QtWidgets.QLabel(footer_widget)
login_btn = QtWidgets.QPushButton("Login", footer_widget)
confirm_btn = QtWidgets.QPushButton("Confirm", footer_widget)
footer_layout = QtWidgets.QHBoxLayout(footer_widget)
footer_layout.setContentsMargins(0, 0, 0, 0)
footer_layout.addWidget(logout_btn, 0)
footer_layout.addWidget(user_message, 1)
footer_layout.addWidget(login_btn, 0)
footer_layout.addWidget(confirm_btn, 0)
main_layout = QtWidgets.QVBoxLayout(self)
main_layout.addWidget(login_widget, 0)
main_layout.addWidget(message_label, 0)
main_layout.addStretch(1)
main_layout.addWidget(footer_widget, 0)
url_input.textChanged.connect(self._on_url_change)
url_input.returnPressed.connect(self._on_url_enter_press)
username_input.textChanged.connect(self._on_user_change)
username_input.returnPressed.connect(self._on_username_enter_press)
password_input.returnPressed.connect(self._on_password_enter_press)
show_password_btn.change_state.connect(self._on_show_password)
url_edit_btn.clicked.connect(self._on_url_edit_click)
username_edit_btn.clicked.connect(self._on_username_edit_click)
logout_btn.clicked.connect(self._on_logout_click)
login_btn.clicked.connect(self._on_login_click)
confirm_btn.clicked.connect(self._on_login_click)
self._message_label = message_label
self._url_widget = url_widget
self._url_input = url_input
self._url_preview = url_preview
self._url_edit_btn = url_edit_btn
self._login_widget = login_widget
self._user_cred_widget = user_cred_widget
self._username_input = username_input
self._username_preview = username_preview
self._username_edit_btn = username_edit_btn
self._password_label = password_label
self._password_input = password_input
self._show_password_btn = show_password_btn
self._api_label = api_label
self._api_preview = api_preview
self._logout_btn = logout_btn
self._user_message = user_message
self._login_btn = login_btn
self._confirm_btn = confirm_btn
self._url_is_valid = None
self._credentials_are_valid = None
self._result = (None, None, None, False)
self._first_show = True
self._allow_logout = False
self._logged_in = False
self._url_edit_mode = False
self._username_edit_mode = False
def set_allow_logout(self, allow_logout):
if allow_logout is self._allow_logout:
return
self._allow_logout = allow_logout
self._update_states_by_edit_mode()
def _set_logged_in(self, logged_in):
if logged_in is self._logged_in:
return
self._logged_in = logged_in
self._update_states_by_edit_mode()
def _set_url_edit_mode(self, edit_mode):
if self._url_edit_mode is not edit_mode:
self._url_edit_mode = edit_mode
self._update_states_by_edit_mode()
def _set_username_edit_mode(self, edit_mode):
if self._username_edit_mode is not edit_mode:
self._username_edit_mode = edit_mode
self._update_states_by_edit_mode()
def _get_url_user_edit(self):
url_edit = True
if self._logged_in and not self._url_edit_mode:
url_edit = False
user_edit = url_edit
if not user_edit and self._logged_in and self._username_edit_mode:
user_edit = True
return url_edit, user_edit
def _update_states_by_edit_mode(self):
url_edit, user_edit = self._get_url_user_edit()
self._url_preview.setVisible(not url_edit)
self._url_input.setVisible(url_edit)
self._url_edit_btn.setVisible(self._allow_logout and not url_edit)
self._username_preview.setVisible(not user_edit)
self._username_input.setVisible(user_edit)
self._username_edit_btn.setVisible(
self._allow_logout and not user_edit
)
self._api_preview.setVisible(not user_edit)
self._api_label.setVisible(not user_edit)
self._password_label.setVisible(user_edit)
self._show_password_btn.setVisible(user_edit)
self._password_input.setVisible(user_edit)
self._logout_btn.setVisible(self._allow_logout and self._logged_in)
self._login_btn.setVisible(not self._allow_logout)
self._confirm_btn.setVisible(self._allow_logout)
self._update_login_btn_state(url_edit, user_edit)
def _update_login_btn_state(self, url_edit=None, user_edit=None, url=None):
if url_edit is None:
url_edit, user_edit = self._get_url_user_edit()
if url is None:
url = self._url_input.text()
enabled = bool(url) and (url_edit or user_edit)
self._login_btn.setEnabled(enabled)
self._confirm_btn.setEnabled(enabled)
def showEvent(self, event):
super().showEvent(event)
if self._first_show:
self._first_show = False
self._on_first_show()
def _on_first_show(self):
self.setStyleSheet(load_stylesheet())
self.resize(self.default_width, self.default_height)
self._center_window()
if self._allow_logout is None:
self.set_allow_logout(False)
self._update_states_by_edit_mode()
if not self._url_input.text():
widget = self._url_input
elif not self._username_input.text():
widget = self._username_input
else:
widget = self._password_input
self._set_input_focus(widget)
def result(self):
"""Result url and token or login.
Returns:
Union[Tuple[str, str], Tuple[None, None]]: Url and token used for
login if was successful otherwise are both set to None.
"""
return self._result
def _center_window(self):
"""Move window to center of screen."""
desktop = QtWidgets.QApplication.desktop()
screen_idx = desktop.screenNumber(self)
screen_geo = desktop.screenGeometry(screen_idx)
geo = self.frameGeometry()
geo.moveCenter(screen_geo.center())
if geo.y() < screen_geo.y():
geo.setY(screen_geo.y())
self.move(geo.topLeft())
def _on_url_change(self, text):
self._update_login_btn_state(url=text)
self._set_url_valid(None)
self._set_credentials_valid(None)
self._url_preview.setText(text)
def _set_url_valid(self, valid):
if valid is self._url_is_valid:
return
self._url_is_valid = valid
self._set_input_valid_state(self._url_input, valid)
def _set_credentials_valid(self, valid):
if self._credentials_are_valid is valid:
return
self._credentials_are_valid = valid
self._set_input_valid_state(self._username_input, valid)
self._set_input_valid_state(self._password_input, valid)
def _on_url_enter_press(self):
self._set_input_focus(self._username_input)
def _on_user_change(self, username):
self._username_preview.setText(username)
def _on_username_enter_press(self):
self._set_input_focus(self._password_input)
def _on_password_enter_press(self):
self._login()
def _on_show_password(self, show_password):
if show_password:
placeholder_text = "< MySecret124 >"
echo_mode = QtWidgets.QLineEdit.Normal
else:
placeholder_text = "< *********** >"
echo_mode = QtWidgets.QLineEdit.Password
self._password_input.setEchoMode(echo_mode)
self._password_input.setPlaceholderText(placeholder_text)
def _on_username_edit_click(self):
self._username_edit_mode = True
self._update_states_by_edit_mode()
def _on_url_edit_click(self):
self._url_edit_mode = True
self._update_states_by_edit_mode()
def _on_logout_click(self):
dialog = LogoutConfirmDialog(self)
dialog.exec_()
if dialog.get_result():
self._result = (None, None, None, True)
self.accept()
def _on_login_click(self):
self._login()
def _validate_url(self):
"""Use url from input to connect and change window state on success.
Todos:
Threaded check.
"""
url = self._url_input.text()
valid_url = None
try:
valid_url = validate_url(url)
except UrlError as exc:
parts = [f"<b>{exc.title}</b>"]
parts.extend(f"- {hint}" for hint in exc.hints)
self._set_message("<br/>".join(parts))
except KeyboardInterrupt:
# Reraise KeyboardInterrupt error
raise
except BaseException:
self._set_unexpected_error()
return
if valid_url is None:
return False
self._url_input.setText(valid_url)
return True
def _login(self):
if (
not self._login_btn.isEnabled()
and not self._confirm_btn.isEnabled()
):
return
if not self._url_is_valid:
self._set_url_valid(self._validate_url())
if not self._url_is_valid:
self._set_input_focus(self._url_input)
self._set_credentials_valid(None)
return
self._clear_message()
url = self._url_input.text()
username = self._username_input.text()
password = self._password_input.text()
try:
token = login_to_server(url, username, password)
except BaseException:
self._set_unexpected_error()
return
if token is not None:
self._result = (url, token, username, False)
self.accept()
return
self._set_credentials_valid(False)
message_lines = ["<b>Invalid credentials</b>"]
if not username.strip():
message_lines.append("- Username is not filled")
if not password.strip():
message_lines.append("- Password is not filled")
if username and password:
message_lines.append("- Check your credentials")
self._set_message("<br/>".join(message_lines))
self._set_input_focus(self._username_input)
def _set_input_focus(self, widget):
widget.setFocus(QtCore.Qt.MouseFocusReason)
def _set_input_valid_state(self, widget, valid):
state = ""
if valid is True:
state = "valid"
elif valid is False:
state = "invalid"
set_style_property(widget, "state", state)
def _set_message(self, message):
self._message_label.setText(message)
def _clear_message(self):
self._message_label.setText("")
def _set_unexpected_error(self):
# TODO add traceback somewhere
# - maybe a button to show or copy?
traceback.print_exc()
lines = [
"<b>Unexpected error happened</b>",
"- Can be caused by wrong url (leading elsewhere)"
]
self._set_message("<br/>".join(lines))
def set_url(self, url):
self._url_preview.setText(url)
self._url_input.setText(url)
self._validate_url()
def set_username(self, username):
self._username_preview.setText(username)
self._username_input.setText(username)
def _set_api_key(self, api_key):
if not api_key or len(api_key) < 3:
self._api_preview.setText(api_key or "")
return
api_key_len = len(api_key)
offset = 6
if api_key_len < offset:
offset = api_key_len // 2
api_key = api_key[:offset] + "." * (api_key_len - offset)
self._api_preview.setText(api_key)
def set_logged_in(
self,
logged_in,
url=None,
username=None,
api_key=None,
allow_logout=None
):
if url is not None:
self.set_url(url)
if username is not None:
self.set_username(username)
if api_key:
self._set_api_key(api_key)
if logged_in and allow_logout is None:
allow_logout = True
self._set_logged_in(logged_in)
if allow_logout:
self.set_allow_logout(True)
elif allow_logout is False:
self.set_allow_logout(False)
def ask_to_login(url=None, username=None, always_on_top=False):
"""Ask user to login using Qt dialog.
Function creates new QApplication if is not created yet.
Args:
url (Optional[str]): Server url that will be prefilled in dialog.
username (Optional[str]): Username that will be prefilled in dialog.
always_on_top (Optional[bool]): Window will be drawn on top of
other windows.
Returns:
tuple[str, str, str]: Returns Url, user's token and username. Url can
be changed during dialog lifetime that's why the url is returned.
"""
app_instance = QtWidgets.QApplication.instance()
if app_instance is None:
for attr_name in (
"AA_EnableHighDpiScaling",
"AA_UseHighDpiPixmaps"
):
attr = getattr(QtCore.Qt, attr_name, None)
if attr is not None:
QtWidgets.QApplication.setAttribute(attr)
app_instance = QtWidgets.QApplication([])
window = ServerLoginWindow()
if always_on_top:
window.setWindowFlags(
window.windowFlags()
| QtCore.Qt.WindowStaysOnTopHint
)
if url:
window.set_url(url)
if username:
window.set_username(username)
_output = {"out": None}
def _exec_window():
window.exec_()
result = window.result()
out_url, out_token, out_username, _logged_out = result
_output["out"] = out_url, out_token, out_username
return _output["out"]
# Use QTimer to exec dialog if application is not running yet
# - it is not possible to call 'exec_' on dialog without running app
# - it is but the window is stuck
if not app_instance.startingUp():
return _exec_window()
timer = QtCore.QTimer()
timer.setSingleShot(True)
timer.timeout.connect(_exec_window)
timer.start()
# This can become main Qt loop. Maybe should live elsewhere
app_instance.exec_()
return _output["out"]
def change_user(url, username, api_key, always_on_top=False):
"""Ask user to login using Qt dialog.
Function creates new QApplication if is not created yet.
Args:
url (str): Server url that will be prefilled in dialog.
username (str): Username that will be prefilled in dialog.
api_key (str): API key that will be prefilled in dialog.
always_on_top (Optional[bool]): Window will be drawn on top of
other windows.
Returns:
Tuple[str, str]: Returns Url and user's token. Url can be changed
during dialog lifetime that's why the url is returned.
"""
app_instance = QtWidgets.QApplication.instance()
if app_instance is None:
for attr_name in (
"AA_EnableHighDpiScaling",
"AA_UseHighDpiPixmaps"
):
attr = getattr(QtCore.Qt, attr_name, None)
if attr is not None:
QtWidgets.QApplication.setAttribute(attr)
app_instance = QtWidgets.QApplication([])
window = ServerLoginWindow()
if always_on_top:
window.setWindowFlags(
window.windowFlags()
| QtCore.Qt.WindowStaysOnTopHint
)
window.set_logged_in(True, url, username, api_key)
_output = {"out": None}
def _exec_window():
window.exec_()
_output["out"] = window.result()
return _output["out"]
# Use QTimer to exec dialog if application is not running yet
# - it is not possible to call 'exec_' on dialog without running app
# - it is but the window is stuck
if not app_instance.startingUp():
return _exec_window()
timer = QtCore.QTimer()
timer.setSingleShot(True)
timer.timeout.connect(_exec_window)
timer.start()
# This can become main Qt loop. Maybe should live elsewhere
app_instance.exec_()
return _output["out"]

View file

@ -0,0 +1,47 @@
from Qt import QtWidgets, QtCore, QtGui
class PressHoverButton(QtWidgets.QPushButton):
"""Keep track about mouse press/release and enter/leave."""
_mouse_pressed = False
_mouse_hovered = False
change_state = QtCore.Signal(bool)
def mousePressEvent(self, event):
self._mouse_pressed = True
self._mouse_hovered = True
self.change_state.emit(self._mouse_hovered)
super(PressHoverButton, self).mousePressEvent(event)
def mouseReleaseEvent(self, event):
self._mouse_pressed = False
self._mouse_hovered = False
self.change_state.emit(self._mouse_hovered)
super(PressHoverButton, self).mouseReleaseEvent(event)
def mouseMoveEvent(self, event):
mouse_pos = self.mapFromGlobal(QtGui.QCursor.pos())
under_mouse = self.rect().contains(mouse_pos)
if under_mouse != self._mouse_hovered:
self._mouse_hovered = under_mouse
self.change_state.emit(self._mouse_hovered)
super(PressHoverButton, self).mouseMoveEvent(event)
class PlaceholderLineEdit(QtWidgets.QLineEdit):
"""Set placeholder color of QLineEdit in Qt 5.12 and higher."""
def __init__(self, *args, **kwargs):
super(PlaceholderLineEdit, self).__init__(*args, **kwargs)
# Change placeholder palette color
if hasattr(QtGui.QPalette, "PlaceholderText"):
filter_palette = self.palette()
color = QtGui.QColor("#D3D8DE")
color.setAlpha(67)
filter_palette.setColor(
QtGui.QPalette.PlaceholderText,
color
)
self.setPalette(filter_palette)

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,177 @@
import attr
from enum import Enum
class UrlType(Enum):
HTTP = "http"
GIT = "git"
FILESYSTEM = "filesystem"
SERVER = "server"
@attr.s
class MultiPlatformPath(object):
windows = attr.ib(default=None)
linux = attr.ib(default=None)
darwin = attr.ib(default=None)
@attr.s
class SourceInfo(object):
type = attr.ib()
@attr.s
class LocalSourceInfo(SourceInfo):
path = attr.ib(default=attr.Factory(MultiPlatformPath))
@attr.s
class WebSourceInfo(SourceInfo):
url = attr.ib(default=None)
headers = attr.ib(default=None)
filename = attr.ib(default=None)
@attr.s
class ServerSourceInfo(SourceInfo):
filename = attr.ib(default=None)
path = attr.ib(default=None)
def convert_source(source):
"""Create source object from data information.
Args:
source (Dict[str, any]): Information about source.
Returns:
Union[None, SourceInfo]: Object with source information if type is
known.
"""
source_type = source.get("type")
if not source_type:
return None
if source_type == UrlType.FILESYSTEM.value:
return LocalSourceInfo(
type=source_type,
path=source["path"]
)
if source_type == UrlType.HTTP.value:
url = source["path"]
return WebSourceInfo(
type=source_type,
url=url,
headers=source.get("headers"),
filename=source.get("filename")
)
if source_type == UrlType.SERVER.value:
return ServerSourceInfo(
type=source_type,
filename=source.get("filename"),
path=source.get("path")
)
@attr.s
class VersionData(object):
version_data = attr.ib(default=None)
@attr.s
class AddonInfo(object):
"""Object matching json payload from Server"""
name = attr.ib()
version = attr.ib()
full_name = attr.ib()
title = attr.ib(default=None)
require_distribution = attr.ib(default=False)
sources = attr.ib(default=attr.Factory(list))
unknown_sources = attr.ib(default=attr.Factory(list))
hash = attr.ib(default=None)
description = attr.ib(default=None)
license = attr.ib(default=None)
authors = attr.ib(default=None)
@classmethod
def from_dict(cls, data):
sources = []
unknown_sources = []
production_version = data.get("productionVersion")
if not production_version:
return None
# server payload contains info about all versions
# active addon must have 'productionVersion' and matching version info
version_data = data.get("versions", {})[production_version]
source_info = version_data.get("clientSourceInfo")
require_distribution = source_info is not None
for source in (source_info or []):
addon_source = convert_source(source)
if addon_source is not None:
sources.append(addon_source)
else:
unknown_sources.append(source)
print(f"Unknown source {source.get('type')}")
full_name = "{}_{}".format(data["name"], production_version)
return cls(
name=data.get("name"),
version=production_version,
full_name=full_name,
require_distribution=require_distribution,
sources=sources,
unknown_sources=unknown_sources,
hash=data.get("hash"),
description=data.get("description"),
title=data.get("title"),
license=data.get("license"),
authors=data.get("authors")
)
@attr.s
class DependencyItem(object):
"""Object matching payload from Server about single dependency package"""
name = attr.ib()
platform = attr.ib()
checksum = attr.ib()
require_distribution = attr.ib()
sources = attr.ib(default=attr.Factory(list))
unknown_sources = attr.ib(default=attr.Factory(list))
addon_list = attr.ib(default=attr.Factory(list))
python_modules = attr.ib(default=attr.Factory(dict))
@classmethod
def from_dict(cls, package):
sources = []
unknown_sources = []
package_sources = package.get("sources")
require_distribution = package_sources is not None
for source in (package_sources or []):
dependency_source = convert_source(source)
if dependency_source is not None:
sources.append(dependency_source)
else:
print(f"Unknown source {source.get('type')}")
unknown_sources.append(source)
addon_list = [f"{name}_{version}"
for name, version in
package.get("supportedAddons").items()]
return cls(
name=package.get("name"),
platform=package.get("platform"),
require_distribution=require_distribution,
sources=sources,
unknown_sources=unknown_sources,
checksum=package.get("checksum"),
addon_list=addon_list,
python_modules=package.get("pythonModules")
)

View file

@ -62,7 +62,7 @@ class RemoteFileHandler:
return True
if not hash_type:
raise ValueError("Provide hash type, md5 or sha256")
if hash_type == 'md5':
if hash_type == "md5":
return RemoteFileHandler.check_md5(fpath, hash_value)
if hash_type == "sha256":
return RemoteFileHandler.check_sha256(fpath, hash_value)
@ -70,7 +70,7 @@ class RemoteFileHandler:
@staticmethod
def download_url(
url, root, filename=None,
sha256=None, max_redirect_hops=3
sha256=None, max_redirect_hops=3, headers=None
):
"""Download a file from a url and place it in root.
Args:
@ -82,6 +82,7 @@ class RemoteFileHandler:
If None, do not check
max_redirect_hops (int, optional): Maximum number of redirect
hops allowed
headers (dict): additional required headers - Authentication etc..
"""
root = os.path.expanduser(root)
if not filename:
@ -93,12 +94,13 @@ class RemoteFileHandler:
# check if file is already present locally
if RemoteFileHandler.check_integrity(fpath,
sha256, hash_type="sha256"):
print('Using downloaded and verified file: ' + fpath)
print(f"Using downloaded and verified file: {fpath}")
return
# expand redirect chain if needed
url = RemoteFileHandler._get_redirect_url(url,
max_hops=max_redirect_hops)
max_hops=max_redirect_hops,
headers=headers)
# check if file is located on Google Drive
file_id = RemoteFileHandler._get_google_drive_file_id(url)
@ -108,14 +110,17 @@ class RemoteFileHandler:
# download the file
try:
print('Downloading ' + url + ' to ' + fpath)
RemoteFileHandler._urlretrieve(url, fpath)
print(f"Downloading {url} to {fpath}")
RemoteFileHandler._urlretrieve(url, fpath, headers=headers)
except (urllib.error.URLError, IOError) as e:
if url[:5] == 'https':
url = url.replace('https:', 'http:')
print('Failed download. Trying https -> http instead.'
' Downloading ' + url + ' to ' + fpath)
RemoteFileHandler._urlretrieve(url, fpath)
if url[:5] == "https":
url = url.replace("https:", "http:")
print((
"Failed download. Trying https -> http instead."
f" Downloading {url} to {fpath}"
))
RemoteFileHandler._urlretrieve(url, fpath,
headers=headers)
else:
raise e
@ -216,11 +221,16 @@ class RemoteFileHandler:
tar_file.close()
@staticmethod
def _urlretrieve(url, filename, chunk_size):
def _urlretrieve(url, filename, chunk_size=None, headers=None):
final_headers = {"User-Agent": USER_AGENT}
if headers:
final_headers.update(headers)
chunk_size = chunk_size or 8192
with open(filename, "wb") as fh:
with urllib.request.urlopen(
urllib.request.Request(url,
headers={"User-Agent": USER_AGENT})) \
headers=final_headers)) \
as response:
for chunk in iter(lambda: response.read(chunk_size), ""):
if not chunk:
@ -228,13 +238,15 @@ class RemoteFileHandler:
fh.write(chunk)
@staticmethod
def _get_redirect_url(url, max_hops):
def _get_redirect_url(url, max_hops, headers=None):
initial_url = url
headers = {"Method": "HEAD", "User-Agent": USER_AGENT}
final_headers = {"Method": "HEAD", "User-Agent": USER_AGENT}
if headers:
final_headers.update(headers)
for _ in range(max_hops + 1):
with urllib.request.urlopen(
urllib.request.Request(url, headers=headers)) as response:
urllib.request.Request(url,
headers=final_headers)) as response:
if response.url == url or response.url is None:
return url

View file

@ -2,29 +2,29 @@ import pytest
import attr
import tempfile
from common.openpype_common.distribution.addon_distribution import (
AddonDownloader,
OSAddonDownloader,
HTTPAddonDownloader,
from common.ayon_common.distribution.addon_distribution import (
DownloadFactory,
OSDownloader,
HTTPDownloader,
AddonInfo,
update_addon_state,
AyonDistribution,
UpdateState
)
from common.openpype_common.distribution.addon_info import UrlType
from common.ayon_common.distribution.addon_info import UrlType
@pytest.fixture
def addon_downloader():
addon_downloader = AddonDownloader()
addon_downloader.register_format(UrlType.FILESYSTEM, OSAddonDownloader)
addon_downloader.register_format(UrlType.HTTP, HTTPAddonDownloader)
def addon_download_factory():
addon_downloader = DownloadFactory()
addon_downloader.register_format(UrlType.FILESYSTEM, OSDownloader)
addon_downloader.register_format(UrlType.HTTP, HTTPDownloader)
yield addon_downloader
@pytest.fixture
def http_downloader(addon_downloader):
yield addon_downloader.get_downloader(UrlType.HTTP.value)
def http_downloader(addon_download_factory):
yield addon_download_factory.get_downloader(UrlType.HTTP.value)
@pytest.fixture
@ -55,7 +55,8 @@ def sample_addon_info():
"clientSourceInfo": [
{
"type": "http",
"url": "https://drive.google.com/file/d/1TcuV8c2OV8CcbPeWi7lxOdqWsEqQNPYy/view?usp=sharing" # noqa
"path": "https://drive.google.com/file/d/1TcuV8c2OV8CcbPeWi7lxOdqWsEqQNPYy/view?usp=sharing", # noqa
"filename": "dummy.zip"
},
{
"type": "filesystem",
@ -84,19 +85,19 @@ def sample_addon_info():
def test_register(printer):
addon_downloader = AddonDownloader()
download_factory = DownloadFactory()
assert len(addon_downloader._downloaders) == 0, "Contains registered"
assert len(download_factory._downloaders) == 0, "Contains registered"
addon_downloader.register_format(UrlType.FILESYSTEM, OSAddonDownloader)
assert len(addon_downloader._downloaders) == 1, "Should contain one"
download_factory.register_format(UrlType.FILESYSTEM, OSDownloader)
assert len(download_factory._downloaders) == 1, "Should contain one"
def test_get_downloader(printer, addon_downloader):
assert addon_downloader.get_downloader(UrlType.FILESYSTEM.value), "Should find" # noqa
def test_get_downloader(printer, download_factory):
assert download_factory.get_downloader(UrlType.FILESYSTEM.value), "Should find" # noqa
with pytest.raises(ValueError):
addon_downloader.get_downloader("unknown"), "Shouldn't find"
download_factory.get_downloader("unknown"), "Shouldn't find"
def test_addon_info(printer, sample_addon_info):
@ -147,21 +148,36 @@ def test_addon_info(printer, sample_addon_info):
def test_update_addon_state(printer, sample_addon_info,
temp_folder, addon_downloader):
temp_folder, download_factory):
"""Tests possible cases of addon update."""
addon_info = AddonInfo.from_dict(sample_addon_info)
orig_hash = addon_info.hash
# Cause crash because of invalid hash
addon_info.hash = "brokenhash"
result = update_addon_state([addon_info], temp_folder, addon_downloader)
assert result["openpype_slack_1.0.0"] == UpdateState.FAILED.value, \
"Update should failed because of wrong hash"
distribution = AyonDistribution(
temp_folder, temp_folder, download_factory, [addon_info], None
)
distribution.distribute()
dist_items = distribution.get_addons_dist_items()
slack_state = dist_items["openpype_slack_1.0.0"].state
assert slack_state == UpdateState.UPDATE_FAILED, (
"Update should have failed because of wrong hash")
# Fix cache and validate if was updated
addon_info.hash = orig_hash
result = update_addon_state([addon_info], temp_folder, addon_downloader)
assert result["openpype_slack_1.0.0"] == UpdateState.UPDATED.value, \
"Addon should have been updated"
distribution = AyonDistribution(
temp_folder, temp_folder, download_factory, [addon_info], None
)
distribution.distribute()
dist_items = distribution.get_addons_dist_items()
assert dist_items["openpype_slack_1.0.0"].state == UpdateState.UPDATED, (
"Addon should have been updated")
result = update_addon_state([addon_info], temp_folder, addon_downloader)
assert result["openpype_slack_1.0.0"] == UpdateState.EXISTS.value, \
"Addon should already exist"
# Is UPDATED without calling distribute
distribution = AyonDistribution(
temp_folder, temp_folder, download_factory, [addon_info], None
)
dist_items = distribution.get_addons_dist_items()
assert dist_items["openpype_slack_1.0.0"].state == UpdateState.UPDATED, (
"Addon should already exist")

Binary file not shown.

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.2 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 16 KiB

View file

@ -0,0 +1,21 @@
import os
RESOURCES_DIR = os.path.dirname(os.path.abspath(__file__))
def get_resource_path(*args):
path_items = list(args)
path_items.insert(0, RESOURCES_DIR)
return os.path.sep.join(path_items)
def get_icon_path():
return get_resource_path("AYON.png")
def load_stylesheet():
stylesheet_path = get_resource_path("stylesheet.css")
with open(stylesheet_path, "r") as stream:
content = stream.read()
return content

Binary file not shown.

After

Width:  |  Height:  |  Size: 8.9 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.1 KiB

View file

@ -0,0 +1,84 @@
* {
font-size: 10pt;
font-family: "Noto Sans";
font-weight: 450;
outline: none;
}
QWidget {
color: #D3D8DE;
background: #2C313A;
border-radius: 0px;
}
QWidget:disabled {
color: #5b6779;
}
QLabel {
background: transparent;
}
QPushButton {
text-align:center center;
border: 0px solid transparent;
border-radius: 0.2em;
padding: 3px 5px 3px 5px;
background: #434a56;
}
QPushButton:hover {
background: rgba(168, 175, 189, 0.3);
color: #F0F2F5;
}
QPushButton:pressed {}
QPushButton:disabled {
background: #434a56;
}
QLineEdit {
border: 1px solid #373D48;
border-radius: 0.3em;
background: #21252B;
padding: 0.1em;
}
QLineEdit:disabled {
background: #2C313A;
}
QLineEdit:hover {
border-color: rgba(168, 175, 189, .3);
}
QLineEdit:focus {
border-color: rgb(92, 173, 214);
}
QLineEdit[state="invalid"] {
border-color: #AA5050;
}
#Separator {
background: rgba(75, 83, 98, 127);
}
#PasswordBtn {
border: none;
padding: 0.1em;
background: transparent;
}
#PasswordBtn:hover {
background: #434a56;
}
#LikeDisabledInput {
background: #2C313A;
}
#LikeDisabledInput:hover {
border-color: #373D48;
}
#LikeDisabledInput:focus {
border-color: #373D48;
}

View file

@ -0,0 +1,52 @@
import os
import appdirs
def get_ayon_appdirs(*args):
"""Local app data directory of AYON client.
Args:
*args (Iterable[str]): Subdirectories/files in local app data dir.
Returns:
str: Path to directory/file in local app data dir.
"""
return os.path.join(
appdirs.user_data_dir("ayon", "ynput"),
*args
)
def _create_local_site_id():
"""Create a local site identifier."""
from coolname import generate_slug
new_id = generate_slug(3)
print("Created local site id \"{}\"".format(new_id))
return new_id
def get_local_site_id():
"""Get local site identifier.
Site id is created if does not exist yet.
"""
# used for background syncing
site_id = os.environ.get("AYON_SITE_ID")
if site_id:
return site_id
site_id_path = get_ayon_appdirs("site_id")
if os.path.exists(site_id_path):
with open(site_id_path, "r") as stream:
site_id = stream.read()
if not site_id:
site_id = _create_local_site_id()
with open(site_id_path, "w") as stream:
stream.write(site_id)
return site_id

View file

@ -1,208 +0,0 @@
import os
from enum import Enum
from abc import abstractmethod
import attr
import logging
import requests
import platform
import shutil
from .file_handler import RemoteFileHandler
from .addon_info import AddonInfo
class UpdateState(Enum):
EXISTS = "exists"
UPDATED = "updated"
FAILED = "failed"
class AddonDownloader:
log = logging.getLogger(__name__)
def __init__(self):
self._downloaders = {}
def register_format(self, downloader_type, downloader):
self._downloaders[downloader_type.value] = downloader
def get_downloader(self, downloader_type):
downloader = self._downloaders.get(downloader_type)
if not downloader:
raise ValueError(f"{downloader_type} not implemented")
return downloader()
@classmethod
@abstractmethod
def download(cls, source, destination):
"""Returns url to downloaded addon zip file.
Args:
source (dict): {type:"http", "url":"https://} ...}
destination (str): local folder to unzip
Returns:
(str) local path to addon zip file
"""
pass
@classmethod
def check_hash(cls, addon_path, addon_hash):
"""Compares 'hash' of downloaded 'addon_url' file.
Args:
addon_path (str): local path to addon zip file
addon_hash (str): sha256 hash of zip file
Raises:
ValueError if hashes doesn't match
"""
if not os.path.exists(addon_path):
raise ValueError(f"{addon_path} doesn't exist.")
if not RemoteFileHandler.check_integrity(addon_path,
addon_hash,
hash_type="sha256"):
raise ValueError(f"{addon_path} doesn't match expected hash.")
@classmethod
def unzip(cls, addon_zip_path, destination):
"""Unzips local 'addon_zip_path' to 'destination'.
Args:
addon_zip_path (str): local path to addon zip file
destination (str): local folder to unzip
"""
RemoteFileHandler.unzip(addon_zip_path, destination)
os.remove(addon_zip_path)
@classmethod
def remove(cls, addon_url):
pass
class OSAddonDownloader(AddonDownloader):
@classmethod
def download(cls, source, destination):
# OS doesnt need to download, unzip directly
addon_url = source["path"].get(platform.system().lower())
if not os.path.exists(addon_url):
raise ValueError("{} is not accessible".format(addon_url))
return addon_url
class HTTPAddonDownloader(AddonDownloader):
CHUNK_SIZE = 100000
@classmethod
def download(cls, source, destination):
source_url = source["url"]
cls.log.debug(f"Downloading {source_url} to {destination}")
file_name = os.path.basename(destination)
_, ext = os.path.splitext(file_name)
if (ext.replace(".", '') not
in set(RemoteFileHandler.IMPLEMENTED_ZIP_FORMATS)):
file_name += ".zip"
RemoteFileHandler.download_url(source_url,
destination,
filename=file_name)
return os.path.join(destination, file_name)
def get_addons_info(server_endpoint):
"""Returns list of addon information from Server"""
# TODO temp
# addon_info = AddonInfo(
# **{"name": "openpype_slack",
# "version": "1.0.0",
# "addon_url": "c:/projects/openpype_slack_1.0.0.zip",
# "type": UrlType.FILESYSTEM,
# "hash": "4be25eb6215e91e5894d3c5475aeb1e379d081d3f5b43b4ee15b0891cf5f5658"}) # noqa
#
# http_addon = AddonInfo(
# **{"name": "openpype_slack",
# "version": "1.0.0",
# "addon_url": "https://drive.google.com/file/d/1TcuV8c2OV8CcbPeWi7lxOdqWsEqQNPYy/view?usp=sharing", # noqa
# "type": UrlType.HTTP,
# "hash": "4be25eb6215e91e5894d3c5475aeb1e379d081d3f5b43b4ee15b0891cf5f5658"}) # noqa
response = requests.get(server_endpoint)
if not response.ok:
raise Exception(response.text)
addons_info = []
for addon in response.json():
addons_info.append(AddonInfo(**addon))
return addons_info
def update_addon_state(addon_infos, destination_folder, factory,
log=None):
"""Loops through all 'addon_infos', compares local version, unzips.
Loops through server provided list of dictionaries with information about
available addons. Looks if each addon is already present and deployed.
If isn't, addon zip gets downloaded and unzipped into 'destination_folder'.
Args:
addon_infos (list of AddonInfo)
destination_folder (str): local path
factory (AddonDownloader): factory to get appropriate downloader per
addon type
log (logging.Logger)
Returns:
(dict): {"addon_full_name": UpdateState.value
(eg. "exists"|"updated"|"failed")
"""
if not log:
log = logging.getLogger(__name__)
download_states = {}
for addon in addon_infos:
full_name = "{}_{}".format(addon.name, addon.version)
addon_dest = os.path.join(destination_folder, full_name)
if os.path.isdir(addon_dest):
log.debug(f"Addon version folder {addon_dest} already exists.")
download_states[full_name] = UpdateState.EXISTS.value
continue
for source in addon.sources:
download_states[full_name] = UpdateState.FAILED.value
try:
downloader = factory.get_downloader(source.type)
zip_file_path = downloader.download(attr.asdict(source),
addon_dest)
downloader.check_hash(zip_file_path, addon.hash)
downloader.unzip(zip_file_path, addon_dest)
download_states[full_name] = UpdateState.UPDATED.value
break
except Exception:
log.warning(f"Error happened during updating {addon.name}",
exc_info=True)
if os.path.isdir(addon_dest):
log.debug(f"Cleaning {addon_dest}")
shutil.rmtree(addon_dest)
return download_states
def check_addons(server_endpoint, addon_folder, downloaders):
"""Main entry point to compare existing addons with those on server.
Args:
server_endpoint (str): url to v4 server endpoint
addon_folder (str): local dir path for addons
downloaders (AddonDownloader): factory of downloaders
Raises:
(RuntimeError) if any addon failed update
"""
addons_info = get_addons_info(server_endpoint)
result = update_addon_state(addons_info,
addon_folder,
downloaders)
if UpdateState.FAILED.value in result.values():
raise RuntimeError(f"Unable to update some addons {result}")
def cli(*args):
raise NotImplementedError

View file

@ -1,80 +0,0 @@
import attr
from enum import Enum
class UrlType(Enum):
HTTP = "http"
GIT = "git"
FILESYSTEM = "filesystem"
@attr.s
class MultiPlatformPath(object):
windows = attr.ib(default=None)
linux = attr.ib(default=None)
darwin = attr.ib(default=None)
@attr.s
class AddonSource(object):
type = attr.ib()
@attr.s
class LocalAddonSource(AddonSource):
path = attr.ib(default=attr.Factory(MultiPlatformPath))
@attr.s
class WebAddonSource(AddonSource):
url = attr.ib(default=None)
@attr.s
class VersionData(object):
version_data = attr.ib(default=None)
@attr.s
class AddonInfo(object):
"""Object matching json payload from Server"""
name = attr.ib()
version = attr.ib()
title = attr.ib(default=None)
sources = attr.ib(default=attr.Factory(dict))
hash = attr.ib(default=None)
description = attr.ib(default=None)
license = attr.ib(default=None)
authors = attr.ib(default=None)
@classmethod
def from_dict(cls, data):
sources = []
production_version = data.get("productionVersion")
if not production_version:
return
# server payload contains info about all versions
# active addon must have 'productionVersion' and matching version info
version_data = data.get("versions", {})[production_version]
for source in version_data.get("clientSourceInfo", []):
if source.get("type") == UrlType.FILESYSTEM.value:
source_addon = LocalAddonSource(type=source["type"],
path=source["path"])
if source.get("type") == UrlType.HTTP.value:
source_addon = WebAddonSource(type=source["type"],
url=source["url"])
sources.append(source_addon)
return cls(name=data.get("name"),
version=production_version,
sources=sources,
hash=data.get("hash"),
description=data.get("description"),
title=data.get("title"),
license=data.get("license"),
authors=data.get("authors"))

View file

@ -3,3 +3,5 @@ import os
PACKAGE_DIR = os.path.dirname(os.path.abspath(__file__))
PLUGINS_DIR = os.path.join(PACKAGE_DIR, "plugins")
AYON_SERVER_ENABLED = os.environ.get("USE_AYON_SERVER") == "1"

File diff suppressed because it is too large Load diff

View file

@ -1,243 +1,6 @@
from .mongo import get_project_connection
from .entities import (
get_assets,
get_asset_by_id,
get_version_by_id,
get_representation_by_id,
convert_id,
)
from openpype import AYON_SERVER_ENABLED
def get_linked_asset_ids(project_name, asset_doc=None, asset_id=None):
"""Extract linked asset ids from asset document.
One of asset document or asset id must be passed.
Note:
Asset links now works only from asset to assets.
Args:
asset_doc (dict): Asset document from DB.
Returns:
List[Union[ObjectId, str]]: Asset ids of input links.
"""
output = []
if not asset_doc and not asset_id:
return output
if not asset_doc:
asset_doc = get_asset_by_id(
project_name, asset_id, fields=["data.inputLinks"]
)
input_links = asset_doc["data"].get("inputLinks")
if not input_links:
return output
for item in input_links:
# Backwards compatibility for "_id" key which was replaced with
# "id"
if "_id" in item:
link_id = item["_id"]
else:
link_id = item["id"]
output.append(link_id)
return output
def get_linked_assets(
project_name, asset_doc=None, asset_id=None, fields=None
):
"""Return linked assets based on passed asset document.
One of asset document or asset id must be passed.
Args:
project_name (str): Name of project where to look for queried entities.
asset_doc (Dict[str, Any]): Asset document from database.
asset_id (Union[ObjectId, str]): Asset id. Can be used instead of
asset document.
fields (Iterable[str]): Fields that should be returned. All fields are
returned if 'None' is passed.
Returns:
List[Dict[str, Any]]: Asset documents of input links for passed
asset doc.
"""
if not asset_doc:
if not asset_id:
return []
asset_doc = get_asset_by_id(
project_name,
asset_id,
fields=["data.inputLinks"]
)
if not asset_doc:
return []
link_ids = get_linked_asset_ids(project_name, asset_doc=asset_doc)
if not link_ids:
return []
return list(get_assets(project_name, asset_ids=link_ids, fields=fields))
def get_linked_representation_id(
project_name, repre_doc=None, repre_id=None, link_type=None, max_depth=None
):
"""Returns list of linked ids of particular type (if provided).
One of representation document or representation id must be passed.
Note:
Representation links now works only from representation through version
back to representations.
Args:
project_name (str): Name of project where look for links.
repre_doc (Dict[str, Any]): Representation document.
repre_id (Union[ObjectId, str]): Representation id.
link_type (str): Type of link (e.g. 'reference', ...).
max_depth (int): Limit recursion level. Default: 0
Returns:
List[ObjectId] Linked representation ids.
"""
if repre_doc:
repre_id = repre_doc["_id"]
if repre_id:
repre_id = convert_id(repre_id)
if not repre_id and not repre_doc:
return []
version_id = None
if repre_doc:
version_id = repre_doc.get("parent")
if not version_id:
repre_doc = get_representation_by_id(
project_name, repre_id, fields=["parent"]
)
version_id = repre_doc["parent"]
if not version_id:
return []
version_doc = get_version_by_id(
project_name, version_id, fields=["type", "version_id"]
)
if version_doc["type"] == "hero_version":
version_id = version_doc["version_id"]
if max_depth is None:
max_depth = 0
match = {
"_id": version_id,
# Links are not stored to hero versions at this moment so filter
# is limited to just versions
"type": "version"
}
graph_lookup = {
"from": project_name,
"startWith": "$data.inputLinks.id",
"connectFromField": "data.inputLinks.id",
"connectToField": "_id",
"as": "outputs_recursive",
"depthField": "depth"
}
if max_depth != 0:
# We offset by -1 since 0 basically means no recursion
# but the recursion only happens after the initial lookup
# for outputs.
graph_lookup["maxDepth"] = max_depth - 1
query_pipeline = [
# Match
{"$match": match},
# Recursive graph lookup for inputs
{"$graphLookup": graph_lookup}
]
conn = get_project_connection(project_name)
result = conn.aggregate(query_pipeline)
referenced_version_ids = _process_referenced_pipeline_result(
result, link_type
)
if not referenced_version_ids:
return []
ref_ids = conn.distinct(
"_id",
filter={
"parent": {"$in": list(referenced_version_ids)},
"type": "representation"
}
)
return list(ref_ids)
def _process_referenced_pipeline_result(result, link_type):
"""Filters result from pipeline for particular link_type.
Pipeline cannot use link_type directly in a query.
Returns:
(list)
"""
referenced_version_ids = set()
correctly_linked_ids = set()
for item in result:
input_links = item.get("data", {}).get("inputLinks")
if not input_links:
continue
_filter_input_links(
input_links,
link_type,
correctly_linked_ids
)
# outputs_recursive in random order, sort by depth
outputs_recursive = item.get("outputs_recursive")
if not outputs_recursive:
continue
for output in sorted(outputs_recursive, key=lambda o: o["depth"]):
output_links = output.get("data", {}).get("inputLinks")
if not output_links and output["type"] != "hero_version":
continue
# Leaf
if output["_id"] not in correctly_linked_ids:
continue
_filter_input_links(
output_links,
link_type,
correctly_linked_ids
)
referenced_version_ids.add(output["_id"])
return referenced_version_ids
def _filter_input_links(input_links, link_type, correctly_linked_ids):
if not input_links: # to handle hero versions
return
for input_link in input_links:
if link_type and input_link["type"] != link_type:
continue
link_id = input_link.get("id") or input_link.get("_id")
if link_id is not None:
correctly_linked_ids.add(link_id)
if not AYON_SERVER_ENABLED:
from .mongo.entity_links import *
else:
from .server.entity_links import *

View file

@ -0,0 +1,20 @@
from .mongo import (
MongoEnvNotSet,
get_default_components,
should_add_certificate_path_to_mongo_url,
validate_mongo_connection,
OpenPypeMongoConnection,
get_project_database,
get_project_connection,
)
__all__ = (
"MongoEnvNotSet",
"get_default_components",
"should_add_certificate_path_to_mongo_url",
"validate_mongo_connection",
"OpenPypeMongoConnection",
"get_project_database",
"get_project_connection",
)

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,244 @@
from .mongo import get_project_connection
from .entities import (
get_assets,
get_asset_by_id,
get_version_by_id,
get_representation_by_id,
convert_id,
)
def get_linked_asset_ids(project_name, asset_doc=None, asset_id=None):
"""Extract linked asset ids from asset document.
One of asset document or asset id must be passed.
Note:
Asset links now works only from asset to assets.
Args:
asset_doc (dict): Asset document from DB.
Returns:
List[Union[ObjectId, str]]: Asset ids of input links.
"""
output = []
if not asset_doc and not asset_id:
return output
if not asset_doc:
asset_doc = get_asset_by_id(
project_name, asset_id, fields=["data.inputLinks"]
)
input_links = asset_doc["data"].get("inputLinks")
if not input_links:
return output
for item in input_links:
# Backwards compatibility for "_id" key which was replaced with
# "id"
if "_id" in item:
link_id = item["_id"]
else:
link_id = item["id"]
output.append(link_id)
return output
def get_linked_assets(
project_name, asset_doc=None, asset_id=None, fields=None
):
"""Return linked assets based on passed asset document.
One of asset document or asset id must be passed.
Args:
project_name (str): Name of project where to look for queried entities.
asset_doc (Dict[str, Any]): Asset document from database.
asset_id (Union[ObjectId, str]): Asset id. Can be used instead of
asset document.
fields (Iterable[str]): Fields that should be returned. All fields are
returned if 'None' is passed.
Returns:
List[Dict[str, Any]]: Asset documents of input links for passed
asset doc.
"""
if not asset_doc:
if not asset_id:
return []
asset_doc = get_asset_by_id(
project_name,
asset_id,
fields=["data.inputLinks"]
)
if not asset_doc:
return []
link_ids = get_linked_asset_ids(project_name, asset_doc=asset_doc)
if not link_ids:
return []
return list(get_assets(project_name, asset_ids=link_ids, fields=fields))
def get_linked_representation_id(
project_name, repre_doc=None, repre_id=None, link_type=None, max_depth=None
):
"""Returns list of linked ids of particular type (if provided).
One of representation document or representation id must be passed.
Note:
Representation links now works only from representation through version
back to representations.
Args:
project_name (str): Name of project where look for links.
repre_doc (Dict[str, Any]): Representation document.
repre_id (Union[ObjectId, str]): Representation id.
link_type (str): Type of link (e.g. 'reference', ...).
max_depth (int): Limit recursion level. Default: 0
Returns:
List[ObjectId] Linked representation ids.
"""
if repre_doc:
repre_id = repre_doc["_id"]
if repre_id:
repre_id = convert_id(repre_id)
if not repre_id and not repre_doc:
return []
version_id = None
if repre_doc:
version_id = repre_doc.get("parent")
if not version_id:
repre_doc = get_representation_by_id(
project_name, repre_id, fields=["parent"]
)
version_id = repre_doc["parent"]
if not version_id:
return []
version_doc = get_version_by_id(
project_name, version_id, fields=["type", "version_id"]
)
if version_doc["type"] == "hero_version":
version_id = version_doc["version_id"]
if max_depth is None:
max_depth = 0
match = {
"_id": version_id,
# Links are not stored to hero versions at this moment so filter
# is limited to just versions
"type": "version"
}
graph_lookup = {
"from": project_name,
"startWith": "$data.inputLinks.id",
"connectFromField": "data.inputLinks.id",
"connectToField": "_id",
"as": "outputs_recursive",
"depthField": "depth"
}
if max_depth != 0:
# We offset by -1 since 0 basically means no recursion
# but the recursion only happens after the initial lookup
# for outputs.
graph_lookup["maxDepth"] = max_depth - 1
query_pipeline = [
# Match
{"$match": match},
# Recursive graph lookup for inputs
{"$graphLookup": graph_lookup}
]
conn = get_project_connection(project_name)
result = conn.aggregate(query_pipeline)
referenced_version_ids = _process_referenced_pipeline_result(
result, link_type
)
if not referenced_version_ids:
return []
ref_ids = conn.distinct(
"_id",
filter={
"parent": {"$in": list(referenced_version_ids)},
"type": "representation"
}
)
return list(ref_ids)
def _process_referenced_pipeline_result(result, link_type):
"""Filters result from pipeline for particular link_type.
Pipeline cannot use link_type directly in a query.
Returns:
(list)
"""
referenced_version_ids = set()
correctly_linked_ids = set()
for item in result:
input_links = item.get("data", {}).get("inputLinks")
if not input_links:
continue
_filter_input_links(
input_links,
link_type,
correctly_linked_ids
)
# outputs_recursive in random order, sort by depth
outputs_recursive = item.get("outputs_recursive")
if not outputs_recursive:
continue
for output in sorted(outputs_recursive, key=lambda o: o["depth"]):
output_links = output.get("data", {}).get("inputLinks")
if not output_links and output["type"] != "hero_version":
continue
# Leaf
if output["_id"] not in correctly_linked_ids:
continue
_filter_input_links(
output_links,
link_type,
correctly_linked_ids
)
referenced_version_ids.add(output["_id"])
return referenced_version_ids
def _filter_input_links(input_links, link_type, correctly_linked_ids):
if not input_links: # to handle hero versions
return
for input_link in input_links:
if link_type and input_link["type"] != link_type:
continue
link_id = input_link.get("id") or input_link.get("_id")
if link_id is not None:
correctly_linked_ids.add(link_id)

View file

@ -11,6 +11,7 @@ from bson.json_util import (
CANONICAL_JSON_OPTIONS
)
from openpype import AYON_SERVER_ENABLED
if sys.version_info[0] == 2:
from urlparse import urlparse, parse_qs
else:
@ -206,6 +207,8 @@ class OpenPypeMongoConnection:
@classmethod
def create_connection(cls, mongo_url, timeout=None, retry_attempts=None):
if AYON_SERVER_ENABLED:
raise RuntimeError("Created mongo connection in AYON mode")
parsed = urlparse(mongo_url)
# Force validation of scheme
if parsed.scheme not in ["mongodb", "mongodb+srv"]:

View file

@ -0,0 +1,632 @@
import re
import copy
import collections
from bson.objectid import ObjectId
from pymongo import DeleteOne, InsertOne, UpdateOne
from openpype.client.operations_base import (
REMOVED_VALUE,
CreateOperation,
UpdateOperation,
DeleteOperation,
BaseOperationsSession
)
from .mongo import get_project_connection
from .entities import get_project
PROJECT_NAME_ALLOWED_SYMBOLS = "a-zA-Z0-9_"
PROJECT_NAME_REGEX = re.compile(
"^[{}]+$".format(PROJECT_NAME_ALLOWED_SYMBOLS)
)
CURRENT_PROJECT_SCHEMA = "openpype:project-3.0"
CURRENT_PROJECT_CONFIG_SCHEMA = "openpype:config-2.0"
CURRENT_ASSET_DOC_SCHEMA = "openpype:asset-3.0"
CURRENT_SUBSET_SCHEMA = "openpype:subset-3.0"
CURRENT_VERSION_SCHEMA = "openpype:version-3.0"
CURRENT_HERO_VERSION_SCHEMA = "openpype:hero_version-1.0"
CURRENT_REPRESENTATION_SCHEMA = "openpype:representation-2.0"
CURRENT_WORKFILE_INFO_SCHEMA = "openpype:workfile-1.0"
CURRENT_THUMBNAIL_SCHEMA = "openpype:thumbnail-1.0"
def _create_or_convert_to_mongo_id(mongo_id):
if mongo_id is None:
return ObjectId()
return ObjectId(mongo_id)
def new_project_document(
project_name, project_code, config, data=None, entity_id=None
):
"""Create skeleton data of project document.
Args:
project_name (str): Name of project. Used as identifier of a project.
project_code (str): Shorter version of projet without spaces and
special characters (in most of cases). Should be also considered
as unique name across projects.
config (Dic[str, Any]): Project config consist of roots, templates,
applications and other project Anatomy related data.
data (Dict[str, Any]): Project data with information about it's
attributes (e.g. 'fps' etc.) or integration specific keys.
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
created if not passed.
Returns:
Dict[str, Any]: Skeleton of project document.
"""
if data is None:
data = {}
data["code"] = project_code
return {
"_id": _create_or_convert_to_mongo_id(entity_id),
"name": project_name,
"type": CURRENT_PROJECT_SCHEMA,
"entity_data": data,
"config": config
}
def new_asset_document(
name, project_id, parent_id, parents, data=None, entity_id=None
):
"""Create skeleton data of asset document.
Args:
name (str): Is considered as unique identifier of asset in project.
project_id (Union[str, ObjectId]): Id of project doument.
parent_id (Union[str, ObjectId]): Id of parent asset.
parents (List[str]): List of parent assets names.
data (Dict[str, Any]): Asset document data. Empty dictionary is used
if not passed. Value of 'parent_id' is used to fill 'visualParent'.
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
created if not passed.
Returns:
Dict[str, Any]: Skeleton of asset document.
"""
if data is None:
data = {}
if parent_id is not None:
parent_id = ObjectId(parent_id)
data["visualParent"] = parent_id
data["parents"] = parents
return {
"_id": _create_or_convert_to_mongo_id(entity_id),
"type": "asset",
"name": name,
"parent": ObjectId(project_id),
"data": data,
"schema": CURRENT_ASSET_DOC_SCHEMA
}
def new_subset_document(name, family, asset_id, data=None, entity_id=None):
"""Create skeleton data of subset document.
Args:
name (str): Is considered as unique identifier of subset under asset.
family (str): Subset's family.
asset_id (Union[str, ObjectId]): Id of parent asset.
data (Dict[str, Any]): Subset document data. Empty dictionary is used
if not passed. Value of 'family' is used to fill 'family'.
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
created if not passed.
Returns:
Dict[str, Any]: Skeleton of subset document.
"""
if data is None:
data = {}
data["family"] = family
return {
"_id": _create_or_convert_to_mongo_id(entity_id),
"schema": CURRENT_SUBSET_SCHEMA,
"type": "subset",
"name": name,
"data": data,
"parent": asset_id
}
def new_version_doc(version, subset_id, data=None, entity_id=None):
"""Create skeleton data of version document.
Args:
version (int): Is considered as unique identifier of version
under subset.
subset_id (Union[str, ObjectId]): Id of parent subset.
data (Dict[str, Any]): Version document data.
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
created if not passed.
Returns:
Dict[str, Any]: Skeleton of version document.
"""
if data is None:
data = {}
return {
"_id": _create_or_convert_to_mongo_id(entity_id),
"schema": CURRENT_VERSION_SCHEMA,
"type": "version",
"name": int(version),
"parent": subset_id,
"data": data
}
def new_hero_version_doc(version_id, subset_id, data=None, entity_id=None):
"""Create skeleton data of hero version document.
Args:
version_id (ObjectId): Is considered as unique identifier of version
under subset.
subset_id (Union[str, ObjectId]): Id of parent subset.
data (Dict[str, Any]): Version document data.
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
created if not passed.
Returns:
Dict[str, Any]: Skeleton of version document.
"""
if data is None:
data = {}
return {
"_id": _create_or_convert_to_mongo_id(entity_id),
"schema": CURRENT_HERO_VERSION_SCHEMA,
"type": "hero_version",
"version_id": version_id,
"parent": subset_id,
"data": data
}
def new_representation_doc(
name, version_id, context, data=None, entity_id=None
):
"""Create skeleton data of asset document.
Args:
version (int): Is considered as unique identifier of version
under subset.
version_id (Union[str, ObjectId]): Id of parent version.
context (Dict[str, Any]): Representation context used for fill template
of to query.
data (Dict[str, Any]): Representation document data.
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
created if not passed.
Returns:
Dict[str, Any]: Skeleton of version document.
"""
if data is None:
data = {}
return {
"_id": _create_or_convert_to_mongo_id(entity_id),
"schema": CURRENT_REPRESENTATION_SCHEMA,
"type": "representation",
"parent": version_id,
"name": name,
"data": data,
# Imprint shortcut to context for performance reasons.
"context": context
}
def new_thumbnail_doc(data=None, entity_id=None):
"""Create skeleton data of thumbnail document.
Args:
data (Dict[str, Any]): Thumbnail document data.
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
created if not passed.
Returns:
Dict[str, Any]: Skeleton of thumbnail document.
"""
if data is None:
data = {}
return {
"_id": _create_or_convert_to_mongo_id(entity_id),
"type": "thumbnail",
"schema": CURRENT_THUMBNAIL_SCHEMA,
"data": data
}
def new_workfile_info_doc(
filename, asset_id, task_name, files, data=None, entity_id=None
):
"""Create skeleton data of workfile info document.
Workfile document is at this moment used primarily for artist notes.
Args:
filename (str): Filename of workfile.
asset_id (Union[str, ObjectId]): Id of asset under which workfile live.
task_name (str): Task under which was workfile created.
files (List[str]): List of rootless filepaths related to workfile.
data (Dict[str, Any]): Additional metadata.
Returns:
Dict[str, Any]: Skeleton of workfile info document.
"""
if not data:
data = {}
return {
"_id": _create_or_convert_to_mongo_id(entity_id),
"type": "workfile",
"parent": ObjectId(asset_id),
"task_name": task_name,
"filename": filename,
"data": data,
"files": files
}
def _prepare_update_data(old_doc, new_doc, replace):
changes = {}
for key, value in new_doc.items():
if key not in old_doc or value != old_doc[key]:
changes[key] = value
if replace:
for key in old_doc.keys():
if key not in new_doc:
changes[key] = REMOVED_VALUE
return changes
def prepare_subset_update_data(old_doc, new_doc, replace=True):
"""Compare two subset documents and prepare update data.
Based on compared values will create update data for
'MongoUpdateOperation'.
Empty output means that documents are identical.
Returns:
Dict[str, Any]: Changes between old and new document.
"""
return _prepare_update_data(old_doc, new_doc, replace)
def prepare_version_update_data(old_doc, new_doc, replace=True):
"""Compare two version documents and prepare update data.
Based on compared values will create update data for
'MongoUpdateOperation'.
Empty output means that documents are identical.
Returns:
Dict[str, Any]: Changes between old and new document.
"""
return _prepare_update_data(old_doc, new_doc, replace)
def prepare_hero_version_update_data(old_doc, new_doc, replace=True):
"""Compare two hero version documents and prepare update data.
Based on compared values will create update data for 'UpdateOperation'.
Empty output means that documents are identical.
Returns:
Dict[str, Any]: Changes between old and new document.
"""
return _prepare_update_data(old_doc, new_doc, replace)
def prepare_representation_update_data(old_doc, new_doc, replace=True):
"""Compare two representation documents and prepare update data.
Based on compared values will create update data for
'MongoUpdateOperation'.
Empty output means that documents are identical.
Returns:
Dict[str, Any]: Changes between old and new document.
"""
return _prepare_update_data(old_doc, new_doc, replace)
def prepare_workfile_info_update_data(old_doc, new_doc, replace=True):
"""Compare two workfile info documents and prepare update data.
Based on compared values will create update data for
'MongoUpdateOperation'.
Empty output means that documents are identical.
Returns:
Dict[str, Any]: Changes between old and new document.
"""
return _prepare_update_data(old_doc, new_doc, replace)
class MongoCreateOperation(CreateOperation):
"""Operation to create an entity.
Args:
project_name (str): On which project operation will happen.
entity_type (str): Type of entity on which change happens.
e.g. 'asset', 'representation' etc.
data (Dict[str, Any]): Data of entity that will be created.
"""
operation_name = "create"
def __init__(self, project_name, entity_type, data):
super(MongoCreateOperation, self).__init__(
project_name, entity_type, data
)
if "_id" not in self._data:
self._data["_id"] = ObjectId()
else:
self._data["_id"] = ObjectId(self._data["_id"])
@property
def entity_id(self):
return self._data["_id"]
def to_mongo_operation(self):
return InsertOne(copy.deepcopy(self._data))
class MongoUpdateOperation(UpdateOperation):
"""Operation to update an entity.
Args:
project_name (str): On which project operation will happen.
entity_type (str): Type of entity on which change happens.
e.g. 'asset', 'representation' etc.
entity_id (Union[str, ObjectId]): Identifier of an entity.
update_data (Dict[str, Any]): Key -> value changes that will be set in
database. If value is set to 'REMOVED_VALUE' the key will be
removed. Only first level of dictionary is checked (on purpose).
"""
operation_name = "update"
def __init__(self, project_name, entity_type, entity_id, update_data):
super(MongoUpdateOperation, self).__init__(
project_name, entity_type, entity_id, update_data
)
self._entity_id = ObjectId(self._entity_id)
def to_mongo_operation(self):
unset_data = {}
set_data = {}
for key, value in self._update_data.items():
if value is REMOVED_VALUE:
unset_data[key] = None
else:
set_data[key] = value
op_data = {}
if unset_data:
op_data["$unset"] = unset_data
if set_data:
op_data["$set"] = set_data
if not op_data:
return None
return UpdateOne(
{"_id": self.entity_id},
op_data
)
class MongoDeleteOperation(DeleteOperation):
"""Operation to delete an entity.
Args:
project_name (str): On which project operation will happen.
entity_type (str): Type of entity on which change happens.
e.g. 'asset', 'representation' etc.
entity_id (Union[str, ObjectId]): Entity id that will be removed.
"""
operation_name = "delete"
def __init__(self, project_name, entity_type, entity_id):
super(MongoDeleteOperation, self).__init__(
project_name, entity_type, entity_id
)
self._entity_id = ObjectId(self._entity_id)
def to_mongo_operation(self):
return DeleteOne({"_id": self.entity_id})
class MongoOperationsSession(BaseOperationsSession):
"""Session storing operations that should happen in an order.
At this moment does not handle anything special can be sonsidered as
stupid list of operations that will happen after each other. If creation
of same entity is there multiple times it's handled in any way and document
values are not validated.
All operations must be related to single project.
Args:
project_name (str): Project name to which are operations related.
"""
def commit(self):
"""Commit session operations."""
operations, self._operations = self._operations, []
if not operations:
return
operations_by_project = collections.defaultdict(list)
for operation in operations:
operations_by_project[operation.project_name].append(operation)
for project_name, operations in operations_by_project.items():
bulk_writes = []
for operation in operations:
mongo_op = operation.to_mongo_operation()
if mongo_op is not None:
bulk_writes.append(mongo_op)
if bulk_writes:
collection = get_project_connection(project_name)
collection.bulk_write(bulk_writes)
def create_entity(self, project_name, entity_type, data):
"""Fast access to 'MongoCreateOperation'.
Returns:
MongoCreateOperation: Object of update operation.
"""
operation = MongoCreateOperation(project_name, entity_type, data)
self.add(operation)
return operation
def update_entity(self, project_name, entity_type, entity_id, update_data):
"""Fast access to 'MongoUpdateOperation'.
Returns:
MongoUpdateOperation: Object of update operation.
"""
operation = MongoUpdateOperation(
project_name, entity_type, entity_id, update_data
)
self.add(operation)
return operation
def delete_entity(self, project_name, entity_type, entity_id):
"""Fast access to 'MongoDeleteOperation'.
Returns:
MongoDeleteOperation: Object of delete operation.
"""
operation = MongoDeleteOperation(project_name, entity_type, entity_id)
self.add(operation)
return operation
def create_project(
project_name,
project_code,
library_project=False,
):
"""Create project using OpenPype settings.
This project creation function is not validating project document on
creation. It is because project document is created blindly with only
minimum required information about project which is it's name, code, type
and schema.
Entered project name must be unique and project must not exist yet.
Note:
This function is here to be OP v4 ready but in v3 has more logic
to do. That's why inner imports are in the body.
Args:
project_name(str): New project name. Should be unique.
project_code(str): Project's code should be unique too.
library_project(bool): Project is library project.
Raises:
ValueError: When project name already exists in MongoDB.
Returns:
dict: Created project document.
"""
from openpype.settings import ProjectSettings, SaveWarningExc
from openpype.pipeline.schema import validate
if get_project(project_name, fields=["name"]):
raise ValueError("Project with name \"{}\" already exists".format(
project_name
))
if not PROJECT_NAME_REGEX.match(project_name):
raise ValueError((
"Project name \"{}\" contain invalid characters"
).format(project_name))
project_doc = {
"type": "project",
"name": project_name,
"data": {
"code": project_code,
"library_project": library_project
},
"schema": CURRENT_PROJECT_SCHEMA
}
op_session = MongoOperationsSession()
# Insert document with basic data
create_op = op_session.create_entity(
project_name, project_doc["type"], project_doc
)
op_session.commit()
# Load ProjectSettings for the project and save it to store all attributes
# and Anatomy
try:
project_settings_entity = ProjectSettings(project_name)
project_settings_entity.save()
except SaveWarningExc as exc:
print(str(exc))
except Exception:
op_session.delete_entity(
project_name, project_doc["type"], create_op.entity_id
)
op_session.commit()
raise
project_doc = get_project(project_name)
try:
# Validate created project document
validate(project_doc)
except Exception:
# Remove project if is not valid
op_session.delete_entity(
project_name, project_doc["type"], create_op.entity_id
)
op_session.commit()
raise
return project_doc

View file

@ -1,797 +1,24 @@
import re
import uuid
import copy
import collections
from abc import ABCMeta, abstractmethod, abstractproperty
from openpype import AYON_SERVER_ENABLED
import six
from bson.objectid import ObjectId
from pymongo import DeleteOne, InsertOne, UpdateOne
from .operations_base import REMOVED_VALUE
if not AYON_SERVER_ENABLED:
from .mongo.operations import *
OperationsSession = MongoOperationsSession
from .mongo import get_project_connection
from .entities import get_project
REMOVED_VALUE = object()
PROJECT_NAME_ALLOWED_SYMBOLS = "a-zA-Z0-9_"
PROJECT_NAME_REGEX = re.compile(
"^[{}]+$".format(PROJECT_NAME_ALLOWED_SYMBOLS)
)
CURRENT_PROJECT_SCHEMA = "openpype:project-3.0"
CURRENT_PROJECT_CONFIG_SCHEMA = "openpype:config-2.0"
CURRENT_ASSET_DOC_SCHEMA = "openpype:asset-3.0"
CURRENT_SUBSET_SCHEMA = "openpype:subset-3.0"
CURRENT_VERSION_SCHEMA = "openpype:version-3.0"
CURRENT_HERO_VERSION_SCHEMA = "openpype:hero_version-1.0"
CURRENT_REPRESENTATION_SCHEMA = "openpype:representation-2.0"
CURRENT_WORKFILE_INFO_SCHEMA = "openpype:workfile-1.0"
CURRENT_THUMBNAIL_SCHEMA = "openpype:thumbnail-1.0"
def _create_or_convert_to_mongo_id(mongo_id):
if mongo_id is None:
return ObjectId()
return ObjectId(mongo_id)
def new_project_document(
project_name, project_code, config, data=None, entity_id=None
):
"""Create skeleton data of project document.
Args:
project_name (str): Name of project. Used as identifier of a project.
project_code (str): Shorter version of projet without spaces and
special characters (in most of cases). Should be also considered
as unique name across projects.
config (Dic[str, Any]): Project config consist of roots, templates,
applications and other project Anatomy related data.
data (Dict[str, Any]): Project data with information about it's
attributes (e.g. 'fps' etc.) or integration specific keys.
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
created if not passed.
Returns:
Dict[str, Any]: Skeleton of project document.
"""
if data is None:
data = {}
data["code"] = project_code
return {
"_id": _create_or_convert_to_mongo_id(entity_id),
"name": project_name,
"type": CURRENT_PROJECT_SCHEMA,
"entity_data": data,
"config": config
}
def new_asset_document(
name, project_id, parent_id, parents, data=None, entity_id=None
):
"""Create skeleton data of asset document.
Args:
name (str): Is considered as unique identifier of asset in project.
project_id (Union[str, ObjectId]): Id of project doument.
parent_id (Union[str, ObjectId]): Id of parent asset.
parents (List[str]): List of parent assets names.
data (Dict[str, Any]): Asset document data. Empty dictionary is used
if not passed. Value of 'parent_id' is used to fill 'visualParent'.
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
created if not passed.
Returns:
Dict[str, Any]: Skeleton of asset document.
"""
if data is None:
data = {}
if parent_id is not None:
parent_id = ObjectId(parent_id)
data["visualParent"] = parent_id
data["parents"] = parents
return {
"_id": _create_or_convert_to_mongo_id(entity_id),
"type": "asset",
"name": name,
"parent": ObjectId(project_id),
"data": data,
"schema": CURRENT_ASSET_DOC_SCHEMA
}
def new_subset_document(name, family, asset_id, data=None, entity_id=None):
"""Create skeleton data of subset document.
Args:
name (str): Is considered as unique identifier of subset under asset.
family (str): Subset's family.
asset_id (Union[str, ObjectId]): Id of parent asset.
data (Dict[str, Any]): Subset document data. Empty dictionary is used
if not passed. Value of 'family' is used to fill 'family'.
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
created if not passed.
Returns:
Dict[str, Any]: Skeleton of subset document.
"""
if data is None:
data = {}
data["family"] = family
return {
"_id": _create_or_convert_to_mongo_id(entity_id),
"schema": CURRENT_SUBSET_SCHEMA,
"type": "subset",
"name": name,
"data": data,
"parent": asset_id
}
def new_version_doc(version, subset_id, data=None, entity_id=None):
"""Create skeleton data of version document.
Args:
version (int): Is considered as unique identifier of version
under subset.
subset_id (Union[str, ObjectId]): Id of parent subset.
data (Dict[str, Any]): Version document data.
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
created if not passed.
Returns:
Dict[str, Any]: Skeleton of version document.
"""
if data is None:
data = {}
return {
"_id": _create_or_convert_to_mongo_id(entity_id),
"schema": CURRENT_VERSION_SCHEMA,
"type": "version",
"name": int(version),
"parent": subset_id,
"data": data
}
def new_hero_version_doc(version_id, subset_id, data=None, entity_id=None):
"""Create skeleton data of hero version document.
Args:
version_id (ObjectId): Is considered as unique identifier of version
under subset.
subset_id (Union[str, ObjectId]): Id of parent subset.
data (Dict[str, Any]): Version document data.
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
created if not passed.
Returns:
Dict[str, Any]: Skeleton of version document.
"""
if data is None:
data = {}
return {
"_id": _create_or_convert_to_mongo_id(entity_id),
"schema": CURRENT_HERO_VERSION_SCHEMA,
"type": "hero_version",
"version_id": version_id,
"parent": subset_id,
"data": data
}
def new_representation_doc(
name, version_id, context, data=None, entity_id=None
):
"""Create skeleton data of asset document.
Args:
version (int): Is considered as unique identifier of version
under subset.
version_id (Union[str, ObjectId]): Id of parent version.
context (Dict[str, Any]): Representation context used for fill template
of to query.
data (Dict[str, Any]): Representation document data.
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
created if not passed.
Returns:
Dict[str, Any]: Skeleton of version document.
"""
if data is None:
data = {}
return {
"_id": _create_or_convert_to_mongo_id(entity_id),
"schema": CURRENT_REPRESENTATION_SCHEMA,
"type": "representation",
"parent": version_id,
"name": name,
"data": data,
# Imprint shortcut to context for performance reasons.
"context": context
}
def new_thumbnail_doc(data=None, entity_id=None):
"""Create skeleton data of thumbnail document.
Args:
data (Dict[str, Any]): Thumbnail document data.
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
created if not passed.
Returns:
Dict[str, Any]: Skeleton of thumbnail document.
"""
if data is None:
data = {}
return {
"_id": _create_or_convert_to_mongo_id(entity_id),
"type": "thumbnail",
"schema": CURRENT_THUMBNAIL_SCHEMA,
"data": data
}
def new_workfile_info_doc(
filename, asset_id, task_name, files, data=None, entity_id=None
):
"""Create skeleton data of workfile info document.
Workfile document is at this moment used primarily for artist notes.
Args:
filename (str): Filename of workfile.
asset_id (Union[str, ObjectId]): Id of asset under which workfile live.
task_name (str): Task under which was workfile created.
files (List[str]): List of rootless filepaths related to workfile.
data (Dict[str, Any]): Additional metadata.
Returns:
Dict[str, Any]: Skeleton of workfile info document.
"""
if not data:
data = {}
return {
"_id": _create_or_convert_to_mongo_id(entity_id),
"type": "workfile",
"parent": ObjectId(asset_id),
"task_name": task_name,
"filename": filename,
"data": data,
"files": files
}
def _prepare_update_data(old_doc, new_doc, replace):
changes = {}
for key, value in new_doc.items():
if key not in old_doc or value != old_doc[key]:
changes[key] = value
if replace:
for key in old_doc.keys():
if key not in new_doc:
changes[key] = REMOVED_VALUE
return changes
def prepare_subset_update_data(old_doc, new_doc, replace=True):
"""Compare two subset documents and prepare update data.
Based on compared values will create update data for 'UpdateOperation'.
Empty output means that documents are identical.
Returns:
Dict[str, Any]: Changes between old and new document.
"""
return _prepare_update_data(old_doc, new_doc, replace)
def prepare_version_update_data(old_doc, new_doc, replace=True):
"""Compare two version documents and prepare update data.
Based on compared values will create update data for 'UpdateOperation'.
Empty output means that documents are identical.
Returns:
Dict[str, Any]: Changes between old and new document.
"""
return _prepare_update_data(old_doc, new_doc, replace)
def prepare_hero_version_update_data(old_doc, new_doc, replace=True):
"""Compare two hero version documents and prepare update data.
Based on compared values will create update data for 'UpdateOperation'.
Empty output means that documents are identical.
Returns:
Dict[str, Any]: Changes between old and new document.
"""
return _prepare_update_data(old_doc, new_doc, replace)
def prepare_representation_update_data(old_doc, new_doc, replace=True):
"""Compare two representation documents and prepare update data.
Based on compared values will create update data for 'UpdateOperation'.
Empty output means that documents are identical.
Returns:
Dict[str, Any]: Changes between old and new document.
"""
return _prepare_update_data(old_doc, new_doc, replace)
def prepare_workfile_info_update_data(old_doc, new_doc, replace=True):
"""Compare two workfile info documents and prepare update data.
Based on compared values will create update data for 'UpdateOperation'.
Empty output means that documents are identical.
Returns:
Dict[str, Any]: Changes between old and new document.
"""
return _prepare_update_data(old_doc, new_doc, replace)
@six.add_metaclass(ABCMeta)
class AbstractOperation(object):
"""Base operation class.
Operation represent a call into database. The call can create, change or
remove data.
Args:
project_name (str): On which project operation will happen.
entity_type (str): Type of entity on which change happens.
e.g. 'asset', 'representation' etc.
"""
def __init__(self, project_name, entity_type):
self._project_name = project_name
self._entity_type = entity_type
self._id = str(uuid.uuid4())
@property
def project_name(self):
return self._project_name
@property
def id(self):
"""Identifier of operation."""
return self._id
@property
def entity_type(self):
return self._entity_type
@abstractproperty
def operation_name(self):
"""Stringified type of operation."""
pass
@abstractmethod
def to_mongo_operation(self):
"""Convert operation to Mongo batch operation."""
pass
def to_data(self):
"""Convert operation to data that can be converted to json or others.
Warning:
Current state returns ObjectId objects which cannot be parsed by
json.
Returns:
Dict[str, Any]: Description of operation.
"""
return {
"id": self._id,
"entity_type": self.entity_type,
"project_name": self.project_name,
"operation": self.operation_name
}
class CreateOperation(AbstractOperation):
"""Operation to create an entity.
Args:
project_name (str): On which project operation will happen.
entity_type (str): Type of entity on which change happens.
e.g. 'asset', 'representation' etc.
data (Dict[str, Any]): Data of entity that will be created.
"""
operation_name = "create"
def __init__(self, project_name, entity_type, data):
super(CreateOperation, self).__init__(project_name, entity_type)
if not data:
data = {}
else:
data = copy.deepcopy(dict(data))
if "_id" not in data:
data["_id"] = ObjectId()
else:
data["_id"] = ObjectId(data["_id"])
self._entity_id = data["_id"]
self._data = data
def __setitem__(self, key, value):
self.set_value(key, value)
def __getitem__(self, key):
return self.data[key]
def set_value(self, key, value):
self.data[key] = value
def get(self, key, *args, **kwargs):
return self.data.get(key, *args, **kwargs)
@property
def entity_id(self):
return self._entity_id
@property
def data(self):
return self._data
def to_mongo_operation(self):
return InsertOne(copy.deepcopy(self._data))
def to_data(self):
output = super(CreateOperation, self).to_data()
output["data"] = copy.deepcopy(self.data)
return output
class UpdateOperation(AbstractOperation):
"""Operation to update an entity.
Args:
project_name (str): On which project operation will happen.
entity_type (str): Type of entity on which change happens.
e.g. 'asset', 'representation' etc.
entity_id (Union[str, ObjectId]): Identifier of an entity.
update_data (Dict[str, Any]): Key -> value changes that will be set in
database. If value is set to 'REMOVED_VALUE' the key will be
removed. Only first level of dictionary is checked (on purpose).
"""
operation_name = "update"
def __init__(self, project_name, entity_type, entity_id, update_data):
super(UpdateOperation, self).__init__(project_name, entity_type)
self._entity_id = ObjectId(entity_id)
self._update_data = update_data
@property
def entity_id(self):
return self._entity_id
@property
def update_data(self):
return self._update_data
def to_mongo_operation(self):
unset_data = {}
set_data = {}
for key, value in self._update_data.items():
if value is REMOVED_VALUE:
unset_data[key] = None
else:
set_data[key] = value
op_data = {}
if unset_data:
op_data["$unset"] = unset_data
if set_data:
op_data["$set"] = set_data
if not op_data:
return None
return UpdateOne(
{"_id": self.entity_id},
op_data
else:
from ayon_api.server_api import (
PROJECT_NAME_ALLOWED_SYMBOLS,
PROJECT_NAME_REGEX,
)
def to_data(self):
changes = {}
for key, value in self._update_data.items():
if value is REMOVED_VALUE:
value = None
changes[key] = value
output = super(UpdateOperation, self).to_data()
output.update({
"entity_id": self.entity_id,
"changes": changes
})
return output
class DeleteOperation(AbstractOperation):
"""Operation to delete an entity.
Args:
project_name (str): On which project operation will happen.
entity_type (str): Type of entity on which change happens.
e.g. 'asset', 'representation' etc.
entity_id (Union[str, ObjectId]): Entity id that will be removed.
"""
operation_name = "delete"
def __init__(self, project_name, entity_type, entity_id):
super(DeleteOperation, self).__init__(project_name, entity_type)
self._entity_id = ObjectId(entity_id)
@property
def entity_id(self):
return self._entity_id
def to_mongo_operation(self):
return DeleteOne({"_id": self.entity_id})
def to_data(self):
output = super(DeleteOperation, self).to_data()
output["entity_id"] = self.entity_id
return output
class OperationsSession(object):
"""Session storing operations that should happen in an order.
At this moment does not handle anything special can be sonsidered as
stupid list of operations that will happen after each other. If creation
of same entity is there multiple times it's handled in any way and document
values are not validated.
All operations must be related to single project.
Args:
project_name (str): Project name to which are operations related.
"""
def __init__(self):
self._operations = []
def add(self, operation):
"""Add operation to be processed.
Args:
operation (BaseOperation): Operation that should be processed.
"""
if not isinstance(
operation,
(CreateOperation, UpdateOperation, DeleteOperation)
):
raise TypeError("Expected Operation object got {}".format(
str(type(operation))
))
self._operations.append(operation)
def append(self, operation):
"""Add operation to be processed.
Args:
operation (BaseOperation): Operation that should be processed.
"""
self.add(operation)
def extend(self, operations):
"""Add operations to be processed.
Args:
operations (List[BaseOperation]): Operations that should be
processed.
"""
for operation in operations:
self.add(operation)
def remove(self, operation):
"""Remove operation."""
self._operations.remove(operation)
def clear(self):
"""Clear all registered operations."""
self._operations = []
def to_data(self):
return [
operation.to_data()
for operation in self._operations
]
def commit(self):
"""Commit session operations."""
operations, self._operations = self._operations, []
if not operations:
return
operations_by_project = collections.defaultdict(list)
for operation in operations:
operations_by_project[operation.project_name].append(operation)
for project_name, operations in operations_by_project.items():
bulk_writes = []
for operation in operations:
mongo_op = operation.to_mongo_operation()
if mongo_op is not None:
bulk_writes.append(mongo_op)
if bulk_writes:
collection = get_project_connection(project_name)
collection.bulk_write(bulk_writes)
def create_entity(self, project_name, entity_type, data):
"""Fast access to 'CreateOperation'.
Returns:
CreateOperation: Object of update operation.
"""
operation = CreateOperation(project_name, entity_type, data)
self.add(operation)
return operation
def update_entity(self, project_name, entity_type, entity_id, update_data):
"""Fast access to 'UpdateOperation'.
Returns:
UpdateOperation: Object of update operation.
"""
operation = UpdateOperation(
project_name, entity_type, entity_id, update_data
from .server.operations import *
from .mongo.operations import (
CURRENT_PROJECT_SCHEMA,
CURRENT_PROJECT_CONFIG_SCHEMA,
CURRENT_ASSET_DOC_SCHEMA,
CURRENT_SUBSET_SCHEMA,
CURRENT_VERSION_SCHEMA,
CURRENT_HERO_VERSION_SCHEMA,
CURRENT_REPRESENTATION_SCHEMA,
CURRENT_WORKFILE_INFO_SCHEMA,
CURRENT_THUMBNAIL_SCHEMA
)
self.add(operation)
return operation
def delete_entity(self, project_name, entity_type, entity_id):
"""Fast access to 'DeleteOperation'.
Returns:
DeleteOperation: Object of delete operation.
"""
operation = DeleteOperation(project_name, entity_type, entity_id)
self.add(operation)
return operation
def create_project(
project_name,
project_code,
library_project=False,
):
"""Create project using OpenPype settings.
This project creation function is not validating project document on
creation. It is because project document is created blindly with only
minimum required information about project which is it's name, code, type
and schema.
Entered project name must be unique and project must not exist yet.
Note:
This function is here to be OP v4 ready but in v3 has more logic
to do. That's why inner imports are in the body.
Args:
project_name(str): New project name. Should be unique.
project_code(str): Project's code should be unique too.
library_project(bool): Project is library project.
Raises:
ValueError: When project name already exists in MongoDB.
Returns:
dict: Created project document.
"""
from openpype.settings import ProjectSettings, SaveWarningExc
from openpype.pipeline.schema import validate
if get_project(project_name, fields=["name"]):
raise ValueError("Project with name \"{}\" already exists".format(
project_name
))
if not PROJECT_NAME_REGEX.match(project_name):
raise ValueError((
"Project name \"{}\" contain invalid characters"
).format(project_name))
project_doc = {
"type": "project",
"name": project_name,
"data": {
"code": project_code,
"library_project": library_project,
},
"schema": CURRENT_PROJECT_SCHEMA
}
op_session = OperationsSession()
# Insert document with basic data
create_op = op_session.create_entity(
project_name, project_doc["type"], project_doc
)
op_session.commit()
# Load ProjectSettings for the project and save it to store all attributes
# and Anatomy
try:
project_settings_entity = ProjectSettings(project_name)
project_settings_entity.save()
except SaveWarningExc as exc:
print(str(exc))
except Exception:
op_session.delete_entity(
project_name, project_doc["type"], create_op.entity_id
)
op_session.commit()
raise
project_doc = get_project(project_name)
try:
# Validate created project document
validate(project_doc)
except Exception:
# Remove project if is not valid
op_session.delete_entity(
project_name, project_doc["type"], create_op.entity_id
)
op_session.commit()
raise
return project_doc

View file

@ -0,0 +1,289 @@
import uuid
import copy
from abc import ABCMeta, abstractmethod, abstractproperty
import six
REMOVED_VALUE = object()
@six.add_metaclass(ABCMeta)
class AbstractOperation(object):
"""Base operation class.
Operation represent a call into database. The call can create, change or
remove data.
Args:
project_name (str): On which project operation will happen.
entity_type (str): Type of entity on which change happens.
e.g. 'asset', 'representation' etc.
"""
def __init__(self, project_name, entity_type):
self._project_name = project_name
self._entity_type = entity_type
self._id = str(uuid.uuid4())
@property
def project_name(self):
return self._project_name
@property
def id(self):
"""Identifier of operation."""
return self._id
@property
def entity_type(self):
return self._entity_type
@abstractproperty
def operation_name(self):
"""Stringified type of operation."""
pass
def to_data(self):
"""Convert operation to data that can be converted to json or others.
Warning:
Current state returns ObjectId objects which cannot be parsed by
json.
Returns:
Dict[str, Any]: Description of operation.
"""
return {
"id": self._id,
"entity_type": self.entity_type,
"project_name": self.project_name,
"operation": self.operation_name
}
class CreateOperation(AbstractOperation):
"""Operation to create an entity.
Args:
project_name (str): On which project operation will happen.
entity_type (str): Type of entity on which change happens.
e.g. 'asset', 'representation' etc.
data (Dict[str, Any]): Data of entity that will be created.
"""
operation_name = "create"
def __init__(self, project_name, entity_type, data):
super(CreateOperation, self).__init__(project_name, entity_type)
if not data:
data = {}
else:
data = copy.deepcopy(dict(data))
self._data = data
def __setitem__(self, key, value):
self.set_value(key, value)
def __getitem__(self, key):
return self.data[key]
def set_value(self, key, value):
self.data[key] = value
def get(self, key, *args, **kwargs):
return self.data.get(key, *args, **kwargs)
@abstractproperty
def entity_id(self):
pass
@property
def data(self):
return self._data
def to_data(self):
output = super(CreateOperation, self).to_data()
output["data"] = copy.deepcopy(self.data)
return output
class UpdateOperation(AbstractOperation):
"""Operation to update an entity.
Args:
project_name (str): On which project operation will happen.
entity_type (str): Type of entity on which change happens.
e.g. 'asset', 'representation' etc.
entity_id (Union[str, ObjectId]): Identifier of an entity.
update_data (Dict[str, Any]): Key -> value changes that will be set in
database. If value is set to 'REMOVED_VALUE' the key will be
removed. Only first level of dictionary is checked (on purpose).
"""
operation_name = "update"
def __init__(self, project_name, entity_type, entity_id, update_data):
super(UpdateOperation, self).__init__(project_name, entity_type)
self._entity_id = entity_id
self._update_data = update_data
@property
def entity_id(self):
return self._entity_id
@property
def update_data(self):
return self._update_data
def to_data(self):
changes = {}
for key, value in self._update_data.items():
if value is REMOVED_VALUE:
value = None
changes[key] = value
output = super(UpdateOperation, self).to_data()
output.update({
"entity_id": self.entity_id,
"changes": changes
})
return output
class DeleteOperation(AbstractOperation):
"""Operation to delete an entity.
Args:
project_name (str): On which project operation will happen.
entity_type (str): Type of entity on which change happens.
e.g. 'asset', 'representation' etc.
entity_id (Union[str, ObjectId]): Entity id that will be removed.
"""
operation_name = "delete"
def __init__(self, project_name, entity_type, entity_id):
super(DeleteOperation, self).__init__(project_name, entity_type)
self._entity_id = entity_id
@property
def entity_id(self):
return self._entity_id
def to_data(self):
output = super(DeleteOperation, self).to_data()
output["entity_id"] = self.entity_id
return output
class BaseOperationsSession(object):
"""Session storing operations that should happen in an order.
At this moment does not handle anything special can be considered as
stupid list of operations that will happen after each other. If creation
of same entity is there multiple times it's handled in any way and document
values are not validated.
"""
def __init__(self):
self._operations = []
def __len__(self):
return len(self._operations)
def add(self, operation):
"""Add operation to be processed.
Args:
operation (BaseOperation): Operation that should be processed.
"""
if not isinstance(
operation,
(CreateOperation, UpdateOperation, DeleteOperation)
):
raise TypeError("Expected Operation object got {}".format(
str(type(operation))
))
self._operations.append(operation)
def append(self, operation):
"""Add operation to be processed.
Args:
operation (BaseOperation): Operation that should be processed.
"""
self.add(operation)
def extend(self, operations):
"""Add operations to be processed.
Args:
operations (List[BaseOperation]): Operations that should be
processed.
"""
for operation in operations:
self.add(operation)
def remove(self, operation):
"""Remove operation."""
self._operations.remove(operation)
def clear(self):
"""Clear all registered operations."""
self._operations = []
def to_data(self):
return [
operation.to_data()
for operation in self._operations
]
@abstractmethod
def commit(self):
"""Commit session operations."""
pass
def create_entity(self, project_name, entity_type, data):
"""Fast access to 'CreateOperation'.
Returns:
CreateOperation: Object of update operation.
"""
operation = CreateOperation(project_name, entity_type, data)
self.add(operation)
return operation
def update_entity(self, project_name, entity_type, entity_id, update_data):
"""Fast access to 'UpdateOperation'.
Returns:
UpdateOperation: Object of update operation.
"""
operation = UpdateOperation(
project_name, entity_type, entity_id, update_data
)
self.add(operation)
return operation
def delete_entity(self, project_name, entity_type, entity_id):
"""Fast access to 'DeleteOperation'.
Returns:
DeleteOperation: Object of delete operation.
"""
operation = DeleteOperation(project_name, entity_type, entity_id)
self.add(operation)
return operation

View file

View file

@ -0,0 +1,83 @@
# --- Project ---
DEFAULT_PROJECT_FIELDS = {
"active",
"name",
"code",
"config",
"data",
"createdAt",
}
# --- Folders ---
DEFAULT_FOLDER_FIELDS = {
"id",
"name",
"path",
"parentId",
"active",
"parents",
"thumbnailId"
}
# --- Tasks ---
DEFAULT_TASK_FIELDS = {
"id",
"name",
"taskType",
"assignees",
}
# --- Subsets ---
DEFAULT_SUBSET_FIELDS = {
"id",
"name",
"active",
"family",
"folderId",
}
# --- Versions ---
DEFAULT_VERSION_FIELDS = {
"id",
"name",
"version",
"active",
"subsetId",
"taskId",
"author",
"thumbnailId",
"createdAt",
"updatedAt",
}
# --- Representations ---
DEFAULT_REPRESENTATION_FIELDS = {
"id",
"name",
"context",
"createdAt",
"active",
"versionId",
}
REPRESENTATION_FILES_FIELDS = {
"files.name",
"files.hash",
"files.id",
"files.path",
"files.size",
}
DEFAULT_WORKFILE_INFO_FIELDS = {
"active",
"createdAt",
"createdBy",
"id",
"name",
"path",
"projectName",
"taskId",
"thumbnailId",
"updatedAt",
"updatedBy",
}

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,655 @@
import collections
from ayon_api import get_server_api_connection
from openpype.client.mongo.operations import CURRENT_THUMBNAIL_SCHEMA
from .openpype_comp import get_folders_with_tasks
from .conversion_utils import (
project_fields_v3_to_v4,
convert_v4_project_to_v3,
folder_fields_v3_to_v4,
convert_v4_folder_to_v3,
subset_fields_v3_to_v4,
convert_v4_subset_to_v3,
version_fields_v3_to_v4,
convert_v4_version_to_v3,
representation_fields_v3_to_v4,
convert_v4_representation_to_v3,
workfile_info_fields_v3_to_v4,
convert_v4_workfile_info_to_v3,
)
def get_projects(active=True, inactive=False, library=None, fields=None):
if not active and not inactive:
return
if active and inactive:
active = None
elif active:
active = True
elif inactive:
active = False
con = get_server_api_connection()
fields = project_fields_v3_to_v4(fields, con)
for project in con.get_projects(active, library, fields=fields):
yield convert_v4_project_to_v3(project)
def get_project(project_name, active=True, inactive=False, fields=None):
# Skip if both are disabled
con = get_server_api_connection()
fields = project_fields_v3_to_v4(fields, con)
return convert_v4_project_to_v3(
con.get_project(project_name, fields=fields)
)
def get_whole_project(*args, **kwargs):
raise NotImplementedError("'get_whole_project' not implemented")
def _get_subsets(
project_name,
subset_ids=None,
subset_names=None,
folder_ids=None,
names_by_folder_ids=None,
archived=False,
fields=None
):
# Convert fields and add minimum required fields
con = get_server_api_connection()
fields = subset_fields_v3_to_v4(fields, con)
if fields is not None:
for key in (
"id",
"active"
):
fields.add(key)
active = None
if archived:
active = False
for subset in con.get_subsets(
project_name,
subset_ids,
subset_names,
folder_ids,
names_by_folder_ids,
active,
fields
):
yield convert_v4_subset_to_v3(subset)
def _get_versions(
project_name,
version_ids=None,
subset_ids=None,
versions=None,
hero=True,
standard=True,
latest=None,
fields=None
):
con = get_server_api_connection()
fields = version_fields_v3_to_v4(fields, con)
# Make sure 'subsetId' and 'version' are available when hero versions
# are queried
if fields and hero:
fields = set(fields)
fields |= {"subsetId", "version"}
queried_versions = con.get_versions(
project_name,
version_ids,
subset_ids,
versions,
hero,
standard,
latest,
fields=fields
)
versions = []
hero_versions = []
for version in queried_versions:
if version["version"] < 0:
hero_versions.append(version)
else:
versions.append(convert_v4_version_to_v3(version))
if hero_versions:
subset_ids = set()
versions_nums = set()
for hero_version in hero_versions:
versions_nums.add(abs(hero_version["version"]))
subset_ids.add(hero_version["subsetId"])
hero_eq_versions = con.get_versions(
project_name,
subset_ids=subset_ids,
versions=versions_nums,
hero=False,
fields=["id", "version", "subsetId"]
)
hero_eq_by_subset_id = collections.defaultdict(list)
for version in hero_eq_versions:
hero_eq_by_subset_id[version["subsetId"]].append(version)
for hero_version in hero_versions:
abs_version = abs(hero_version["version"])
subset_id = hero_version["subsetId"]
version_id = None
for version in hero_eq_by_subset_id.get(subset_id, []):
if version["version"] == abs_version:
version_id = version["id"]
break
conv_hero = convert_v4_version_to_v3(hero_version)
conv_hero["version_id"] = version_id
versions.append(conv_hero)
return versions
def get_asset_by_id(project_name, asset_id, fields=None):
assets = get_assets(
project_name, asset_ids=[asset_id], fields=fields
)
for asset in assets:
return asset
return None
def get_asset_by_name(project_name, asset_name, fields=None):
assets = get_assets(
project_name, asset_names=[asset_name], fields=fields
)
for asset in assets:
return asset
return None
def get_assets(
project_name,
asset_ids=None,
asset_names=None,
parent_ids=None,
archived=False,
fields=None
):
if not project_name:
return
active = True
if archived:
active = False
con = get_server_api_connection()
fields = folder_fields_v3_to_v4(fields, con)
kwargs = dict(
folder_ids=asset_ids,
folder_names=asset_names,
parent_ids=parent_ids,
active=active,
fields=fields
)
if fields is None or "tasks" in fields:
folders = get_folders_with_tasks(con, project_name, **kwargs)
else:
folders = con.get_folders(project_name, **kwargs)
for folder in folders:
yield convert_v4_folder_to_v3(folder, project_name)
def get_archived_assets(*args, **kwargs):
raise NotImplementedError("'get_archived_assets' not implemented")
def get_asset_ids_with_subsets(project_name, asset_ids=None):
con = get_server_api_connection()
return con.get_asset_ids_with_subsets(project_name, asset_ids)
def get_subset_by_id(project_name, subset_id, fields=None):
subsets = get_subsets(
project_name, subset_ids=[subset_id], fields=fields
)
for subset in subsets:
return subset
return None
def get_subset_by_name(project_name, subset_name, asset_id, fields=None):
subsets = get_subsets(
project_name,
subset_names=[subset_name],
asset_ids=[asset_id],
fields=fields
)
for subset in subsets:
return subset
return None
def get_subsets(
project_name,
subset_ids=None,
subset_names=None,
asset_ids=None,
names_by_asset_ids=None,
archived=False,
fields=None
):
return _get_subsets(
project_name,
subset_ids,
subset_names,
asset_ids,
names_by_asset_ids,
archived,
fields=fields
)
def get_subset_families(project_name, subset_ids=None):
con = get_server_api_connection()
return con.get_subset_families(project_name, subset_ids)
def get_version_by_id(project_name, version_id, fields=None):
versions = get_versions(
project_name,
version_ids=[version_id],
fields=fields,
hero=True
)
for version in versions:
return version
return None
def get_version_by_name(project_name, version, subset_id, fields=None):
versions = get_versions(
project_name,
subset_ids=[subset_id],
versions=[version],
fields=fields
)
for version in versions:
return version
return None
def get_versions(
project_name,
version_ids=None,
subset_ids=None,
versions=None,
hero=False,
fields=None
):
return _get_versions(
project_name,
version_ids,
subset_ids,
versions,
hero=hero,
standard=True,
fields=fields
)
def get_hero_version_by_id(project_name, version_id, fields=None):
versions = get_hero_versions(
project_name,
version_ids=[version_id],
fields=fields
)
for version in versions:
return version
return None
def get_hero_version_by_subset_id(
project_name, subset_id, fields=None
):
versions = get_hero_versions(
project_name,
subset_ids=[subset_id],
fields=fields
)
for version in versions:
return version
return None
def get_hero_versions(
project_name, subset_ids=None, version_ids=None, fields=None
):
return _get_versions(
project_name,
version_ids=version_ids,
subset_ids=subset_ids,
hero=True,
standard=False,
fields=fields
)
def get_last_versions(project_name, subset_ids, fields=None):
if fields:
fields = set(fields)
fields.add("parent")
versions = _get_versions(
project_name,
subset_ids=subset_ids,
latest=True,
hero=False,
fields=fields
)
return {
version["parent"]: version
for version in versions
}
def get_last_version_by_subset_id(project_name, subset_id, fields=None):
versions = _get_versions(
project_name,
subset_ids=[subset_id],
latest=True,
hero=False,
fields=fields
)
if not versions:
return versions[0]
return None
def get_last_version_by_subset_name(
project_name,
subset_name,
asset_id=None,
asset_name=None,
fields=None
):
if not asset_id and not asset_name:
return None
if not asset_id:
asset = get_asset_by_name(
project_name, asset_name, fields=["_id"]
)
if not asset:
return None
asset_id = asset["_id"]
subset = get_subset_by_name(
project_name, subset_name, asset_id, fields=["_id"]
)
if not subset:
return None
return get_last_version_by_subset_id(
project_name, subset["id"], fields=fields
)
def get_output_link_versions(*args, **kwargs):
raise NotImplementedError("'get_output_link_versions' not implemented")
def version_is_latest(project_name, version_id):
con = get_server_api_connection()
return con.version_is_latest(project_name, version_id)
def get_representation_by_id(project_name, representation_id, fields=None):
representations = get_representations(
project_name,
representation_ids=[representation_id],
fields=fields
)
for representation in representations:
return representation
return None
def get_representation_by_name(
project_name, representation_name, version_id, fields=None
):
representations = get_representations(
project_name,
representation_names=[representation_name],
version_ids=[version_id],
fields=fields
)
for representation in representations:
return representation
return None
def get_representations(
project_name,
representation_ids=None,
representation_names=None,
version_ids=None,
context_filters=None,
names_by_version_ids=None,
archived=False,
standard=True,
fields=None
):
if context_filters is not None:
# TODO should we add the support?
# - there was ability to fitler using regex
raise ValueError("OP v4 can't filter by representation context.")
if not archived and not standard:
return
if archived and not standard:
active = False
elif not archived and standard:
active = True
else:
active = None
con = get_server_api_connection()
fields = representation_fields_v3_to_v4(fields, con)
if fields and active is not None:
fields.add("active")
representations = con.get_representations(
project_name,
representation_ids,
representation_names,
version_ids,
names_by_version_ids,
active,
fields=fields
)
for representation in representations:
yield convert_v4_representation_to_v3(representation)
def get_representation_parents(project_name, representation):
if not representation:
return None
repre_id = representation["_id"]
parents_by_repre_id = get_representations_parents(
project_name, [representation]
)
return parents_by_repre_id[repre_id]
def get_representations_parents(project_name, representations):
repre_ids = {
repre["_id"]
for repre in representations
}
con = get_server_api_connection()
parents_by_repre_id = con.get_representations_parents(project_name,
repre_ids)
folder_ids = set()
for parents in parents_by_repre_id .values():
folder_ids.add(parents[2]["id"])
tasks_by_folder_id = {}
new_parents = {}
for repre_id, parents in parents_by_repre_id .items():
version, subset, folder, project = parents
folder_tasks = tasks_by_folder_id.get(folder["id"]) or {}
folder["tasks"] = folder_tasks
new_parents[repre_id] = (
convert_v4_version_to_v3(version),
convert_v4_subset_to_v3(subset),
convert_v4_folder_to_v3(folder, project_name),
project
)
return new_parents
def get_archived_representations(
project_name,
representation_ids=None,
representation_names=None,
version_ids=None,
context_filters=None,
names_by_version_ids=None,
fields=None
):
return get_representations(
project_name,
representation_ids=representation_ids,
representation_names=representation_names,
version_ids=version_ids,
context_filters=context_filters,
names_by_version_ids=names_by_version_ids,
archived=True,
standard=False,
fields=fields
)
def get_thumbnail(
project_name, thumbnail_id, entity_type, entity_id, fields=None
):
"""Receive thumbnail entity data.
Args:
project_name (str): Name of project where to look for queried entities.
thumbnail_id (Union[str, ObjectId]): Id of thumbnail entity.
entity_type (str): Type of entity for which the thumbnail should be
received.
entity_id (str): Id of entity for which the thumbnail should be
received.
fields (Iterable[str]): Fields that should be returned. All fields are
returned if 'None' is passed.
Returns:
None: If thumbnail with specified id was not found.
Dict: Thumbnail entity data which can be reduced to specified 'fields'.
"""
if not thumbnail_id or not entity_type or not entity_id:
return None
if entity_type == "asset":
entity_type = "folder"
elif entity_type == "hero_version":
entity_type = "version"
return {
"_id": thumbnail_id,
"type": "thumbnail",
"schema": CURRENT_THUMBNAIL_SCHEMA,
"data": {
"entity_type": entity_type,
"entity_id": entity_id
}
}
def get_thumbnails(project_name, thumbnail_contexts, fields=None):
thumbnail_items = set()
for thumbnail_context in thumbnail_contexts:
thumbnail_id, entity_type, entity_id = thumbnail_context
thumbnail_item = get_thumbnail(
project_name, thumbnail_id, entity_type, entity_id
)
if thumbnail_item:
thumbnail_items.add(thumbnail_item)
return list(thumbnail_items)
def get_thumbnail_id_from_source(project_name, src_type, src_id):
"""Receive thumbnail id from source entity.
Args:
project_name (str): Name of project where to look for queried entities.
src_type (str): Type of source entity ('asset', 'version').
src_id (Union[str, ObjectId]): Id of source entity.
Returns:
ObjectId: Thumbnail id assigned to entity.
None: If Source entity does not have any thumbnail id assigned.
"""
if not src_type or not src_id:
return None
if src_type == "version":
version = get_version_by_id(
project_name, src_id, fields=["data.thumbnail_id"]
) or {}
return version.get("data", {}).get("thumbnail_id")
if src_type == "asset":
asset = get_asset_by_id(
project_name, src_id, fields=["data.thumbnail_id"]
) or {}
return asset.get("data", {}).get("thumbnail_id")
return None
def get_workfile_info(
project_name, asset_id, task_name, filename, fields=None
):
if not asset_id or not task_name or not filename:
return None
con = get_server_api_connection()
task = con.get_task_by_name(
project_name, asset_id, task_name, fields=["id", "name", "folderId"]
)
if not task:
return None
fields = workfile_info_fields_v3_to_v4(fields)
for workfile_info in con.get_workfiles_info(
project_name, task_ids=[task["id"]], fields=fields
):
if workfile_info["name"] == filename:
return convert_v4_workfile_info_to_v3(workfile_info, task)
return None

View file

@ -0,0 +1,65 @@
def get_linked_asset_ids(project_name, asset_doc=None, asset_id=None):
"""Extract linked asset ids from asset document.
One of asset document or asset id must be passed.
Note:
Asset links now works only from asset to assets.
Args:
project_name (str): Project where to look for asset.
asset_doc (dict): Asset document from DB.
asset_id (str): Asset id to find its document.
Returns:
List[Union[ObjectId, str]]: Asset ids of input links.
"""
return []
def get_linked_assets(
project_name, asset_doc=None, asset_id=None, fields=None
):
"""Return linked assets based on passed asset document.
One of asset document or asset id must be passed.
Args:
project_name (str): Name of project where to look for queried entities.
asset_doc (Dict[str, Any]): Asset document from database.
asset_id (Union[ObjectId, str]): Asset id. Can be used instead of
asset document.
fields (Iterable[str]): Fields that should be returned. All fields are
returned if 'None' is passed.
Returns:
List[Dict[str, Any]]: Asset documents of input links for passed
asset doc.
"""
return []
def get_linked_representation_id(
project_name, repre_doc=None, repre_id=None, link_type=None, max_depth=None
):
"""Returns list of linked ids of particular type (if provided).
One of representation document or representation id must be passed.
Note:
Representation links now works only from representation through version
back to representations.
Args:
project_name (str): Name of project where look for links.
repre_doc (Dict[str, Any]): Representation document.
repre_id (Union[ObjectId, str]): Representation id.
link_type (str): Type of link (e.g. 'reference', ...).
max_depth (int): Limit recursion level. Default: 0
Returns:
List[ObjectId] Linked representation ids.
"""
return []

View file

@ -0,0 +1,156 @@
import collections
from ayon_api.graphql import GraphQlQuery, FIELD_VALUE, fields_to_dict
from .constants import DEFAULT_FOLDER_FIELDS
def folders_tasks_graphql_query(fields):
query = GraphQlQuery("FoldersQuery")
project_name_var = query.add_variable("projectName", "String!")
folder_ids_var = query.add_variable("folderIds", "[String!]")
parent_folder_ids_var = query.add_variable("parentFolderIds", "[String!]")
folder_paths_var = query.add_variable("folderPaths", "[String!]")
folder_names_var = query.add_variable("folderNames", "[String!]")
has_subsets_var = query.add_variable("folderHasSubsets", "Boolean!")
project_field = query.add_field("project")
project_field.set_filter("name", project_name_var)
folders_field = project_field.add_field("folders", has_edges=True)
folders_field.set_filter("ids", folder_ids_var)
folders_field.set_filter("parentIds", parent_folder_ids_var)
folders_field.set_filter("names", folder_names_var)
folders_field.set_filter("paths", folder_paths_var)
folders_field.set_filter("hasSubsets", has_subsets_var)
fields = set(fields)
fields.discard("tasks")
tasks_field = folders_field.add_field("tasks", has_edges=True)
tasks_field.add_field("name")
tasks_field.add_field("taskType")
nested_fields = fields_to_dict(fields)
query_queue = collections.deque()
for key, value in nested_fields.items():
query_queue.append((key, value, folders_field))
while query_queue:
item = query_queue.popleft()
key, value, parent = item
field = parent.add_field(key)
if value is FIELD_VALUE:
continue
for k, v in value.items():
query_queue.append((k, v, field))
return query
def get_folders_with_tasks(
con,
project_name,
folder_ids=None,
folder_paths=None,
folder_names=None,
parent_ids=None,
active=True,
fields=None
):
"""Query folders with tasks from server.
This is for v4 compatibility where tasks were stored on assets. This is
an inefficient way how folders and tasks are queried so it was added only
as compatibility function.
Todos:
Folder name won't be unique identifier, so we should add folder path
filtering.
Notes:
Filter 'active' don't have direct filter in GraphQl.
Args:
con (ServerAPI): Connection to server.
project_name (str): Name of project where folders are.
folder_ids (Iterable[str]): Folder ids to filter.
folder_paths (Iterable[str]): Folder paths used for filtering.
folder_names (Iterable[str]): Folder names used for filtering.
parent_ids (Iterable[str]): Ids of folder parents. Use 'None'
if folder is direct child of project.
active (Union[bool, None]): Filter active/inactive folders. Both
are returned if is set to None.
fields (Union[Iterable(str), None]): Fields to be queried
for folder. All possible folder fields are returned if 'None'
is passed.
Returns:
List[Dict[str, Any]]: Queried folder entities.
"""
if not project_name:
return []
filters = {
"projectName": project_name
}
if folder_ids is not None:
folder_ids = set(folder_ids)
if not folder_ids:
return []
filters["folderIds"] = list(folder_ids)
if folder_paths is not None:
folder_paths = set(folder_paths)
if not folder_paths:
return []
filters["folderPaths"] = list(folder_paths)
if folder_names is not None:
folder_names = set(folder_names)
if not folder_names:
return []
filters["folderNames"] = list(folder_names)
if parent_ids is not None:
parent_ids = set(parent_ids)
if not parent_ids:
return []
if None in parent_ids:
# Replace 'None' with '"root"' which is used during GraphQl
# query for parent ids filter for folders without folder
# parent
parent_ids.remove(None)
parent_ids.add("root")
if project_name in parent_ids:
# Replace project name with '"root"' which is used during
# GraphQl query for parent ids filter for folders without
# folder parent
parent_ids.remove(project_name)
parent_ids.add("root")
filters["parentFolderIds"] = list(parent_ids)
if fields:
fields = set(fields)
else:
fields = con.get_default_fields_for_type("folder")
fields |= DEFAULT_FOLDER_FIELDS
if active is not None:
fields.add("active")
query = folders_tasks_graphql_query(fields)
for attr, filter_value in filters.items():
query.set_variable_value(attr, filter_value)
parsed_data = query.query(con)
folders = parsed_data["project"]["folders"]
if active is None:
return folders
return [
folder
for folder in folders
if folder["active"] is active
]

View file

@ -0,0 +1,863 @@
import copy
import json
import collections
import uuid
import datetime
from bson.objectid import ObjectId
from ayon_api import get_server_api_connection
from openpype.client.operations_base import (
REMOVED_VALUE,
CreateOperation,
UpdateOperation,
DeleteOperation,
BaseOperationsSession
)
from openpype.client.mongo.operations import (
CURRENT_THUMBNAIL_SCHEMA,
CURRENT_REPRESENTATION_SCHEMA,
CURRENT_HERO_VERSION_SCHEMA,
CURRENT_VERSION_SCHEMA,
CURRENT_SUBSET_SCHEMA,
CURRENT_ASSET_DOC_SCHEMA,
CURRENT_PROJECT_SCHEMA,
)
from .conversion_utils import (
convert_create_asset_to_v4,
convert_create_task_to_v4,
convert_create_subset_to_v4,
convert_create_version_to_v4,
convert_create_hero_version_to_v4,
convert_create_representation_to_v4,
convert_create_workfile_info_to_v4,
convert_update_folder_to_v4,
convert_update_subset_to_v4,
convert_update_version_to_v4,
convert_update_hero_version_to_v4,
convert_update_representation_to_v4,
convert_update_workfile_info_to_v4,
)
from .utils import create_entity_id
def _create_or_convert_to_id(entity_id=None):
if entity_id is None:
return create_entity_id()
if isinstance(entity_id, ObjectId):
raise TypeError("Type of 'ObjectId' is not supported anymore.")
# Validate if can be converted to uuid
uuid.UUID(entity_id)
return entity_id
def new_project_document(
project_name, project_code, config, data=None, entity_id=None
):
"""Create skeleton data of project document.
Args:
project_name (str): Name of project. Used as identifier of a project.
project_code (str): Shorter version of projet without spaces and
special characters (in most of cases). Should be also considered
as unique name across projects.
config (Dic[str, Any]): Project config consist of roots, templates,
applications and other project Anatomy related data.
data (Dict[str, Any]): Project data with information about it's
attributes (e.g. 'fps' etc.) or integration specific keys.
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
created if not passed.
Returns:
Dict[str, Any]: Skeleton of project document.
"""
if data is None:
data = {}
data["code"] = project_code
return {
"_id": _create_or_convert_to_id(entity_id),
"name": project_name,
"type": CURRENT_PROJECT_SCHEMA,
"entity_data": data,
"config": config
}
def new_asset_document(
name, project_id, parent_id, parents, data=None, entity_id=None
):
"""Create skeleton data of asset document.
Args:
name (str): Is considered as unique identifier of asset in project.
project_id (Union[str, ObjectId]): Id of project doument.
parent_id (Union[str, ObjectId]): Id of parent asset.
parents (List[str]): List of parent assets names.
data (Dict[str, Any]): Asset document data. Empty dictionary is used
if not passed. Value of 'parent_id' is used to fill 'visualParent'.
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
created if not passed.
Returns:
Dict[str, Any]: Skeleton of asset document.
"""
if data is None:
data = {}
if parent_id is not None:
parent_id = _create_or_convert_to_id(parent_id)
data["visualParent"] = parent_id
data["parents"] = parents
return {
"_id": _create_or_convert_to_id(entity_id),
"type": "asset",
"name": name,
# This will be ignored
"parent": project_id,
"data": data,
"schema": CURRENT_ASSET_DOC_SCHEMA
}
def new_subset_document(name, family, asset_id, data=None, entity_id=None):
"""Create skeleton data of subset document.
Args:
name (str): Is considered as unique identifier of subset under asset.
family (str): Subset's family.
asset_id (Union[str, ObjectId]): Id of parent asset.
data (Dict[str, Any]): Subset document data. Empty dictionary is used
if not passed. Value of 'family' is used to fill 'family'.
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
created if not passed.
Returns:
Dict[str, Any]: Skeleton of subset document.
"""
if data is None:
data = {}
data["family"] = family
return {
"_id": _create_or_convert_to_id(entity_id),
"schema": CURRENT_SUBSET_SCHEMA,
"type": "subset",
"name": name,
"data": data,
"parent": _create_or_convert_to_id(asset_id)
}
def new_version_doc(version, subset_id, data=None, entity_id=None):
"""Create skeleton data of version document.
Args:
version (int): Is considered as unique identifier of version
under subset.
subset_id (Union[str, ObjectId]): Id of parent subset.
data (Dict[str, Any]): Version document data.
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
created if not passed.
Returns:
Dict[str, Any]: Skeleton of version document.
"""
if data is None:
data = {}
return {
"_id": _create_or_convert_to_id(entity_id),
"schema": CURRENT_VERSION_SCHEMA,
"type": "version",
"name": int(version),
"parent": _create_or_convert_to_id(subset_id),
"data": data
}
def new_hero_version_doc(subset_id, data, version=None, entity_id=None):
"""Create skeleton data of hero version document.
Args:
subset_id (Union[str, ObjectId]): Id of parent subset.
data (Dict[str, Any]): Version document data.
version (int): Version of source version.
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
created if not passed.
Returns:
Dict[str, Any]: Skeleton of version document.
"""
if version is None:
version = -1
elif version > 0:
version = -version
return {
"_id": _create_or_convert_to_id(entity_id),
"schema": CURRENT_HERO_VERSION_SCHEMA,
"type": "hero_version",
"version": version,
"parent": _create_or_convert_to_id(subset_id),
"data": data
}
def new_representation_doc(
name, version_id, context, data=None, entity_id=None
):
"""Create skeleton data of representation document.
Args:
name (str): Representation name considered as unique identifier
of representation under version.
version_id (Union[str, ObjectId]): Id of parent version.
context (Dict[str, Any]): Representation context used for fill template
of to query.
data (Dict[str, Any]): Representation document data.
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
created if not passed.
Returns:
Dict[str, Any]: Skeleton of version document.
"""
if data is None:
data = {}
return {
"_id": _create_or_convert_to_id(entity_id),
"schema": CURRENT_REPRESENTATION_SCHEMA,
"type": "representation",
"parent": _create_or_convert_to_id(version_id),
"name": name,
"data": data,
# Imprint shortcut to context for performance reasons.
"context": context
}
def new_thumbnail_doc(data=None, entity_id=None):
"""Create skeleton data of thumbnail document.
Args:
data (Dict[str, Any]): Thumbnail document data.
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
created if not passed.
Returns:
Dict[str, Any]: Skeleton of thumbnail document.
"""
if data is None:
data = {}
return {
"_id": _create_or_convert_to_id(entity_id),
"type": "thumbnail",
"schema": CURRENT_THUMBNAIL_SCHEMA,
"data": data
}
def new_workfile_info_doc(
filename, asset_id, task_name, files, data=None, entity_id=None
):
"""Create skeleton data of workfile info document.
Workfile document is at this moment used primarily for artist notes.
Args:
filename (str): Filename of workfile.
asset_id (Union[str, ObjectId]): Id of asset under which workfile live.
task_name (str): Task under which was workfile created.
files (List[str]): List of rootless filepaths related to workfile.
data (Dict[str, Any]): Additional metadata.
Returns:
Dict[str, Any]: Skeleton of workfile info document.
"""
if not data:
data = {}
return {
"_id": _create_or_convert_to_id(entity_id),
"type": "workfile",
"parent": _create_or_convert_to_id(asset_id),
"task_name": task_name,
"filename": filename,
"data": data,
"files": files
}
def _prepare_update_data(old_doc, new_doc, replace):
changes = {}
for key, value in new_doc.items():
if key not in old_doc or value != old_doc[key]:
changes[key] = value
if replace:
for key in old_doc.keys():
if key not in new_doc:
changes[key] = REMOVED_VALUE
return changes
def prepare_subset_update_data(old_doc, new_doc, replace=True):
"""Compare two subset documents and prepare update data.
Based on compared values will create update data for
'MongoUpdateOperation'.
Empty output means that documents are identical.
Returns:
Dict[str, Any]: Changes between old and new document.
"""
return _prepare_update_data(old_doc, new_doc, replace)
def prepare_version_update_data(old_doc, new_doc, replace=True):
"""Compare two version documents and prepare update data.
Based on compared values will create update data for
'MongoUpdateOperation'.
Empty output means that documents are identical.
Returns:
Dict[str, Any]: Changes between old and new document.
"""
return _prepare_update_data(old_doc, new_doc, replace)
def prepare_hero_version_update_data(old_doc, new_doc, replace=True):
"""Compare two hero version documents and prepare update data.
Based on compared values will create update data for 'UpdateOperation'.
Empty output means that documents are identical.
Returns:
Dict[str, Any]: Changes between old and new document.
"""
return _prepare_update_data(old_doc, new_doc, replace)
def prepare_representation_update_data(old_doc, new_doc, replace=True):
"""Compare two representation documents and prepare update data.
Based on compared values will create update data for
'MongoUpdateOperation'.
Empty output means that documents are identical.
Returns:
Dict[str, Any]: Changes between old and new document.
"""
return _prepare_update_data(old_doc, new_doc, replace)
def prepare_workfile_info_update_data(old_doc, new_doc, replace=True):
"""Compare two workfile info documents and prepare update data.
Based on compared values will create update data for
'MongoUpdateOperation'.
Empty output means that documents are identical.
Returns:
Dict[str, Any]: Changes between old and new document.
"""
return _prepare_update_data(old_doc, new_doc, replace)
class FailedOperations(Exception):
pass
def entity_data_json_default(value):
if isinstance(value, datetime.datetime):
return int(value.timestamp())
raise TypeError(
"Object of type {} is not JSON serializable".format(str(type(value)))
)
def failed_json_default(value):
return "< Failed value {} > {}".format(type(value), str(value))
class ServerCreateOperation(CreateOperation):
"""Opeartion to create an entity.
Args:
project_name (str): On which project operation will happen.
entity_type (str): Type of entity on which change happens.
e.g. 'asset', 'representation' etc.
data (Dict[str, Any]): Data of entity that will be created.
"""
def __init__(self, project_name, entity_type, data, session):
self._session = session
if not data:
data = {}
data = copy.deepcopy(data)
if entity_type == "project":
raise ValueError("Project cannot be created using operations")
tasks = None
if entity_type in "asset":
# TODO handle tasks
entity_type = "folder"
if "data" in data:
tasks = data["data"].get("tasks")
project = self._session.get_project(project_name)
new_data = convert_create_asset_to_v4(data, project, self.con)
elif entity_type == "task":
project = self._session.get_project(project_name)
new_data = convert_create_task_to_v4(data, project, self.con)
elif entity_type == "subset":
new_data = convert_create_subset_to_v4(data, self.con)
elif entity_type == "version":
new_data = convert_create_version_to_v4(data, self.con)
elif entity_type == "hero_version":
new_data = convert_create_hero_version_to_v4(
data, project_name, self.con
)
entity_type = "version"
elif entity_type in ("representation", "archived_representation"):
new_data = convert_create_representation_to_v4(data, self.con)
entity_type = "representation"
elif entity_type == "workfile":
new_data = convert_create_workfile_info_to_v4(
data, project_name, self.con
)
else:
raise ValueError(
"Unhandled entity type \"{}\"".format(entity_type)
)
# Simple check if data can be dumped into json
# - should raise error on 'ObjectId' object
try:
new_data = json.loads(
json.dumps(new_data, default=entity_data_json_default)
)
except:
raise ValueError("Couldn't json parse body: {}".format(
json.dumps(new_data, default=failed_json_default)
))
super(ServerCreateOperation, self).__init__(
project_name, entity_type, new_data
)
if "id" not in self._data:
self._data["id"] = create_entity_id()
if tasks:
copied_tasks = copy.deepcopy(tasks)
for task_name, task in copied_tasks.items():
task["name"] = task_name
task["folderId"] = self._data["id"]
self.session.create_entity(
project_name, "task", task, nested_id=self.id
)
@property
def con(self):
return self.session.con
@property
def session(self):
return self._session
@property
def entity_id(self):
return self._data["id"]
def to_server_operation(self):
return {
"id": self.id,
"type": "create",
"entityType": self.entity_type,
"entityId": self.entity_id,
"data": self._data
}
class ServerUpdateOperation(UpdateOperation):
"""Operation to update an entity.
Args:
project_name (str): On which project operation will happen.
entity_type (str): Type of entity on which change happens.
e.g. 'asset', 'representation' etc.
entity_id (Union[str, ObjectId]): Identifier of an entity.
update_data (Dict[str, Any]): Key -> value changes that will be set in
database. If value is set to 'REMOVED_VALUE' the key will be
removed. Only first level of dictionary is checked (on purpose).
"""
def __init__(
self, project_name, entity_type, entity_id, update_data, session
):
self._session = session
update_data = copy.deepcopy(update_data)
if entity_type == "project":
raise ValueError("Project cannot be created using operations")
if entity_type in ("asset", "archived_asset"):
new_update_data = convert_update_folder_to_v4(
project_name, entity_id, update_data, self.con
)
entity_type = "folder"
elif entity_type == "subset":
new_update_data = convert_update_subset_to_v4(
project_name, entity_id, update_data, self.con
)
elif entity_type == "version":
new_update_data = convert_update_version_to_v4(
project_name, entity_id, update_data, self.con
)
elif entity_type == "hero_version":
new_update_data = convert_update_hero_version_to_v4(
project_name, entity_id, update_data, self.con
)
entity_type = "version"
elif entity_type in ("representation", "archived_representation"):
new_update_data = convert_update_representation_to_v4(
project_name, entity_id, update_data, self.con
)
entity_type = "representation"
elif entity_type == "workfile":
new_update_data = convert_update_workfile_info_to_v4(
project_name, entity_id, update_data, self.con
)
else:
raise ValueError(
"Unhandled entity type \"{}\"".format(entity_type)
)
try:
new_update_data = json.loads(
json.dumps(new_update_data, default=entity_data_json_default)
)
except:
raise ValueError("Couldn't json parse body: {}".format(
json.dumps(new_update_data, default=failed_json_default)
))
super(ServerUpdateOperation, self).__init__(
project_name, entity_type, entity_id, new_update_data
)
@property
def con(self):
return self.session.con
@property
def session(self):
return self._session
def to_server_operation(self):
if not self._update_data:
return None
update_data = {}
for key, value in self._update_data.items():
if value is REMOVED_VALUE:
value = None
update_data[key] = value
return {
"id": self.id,
"type": "update",
"entityType": self.entity_type,
"entityId": self.entity_id,
"data": update_data
}
class ServerDeleteOperation(DeleteOperation):
"""Opeartion to delete an entity.
Args:
project_name (str): On which project operation will happen.
entity_type (str): Type of entity on which change happens.
e.g. 'asset', 'representation' etc.
entity_id (Union[str, ObjectId]): Entity id that will be removed.
"""
def __init__(self, project_name, entity_type, entity_id, session):
self._session = session
if entity_type == "asset":
entity_type == "folder"
if entity_type == "hero_version":
entity_type = "version"
super(ServerDeleteOperation, self).__init__(
project_name, entity_type, entity_id
)
@property
def con(self):
return self.session.con
@property
def session(self):
return self._session
def to_server_operation(self):
return {
"id": self.id,
"type": self.operation_name,
"entityId": self.entity_id,
"entityType": self.entity_type,
}
class OperationsSession(BaseOperationsSession):
def __init__(self, con=None, *args, **kwargs):
super(OperationsSession, self).__init__(*args, **kwargs)
if con is None:
con = get_server_api_connection()
self._con = con
self._project_cache = {}
self._nested_operations = collections.defaultdict(list)
@property
def con(self):
return self._con
def get_project(self, project_name):
if project_name not in self._project_cache:
self._project_cache[project_name] = self.con.get_project(
project_name)
return copy.deepcopy(self._project_cache[project_name])
def commit(self):
"""Commit session operations."""
operations, self._operations = self._operations, []
if not operations:
return
operations_by_project = collections.defaultdict(list)
for operation in operations:
operations_by_project[operation.project_name].append(operation)
body_by_id = {}
results = []
for project_name, operations in operations_by_project.items():
operations_body = []
for operation in operations:
body = operation.to_server_operation()
if body is not None:
try:
json.dumps(body)
except:
raise ValueError("Couldn't json parse body: {}".format(
json.dumps(
body, indent=4, default=failed_json_default
)
))
body_by_id[operation.id] = body
operations_body.append(body)
if operations_body:
result = self._con.post(
"projects/{}/operations".format(project_name),
operations=operations_body,
canFail=False
)
results.append(result.data)
for result in results:
if result.get("success"):
continue
if "operations" not in result:
raise FailedOperations(
"Operation failed. Content: {}".format(str(result))
)
for op_result in result["operations"]:
if not op_result["success"]:
operation_id = op_result["id"]
raise FailedOperations((
"Operation \"{}\" failed with data:\n{}\nError: {}."
).format(
operation_id,
json.dumps(body_by_id[operation_id], indent=4),
op_result.get("error", "unknown"),
))
def create_entity(self, project_name, entity_type, data, nested_id=None):
"""Fast access to 'ServerCreateOperation'.
Args:
project_name (str): On which project the creation happens.
entity_type (str): Which entity type will be created.
data (Dicst[str, Any]): Entity data.
nested_id (str): Id of other operation from which is triggered
operation -> Operations can trigger suboperations but they
must be added to operations list after it's parent is added.
Returns:
ServerCreateOperation: Object of update operation.
"""
operation = ServerCreateOperation(
project_name, entity_type, data, self
)
if nested_id:
self._nested_operations[nested_id].append(operation)
else:
self.add(operation)
if operation.id in self._nested_operations:
self.extend(self._nested_operations.pop(operation.id))
return operation
def update_entity(
self, project_name, entity_type, entity_id, update_data, nested_id=None
):
"""Fast access to 'ServerUpdateOperation'.
Returns:
ServerUpdateOperation: Object of update operation.
"""
operation = ServerUpdateOperation(
project_name, entity_type, entity_id, update_data, self
)
if nested_id:
self._nested_operations[nested_id].append(operation)
else:
self.add(operation)
if operation.id in self._nested_operations:
self.extend(self._nested_operations.pop(operation.id))
return operation
def delete_entity(
self, project_name, entity_type, entity_id, nested_id=None
):
"""Fast access to 'ServerDeleteOperation'.
Returns:
ServerDeleteOperation: Object of delete operation.
"""
operation = ServerDeleteOperation(
project_name, entity_type, entity_id, self
)
if nested_id:
self._nested_operations[nested_id].append(operation)
else:
self.add(operation)
if operation.id in self._nested_operations:
self.extend(self._nested_operations.pop(operation.id))
return operation
def create_project(
project_name,
project_code,
library_project=False,
preset_name=None,
con=None
):
"""Create project using OpenPype settings.
This project creation function is not validating project document on
creation. It is because project document is created blindly with only
minimum required information about project which is it's name, code, type
and schema.
Entered project name must be unique and project must not exist yet.
Note:
This function is here to be OP v4 ready but in v3 has more logic
to do. That's why inner imports are in the body.
Args:
project_name (str): New project name. Should be unique.
project_code (str): Project's code should be unique too.
library_project (bool): Project is library project.
preset_name (str): Name of anatomy preset. Default is used if not
passed.
con (ServerAPI): Connection to server with logged user.
Raises:
ValueError: When project name already exists in MongoDB.
Returns:
dict: Created project document.
"""
if con is None:
con = get_server_api_connection()
return con.create_project(
project_name,
project_code,
library_project,
preset_name
)
def delete_project(project_name, con=None):
if con is None:
con = get_server_api_connection()
return con.delete_project(project_name)
def create_thumbnail(project_name, src_filepath, con=None):
if con is None:
con = get_server_api_connection()
return con.create_thumbnail(project_name, src_filepath)

View file

@ -0,0 +1,109 @@
import uuid
from openpype.client.operations_base import REMOVED_VALUE
def create_entity_id():
return uuid.uuid1().hex
def prepare_attribute_changes(old_entity, new_entity, replace=False):
"""Prepare changes of attributes on entities.
Compare 'attrib' of old and new entity data to prepare only changed
values that should be sent to server for update.
Example:
>>> # Limited entity data to 'attrib'
>>> old_entity = {
... "attrib": {"attr_1": 1, "attr_2": "MyString", "attr_3": True}
... }
>>> new_entity = {
... "attrib": {"attr_1": 2, "attr_3": True, "attr_4": 3}
... }
>>> # Changes if replacement should not happen
>>> expected_changes = {
... "attr_1": 2,
... "attr_4": 3
... }
>>> changes = prepare_attribute_changes(old_entity, new_entity)
>>> changes == expected_changes
True
>>> # Changes if replacement should happen
>>> expected_changes_replace = {
... "attr_1": 2,
... "attr_2": REMOVED_VALUE,
... "attr_4": 3
... }
>>> changes_replace = prepare_attribute_changes(
... old_entity, new_entity, True)
>>> changes_replace == expected_changes_replace
True
Args:
old_entity (dict[str, Any]): Data of entity queried from server.
new_entity (dict[str, Any]): Entity data with applied changes.
replace (bool): New entity should fully replace all old entity values.
Returns:
Dict[str, Any]: Values from new entity only if value has changed.
"""
attrib_changes = {}
new_attrib = new_entity.get("attrib")
old_attrib = old_entity.get("attrib")
if new_attrib is None:
if not replace:
return attrib_changes
new_attrib = {}
if old_attrib is None:
return new_attrib
for attr, new_attr_value in new_attrib.items():
old_attr_value = old_attrib.get(attr)
if old_attr_value != new_attr_value:
attrib_changes[attr] = new_attr_value
if replace:
for attr in old_attrib:
if attr not in new_attrib:
attrib_changes[attr] = REMOVED_VALUE
return attrib_changes
def prepare_entity_changes(old_entity, new_entity, replace=False):
"""Prepare changes of AYON entities.
Compare old and new entity to filter values from new data that changed.
Args:
old_entity (dict[str, Any]): Data of entity queried from server.
new_entity (dict[str, Any]): Entity data with applied changes.
replace (bool): All attributes should be replaced by new values. So
all attribute values that are not on new entity will be removed.
Returns:
Dict[str, Any]: Only values from new entity that changed.
"""
changes = {}
for key, new_value in new_entity.items():
if key == "attrib":
continue
old_value = old_entity.get(key)
if old_value != new_value:
changes[key] = new_value
if replace:
for key in old_entity:
if key not in new_entity:
changes[key] = REMOVED_VALUE
attr_changes = prepare_attribute_changes(old_entity, new_entity, replace)
if attr_changes:
changes["attrib"] = attr_changes
return changes

View file

@ -1128,6 +1128,10 @@ def format_anatomy(data):
anatomy = Anatomy()
log.debug("__ anatomy.templates: {}".format(anatomy.templates))
padding = None
if "frame_padding" in anatomy.templates.keys():
padding = int(anatomy.templates["frame_padding"])
elif "render" in anatomy.templates.keys():
padding = int(
anatomy.templates["render"].get(
"frame_padding"

View file

@ -5,6 +5,8 @@ import platform
import json
import tempfile
from openpype import AYON_SERVER_ENABLED
from .log import Logger
from .vendor_bin_utils import find_executable
@ -321,19 +323,22 @@ def get_openpype_execute_args(*args):
It is possible to pass any arguments that will be added after pype
executables.
"""
pype_executable = os.environ["OPENPYPE_EXECUTABLE"]
pype_args = [pype_executable]
executable = os.environ["OPENPYPE_EXECUTABLE"]
launch_args = [executable]
executable_filename = os.path.basename(pype_executable)
executable_filename = os.path.basename(executable)
if "python" in executable_filename.lower():
pype_args.append(
os.path.join(os.environ["OPENPYPE_ROOT"], "start.py")
filename = "start.py"
if AYON_SERVER_ENABLED:
filename = "ayon_start.py"
launch_args.append(
os.path.join(os.environ["OPENPYPE_ROOT"], filename)
)
if args:
pype_args.extend(args)
launch_args.extend(args)
return pype_args
return launch_args
def get_linux_launcher_args(*args):

View file

@ -29,6 +29,7 @@ except ImportError:
import six
import appdirs
from openpype import AYON_SERVER_ENABLED
from openpype.settings import (
get_local_settings,
get_system_settings
@ -517,11 +518,54 @@ def _create_local_site_id(registry=None):
return new_id
def get_ayon_appdirs(*args):
"""Local app data directory of AYON client.
Args:
*args (Iterable[str]): Subdirectories/files in local app data dir.
Returns:
str: Path to directory/file in local app data dir.
"""
return os.path.join(
appdirs.user_data_dir("ayon", "ynput"),
*args
)
def _get_ayon_local_site_id():
# used for background syncing
site_id = os.environ.get("AYON_SITE_ID")
if site_id:
return site_id
site_id_path = get_ayon_appdirs("site_id")
if os.path.exists(site_id_path):
with open(site_id_path, "r") as stream:
site_id = stream.read()
if site_id:
return site_id
try:
from ayon_common.utils import get_local_site_id as _get_local_site_id
site_id = _get_local_site_id()
except ImportError:
raise ValueError("Couldn't access local site id")
return site_id
def get_local_site_id():
"""Get local site identifier.
Identifier is created if does not exists yet.
"""
if AYON_SERVER_ENABLED:
return _get_ayon_local_site_id()
# override local id from environment
# used for background syncing
if os.environ.get("OPENPYPE_LOCAL_ID"):

View file

@ -24,6 +24,7 @@ import traceback
import threading
import copy
from openpype import AYON_SERVER_ENABLED
from openpype.client.mongo import (
MongoEnvNotSet,
get_default_components,
@ -212,7 +213,7 @@ class Logger:
log_mongo_url_components = None
# Database name in Mongo
log_database_name = os.environ["OPENPYPE_DATABASE_NAME"]
log_database_name = os.environ.get("OPENPYPE_DATABASE_NAME")
# Collection name under database in Mongo
log_collection_name = "logs"
@ -326,12 +327,17 @@ class Logger:
# Change initialization state to prevent runtime changes
# if is executed during runtime
cls.initialized = False
if not AYON_SERVER_ENABLED:
cls.log_mongo_url_components = get_default_components()
# Define if should logging to mongo be used
use_mongo_logging = bool(log4mongo is not None)
if use_mongo_logging:
use_mongo_logging = os.environ.get("OPENPYPE_LOG_TO_SERVER") == "1"
if AYON_SERVER_ENABLED:
use_mongo_logging = False
else:
use_mongo_logging = (
log4mongo is not None
and os.environ.get("OPENPYPE_LOG_TO_SERVER") == "1"
)
# Set mongo id for process (ONLY ONCE)
if use_mongo_logging and cls.mongo_process_id is None:
@ -453,6 +459,9 @@ class Logger:
if not cls.use_mongo_logging:
return
if not cls.log_database_name:
raise ValueError("Database name for logs is not set")
client = log4mongo.handlers._connection
if not client:
client = cls.get_log_mongo_connection()

View file

@ -5,6 +5,7 @@ import platform
import getpass
import socket
from openpype import AYON_SERVER_ENABLED
from openpype.settings.lib import get_local_settings
from .execute import get_openpype_execute_args
from .local_settings import get_local_site_id
@ -33,6 +34,21 @@ def get_openpype_info():
}
def get_ayon_info():
executable_args = get_openpype_execute_args()
if is_running_from_build():
version_type = "build"
else:
version_type = "code"
return {
"build_verison": get_build_version(),
"version_type": version_type,
"executable": executable_args[-1],
"ayon_root": os.environ["AYON_ROOT"],
"server_url": os.environ["AYON_SERVER_URL"]
}
def get_workstation_info():
"""Basic information about workstation."""
host_name = socket.gethostname()
@ -52,12 +68,17 @@ def get_workstation_info():
def get_all_current_info():
"""All information about current process in one dictionary."""
return {
"pype": get_openpype_info(),
output = {
"workstation": get_workstation_info(),
"env": os.environ.copy(),
"local_settings": get_local_settings()
}
if AYON_SERVER_ENABLED:
output["ayon"] = get_ayon_info()
else:
output["openpype"] = get_openpype_info()
return output
def extract_pype_info_to_file(dirpath):

View file

@ -12,8 +12,12 @@ import collections
import traceback
from uuid import uuid4
from abc import ABCMeta, abstractmethod
import six
import six
import appdirs
import ayon_api
from openpype import AYON_SERVER_ENABLED
from openpype.settings import (
get_system_settings,
SYSTEM_SETTINGS_KEY,
@ -186,7 +190,11 @@ def get_dynamic_modules_dirs():
Returns:
list: Paths loaded from studio overrides.
"""
output = []
if AYON_SERVER_ENABLED:
return output
value = get_studio_system_settings_overrides()
for key in ("modules", "addon_paths", platform.system().lower()):
if key not in value:
@ -299,6 +307,108 @@ def load_modules(force=False):
time.sleep(0.1)
def _get_ayon_addons_information():
"""Receive information about addons to use from server.
Todos:
Actually ask server for the information.
Allow project name as optional argument to be able to query information
about used addons for specific project.
Returns:
List[Dict[str, Any]]: List of addon information to use.
"""
return ayon_api.get_addons_info()["addons"]
def _load_ayon_addons(openpype_modules, modules_key, log):
"""Load AYON addons based on information from server.
This function should not trigger downloading of any addons but only use
what is already available on the machine (at least in first stages of
development).
Args:
openpype_modules (_ModuleClass): Module object where modules are
stored.
log (logging.Logger): Logger object.
Returns:
List[str]: List of v3 addons to skip to load because v4 alternative is
imported.
"""
v3_addons_to_skip = []
addons_info = _get_ayon_addons_information()
if not addons_info:
return v3_addons_to_skip
addons_dir = os.path.join(
appdirs.user_data_dir("ayon", "ynput"),
"addons"
)
if not os.path.exists(addons_dir):
log.warning("Addons directory does not exists. Path \"{}\"".format(
addons_dir
))
return v3_addons_to_skip
for addon_info in addons_info:
addon_name = addon_info["name"]
addon_version = addon_info.get("productionVersion")
if not addon_version:
continue
folder_name = "{}_{}".format(addon_name, addon_version)
addon_dir = os.path.join(addons_dir, folder_name)
if not os.path.exists(addon_dir):
log.warning((
"Directory for addon {} {} does not exists. Path \"{}\""
).format(addon_name, addon_version, addon_dir))
continue
sys.path.insert(0, addon_dir)
imported_modules = []
for name in os.listdir(addon_dir):
path = os.path.join(addon_dir, name)
basename, ext = os.path.splitext(name)
is_dir = os.path.isdir(path)
is_py_file = ext.lower() == ".py"
if not is_py_file and not is_dir:
continue
try:
mod = __import__(basename, fromlist=("",))
imported_modules.append(mod)
except BaseException:
log.warning(
"Failed to import \"{}\"".format(basename),
exc_info=True
)
if not imported_modules:
log.warning("Addon {} {} has no content to import".format(
addon_name, addon_version
))
continue
if len(imported_modules) == 1:
mod = imported_modules[0]
addon_alias = getattr(mod, "V3_ALIAS", None)
if not addon_alias:
addon_alias = addon_name
v3_addons_to_skip.append(addon_alias)
new_import_str = "{}.{}".format(modules_key, addon_alias)
sys.modules[new_import_str] = mod
setattr(openpype_modules, addon_alias, mod)
else:
log.info("More then one module was imported")
return v3_addons_to_skip
def _load_modules():
# Key under which will be modules imported in `sys.modules`
modules_key = "openpype_modules"
@ -308,6 +418,12 @@ def _load_modules():
log = Logger.get_logger("ModulesLoader")
ignore_addon_names = []
if AYON_SERVER_ENABLED:
ignore_addon_names = _load_ayon_addons(
openpype_modules, modules_key, log
)
# Look for OpenPype modules in paths defined with `get_module_dirs`
# - dynamically imported OpenPype modules and addons
module_dirs = get_module_dirs()
@ -351,6 +467,9 @@ def _load_modules():
fullpath = os.path.join(dirpath, filename)
basename, ext = os.path.splitext(filename)
if basename in ignore_addon_names:
continue
# Validations
if os.path.isdir(fullpath):
# Check existence of init file

View file

@ -1,3 +1,4 @@
from openpype import AYON_SERVER_ENABLED
from openpype.modules import OpenPypeModule, ITrayModule
@ -7,6 +8,8 @@ class LogViewModule(OpenPypeModule, ITrayModule):
def initialize(self, modules_settings):
logging_settings = modules_settings[self.name]
self.enabled = logging_settings["enabled"]
if AYON_SERVER_ENABLED:
self.enabled = False
# Tray attributes
self.window = None

View file

@ -1,3 +1,4 @@
from openpype import AYON_SERVER_ENABLED
from openpype.modules import OpenPypeModule, ITrayAction
@ -11,6 +12,9 @@ class ProjectManagerAction(OpenPypeModule, ITrayAction):
module_settings = modules_settings.get(self.name)
if module_settings:
enabled = module_settings.get("enabled", enabled)
if AYON_SERVER_ENABLED:
enabled = False
self.enabled = enabled
# Tray attributes

View file

@ -1,3 +1,4 @@
from openpype import AYON_SERVER_ENABLED
from openpype.modules import OpenPypeModule, ITrayAction
@ -10,6 +11,8 @@ class SettingsAction(OpenPypeModule, ITrayAction):
def initialize(self, _modules_settings):
# This action is always enabled
self.enabled = True
if AYON_SERVER_ENABLED:
self.enabled = False
# User role
# TODO should be changeable
@ -80,6 +83,8 @@ class LocalSettingsAction(OpenPypeModule, ITrayAction):
def initialize(self, _modules_settings):
# This action is always enabled
self.enabled = True
if AYON_SERVER_ENABLED:
self.enabled = False
# Tray attributes
self.settings_window = None

View file

@ -536,8 +536,8 @@ class SyncServerThread(threading.Thread):
_site_is_working(self.module, project_name, remote_site,
remote_site_config)]):
self.log.debug(
"Some of the sites {} - {} is not working properly".format(
local_site, remote_site
"Some of the sites {} - {} in {} is not working properly".format( # noqa
local_site, remote_site, project_name
)
)

View file

@ -15,7 +15,7 @@ from openpype.client import (
get_representations,
get_representation_by_id,
)
from openpype.modules import OpenPypeModule, ITrayModule
from openpype.modules import OpenPypeModule, ITrayModule, IPluginPaths
from openpype.settings import (
get_project_settings,
get_system_settings,
@ -39,7 +39,7 @@ from .utils import time_function, SyncStatus, SiteAlreadyPresentError
log = Logger.get_logger("SyncServer")
class SyncServerModule(OpenPypeModule, ITrayModule):
class SyncServerModule(OpenPypeModule, ITrayModule, IPluginPaths):
"""
Synchronization server that is syncing published files from local to
any of implemented providers (like GDrive, S3 etc.)
@ -136,6 +136,13 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
# projects that long tasks are running on
self.projects_processed = set()
def get_plugin_paths(self):
"""Deadline plugin paths."""
current_dir = os.path.dirname(os.path.abspath(__file__))
return {
"load": [os.path.join(current_dir, "plugins", "load")]
}
""" Start of Public API """
def add_site(self, project_name, representation_id, site_name=None,
force=False, priority=None, reset_timer=False):
@ -204,6 +211,58 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
if remove_local_files:
self._remove_local_file(project_name, representation_id, site_name)
def get_progress_for_repre(self, doc, active_site, remote_site):
"""
Calculates average progress for representation.
If site has created_dt >> fully available >> progress == 1
Could be calculated in aggregate if it would be too slow
Args:
doc(dict): representation dict
Returns:
(dict) with active and remote sites progress
{'studio': 1.0, 'gdrive': -1} - gdrive site is not present
-1 is used to highlight the site should be added
{'studio': 1.0, 'gdrive': 0.0} - gdrive site is present, not
uploaded yet
"""
progress = {active_site: -1,
remote_site: -1}
if not doc:
return progress
files = {active_site: 0, remote_site: 0}
doc_files = doc.get("files") or []
for doc_file in doc_files:
if not isinstance(doc_file, dict):
continue
sites = doc_file.get("sites") or []
for site in sites:
if (
# Pype 2 compatibility
not isinstance(site, dict)
# Check if site name is one of progress sites
or site["name"] not in progress
):
continue
files[site["name"]] += 1
norm_progress = max(progress[site["name"]], 0)
if site.get("created_dt"):
progress[site["name"]] = norm_progress + 1
elif site.get("progress"):
progress[site["name"]] = norm_progress + site["progress"]
else: # site exists, might be failed, do not add again
progress[site["name"]] = 0
# for example 13 fully avail. files out of 26 >> 13/26 = 0.5
avg_progress = {}
avg_progress[active_site] = \
progress[active_site] / max(files[active_site], 1)
avg_progress[remote_site] = \
progress[remote_site] / max(files[remote_site], 1)
return avg_progress
def compute_resource_sync_sites(self, project_name):
"""Get available resource sync sites state for publish process.

View file

@ -5,17 +5,19 @@ import platform
import collections
import numbers
import ayon_api
import six
import time
from openpype import AYON_SERVER_ENABLED
from openpype.settings.lib import (
get_local_settings,
)
from openpype.settings.constants import (
DEFAULT_PROJECT_KEY
)
from openpype.client import get_project
from openpype.lib import Logger, get_local_site_id
from openpype.lib.path_templates import (
TemplateUnsolved,
TemplateResult,
@ -23,7 +25,6 @@ from openpype.lib.path_templates import (
TemplatesDict,
FormatObject,
)
from openpype.lib.log import Logger
from openpype.modules import ModulesManager
log = Logger.get_logger(__name__)
@ -475,6 +476,13 @@ class Anatomy(BaseAnatomy):
Union[Dict[str, str], None]): Local root overrides.
"""
if AYON_SERVER_ENABLED:
if not project_name:
return
return ayon_api.get_project_roots_for_site(
project_name, get_local_site_id()
)
if local_settings is None:
local_settings = get_local_settings()

View file

@ -4,6 +4,7 @@ import sys
import logging
import functools
from openpype import AYON_SERVER_ENABLED
from . import schema
from .mongodb import AvalonMongoDB, session_data_from_environment
@ -39,6 +40,7 @@ def install():
_connection_object.Session.update(session)
_connection_object.install()
if not AYON_SERVER_ENABLED:
module._mongo_client = _connection_object.mongo_client
module._database = module.database = _connection_object.database

View file

@ -5,6 +5,7 @@ import logging
import pymongo
from uuid import uuid4
from openpype import AYON_SERVER_ENABLED
from openpype.client import OpenPypeMongoConnection
from . import schema
@ -187,6 +188,7 @@ class AvalonMongoDB:
return
self._installed = True
if not AYON_SERVER_ENABLED:
self._database = self.mongo_client[str(os.environ["AVALON_DB"])]
def uninstall(self):

View file

@ -2,6 +2,7 @@ import os
import copy
import logging
from openpype import AYON_SERVER_ENABLED
from openpype.client import get_project
from . import legacy_io
from .anatomy import Anatomy
@ -131,6 +132,32 @@ class BinaryThumbnail(ThumbnailResolver):
return thumbnail_entity["data"].get("binary_data")
class ServerThumbnailResolver(ThumbnailResolver):
def process(self, thumbnail_entity, thumbnail_type):
if not AYON_SERVER_ENABLED:
return None
data = thumbnail_entity["data"]
entity_type = data.get("entity_type")
entity_id = data.get("entity_id")
if not entity_type or not entity_id:
return None
from openpype.client.server.server_api import get_server_api_connection
project_name = self.dbcon.active_project()
thumbnail_id = thumbnail_entity["_id"]
con = get_server_api_connection()
filepath = con.get_thumbnail(
project_name, entity_type, entity_id, thumbnail_id
)
content = None
if filepath:
with open(filepath, "rb") as stream:
content = stream.read()
return content
# Thumbnail resolvers
def discover_thumbnail_resolvers():
return discover(ThumbnailResolver)
@ -146,3 +173,4 @@ def register_thumbnail_resolver_path(path):
register_thumbnail_resolver(TemplateResolver)
register_thumbnail_resolver(BinaryThumbnail)
register_thumbnail_resolver(ServerThumbnailResolver)

View file

@ -1,6 +1,7 @@
import collections
from copy import deepcopy
import pyblish.api
from openpype import AYON_SERVER_ENABLED
from openpype.client import (
get_assets,
get_archived_assets
@ -16,6 +17,9 @@ class ExtractHierarchyToAvalon(pyblish.api.ContextPlugin):
families = ["clip", "shot"]
def process(self, context):
if AYON_SERVER_ENABLED:
return
if "hierarchyContext" not in context.data:
self.log.info("skipping IntegrateHierarchyToAvalon")
return

View file

@ -0,0 +1,234 @@
import collections
import copy
import json
import uuid
import pyblish.api
from ayon_api import slugify_string
from ayon_api.entity_hub import EntityHub
from openpype import AYON_SERVER_ENABLED
def _default_json_parse(value):
return str(value)
class ExtractHierarchyToAYON(pyblish.api.ContextPlugin):
"""Create entities in AYON based on collected data."""
order = pyblish.api.ExtractorOrder - 0.01
label = "Extract Hierarchy To AYON"
families = ["clip", "shot"]
def process(self, context):
if not AYON_SERVER_ENABLED:
return
hierarchy_context = context.data.get("hierarchyContext")
if not hierarchy_context:
self.log.info("Skipping")
return
project_name = context.data["projectName"]
hierarchy_context = self._filter_hierarchy(context)
if not hierarchy_context:
self.log.info("All folders were filtered out")
return
self.log.debug("Hierarchy_context: {}".format(
json.dumps(hierarchy_context, default=_default_json_parse)
))
entity_hub = EntityHub(project_name)
project = entity_hub.project_entity
hierarchy_match_queue = collections.deque()
hierarchy_match_queue.append((project, hierarchy_context))
while hierarchy_match_queue:
item = hierarchy_match_queue.popleft()
entity, entity_info = item
# Update attributes of entities
for attr_name, attr_value in entity_info["attributes"].items():
if attr_name in entity.attribs:
entity.attribs[attr_name] = attr_value
# Check if info has any children to sync
children_info = entity_info["children"]
tasks_info = entity_info["tasks"]
if not tasks_info and not children_info:
continue
# Prepare children by lowered name to easily find matching entities
children_by_low_name = {
child.name.lower(): child
for child in entity.children
}
# Create tasks if are not available
for task_info in tasks_info:
task_label = task_info["name"]
task_name = slugify_string(task_label)
if task_name == task_label:
task_label = None
task_entity = children_by_low_name.get(task_name.lower())
# TODO propagate updates of tasks if there are any
# TODO check if existing entity have 'task' type
if task_entity is None:
task_entity = entity_hub.add_new_task(
task_info["type"],
parent_id=entity.id,
name=task_name
)
if task_label:
task_entity.label = task_label
# Create/Update sub-folders
for child_info in children_info:
child_label = child_info["name"]
child_name = slugify_string(child_label)
if child_name == child_label:
child_label = None
# TODO check if existing entity have 'folder' type
child_entity = children_by_low_name.get(child_name.lower())
if child_entity is None:
child_entity = entity_hub.add_new_folder(
child_info["entity_type"],
parent_id=entity.id,
name=child_name
)
if child_label:
child_entity.label = child_label
# Add folder to queue
hierarchy_match_queue.append((child_entity, child_info))
entity_hub.commit_changes()
def _filter_hierarchy(self, context):
"""Filter hierarchy context by active folder names.
Hierarchy context is filtered to folder names on active instances.
Change hierarchy context to unified structure which suits logic in
entity creation.
Output example:
{
"name": "MyProject",
"entity_type": "Project",
"attributes": {},
"tasks": [],
"children": [
{
"name": "seq_01",
"entity_type": "Sequence",
"attributes": {},
"tasks": [],
"children": [
...
]
},
...
]
}
Todos:
Change how active folder are defined (names won't be enough in
AYON).
Args:
context (pyblish.api.Context): Pyblish context.
Returns:
dict[str, Any]: Hierarchy structure filtered by folder names.
"""
# filter only the active publishing instances
active_folder_names = set()
for instance in context:
if instance.data.get("publish") is not False:
active_folder_names.add(instance.data.get("asset"))
active_folder_names.discard(None)
self.log.debug("Active folder names: {}".format(active_folder_names))
if not active_folder_names:
return None
project_item = None
project_children_context = None
for key, value in context.data["hierarchyContext"].items():
project_item = copy.deepcopy(value)
project_children_context = project_item.pop("childs", None)
project_item["name"] = key
project_item["tasks"] = []
project_item["attributes"] = project_item.pop(
"custom_attributes", {}
)
project_item["children"] = []
if not project_children_context:
return None
project_id = uuid.uuid4().hex
items_by_id = {project_id: project_item}
parent_id_by_item_id = {project_id: None}
valid_ids = set()
hierarchy_queue = collections.deque()
hierarchy_queue.append((project_id, project_children_context))
while hierarchy_queue:
queue_item = hierarchy_queue.popleft()
parent_id, children_context = queue_item
if not children_context:
continue
for asset_name, asset_info in children_context.items():
if (
asset_name not in active_folder_names
and not asset_info.get("childs")
):
continue
item_id = uuid.uuid4().hex
new_item = copy.deepcopy(asset_info)
new_item["name"] = asset_name
new_item["children"] = []
new_children_context = new_item.pop("childs", None)
tasks = new_item.pop("tasks", {})
task_items = []
for task_name, task_info in tasks.items():
task_info["name"] = task_name
task_items.append(task_info)
new_item["tasks"] = task_items
new_item["attributes"] = new_item.pop("custom_attributes", {})
items_by_id[item_id] = new_item
parent_id_by_item_id[item_id] = parent_id
if asset_name in active_folder_names:
valid_ids.add(item_id)
hierarchy_queue.append((item_id, new_children_context))
if not valid_ids:
return None
for item_id in set(valid_ids):
parent_id = parent_id_by_item_id[item_id]
while parent_id is not None and parent_id not in valid_ids:
valid_ids.add(parent_id)
parent_id = parent_id_by_item_id[parent_id]
valid_ids.discard(project_id)
for item_id in valid_ids:
parent_id = parent_id_by_item_id[item_id]
item = items_by_id[item_id]
parent_item = items_by_id[parent_id]
parent_item["children"].append(item)
if not project_item["children"]:
return None
return project_item

View file

@ -6,6 +6,7 @@ import shutil
import pyblish.api
from openpype import AYON_SERVER_ENABLED
from openpype.client import (
get_version_by_id,
get_hero_version_by_subset_id,
@ -195,6 +196,15 @@ class IntegrateHeroVersion(pyblish.api.InstancePlugin):
entity_id = None
if old_version:
entity_id = old_version["_id"]
if AYON_SERVER_ENABLED:
new_hero_version = new_hero_version_doc(
src_version_entity["parent"],
copy.deepcopy(src_version_entity["data"]),
src_version_entity["name"],
entity_id=entity_id
)
else:
new_hero_version = new_hero_version_doc(
src_version_entity["_id"],
src_version_entity["parent"],

View file

@ -18,6 +18,7 @@ import collections
import six
import pyblish.api
from openpype import AYON_SERVER_ENABLED
from openpype.client import get_versions
from openpype.client.operations import OperationsSession, new_thumbnail_doc
from openpype.pipeline.publish import get_publish_instance_label
@ -39,6 +40,10 @@ class IntegrateThumbnails(pyblish.api.ContextPlugin):
]
def process(self, context):
if AYON_SERVER_ENABLED:
self.log.info("AYON is enabled. Skipping v3 thumbnail integration")
return
# Filter instances which can be used for integration
filtered_instance_items = self._prepare_instances(context)
if not filtered_instance_items:

View file

@ -0,0 +1,207 @@
""" Integrate Thumbnails for Openpype use in Loaders.
This thumbnail is different from 'thumbnail' representation which could
be uploaded to Ftrack, or used as any other representation in Loaders to
pull into a scene.
This one is used only as image describing content of published item and
shows up only in Loader in right column section.
"""
import os
import collections
import pyblish.api
from openpype import AYON_SERVER_ENABLED
from openpype.client import get_versions
from openpype.client.operations import OperationsSession
InstanceFilterResult = collections.namedtuple(
"InstanceFilterResult",
["instance", "thumbnail_path", "version_id"]
)
class IntegrateThumbnailsAYON(pyblish.api.ContextPlugin):
"""Integrate Thumbnails for Openpype use in Loaders."""
label = "Integrate Thumbnails to AYON"
order = pyblish.api.IntegratorOrder + 0.01
required_context_keys = [
"project", "asset", "task", "subset", "version"
]
def process(self, context):
if not AYON_SERVER_ENABLED:
self.log.info("AYON is not enabled. Skipping")
return
# Filter instances which can be used for integration
filtered_instance_items = self._prepare_instances(context)
if not filtered_instance_items:
self.log.info(
"All instances were filtered. Thumbnail integration skipped."
)
return
project_name = context.data["projectName"]
# Collect version ids from all filtered instance
version_ids = {
instance_items.version_id
for instance_items in filtered_instance_items
}
# Query versions
version_docs = get_versions(
project_name,
version_ids=version_ids,
hero=True,
fields=["_id", "type", "name"]
)
# Store version by their id (converted to string)
version_docs_by_str_id = {
str(version_doc["_id"]): version_doc
for version_doc in version_docs
}
self._integrate_thumbnails(
filtered_instance_items,
version_docs_by_str_id,
project_name
)
def _prepare_instances(self, context):
context_thumbnail_path = context.get("thumbnailPath")
valid_context_thumbnail = bool(
context_thumbnail_path
and os.path.exists(context_thumbnail_path)
)
filtered_instances = []
for instance in context:
instance_label = self._get_instance_label(instance)
# Skip instances without published representations
# - there is no place where to put the thumbnail
published_repres = instance.data.get("published_representations")
if not published_repres:
self.log.debug((
"There are no published representations"
" on the instance {}."
).format(instance_label))
continue
# Find thumbnail path on instance
thumbnail_path = self._get_instance_thumbnail_path(
published_repres)
if thumbnail_path:
self.log.debug((
"Found thumbnail path for instance \"{}\"."
" Thumbnail path: {}"
).format(instance_label, thumbnail_path))
elif valid_context_thumbnail:
# Use context thumbnail path if is available
thumbnail_path = context_thumbnail_path
self.log.debug((
"Using context thumbnail path for instance \"{}\"."
" Thumbnail path: {}"
).format(instance_label, thumbnail_path))
# Skip instance if thumbnail path is not available for it
if not thumbnail_path:
self.log.info((
"Skipping thumbnail integration for instance \"{}\"."
" Instance and context"
" thumbnail paths are not available."
).format(instance_label))
continue
version_id = str(self._get_version_id(published_repres))
filtered_instances.append(
InstanceFilterResult(instance, thumbnail_path, version_id)
)
return filtered_instances
def _get_version_id(self, published_representations):
for repre_info in published_representations.values():
return repre_info["representation"]["parent"]
def _get_instance_thumbnail_path(self, published_representations):
thumb_repre_doc = None
for repre_info in published_representations.values():
repre_doc = repre_info["representation"]
if repre_doc["name"].lower() == "thumbnail":
thumb_repre_doc = repre_doc
break
if thumb_repre_doc is None:
self.log.debug(
"There is not representation with name \"thumbnail\""
)
return None
path = thumb_repre_doc["data"]["path"]
if not os.path.exists(path):
self.log.warning(
"Thumbnail file cannot be found. Path: {}".format(path)
)
return None
return os.path.normpath(path)
def _integrate_thumbnails(
self,
filtered_instance_items,
version_docs_by_str_id,
project_name
):
from openpype.client.server.operations import create_thumbnail
op_session = OperationsSession()
for instance_item in filtered_instance_items:
instance, thumbnail_path, version_id = instance_item
instance_label = self._get_instance_label(instance)
version_doc = version_docs_by_str_id.get(version_id)
if not version_doc:
self.log.warning((
"Version entity for instance \"{}\" was not found."
).format(instance_label))
continue
thumbnail_id = create_thumbnail(project_name, thumbnail_path)
# Set thumbnail id for version
op_session.update_entity(
project_name,
version_doc["type"],
version_doc["_id"],
{"data.thumbnail_id": thumbnail_id}
)
if version_doc["type"] == "hero_version":
version_name = "Hero"
else:
version_name = version_doc["name"]
self.log.debug("Setting thumbnail for version \"{}\" <{}>".format(
version_name, version_id
))
asset_entity = instance.data["assetEntity"]
op_session.update_entity(
project_name,
asset_entity["type"],
asset_entity["_id"],
{"data.thumbnail_id": thumbnail_id}
)
self.log.debug("Setting thumbnail for asset \"{}\" <{}>".format(
asset_entity["name"], version_id
))
op_session.commit()
def _get_instance_label(self, instance):
return (
instance.data.get("label")
or instance.data.get("name")
or "N/A"
)

View file

@ -1,4 +1,5 @@
import os
from openpype import AYON_SERVER_ENABLED
from openpype.lib.openpype_version import is_running_staging
RESOURCES_DIR = os.path.dirname(os.path.abspath(__file__))
@ -40,11 +41,17 @@ def get_liberation_font_path(bold=False, italic=False):
def get_openpype_production_icon_filepath():
return get_resource("icons", "openpype_icon.png")
filename = "openpype_icon.png"
if AYON_SERVER_ENABLED:
filename = "AYON_icon.png"
return get_resource("icons", filename)
def get_openpype_staging_icon_filepath():
return get_resource("icons", "openpype_icon_staging.png")
filename = "openpype_icon_staging.png"
if AYON_SERVER_ENABLED:
filename = "AYON_icon.png"
return get_resource("icons", filename)
def get_openpype_icon_filepath(staging=None):
@ -60,7 +67,9 @@ def get_openpype_splash_filepath(staging=None):
if staging is None:
staging = is_running_staging()
if staging:
if AYON_SERVER_ENABLED:
splash_file_name = "AYON_splash.png"
elif staging:
splash_file_name = "openpype_splash_staging.png"
else:
splash_file_name = "openpype_splash.png"

Binary file not shown.

After

Width:  |  Height:  |  Size: 16 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 20 KiB

File diff suppressed because it is too large Load diff

View file

@ -7,10 +7,14 @@ from abc import ABCMeta, abstractmethod
import six
import openpype.version
from openpype.client.mongo import OpenPypeMongoConnection
from openpype.client.entities import get_project_connection, get_project
from openpype.client.mongo import (
OpenPypeMongoConnection,
get_project_connection,
)
from openpype.client.entities import get_project
from openpype.lib.pype_info import get_workstation_info
from .constants import (
GLOBAL_SETTINGS_KEY,
SYSTEM_SETTINGS_KEY,

View file

@ -4,6 +4,9 @@ import functools
import logging
import platform
import copy
from openpype import AYON_SERVER_ENABLED
from .exceptions import (
SaveWarningExc
)
@ -18,6 +21,11 @@ from .constants import (
DEFAULT_PROJECT_KEY
)
from .ayon_settings import (
get_ayon_project_settings,
get_ayon_system_settings
)
log = logging.getLogger(__name__)
# Py2 + Py3 json decode exception
@ -40,36 +48,17 @@ _SETTINGS_HANDLER = None
_LOCAL_SETTINGS_HANDLER = None
def require_handler(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
global _SETTINGS_HANDLER
if _SETTINGS_HANDLER is None:
_SETTINGS_HANDLER = create_settings_handler()
return func(*args, **kwargs)
return wrapper
def require_local_handler(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
global _LOCAL_SETTINGS_HANDLER
if _LOCAL_SETTINGS_HANDLER is None:
_LOCAL_SETTINGS_HANDLER = create_local_settings_handler()
return func(*args, **kwargs)
return wrapper
def create_settings_handler():
from .handlers import MongoSettingsHandler
# Handler can't be created in global space on initialization but only when
# needed. Plus here may be logic: Which handler is used (in future).
return MongoSettingsHandler()
def create_local_settings_handler():
from .handlers import MongoLocalSettingsHandler
return MongoLocalSettingsHandler()
def clear_metadata_from_settings(values):
"""Remove all metadata keys from loaded settings."""
if isinstance(values, dict):
for key in tuple(values.keys()):
if key in METADATA_KEYS:
values.pop(key)
else:
clear_metadata_from_settings(values[key])
elif isinstance(values, list):
for item in values:
clear_metadata_from_settings(item)
def calculate_changes(old_value, new_value):
@ -91,6 +80,42 @@ def calculate_changes(old_value, new_value):
return changes
def create_settings_handler():
if AYON_SERVER_ENABLED:
raise RuntimeError("Mongo settings handler was triggered in AYON mode")
from .handlers import MongoSettingsHandler
# Handler can't be created in global space on initialization but only when
# needed. Plus here may be logic: Which handler is used (in future).
return MongoSettingsHandler()
def create_local_settings_handler():
if AYON_SERVER_ENABLED:
raise RuntimeError("Mongo settings handler was triggered in AYON mode")
from .handlers import MongoLocalSettingsHandler
return MongoLocalSettingsHandler()
def require_handler(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
global _SETTINGS_HANDLER
if _SETTINGS_HANDLER is None:
_SETTINGS_HANDLER = create_settings_handler()
return func(*args, **kwargs)
return wrapper
def require_local_handler(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
global _LOCAL_SETTINGS_HANDLER
if _LOCAL_SETTINGS_HANDLER is None:
_LOCAL_SETTINGS_HANDLER = create_local_settings_handler()
return func(*args, **kwargs)
return wrapper
@require_handler
def get_system_last_saved_info():
return _SETTINGS_HANDLER.get_system_last_saved_info()
@ -494,10 +519,17 @@ def save_local_settings(data):
@require_local_handler
def get_local_settings():
def _get_local_settings():
return _LOCAL_SETTINGS_HANDLER.get_local_settings()
def get_local_settings():
if not AYON_SERVER_ENABLED:
return _get_local_settings()
# TODO implement ayon implementation
return {}
def load_openpype_default_settings():
"""Load openpype default settings."""
return load_jsons_from_dir(DEFAULTS_DIR)
@ -890,7 +922,7 @@ def apply_local_settings_on_project_settings(
sync_server_config["remote_site"] = remote_site
def get_system_settings(clear_metadata=True, exclude_locals=None):
def _get_system_settings(clear_metadata=True, exclude_locals=None):
"""System settings with applied studio overrides."""
default_values = get_default_settings()[SYSTEM_SETTINGS_KEY]
studio_values = get_studio_system_settings_overrides()
@ -992,7 +1024,7 @@ def get_anatomy_settings(
return result
def get_project_settings(
def _get_project_settings(
project_name, clear_metadata=True, exclude_locals=None
):
"""Project settings with applied studio and project overrides."""
@ -1043,7 +1075,7 @@ def get_current_project_settings():
@require_handler
def get_global_settings():
def _get_global_settings():
default_settings = load_openpype_default_settings()
default_values = default_settings["system_settings"]["general"]
studio_values = _SETTINGS_HANDLER.get_global_settings()
@ -1053,7 +1085,14 @@ def get_global_settings():
}
def get_general_environments():
def get_global_settings():
if not AYON_SERVER_ENABLED:
return _get_global_settings()
default_settings = load_openpype_default_settings()
return default_settings["system_settings"]["general"]
def _get_general_environments():
"""Get general environments.
Function is implemented to be able load general environments without using
@ -1082,14 +1121,24 @@ def get_general_environments():
return environments
def clear_metadata_from_settings(values):
"""Remove all metadata keys from loaded settings."""
if isinstance(values, dict):
for key in tuple(values.keys()):
if key in METADATA_KEYS:
values.pop(key)
else:
clear_metadata_from_settings(values[key])
elif isinstance(values, list):
for item in values:
clear_metadata_from_settings(item)
def get_general_environments():
if not AYON_SERVER_ENABLED:
return _get_general_environments()
value = get_system_settings()
return value["general"]["environment"]
def get_system_settings(*args, **kwargs):
if not AYON_SERVER_ENABLED:
return _get_system_settings(*args, **kwargs)
default_settings = get_default_settings()[SYSTEM_SETTINGS_KEY]
return get_ayon_system_settings(default_settings)
def get_project_settings(project_name, *args, **kwargs):
if not AYON_SERVER_ENABLED:
return _get_project_settings(project_name, *args, **kwargs)
default_settings = get_default_settings()[PROJECT_SETTINGS_KEY]
return get_ayon_project_settings(default_settings, project_name)

View file

@ -1173,9 +1173,9 @@ class RepresentationModel(TreeModel, BaseRepresentationModel):
repre_groups_items[doc["name"]] = 0
group = group_item
progress = lib.get_progress_for_repre(
doc, self.active_site, self.remote_site
)
progress = self.sync_server.get_progress_for_repre(
doc,
self.active_site, self.remote_site)
active_site_icon = self._icons.get(self.active_provider)
remote_site_icon = self._icons.get(self.remote_provider)

View file

@ -886,7 +886,9 @@ class ThumbnailWidget(QtWidgets.QLabel):
self.set_pixmap()
return
thumbnail_ent = get_thumbnail(project_name, thumbnail_id)
thumbnail_ent = get_thumbnail(
project_name, thumbnail_id, src_type, src_id
)
if not thumbnail_ent:
return

View file

@ -28,55 +28,3 @@ def get_site_icons():
return icons
def get_progress_for_repre(repre_doc, active_site, remote_site):
"""
Calculates average progress for representation.
If site has created_dt >> fully available >> progress == 1
Could be calculated in aggregate if it would be too slow
Args:
repre_doc(dict): representation dict
Returns:
(dict) with active and remote sites progress
{'studio': 1.0, 'gdrive': -1} - gdrive site is not present
-1 is used to highlight the site should be added
{'studio': 1.0, 'gdrive': 0.0} - gdrive site is present, not
uploaded yet
"""
progress = {active_site: -1, remote_site: -1}
if not repre_doc:
return progress
files = {active_site: 0, remote_site: 0}
doc_files = repre_doc.get("files") or []
for doc_file in doc_files:
if not isinstance(doc_file, dict):
continue
sites = doc_file.get("sites") or []
for site in sites:
if (
# Pype 2 compatibility
not isinstance(site, dict)
# Check if site name is one of progress sites
or site["name"] not in progress
):
continue
files[site["name"]] += 1
norm_progress = max(progress[site["name"]], 0)
if site.get("created_dt"):
progress[site["name"]] = norm_progress + 1
elif site.get("progress"):
progress[site["name"]] = norm_progress + site["progress"]
else: # site exists, might be failed, do not add again
progress[site["name"]] = 0
# for example 13 fully avail. files out of 26 >> 13/26 = 0.5
avg_progress = {
active_site: progress[active_site] / max(files[active_site], 1),
remote_site: progress[remote_site] / max(files[remote_site], 1)
}
return avg_progress

View file

@ -27,7 +27,6 @@ from openpype.modules import ModulesManager
from .lib import (
get_site_icons,
walk_hierarchy,
get_progress_for_repre
)
@ -80,7 +79,7 @@ class InventoryModel(TreeModel):
project_name, remote_site
)
# self.sync_server = sync_server
self.sync_server = sync_server
self.active_site = active_site
self.active_provider = active_provider
self.remote_site = remote_site
@ -445,7 +444,7 @@ class InventoryModel(TreeModel):
group_node["group"] = subset["data"].get("subsetGroup")
if self.sync_enabled:
progress = get_progress_for_repre(
progress = self.sync_server.get_progress_for_repre(
representation, self.active_site, self.remote_site
)
group_node["active_site"] = self.active_site

View file

@ -23,7 +23,6 @@ from openpype.pipeline import (
)
from openpype.modules import ModulesManager
from openpype.tools.utils.lib import (
get_progress_for_repre,
iter_model_rows,
format_version
)
@ -361,7 +360,7 @@ class SceneInventoryView(QtWidgets.QTreeView):
if not repre_doc:
continue
progress = get_progress_for_repre(
progress = self.sync_server.get_progress_for_repre(
repre_doc,
active_site,
remote_site

View file

@ -2,11 +2,14 @@ import os
import json
import collections
import ayon_api
from qtpy import QtCore, QtGui, QtWidgets
from openpype import style
from openpype import resources
from openpype import AYON_SERVER_ENABLED
from openpype.settings.lib import get_local_settings
from openpype.lib import get_openpype_execute_args
from openpype.lib.pype_info import (
get_all_current_info,
get_openpype_info,
@ -327,6 +330,7 @@ class PypeInfoSubWidget(QtWidgets.QWidget):
main_layout.addWidget(self._create_openpype_info_widget(), 0)
main_layout.addWidget(self._create_separator(), 0)
main_layout.addWidget(self._create_workstation_widget(), 0)
if not AYON_SERVER_ENABLED:
main_layout.addWidget(self._create_separator(), 0)
main_layout.addWidget(self._create_local_settings_widget(), 0)
main_layout.addWidget(self._create_separator(), 0)
@ -425,14 +429,41 @@ class PypeInfoSubWidget(QtWidgets.QWidget):
def _create_openpype_info_widget(self):
"""Create widget with information about OpenPype application."""
if AYON_SERVER_ENABLED:
executable_args = get_openpype_execute_args()
username = "N/A"
user_info = ayon_api.get_user()
if user_info:
username = user_info.get("name") or username
full_name = user_info.get("attrib", {}).get("fullName")
if full_name:
username = "{} ({})".format(full_name, username)
info_values = {
"executable": executable_args[-1],
"server_url": os.environ["AYON_SERVER_URL"],
"username": username
}
key_label_mapping = {
"executable": "AYON Executable:",
"server_url": "AYON Server:",
"username": "AYON Username:"
}
# Prepare keys order
keys_order = [
"server_url",
"username",
"executable",
]
else:
# Get pype info data
pype_info = get_openpype_info()
info_values = get_openpype_info()
# Modify version key/values
version_value = "{} ({})".format(
pype_info.pop("version", self.not_applicable),
pype_info.pop("version_type", self.not_applicable)
info_values.pop("version", self.not_applicable),
info_values.pop("version_type", self.not_applicable)
)
pype_info["version_value"] = version_value
info_values["version_value"] = version_value
# Prepare label mapping
key_label_mapping = {
"version_value": "Running version:",
@ -449,7 +480,8 @@ class PypeInfoSubWidget(QtWidgets.QWidget):
"pype_root",
"mongo_url"
]
for key in pype_info.keys():
for key in info_values.keys():
if key not in keys_order:
keys_order.append(key)
@ -466,9 +498,9 @@ class PypeInfoSubWidget(QtWidgets.QWidget):
info_layout.addWidget(title_label, 0, 0, 1, 2)
for key in keys_order:
if key not in pype_info:
if key not in info_values:
continue
value = pype_info[key]
value = info_values[key]
label = key_label_mapping.get(key, key)
row = info_layout.rowCount()
info_layout.addWidget(

View file

@ -8,6 +8,7 @@ import platform
from qtpy import QtCore, QtGui, QtWidgets
import openpype.version
from openpype import AYON_SERVER_ENABLED
from openpype import resources, style
from openpype.lib import (
Logger,
@ -589,6 +590,11 @@ class TrayManager:
self.tray_widget.showMessage(*args, **kwargs)
def _add_version_item(self):
if AYON_SERVER_ENABLED:
login_action = QtWidgets.QAction("Login", self.tray_widget)
login_action.triggered.connect(self._on_ayon_login)
self.tray_widget.menu.addAction(login_action)
subversion = os.environ.get("OPENPYPE_SUBVERSION")
client_name = os.environ.get("OPENPYPE_CLIENT")
@ -614,6 +620,19 @@ class TrayManager:
self._restart_action = restart_action
def _on_ayon_login(self):
self.execute_in_main_thread(self._show_ayon_login)
def _show_ayon_login(self):
from ayon_common.connection.credentials import change_user_ui
result = change_user_ui()
if result.shutdown:
self.exit()
elif result.restart or result.token_changed:
self.restart()
def _on_restart_action(self):
self.restart(use_expected_version=True)

View file

@ -752,61 +752,6 @@ def get_repre_icons():
return icons
def get_progress_for_repre(doc, active_site, remote_site):
"""
Calculates average progress for representation.
If site has created_dt >> fully available >> progress == 1
Could be calculated in aggregate if it would be too slow
Args:
doc(dict): representation dict
Returns:
(dict) with active and remote sites progress
{'studio': 1.0, 'gdrive': -1} - gdrive site is not present
-1 is used to highlight the site should be added
{'studio': 1.0, 'gdrive': 0.0} - gdrive site is present, not
uploaded yet
"""
progress = {active_site: -1,
remote_site: -1}
if not doc:
return progress
files = {active_site: 0, remote_site: 0}
doc_files = doc.get("files") or []
for doc_file in doc_files:
if not isinstance(doc_file, dict):
continue
sites = doc_file.get("sites") or []
for site in sites:
if (
# Pype 2 compatibility
not isinstance(site, dict)
# Check if site name is one of progress sites
or site["name"] not in progress
):
continue
files[site["name"]] += 1
norm_progress = max(progress[site["name"]], 0)
if site.get("created_dt"):
progress[site["name"]] = norm_progress + 1
elif site.get("progress"):
progress[site["name"]] = norm_progress + site["progress"]
else: # site exists, might be failed, do not add again
progress[site["name"]] = 0
# for example 13 fully avail. files out of 26 >> 13/26 = 0.5
avg_progress = {}
avg_progress[active_site] = \
progress[active_site] / max(files[active_site], 1)
avg_progress[remote_site] = \
progress[remote_site] / max(files[remote_site], 1)
return avg_progress
def is_sync_loader(loader):
return is_remove_site_loader(loader) or is_add_site_loader(loader)

35
poetry.lock generated
View file

@ -302,6 +302,24 @@ files = [
pycodestyle = ">=2.10.0"
tomli = {version = "*", markers = "python_version < \"3.11\""}
[[package]]
name = "ayon-python-api"
version = "0.1.16"
description = "AYON Python API"
category = "main"
optional = false
python-versions = "*"
files = [
{file = "ayon-python-api-0.1.16.tar.gz", hash = "sha256:666110954dd75b2be1699a29b4732cfb0bcb09d01f64fba4449bfc8ac1fb43f1"},
{file = "ayon_python_api-0.1.16-py3-none-any.whl", hash = "sha256:bbcd6df1f80ddf32e653a1bb31289cb5fd1a8bea36ab4c8e6aef08c41b6393de"},
]
[package.dependencies]
appdirs = ">=1,<2"
requests = ">=2.27.1"
six = ">=1.15"
Unidecode = ">=1.2.0"
[[package]]
name = "babel"
version = "2.11.0"
@ -3371,14 +3389,14 @@ test = ["coverage", "pytest", "pytest-cov"]
[[package]]
name = "unidecode"
version = "1.3.6"
version = "1.2.0"
description = "ASCII transliterations of Unicode text"
category = "dev"
optional = true
python-versions = ">=3.5"
category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
files = [
{file = "Unidecode-1.3.6-py3-none-any.whl", hash = "sha256:547d7c479e4f377b430dd91ac1275d593308dce0fc464fb2ab7d41f82ec653be"},
{file = "Unidecode-1.3.6.tar.gz", hash = "sha256:fed09cf0be8cf415b391642c2a5addfc72194407caee4f98719e40ec2a72b830"},
{file = "Unidecode-1.2.0-py2.py3-none-any.whl", hash = "sha256:12435ef2fc4cdfd9cf1035a1db7e98b6b047fe591892e81f34e94959591fad00"},
{file = "Unidecode-1.2.0.tar.gz", hash = "sha256:8d73a97d387a956922344f6b74243c2c6771594659778744b2dbdaad8f6b727d"},
]
[[package]]
@ -3672,10 +3690,7 @@ files = [
docs = ["furo", "jaraco.packaging (>=9)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)"]
testing = ["flake8 (<5)", "func-timeout", "jaraco.functools", "jaraco.itertools", "more-itertools", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=1.3)", "pytest-flake8", "pytest-mypy (>=0.9.1)"]
[extras]
docs = []
[metadata]
lock-version = "2.0"
python-versions = ">=3.9.1,<3.10"
content-hash = "47518c544a90cdb3e99e83533557515d0d47079ac4461708ce71ab3ce97b9987"
content-hash = "6bdb0572a9e255898497ad5ec4d7368d4e0850ce9f4d5c72a37394a2f8f7ec06"

View file

@ -70,7 +70,9 @@ requests = "^2.25.1"
pysftp = "^0.2.9"
dropbox = "^11.20.0"
aiohttp-middlewares = "^2.0.0"
ayon-python-api = "^0.1"
opencolorio = "^2.2.0"
Unidecode = "^1.2"
[tool.poetry.dev-dependencies]
flake8 = "^6.0"

View file

@ -126,6 +126,7 @@ bin_includes = [
include_files = [
"igniter",
"openpype",
"common",
"schema",
"LICENSE",
"README.md"
@ -158,11 +159,35 @@ bdist_mac_options = dict(
)
executables = [
Executable("start.py", base=base,
target_name="openpype_gui", icon=icon_path.as_posix()),
Executable("start.py", base=None,
target_name="openpype_console", icon=icon_path.as_posix())
Executable(
"start.py",
base=base,
target_name="openpype_gui",
icon=icon_path.as_posix()
),
Executable(
"start.py",
base=None,
target_name="openpype_console",
icon=icon_path.as_posix()
),
Executable(
"ayon_start.py",
base=base,
target_name="ayon",
icon=icon_path.as_posix()
),
]
if IS_WINDOWS:
executables.append(
Executable(
"ayon_start.py",
base=None,
target_name="ayon_console",
icon=icon_path.as_posix()
)
)
if IS_LINUX:
executables.append(
Executable(

View file

@ -133,6 +133,10 @@ else:
vendor_python_path = os.path.join(OPENPYPE_ROOT, "vendor", "python")
sys.path.insert(0, vendor_python_path)
# Add common package to sys path
# - common contains common code for bootstraping and OpenPype processes
sys.path.insert(0, os.path.join(OPENPYPE_ROOT, "common"))
import blessed # noqa: E402
import certifi # noqa: E402

View file

@ -12,7 +12,7 @@ import requests
import re
from tests.lib.db_handler import DBHandler
from common.openpype_common.distribution.file_handler import RemoteFileHandler
from common.ayon_common.distribution.file_handler import RemoteFileHandler
from openpype.modules import ModulesManager
from openpype.settings import get_project_settings

41
tools/run_tray_ayon.ps1 Normal file
View file

@ -0,0 +1,41 @@
<#
.SYNOPSIS
Helper script OpenPype Tray.
.DESCRIPTION
.EXAMPLE
PS> .\run_tray.ps1
#>
$current_dir = Get-Location
$script_dir = Split-Path -Path $MyInvocation.MyCommand.Definition -Parent
$openpype_root = (Get-Item $script_dir).parent.FullName
# Install PSWriteColor to support colorized output to terminal
$env:PSModulePath = $env:PSModulePath + ";$($openpype_root)\tools\modules\powershell"
$env:_INSIDE_OPENPYPE_TOOL = "1"
# make sure Poetry is in PATH
if (-not (Test-Path 'env:POETRY_HOME')) {
$env:POETRY_HOME = "$openpype_root\.poetry"
}
$env:PATH = "$($env:PATH);$($env:POETRY_HOME)\bin"
Set-Location -Path $openpype_root
Write-Color -Text ">>> ", "Reading Poetry ... " -Color Green, Gray -NoNewline
if (-not (Test-Path -PathType Container -Path "$($env:POETRY_HOME)\bin")) {
Write-Color -Text "NOT FOUND" -Color Yellow
Write-Color -Text "*** ", "We need to install Poetry create virtual env first ..." -Color Yellow, Gray
& "$openpype_root\tools\create_env.ps1"
} else {
Write-Color -Text "OK" -Color Green
}
& "$($env:POETRY_HOME)\bin\poetry" run python "$($openpype_root)\ayon_start.py" tray --debug
Set-Location -Path $current_dir