Merge branch 'develop' into enhancement/OP-3555_Nuke-Render-and-Prerender-nodes-Process-Order

This commit is contained in:
Jakub Ježek 2023-07-31 16:34:19 +02:00 committed by GitHub
commit 5a5285337a
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
317 changed files with 18956 additions and 6764 deletions

View file

@ -35,6 +35,10 @@ body:
label: Version
description: What version are you running? Look to OpenPype Tray
options:
- 3.16.3-nightly.2
- 3.16.3-nightly.1
- 3.16.2
- 3.16.2-nightly.2
- 3.16.2-nightly.1
- 3.16.1
- 3.16.0
@ -131,10 +135,6 @@ body:
- 3.14.7-nightly.3
- 3.14.7-nightly.2
- 3.14.7-nightly.1
- 3.14.6
- 3.14.6-nightly.3
- 3.14.6-nightly.2
- 3.14.6-nightly.1
validations:
required: true
- type: dropdown

2
.gitignore vendored
View file

@ -37,7 +37,7 @@ Temporary Items
###########
/build
/dist/
/server_addon/package/*
/server_addon/packages/*
/vendor/bin/*
/vendor/python/*

View file

@ -1,6 +1,186 @@
# Changelog
## [3.16.2](https://github.com/ynput/OpenPype/tree/3.16.2)
[Full Changelog](https://github.com/ynput/OpenPype/compare/3.16.1...3.16.2)
### **🆕 New features**
<details>
<summary>Fusion - Set selected tool to active <a href="https://github.com/ynput/OpenPype/pull/5327">#5327</a></summary>
When you run the action to select a node, this PR makes the node-flow show the selected node + you'll see the nodes controls in the inspector.
___
</details>
### **🚀 Enhancements**
<details>
<summary>Maya: All base create plugins <a href="https://github.com/ynput/OpenPype/pull/5326">#5326</a></summary>
Prepared base classes for each creator type in Maya. Extended `MayaCreatorBase` to have default implementations of common logic with instances which is used in each type of plugin.
___
</details>
<details>
<summary>Windows: Support long paths on zip updates. <a href="https://github.com/ynput/OpenPype/pull/5265">#5265</a></summary>
Support long paths for version extract on Windows.Use case is when having long paths in for example an addon. You can install to the C drive but because the zip files are extracted in the local users folder, it'll add additional sub directories to the paths and quickly get too long paths for Windows to handle the zip updates.
___
</details>
<details>
<summary>Blender: Added setting to set resolution and start/end frames at startup <a href="https://github.com/ynput/OpenPype/pull/5338">#5338</a></summary>
This PR adds `set_resolution_startup`and `set_frames_startup` settings. They automatically set respectively the resolution and start/end frames and FPS in Blender when opening a file or creating a new one.
___
</details>
<details>
<summary>Blender: Support for ExtractBurnin <a href="https://github.com/ynput/OpenPype/pull/5339">#5339</a></summary>
This PR adds support for ExtractBurnin for Blender, when publishing a Review.
___
</details>
<details>
<summary>Blender: Extract Camera as Alembic <a href="https://github.com/ynput/OpenPype/pull/5343">#5343</a></summary>
Added support to extract Alembic Cameras in Blender.
___
</details>
### **🐛 Bug fixes**
<details>
<summary>Maya: Validate Instance In Context <a href="https://github.com/ynput/OpenPype/pull/5335">#5335</a></summary>
Missing new publisher error so the repair action shows up.
___
</details>
<details>
<summary>Settings: Fix default settings <a href="https://github.com/ynput/OpenPype/pull/5311">#5311</a></summary>
Fixed defautl settings for shotgrid. Renamed `FarmRootEnumEntity` to `DynamicEnumEntity` and removed doubled ABC metaclass definition (all settings entities have abstract metaclass).
___
</details>
<details>
<summary>Deadline: missing context argument <a href="https://github.com/ynput/OpenPype/pull/5312">#5312</a></summary>
Updated function arguments
___
</details>
<details>
<summary>Qt UI: Multiselection combobox PySide6 compatibility <a href="https://github.com/ynput/OpenPype/pull/5314">#5314</a></summary>
- The check states are replaced with the values for PySide6
- `QtCore.Qt.ItemIsUserTristate` is used instead of `QtCore.Qt.ItemIsTristate` to avoid crashes on PySide6
___
</details>
<details>
<summary>Docker: handle openssl 1.1.1 for centos 7 docker build <a href="https://github.com/ynput/OpenPype/pull/5319">#5319</a></summary>
Move to python 3.9 has added need to use openssl 1.1.x - but it is not by default available on centos 7 image. This is fixing it.
___
</details>
<details>
<summary>houdini: fix typo in redshift proxy <a href="https://github.com/ynput/OpenPype/pull/5320">#5320</a></summary>
I believe there's a typo in `create_redshift_proxy.py` ( extra ` ) in filename, and I made this PR to suggest a fix
___
</details>
<details>
<summary>Houdini: fix wrong creator identifier in pointCache workflow <a href="https://github.com/ynput/OpenPype/pull/5324">#5324</a></summary>
FIxing a bug in publishing alembics, were invalid creator identifier caused missing family association.
___
</details>
<details>
<summary>Fix colorspace compatibility check <a href="https://github.com/ynput/OpenPype/pull/5334">#5334</a></summary>
for some reason a user may have `PyOpenColorIO` installed to his machine, _in my case it came with renderman._it can trick the compatibility check as `import PyOpenColorIO` won't raise an error however it may be an old version _like my case_Beforecompatibility check was true and It used wrapper directly After Fix It will use wrapper via subprocess instead
___
</details>
### **Merged pull requests**
<details>
<summary>Remove forgotten dev logging <a href="https://github.com/ynput/OpenPype/pull/5315">#5315</a></summary>
___
</details>
## [3.16.1](https://github.com/ynput/OpenPype/tree/3.16.1)

View file

@ -1,483 +0,0 @@
# -*- coding: utf-8 -*-
"""Main entry point for AYON command.
Bootstrapping process of AYON.
"""
import os
import sys
import site
import traceback
import contextlib
# Enabled logging debug mode when "--debug" is passed
if "--verbose" in sys.argv:
expected_values = (
"Expected: notset, debug, info, warning, error, critical"
" or integer [0-50]."
)
idx = sys.argv.index("--verbose")
sys.argv.pop(idx)
if idx < len(sys.argv):
value = sys.argv.pop(idx)
else:
raise RuntimeError((
f"Expect value after \"--verbose\" argument. {expected_values}"
))
log_level = None
low_value = value.lower()
if low_value.isdigit():
log_level = int(low_value)
elif low_value == "notset":
log_level = 0
elif low_value == "debug":
log_level = 10
elif low_value == "info":
log_level = 20
elif low_value == "warning":
log_level = 30
elif low_value == "error":
log_level = 40
elif low_value == "critical":
log_level = 50
if log_level is None:
raise ValueError((
"Unexpected value after \"--verbose\" "
f"argument \"{value}\". {expected_values}"
))
os.environ["OPENPYPE_LOG_LEVEL"] = str(log_level)
os.environ["AYON_LOG_LEVEL"] = str(log_level)
# Enable debug mode, may affect log level if log level is not defined
if "--debug" in sys.argv:
sys.argv.remove("--debug")
os.environ["AYON_DEBUG"] = "1"
os.environ["OPENPYPE_DEBUG"] = "1"
if "--automatic-tests" in sys.argv:
sys.argv.remove("--automatic-tests")
os.environ["IS_TEST"] = "1"
SKIP_HEADERS = False
if "--skip-headers" in sys.argv:
sys.argv.remove("--skip-headers")
SKIP_HEADERS = True
SKIP_BOOTSTRAP = False
if "--skip-bootstrap" in sys.argv:
sys.argv.remove("--skip-bootstrap")
SKIP_BOOTSTRAP = True
if "--use-staging" in sys.argv:
sys.argv.remove("--use-staging")
os.environ["AYON_USE_STAGING"] = "1"
os.environ["OPENPYPE_USE_STAGING"] = "1"
if "--headless" in sys.argv:
os.environ["AYON_HEADLESS_MODE"] = "1"
os.environ["OPENPYPE_HEADLESS_MODE"] = "1"
sys.argv.remove("--headless")
elif (
os.getenv("AYON_HEADLESS_MODE") != "1"
or os.getenv("OPENPYPE_HEADLESS_MODE") != "1"
):
os.environ.pop("AYON_HEADLESS_MODE", None)
os.environ.pop("OPENPYPE_HEADLESS_MODE", None)
elif (
os.getenv("AYON_HEADLESS_MODE")
!= os.getenv("OPENPYPE_HEADLESS_MODE")
):
os.environ["OPENPYPE_HEADLESS_MODE"] = (
os.environ["AYON_HEADLESS_MODE"]
)
IS_BUILT_APPLICATION = getattr(sys, "frozen", False)
HEADLESS_MODE_ENABLED = os.getenv("AYON_HEADLESS_MODE") == "1"
_pythonpath = os.getenv("PYTHONPATH", "")
_python_paths = _pythonpath.split(os.pathsep)
if not IS_BUILT_APPLICATION:
# Code root defined by `start.py` directory
AYON_ROOT = os.path.dirname(os.path.abspath(__file__))
_dependencies_path = site.getsitepackages()[-1]
else:
AYON_ROOT = os.path.dirname(sys.executable)
# add dependencies folder to sys.pat for frozen code
_dependencies_path = os.path.normpath(
os.path.join(AYON_ROOT, "dependencies")
)
# add stuff from `<frozen>/dependencies` to PYTHONPATH.
sys.path.append(_dependencies_path)
_python_paths.append(_dependencies_path)
# Vendored python modules that must not be in PYTHONPATH environment but
# are required for OpenPype processes
sys.path.insert(0, os.path.join(AYON_ROOT, "vendor", "python"))
# Add common package to sys path
# - common contains common code for bootstraping and OpenPype processes
sys.path.insert(0, os.path.join(AYON_ROOT, "common"))
# This is content of 'core' addon which is ATM part of build
common_python_vendor = os.path.join(
AYON_ROOT,
"openpype",
"vendor",
"python",
"common"
)
# Add tools dir to sys path for pyblish UI discovery
tools_dir = os.path.join(AYON_ROOT, "openpype", "tools")
for path in (AYON_ROOT, common_python_vendor, tools_dir):
while path in _python_paths:
_python_paths.remove(path)
while path in sys.path:
sys.path.remove(path)
_python_paths.insert(0, path)
sys.path.insert(0, path)
os.environ["PYTHONPATH"] = os.pathsep.join(_python_paths)
# enabled AYON state
os.environ["USE_AYON_SERVER"] = "1"
# Set this to point either to `python` from venv in case of live code
# or to `ayon` or `ayon_console` in case of frozen code
os.environ["AYON_EXECUTABLE"] = sys.executable
os.environ["OPENPYPE_EXECUTABLE"] = sys.executable
os.environ["AYON_ROOT"] = AYON_ROOT
os.environ["OPENPYPE_ROOT"] = AYON_ROOT
os.environ["OPENPYPE_REPOS_ROOT"] = AYON_ROOT
os.environ["AYON_MENU_LABEL"] = "AYON"
os.environ["AVALON_LABEL"] = "AYON"
# Set name of pyblish UI import
os.environ["PYBLISH_GUI"] = "pyblish_pype"
# Set builtin OCIO root
os.environ["BUILTIN_OCIO_ROOT"] = os.path.join(
AYON_ROOT,
"vendor",
"bin",
"ocioconfig",
"OpenColorIOConfigs"
)
import blessed # noqa: E402
import certifi # noqa: E402
if sys.__stdout__:
term = blessed.Terminal()
def _print(message: str):
if message.startswith("!!! "):
print(f'{term.orangered2("!!! ")}{message[4:]}')
elif message.startswith(">>> "):
print(f'{term.aquamarine3(">>> ")}{message[4:]}')
elif message.startswith("--- "):
print(f'{term.darkolivegreen3("--- ")}{message[4:]}')
elif message.startswith("*** "):
print(f'{term.gold("*** ")}{message[4:]}')
elif message.startswith(" - "):
print(f'{term.wheat(" - ")}{message[4:]}')
elif message.startswith(" . "):
print(f'{term.tan(" . ")}{message[4:]}')
elif message.startswith(" - "):
print(f'{term.seagreen3(" - ")}{message[7:]}')
elif message.startswith(" ! "):
print(f'{term.goldenrod(" ! ")}{message[7:]}')
elif message.startswith(" * "):
print(f'{term.aquamarine1(" * ")}{message[7:]}')
elif message.startswith(" "):
print(f'{term.darkseagreen3(" ")}{message[4:]}')
else:
print(message)
else:
def _print(message: str):
print(message)
# if SSL_CERT_FILE is not set prior to OpenPype launch, we set it to point
# to certifi bundle to make sure we have reasonably new CA certificates.
if not os.getenv("SSL_CERT_FILE"):
os.environ["SSL_CERT_FILE"] = certifi.where()
elif os.getenv("SSL_CERT_FILE") != certifi.where():
_print("--- your system is set to use custom CA certificate bundle.")
from ayon_api import get_base_url
from ayon_api.constants import SERVER_URL_ENV_KEY, SERVER_API_ENV_KEY
from ayon_common import is_staging_enabled
from ayon_common.connection.credentials import (
ask_to_login_ui,
add_server,
need_server_or_login,
load_environments,
set_environments,
create_global_connection,
confirm_server_login,
)
from ayon_common.distribution import (
AyonDistribution,
BundleNotFoundError,
show_missing_bundle_information,
)
def set_global_environments() -> None:
"""Set global OpenPype's environments."""
import acre
from openpype.settings import get_general_environments
general_env = get_general_environments()
# first resolve general environment because merge doesn't expect
# values to be list.
# TODO: switch to OpenPype environment functions
merged_env = acre.merge(
acre.compute(acre.parse(general_env), cleanup=False),
dict(os.environ)
)
env = acre.compute(
merged_env,
cleanup=False
)
os.environ.clear()
os.environ.update(env)
# Hardcoded default values
os.environ["PYBLISH_GUI"] = "pyblish_pype"
# Change scale factor only if is not set
if "QT_AUTO_SCREEN_SCALE_FACTOR" not in os.environ:
os.environ["QT_AUTO_SCREEN_SCALE_FACTOR"] = "1"
def set_addons_environments():
"""Set global environments for OpenPype modules.
This requires to have OpenPype in `sys.path`.
"""
import acre
from openpype.modules import ModulesManager
modules_manager = ModulesManager()
# Merge environments with current environments and update values
if module_envs := modules_manager.collect_global_environments():
parsed_envs = acre.parse(module_envs)
env = acre.merge(parsed_envs, dict(os.environ))
os.environ.clear()
os.environ.update(env)
def _connect_to_ayon_server():
load_environments()
if not need_server_or_login():
create_global_connection()
return
if HEADLESS_MODE_ENABLED:
_print("!!! Cannot open v4 Login dialog in headless mode.")
_print((
"!!! Please use `{}` to specify server address"
" and '{}' to specify user's token."
).format(SERVER_URL_ENV_KEY, SERVER_API_ENV_KEY))
sys.exit(1)
current_url = os.environ.get(SERVER_URL_ENV_KEY)
url, token, username = ask_to_login_ui(current_url, always_on_top=True)
if url is not None and token is not None:
confirm_server_login(url, token, username)
return
if url is not None:
add_server(url, username)
_print("!!! Login was not successful.")
sys.exit(0)
def _check_and_update_from_ayon_server():
"""Gets addon info from v4, compares with local folder and updates it.
Raises:
RuntimeError
"""
distribution = AyonDistribution()
bundle = None
bundle_name = None
try:
bundle = distribution.bundle_to_use
if bundle is not None:
bundle_name = bundle.name
except BundleNotFoundError as exc:
bundle_name = exc.bundle_name
if bundle is None:
url = get_base_url()
if not HEADLESS_MODE_ENABLED:
show_missing_bundle_information(url, bundle_name)
elif bundle_name:
_print((
f"!!! Requested release bundle '{bundle_name}'"
" is not available on server."
))
_print(
"!!! Check if selected release bundle"
f" is available on the server '{url}'."
)
else:
mode = "staging" if is_staging_enabled() else "production"
_print(
f"!!! No release bundle is set as {mode} on the AYON server."
)
_print(
"!!! Make sure there is a release bundle set"
f" as \"{mode}\" on the AYON server '{url}'."
)
sys.exit(1)
distribution.distribute()
distribution.validate_distribution()
os.environ["AYON_BUNDLE_NAME"] = bundle_name
python_paths = [
path
for path in os.getenv("PYTHONPATH", "").split(os.pathsep)
if path
]
for path in distribution.get_sys_paths():
sys.path.insert(0, path)
if path not in python_paths:
python_paths.append(path)
os.environ["PYTHONPATH"] = os.pathsep.join(python_paths)
def boot():
"""Bootstrap OpenPype."""
from openpype.version import __version__
# TODO load version
os.environ["OPENPYPE_VERSION"] = __version__
os.environ["AYON_VERSION"] = __version__
_connect_to_ayon_server()
_check_and_update_from_ayon_server()
# delete OpenPype module and it's submodules from cache so it is used from
# specific version
modules_to_del = [
sys.modules.pop(module_name)
for module_name in tuple(sys.modules)
if module_name == "openpype" or module_name.startswith("openpype.")
]
for module_name in modules_to_del:
with contextlib.suppress(AttributeError, KeyError):
del sys.modules[module_name]
def main_cli():
from openpype import cli
from openpype.version import __version__
from openpype.lib import terminal as t
_print(">>> loading environments ...")
_print(" - global AYON ...")
set_global_environments()
_print(" - for addons ...")
set_addons_environments()
# print info when not running scripts defined in 'silent commands'
if not SKIP_HEADERS:
info = get_info(is_staging_enabled())
info.insert(0, f">>> Using AYON from [ {AYON_ROOT} ]")
t_width = 20
with contextlib.suppress(ValueError, OSError):
t_width = os.get_terminal_size().columns - 2
_header = f"*** AYON [{__version__}] "
info.insert(0, _header + "-" * (t_width - len(_header)))
for i in info:
t.echo(i)
try:
cli.main(obj={}, prog_name="ayon")
except Exception: # noqa
exc_info = sys.exc_info()
_print("!!! AYON crashed:")
traceback.print_exception(*exc_info)
sys.exit(1)
def script_cli():
"""Run and execute script."""
filepath = os.path.abspath(sys.argv[1])
# Find '__main__.py' in directory
if os.path.isdir(filepath):
new_filepath = os.path.join(filepath, "__main__.py")
if not os.path.exists(new_filepath):
raise RuntimeError(
f"can't find '__main__' module in '{filepath}'")
filepath = new_filepath
# Add parent dir to sys path
sys.path.insert(0, os.path.dirname(filepath))
# Read content and execute
with open(filepath, "r") as stream:
content = stream.read()
exec(compile(content, filepath, "exec"), globals())
def get_info(use_staging=None) -> list:
"""Print additional information to console."""
inf = []
if use_staging:
inf.append(("AYON variant", "staging"))
else:
inf.append(("AYON variant", "production"))
inf.append(("AYON bundle", os.getenv("AYON_BUNDLE")))
# NOTE add addons information
maximum = max(len(i[0]) for i in inf)
formatted = []
for info in inf:
padding = (maximum - len(info[0])) + 1
formatted.append(f'... {info[0]}:{" " * padding}[ {info[1]} ]')
return formatted
def main():
if not SKIP_BOOTSTRAP:
boot()
args = list(sys.argv)
args.pop(0)
if args and os.path.exists(args[0]):
script_cli()
else:
main_cli()
if __name__ == "__main__":
main()

View file

@ -1,16 +0,0 @@
from .utils import (
IS_BUILT_APPLICATION,
is_staging_enabled,
get_local_site_id,
get_ayon_appdirs,
get_ayon_launch_args,
)
__all__ = (
"IS_BUILT_APPLICATION",
"is_staging_enabled",
"get_local_site_id",
"get_ayon_appdirs",
"get_ayon_launch_args",
)

View file

@ -1,511 +0,0 @@
"""Handle credentials and connection to server for client application.
Cache and store used server urls. Store/load API keys to/from keyring if
needed. Store metadata about used urls, usernames for the urls and when was
the connection with the username established.
On bootstrap is created global connection with information about site and
client version. The connection object lives in 'ayon_api'.
"""
import os
import json
import platform
import datetime
import contextlib
import subprocess
import tempfile
from typing import Optional, Union, Any
import ayon_api
from ayon_api.constants import SERVER_URL_ENV_KEY, SERVER_API_ENV_KEY
from ayon_api.exceptions import UrlError
from ayon_api.utils import (
validate_url,
is_token_valid,
logout_from_server,
)
from ayon_common.utils import (
get_ayon_appdirs,
get_local_site_id,
get_ayon_launch_args,
is_staging_enabled,
)
class ChangeUserResult:
def __init__(
self, logged_out, old_url, old_token, old_username,
new_url, new_token, new_username
):
shutdown = logged_out
restart = new_url is not None and new_url != old_url
token_changed = new_token is not None and new_token != old_token
self.logged_out = logged_out
self.old_url = old_url
self.old_token = old_token
self.old_username = old_username
self.new_url = new_url
self.new_token = new_token
self.new_username = new_username
self.shutdown = shutdown
self.restart = restart
self.token_changed = token_changed
def _get_servers_path():
return get_ayon_appdirs("used_servers.json")
def get_servers_info_data():
"""Metadata about used server on this machine.
Store data about all used server urls, last used url and user username for
the url. Using this metadata we can remember which username was used per
url if token stored in keyring loose lifetime.
Returns:
dict[str, Any]: Information about servers.
"""
data = {}
servers_info_path = _get_servers_path()
if not os.path.exists(servers_info_path):
dirpath = os.path.dirname(servers_info_path)
if not os.path.exists(dirpath):
os.makedirs(dirpath)
return data
with open(servers_info_path, "r") as stream:
with contextlib.suppress(BaseException):
data = json.load(stream)
return data
def add_server(url: str, username: str):
"""Add server to server info metadata.
This function will also mark the url as last used url on the machine so on
next launch will be used.
Args:
url (str): Server url.
username (str): Name of user used to log in.
"""
servers_info_path = _get_servers_path()
data = get_servers_info_data()
data["last_server"] = url
if "urls" not in data:
data["urls"] = {}
data["urls"][url] = {
"updated_dt": datetime.datetime.now().strftime("%Y/%m/%d %H:%M:%S"),
"username": username,
}
with open(servers_info_path, "w") as stream:
json.dump(data, stream)
def remove_server(url: str):
"""Remove server url from servers information.
This should be used on logout to completelly loose information about server
on the machine.
Args:
url (str): Server url.
"""
if not url:
return
servers_info_path = _get_servers_path()
data = get_servers_info_data()
if data.get("last_server") == url:
data["last_server"] = None
if "urls" in data:
data["urls"].pop(url, None)
with open(servers_info_path, "w") as stream:
json.dump(data, stream)
def get_last_server(
data: Optional[dict[str, Any]] = None
) -> Union[str, None]:
"""Last server used to log in on this machine.
Args:
data (Optional[dict[str, Any]]): Prepared server information data.
Returns:
Union[str, None]: Last used server url.
"""
if data is None:
data = get_servers_info_data()
return data.get("last_server")
def get_last_username_by_url(
url: str,
data: Optional[dict[str, Any]] = None
) -> Union[str, None]:
"""Get last username which was used for passed url.
Args:
url (str): Server url.
data (Optional[dict[str, Any]]): Servers info.
Returns:
Union[str, None]: Username.
"""
if not url:
return None
if data is None:
data = get_servers_info_data()
if urls := data.get("urls"):
if url_info := urls.get(url):
return url_info.get("username")
return None
def get_last_server_with_username():
"""Receive last server and username used in last connection.
Returns:
tuple[Union[str, None], Union[str, None]]: Url and username.
"""
data = get_servers_info_data()
url = get_last_server(data)
username = get_last_username_by_url(url)
return url, username
class TokenKeyring:
# Fake username with hardcoded username
username_key = "username"
def __init__(self, url):
try:
import keyring
except Exception as exc:
raise NotImplementedError(
"Python module `keyring` is not available."
) from exc
# hack for cx_freeze and Windows keyring backend
if platform.system().lower() == "windows":
from keyring.backends import Windows
keyring.set_keyring(Windows.WinVaultKeyring())
self._url = url
self._keyring_key = f"AYON/{url}"
def get_value(self):
import keyring
return keyring.get_password(self._keyring_key, self.username_key)
def set_value(self, value):
import keyring
if value is not None:
keyring.set_password(self._keyring_key, self.username_key, value)
return
with contextlib.suppress(keyring.errors.PasswordDeleteError):
keyring.delete_password(self._keyring_key, self.username_key)
def load_token(url: str) -> Union[str, None]:
"""Get token for url from keyring.
Args:
url (str): Server url.
Returns:
Union[str, None]: Token for passed url available in keyring.
"""
return TokenKeyring(url).get_value()
def store_token(url: str, token: str):
"""Store token by url to keyring.
Args:
url (str): Server url.
token (str): User token to server.
"""
TokenKeyring(url).set_value(token)
def ask_to_login_ui(
url: Optional[str] = None,
always_on_top: Optional[bool] = False
) -> tuple[str, str, str]:
"""Ask user to login using UI.
This should be used only when user is not yet logged in at all or available
credentials are invalid. To change credentials use 'change_user_ui'
function.
Use a subprocess to show UI.
Args:
url (Optional[str]): Server url that could be prefilled in UI.
always_on_top (Optional[bool]): Window will be drawn on top of
other windows.
Returns:
tuple[str, str, str]: Url, user's token and username.
"""
current_dir = os.path.dirname(os.path.abspath(__file__))
ui_dir = os.path.join(current_dir, "ui")
if url is None:
url = get_last_server()
username = get_last_username_by_url(url)
data = {
"url": url,
"username": username,
"always_on_top": always_on_top,
}
with tempfile.NamedTemporaryFile(
mode="w", prefix="ayon_login", suffix=".json", delete=False
) as tmp:
output = tmp.name
json.dump(data, tmp)
code = subprocess.call(
get_ayon_launch_args(ui_dir, "--skip-bootstrap", output))
if code != 0:
raise RuntimeError("Failed to show login UI")
with open(output, "r") as stream:
data = json.load(stream)
os.remove(output)
return data["output"]
def change_user_ui() -> ChangeUserResult:
"""Change user using UI.
Show UI to user where he can change credentials or url. Output will contain
all information about old/new values of url, username, api key. If user
confirmed or declined values.
Returns:
ChangeUserResult: Information about user change.
"""
from .ui import change_user
url, username = get_last_server_with_username()
token = load_token(url)
result = change_user(url, username, token)
new_url, new_token, new_username, logged_out = result
output = ChangeUserResult(
logged_out, url, token, username,
new_url, new_token, new_username
)
if output.logged_out:
logout(url, token)
elif output.token_changed:
change_token(
output.new_url,
output.new_token,
output.new_username,
output.old_url
)
return output
def change_token(
url: str,
token: str,
username: Optional[str] = None,
old_url: Optional[str] = None
):
"""Change url and token in currently running session.
Function can also change server url, in that case are previous credentials
NOT removed from cache.
Args:
url (str): Url to server.
token (str): New token to be used for url connection.
username (Optional[str]): Username of logged user.
old_url (Optional[str]): Previous url. Value from 'get_last_server'
is used if not entered.
"""
if old_url is None:
old_url = get_last_server()
if old_url and old_url == url:
remove_url_cache(old_url)
# TODO check if ayon_api is already connected
add_server(url, username)
store_token(url, token)
ayon_api.change_token(url, token)
def remove_url_cache(url: str):
"""Clear cache for server url.
Args:
url (str): Server url which is removed from cache.
"""
store_token(url, None)
def remove_token_cache(url: str, token: str):
"""Remove token from local cache of url.
Is skipped if cached token under the passed url is not the same
as passed token.
Args:
url (str): Url to server.
token (str): Token to be removed from url cache.
"""
if load_token(url) == token:
remove_url_cache(url)
def logout(url: str, token: str):
"""Logout from server and throw token away.
Args:
url (str): Url from which should be logged out.
token (str): Token which should be used to log out.
"""
remove_server(url)
ayon_api.close_connection()
ayon_api.set_environments(None, None)
remove_token_cache(url, token)
logout_from_server(url, token)
def load_environments():
"""Load environments on startup.
Handle environments needed for connection with server. Environments are
'AYON_SERVER_URL' and 'AYON_API_KEY'.
Server is looked up from environment. Already set environent is not
changed. If environemnt is not filled then last server stored in appdirs
is used.
Token is skipped if url is not available. Otherwise, is also checked from
env and if is not available then uses 'load_token' to try to get token
based on server url.
"""
server_url = os.environ.get(SERVER_URL_ENV_KEY)
if not server_url:
server_url = get_last_server()
if not server_url:
return
os.environ[SERVER_URL_ENV_KEY] = server_url
if not os.environ.get(SERVER_API_ENV_KEY):
if token := load_token(server_url):
os.environ[SERVER_API_ENV_KEY] = token
def set_environments(url: str, token: str):
"""Change url and token environemnts in currently running process.
Args:
url (str): New server url.
token (str): User's token.
"""
ayon_api.set_environments(url, token)
def create_global_connection():
"""Create global connection with site id and client version.
Make sure the global connection in 'ayon_api' have entered site id and
client version.
Set default settings variant to use based on 'is_staging_enabled'.
"""
ayon_api.create_connection(
get_local_site_id(), os.environ.get("AYON_VERSION")
)
ayon_api.set_default_settings_variant(
"staging" if is_staging_enabled() else "production"
)
def need_server_or_login() -> bool:
"""Check if server url or login to the server are needed.
It is recommended to call 'load_environments' on startup before this check.
But in some cases this function could be called after startup.
Returns:
bool: 'True' if server and token are available. Otherwise 'False'.
"""
server_url = os.environ.get(SERVER_URL_ENV_KEY)
if not server_url:
return True
try:
server_url = validate_url(server_url)
except UrlError:
return True
token = os.environ.get(SERVER_API_ENV_KEY)
if token:
return not is_token_valid(server_url, token)
token = load_token(server_url)
if token:
return not is_token_valid(server_url, token)
return True
def confirm_server_login(url, token, username):
"""Confirm login of user and do necessary stepts to apply changes.
This should not be used on "change" of user but on first login.
Args:
url (str): Server url where user authenticated.
token (str): API token used for authentication to server.
username (Union[str, None]): Username related to API token.
"""
add_server(url, username)
store_token(url, token)
set_environments(url, token)
create_global_connection()

View file

@ -1,12 +0,0 @@
from .login_window import (
ServerLoginWindow,
ask_to_login,
change_user,
)
__all__ = (
"ServerLoginWindow",
"ask_to_login",
"change_user",
)

View file

@ -1,23 +0,0 @@
import sys
import json
from ayon_common.connection.ui.login_window import ask_to_login
def main(output_path):
with open(output_path, "r") as stream:
data = json.load(stream)
url = data.get("url")
username = data.get("username")
always_on_top = data.get("always_on_top", False)
out_url, out_token, out_username = ask_to_login(
url, username, always_on_top=always_on_top)
data["output"] = [out_url, out_token, out_username]
with open(output_path, "w") as stream:
json.dump(data, stream)
if __name__ == "__main__":
main(sys.argv[-1])

View file

@ -1,710 +0,0 @@
import traceback
from qtpy import QtWidgets, QtCore, QtGui
from ayon_api.exceptions import UrlError
from ayon_api.utils import validate_url, login_to_server
from ayon_common.resources import (
get_resource_path,
get_icon_path,
load_stylesheet,
)
from ayon_common.ui_utils import set_style_property, get_qt_app
from .widgets import (
PressHoverButton,
PlaceholderLineEdit,
)
class LogoutConfirmDialog(QtWidgets.QDialog):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.setWindowTitle("Logout confirmation")
message_widget = QtWidgets.QWidget(self)
message_label = QtWidgets.QLabel(
(
"You are going to logout. This action will close this"
" application and will invalidate your login."
" All other applications launched with this login won't be"
" able to use it anymore.<br/><br/>"
"You can cancel logout and only change server and user login"
" in login dialog.<br/><br/>"
"Press OK to confirm logout."
),
message_widget
)
message_label.setWordWrap(True)
message_layout = QtWidgets.QHBoxLayout(message_widget)
message_layout.setContentsMargins(0, 0, 0, 0)
message_layout.addWidget(message_label, 1)
sep_frame = QtWidgets.QFrame(self)
sep_frame.setObjectName("Separator")
sep_frame.setMinimumHeight(2)
sep_frame.setMaximumHeight(2)
footer_widget = QtWidgets.QWidget(self)
cancel_btn = QtWidgets.QPushButton("Cancel", footer_widget)
confirm_btn = QtWidgets.QPushButton("OK", footer_widget)
footer_layout = QtWidgets.QHBoxLayout(footer_widget)
footer_layout.setContentsMargins(0, 0, 0, 0)
footer_layout.addStretch(1)
footer_layout.addWidget(cancel_btn, 0)
footer_layout.addWidget(confirm_btn, 0)
main_layout = QtWidgets.QVBoxLayout(self)
main_layout.addWidget(message_widget, 0)
main_layout.addStretch(1)
main_layout.addWidget(sep_frame, 0)
main_layout.addWidget(footer_widget, 0)
cancel_btn.clicked.connect(self._on_cancel_click)
confirm_btn.clicked.connect(self._on_confirm_click)
self._cancel_btn = cancel_btn
self._confirm_btn = confirm_btn
self._result = False
def showEvent(self, event):
super().showEvent(event)
self._match_btns_sizes()
def resizeEvent(self, event):
super().resizeEvent(event)
self._match_btns_sizes()
def _match_btns_sizes(self):
width = max(
self._cancel_btn.sizeHint().width(),
self._confirm_btn.sizeHint().width()
)
self._cancel_btn.setMinimumWidth(width)
self._confirm_btn.setMinimumWidth(width)
def _on_cancel_click(self):
self._result = False
self.reject()
def _on_confirm_click(self):
self._result = True
self.accept()
def get_result(self):
return self._result
class ServerLoginWindow(QtWidgets.QDialog):
default_width = 410
default_height = 170
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
icon_path = get_icon_path()
icon = QtGui.QIcon(icon_path)
self.setWindowIcon(icon)
self.setWindowTitle("Login to server")
edit_icon_path = get_resource_path("edit.png")
edit_icon = QtGui.QIcon(edit_icon_path)
# --- URL page ---
login_widget = QtWidgets.QWidget(self)
user_cred_widget = QtWidgets.QWidget(login_widget)
url_label = QtWidgets.QLabel("URL:", user_cred_widget)
url_widget = QtWidgets.QWidget(user_cred_widget)
url_input = PlaceholderLineEdit(url_widget)
url_input.setPlaceholderText("< https://ayon.server.com >")
url_preview = QtWidgets.QLineEdit(url_widget)
url_preview.setReadOnly(True)
url_preview.setObjectName("LikeDisabledInput")
url_edit_btn = PressHoverButton(user_cred_widget)
url_edit_btn.setIcon(edit_icon)
url_edit_btn.setObjectName("PasswordBtn")
url_layout = QtWidgets.QHBoxLayout(url_widget)
url_layout.setContentsMargins(0, 0, 0, 0)
url_layout.addWidget(url_input, 1)
url_layout.addWidget(url_preview, 1)
# --- URL separator ---
url_cred_sep = QtWidgets.QFrame(self)
url_cred_sep.setObjectName("Separator")
url_cred_sep.setMinimumHeight(2)
url_cred_sep.setMaximumHeight(2)
# --- Login page ---
username_label = QtWidgets.QLabel("Username:", user_cred_widget)
username_widget = QtWidgets.QWidget(user_cred_widget)
username_input = PlaceholderLineEdit(username_widget)
username_input.setPlaceholderText("< Artist >")
username_preview = QtWidgets.QLineEdit(username_widget)
username_preview.setReadOnly(True)
username_preview.setObjectName("LikeDisabledInput")
username_edit_btn = PressHoverButton(user_cred_widget)
username_edit_btn.setIcon(edit_icon)
username_edit_btn.setObjectName("PasswordBtn")
username_layout = QtWidgets.QHBoxLayout(username_widget)
username_layout.setContentsMargins(0, 0, 0, 0)
username_layout.addWidget(username_input, 1)
username_layout.addWidget(username_preview, 1)
password_label = QtWidgets.QLabel("Password:", user_cred_widget)
password_input = PlaceholderLineEdit(user_cred_widget)
password_input.setPlaceholderText("< *********** >")
password_input.setEchoMode(PlaceholderLineEdit.Password)
api_label = QtWidgets.QLabel("API key:", user_cred_widget)
api_preview = QtWidgets.QLineEdit(user_cred_widget)
api_preview.setReadOnly(True)
api_preview.setObjectName("LikeDisabledInput")
show_password_icon_path = get_resource_path("eye.png")
show_password_icon = QtGui.QIcon(show_password_icon_path)
show_password_btn = PressHoverButton(user_cred_widget)
show_password_btn.setObjectName("PasswordBtn")
show_password_btn.setIcon(show_password_icon)
show_password_btn.setFocusPolicy(QtCore.Qt.ClickFocus)
cred_msg_sep = QtWidgets.QFrame(self)
cred_msg_sep.setObjectName("Separator")
cred_msg_sep.setMinimumHeight(2)
cred_msg_sep.setMaximumHeight(2)
# --- Credentials inputs ---
user_cred_layout = QtWidgets.QGridLayout(user_cred_widget)
user_cred_layout.setContentsMargins(0, 0, 0, 0)
row = 0
user_cred_layout.addWidget(url_label, row, 0, 1, 1)
user_cred_layout.addWidget(url_widget, row, 1, 1, 1)
user_cred_layout.addWidget(url_edit_btn, row, 2, 1, 1)
row += 1
user_cred_layout.addWidget(url_cred_sep, row, 0, 1, 3)
row += 1
user_cred_layout.addWidget(username_label, row, 0, 1, 1)
user_cred_layout.addWidget(username_widget, row, 1, 1, 1)
user_cred_layout.addWidget(username_edit_btn, row, 2, 2, 1)
row += 1
user_cred_layout.addWidget(api_label, row, 0, 1, 1)
user_cred_layout.addWidget(api_preview, row, 1, 1, 1)
row += 1
user_cred_layout.addWidget(password_label, row, 0, 1, 1)
user_cred_layout.addWidget(password_input, row, 1, 1, 1)
user_cred_layout.addWidget(show_password_btn, row, 2, 1, 1)
row += 1
user_cred_layout.addWidget(cred_msg_sep, row, 0, 1, 3)
row += 1
user_cred_layout.setColumnStretch(0, 0)
user_cred_layout.setColumnStretch(1, 1)
user_cred_layout.setColumnStretch(2, 0)
login_layout = QtWidgets.QVBoxLayout(login_widget)
login_layout.setContentsMargins(0, 0, 0, 0)
login_layout.addWidget(user_cred_widget, 1)
# --- Messages ---
# Messages for users (e.g. invalid url etc.)
message_label = QtWidgets.QLabel(self)
message_label.setWordWrap(True)
message_label.setTextInteractionFlags(QtCore.Qt.TextBrowserInteraction)
footer_widget = QtWidgets.QWidget(self)
logout_btn = QtWidgets.QPushButton("Logout", footer_widget)
user_message = QtWidgets.QLabel(footer_widget)
login_btn = QtWidgets.QPushButton("Login", footer_widget)
confirm_btn = QtWidgets.QPushButton("Confirm", footer_widget)
footer_layout = QtWidgets.QHBoxLayout(footer_widget)
footer_layout.setContentsMargins(0, 0, 0, 0)
footer_layout.addWidget(logout_btn, 0)
footer_layout.addWidget(user_message, 1)
footer_layout.addWidget(login_btn, 0)
footer_layout.addWidget(confirm_btn, 0)
main_layout = QtWidgets.QVBoxLayout(self)
main_layout.addWidget(login_widget, 0)
main_layout.addWidget(message_label, 0)
main_layout.addStretch(1)
main_layout.addWidget(footer_widget, 0)
url_input.textChanged.connect(self._on_url_change)
url_input.returnPressed.connect(self._on_url_enter_press)
username_input.textChanged.connect(self._on_user_change)
username_input.returnPressed.connect(self._on_username_enter_press)
password_input.returnPressed.connect(self._on_password_enter_press)
show_password_btn.change_state.connect(self._on_show_password)
url_edit_btn.clicked.connect(self._on_url_edit_click)
username_edit_btn.clicked.connect(self._on_username_edit_click)
logout_btn.clicked.connect(self._on_logout_click)
login_btn.clicked.connect(self._on_login_click)
confirm_btn.clicked.connect(self._on_login_click)
self._message_label = message_label
self._url_widget = url_widget
self._url_input = url_input
self._url_preview = url_preview
self._url_edit_btn = url_edit_btn
self._login_widget = login_widget
self._user_cred_widget = user_cred_widget
self._username_input = username_input
self._username_preview = username_preview
self._username_edit_btn = username_edit_btn
self._password_label = password_label
self._password_input = password_input
self._show_password_btn = show_password_btn
self._api_label = api_label
self._api_preview = api_preview
self._logout_btn = logout_btn
self._user_message = user_message
self._login_btn = login_btn
self._confirm_btn = confirm_btn
self._url_is_valid = None
self._credentials_are_valid = None
self._result = (None, None, None, False)
self._first_show = True
self._allow_logout = False
self._logged_in = False
self._url_edit_mode = False
self._username_edit_mode = False
def set_allow_logout(self, allow_logout):
if allow_logout is self._allow_logout:
return
self._allow_logout = allow_logout
self._update_states_by_edit_mode()
def _set_logged_in(self, logged_in):
if logged_in is self._logged_in:
return
self._logged_in = logged_in
self._update_states_by_edit_mode()
def _set_url_edit_mode(self, edit_mode):
if self._url_edit_mode is not edit_mode:
self._url_edit_mode = edit_mode
self._update_states_by_edit_mode()
def _set_username_edit_mode(self, edit_mode):
if self._username_edit_mode is not edit_mode:
self._username_edit_mode = edit_mode
self._update_states_by_edit_mode()
def _get_url_user_edit(self):
url_edit = True
if self._logged_in and not self._url_edit_mode:
url_edit = False
user_edit = url_edit
if not user_edit and self._logged_in and self._username_edit_mode:
user_edit = True
return url_edit, user_edit
def _update_states_by_edit_mode(self):
url_edit, user_edit = self._get_url_user_edit()
self._url_preview.setVisible(not url_edit)
self._url_input.setVisible(url_edit)
self._url_edit_btn.setVisible(self._allow_logout and not url_edit)
self._username_preview.setVisible(not user_edit)
self._username_input.setVisible(user_edit)
self._username_edit_btn.setVisible(
self._allow_logout and not user_edit
)
self._api_preview.setVisible(not user_edit)
self._api_label.setVisible(not user_edit)
self._password_label.setVisible(user_edit)
self._show_password_btn.setVisible(user_edit)
self._password_input.setVisible(user_edit)
self._logout_btn.setVisible(self._allow_logout and self._logged_in)
self._login_btn.setVisible(not self._allow_logout)
self._confirm_btn.setVisible(self._allow_logout)
self._update_login_btn_state(url_edit, user_edit)
def _update_login_btn_state(self, url_edit=None, user_edit=None, url=None):
if url_edit is None:
url_edit, user_edit = self._get_url_user_edit()
if url is None:
url = self._url_input.text()
enabled = bool(url) and (url_edit or user_edit)
self._login_btn.setEnabled(enabled)
self._confirm_btn.setEnabled(enabled)
def showEvent(self, event):
super().showEvent(event)
if self._first_show:
self._first_show = False
self._on_first_show()
def _on_first_show(self):
self.setStyleSheet(load_stylesheet())
self.resize(self.default_width, self.default_height)
self._center_window()
if self._allow_logout is None:
self.set_allow_logout(False)
self._update_states_by_edit_mode()
if not self._url_input.text():
widget = self._url_input
elif not self._username_input.text():
widget = self._username_input
else:
widget = self._password_input
self._set_input_focus(widget)
def result(self):
"""Result url and token or login.
Returns:
Union[Tuple[str, str], Tuple[None, None]]: Url and token used for
login if was successful otherwise are both set to None.
"""
return self._result
def _center_window(self):
"""Move window to center of screen."""
if hasattr(QtWidgets.QApplication, "desktop"):
desktop = QtWidgets.QApplication.desktop()
screen_idx = desktop.screenNumber(self)
screen_geo = desktop.screenGeometry(screen_idx)
else:
screen = self.screen()
screen_geo = screen.geometry()
geo = self.frameGeometry()
geo.moveCenter(screen_geo.center())
if geo.y() < screen_geo.y():
geo.setY(screen_geo.y())
self.move(geo.topLeft())
def _on_url_change(self, text):
self._update_login_btn_state(url=text)
self._set_url_valid(None)
self._set_credentials_valid(None)
self._url_preview.setText(text)
def _set_url_valid(self, valid):
if valid is self._url_is_valid:
return
self._url_is_valid = valid
self._set_input_valid_state(self._url_input, valid)
def _set_credentials_valid(self, valid):
if self._credentials_are_valid is valid:
return
self._credentials_are_valid = valid
self._set_input_valid_state(self._username_input, valid)
self._set_input_valid_state(self._password_input, valid)
def _on_url_enter_press(self):
self._set_input_focus(self._username_input)
def _on_user_change(self, username):
self._username_preview.setText(username)
def _on_username_enter_press(self):
self._set_input_focus(self._password_input)
def _on_password_enter_press(self):
self._login()
def _on_show_password(self, show_password):
if show_password:
placeholder_text = "< MySecret124 >"
echo_mode = QtWidgets.QLineEdit.Normal
else:
placeholder_text = "< *********** >"
echo_mode = QtWidgets.QLineEdit.Password
self._password_input.setEchoMode(echo_mode)
self._password_input.setPlaceholderText(placeholder_text)
def _on_username_edit_click(self):
self._username_edit_mode = True
self._update_states_by_edit_mode()
def _on_url_edit_click(self):
self._url_edit_mode = True
self._update_states_by_edit_mode()
def _on_logout_click(self):
dialog = LogoutConfirmDialog(self)
dialog.exec_()
if dialog.get_result():
self._result = (None, None, None, True)
self.accept()
def _on_login_click(self):
self._login()
def _validate_url(self):
"""Use url from input to connect and change window state on success.
Todos:
Threaded check.
"""
url = self._url_input.text()
valid_url = None
try:
valid_url = validate_url(url)
except UrlError as exc:
parts = [f"<b>{exc.title}</b>"]
parts.extend(f"- {hint}" for hint in exc.hints)
self._set_message("<br/>".join(parts))
except KeyboardInterrupt:
# Reraise KeyboardInterrupt error
raise
except BaseException:
self._set_unexpected_error()
return
if valid_url is None:
return False
self._url_input.setText(valid_url)
return True
def _login(self):
if (
not self._login_btn.isEnabled()
and not self._confirm_btn.isEnabled()
):
return
if not self._url_is_valid:
self._set_url_valid(self._validate_url())
if not self._url_is_valid:
self._set_input_focus(self._url_input)
self._set_credentials_valid(None)
return
self._clear_message()
url = self._url_input.text()
username = self._username_input.text()
password = self._password_input.text()
try:
token = login_to_server(url, username, password)
except BaseException:
self._set_unexpected_error()
return
if token is not None:
self._result = (url, token, username, False)
self.accept()
return
self._set_credentials_valid(False)
message_lines = ["<b>Invalid credentials</b>"]
if not username.strip():
message_lines.append("- Username is not filled")
if not password.strip():
message_lines.append("- Password is not filled")
if username and password:
message_lines.append("- Check your credentials")
self._set_message("<br/>".join(message_lines))
self._set_input_focus(self._username_input)
def _set_input_focus(self, widget):
widget.setFocus(QtCore.Qt.MouseFocusReason)
def _set_input_valid_state(self, widget, valid):
state = ""
if valid is True:
state = "valid"
elif valid is False:
state = "invalid"
set_style_property(widget, "state", state)
def _set_message(self, message):
self._message_label.setText(message)
def _clear_message(self):
self._message_label.setText("")
def _set_unexpected_error(self):
# TODO add traceback somewhere
# - maybe a button to show or copy?
traceback.print_exc()
lines = [
"<b>Unexpected error happened</b>",
"- Can be caused by wrong url (leading elsewhere)"
]
self._set_message("<br/>".join(lines))
def set_url(self, url):
self._url_preview.setText(url)
self._url_input.setText(url)
self._validate_url()
def set_username(self, username):
self._username_preview.setText(username)
self._username_input.setText(username)
def _set_api_key(self, api_key):
if not api_key or len(api_key) < 3:
self._api_preview.setText(api_key or "")
return
api_key_len = len(api_key)
offset = 6
if api_key_len < offset:
offset = api_key_len // 2
api_key = api_key[:offset] + "." * (api_key_len - offset)
self._api_preview.setText(api_key)
def set_logged_in(
self,
logged_in,
url=None,
username=None,
api_key=None,
allow_logout=None
):
if url is not None:
self.set_url(url)
if username is not None:
self.set_username(username)
if api_key:
self._set_api_key(api_key)
if logged_in and allow_logout is None:
allow_logout = True
self._set_logged_in(logged_in)
if allow_logout:
self.set_allow_logout(True)
elif allow_logout is False:
self.set_allow_logout(False)
def ask_to_login(url=None, username=None, always_on_top=False):
"""Ask user to login using Qt dialog.
Function creates new QApplication if is not created yet.
Args:
url (Optional[str]): Server url that will be prefilled in dialog.
username (Optional[str]): Username that will be prefilled in dialog.
always_on_top (Optional[bool]): Window will be drawn on top of
other windows.
Returns:
tuple[str, str, str]: Returns Url, user's token and username. Url can
be changed during dialog lifetime that's why the url is returned.
"""
app_instance = get_qt_app()
window = ServerLoginWindow()
if always_on_top:
window.setWindowFlags(
window.windowFlags()
| QtCore.Qt.WindowStaysOnTopHint
)
if url:
window.set_url(url)
if username:
window.set_username(username)
if not app_instance.startingUp():
window.exec_()
else:
window.open()
app_instance.exec_()
result = window.result()
out_url, out_token, out_username, _ = result
return out_url, out_token, out_username
def change_user(url, username, api_key, always_on_top=False):
"""Ask user to login using Qt dialog.
Function creates new QApplication if is not created yet.
Args:
url (str): Server url that will be prefilled in dialog.
username (str): Username that will be prefilled in dialog.
api_key (str): API key that will be prefilled in dialog.
always_on_top (Optional[bool]): Window will be drawn on top of
other windows.
Returns:
Tuple[str, str]: Returns Url and user's token. Url can be changed
during dialog lifetime that's why the url is returned.
"""
app_instance = get_qt_app()
window = ServerLoginWindow()
if always_on_top:
window.setWindowFlags(
window.windowFlags()
| QtCore.Qt.WindowStaysOnTopHint
)
window.set_logged_in(True, url, username, api_key)
if not app_instance.startingUp():
window.exec_()
else:
window.open()
# This can become main Qt loop. Maybe should live elsewhere
app_instance.exec_()
return window.result()

View file

@ -1,47 +0,0 @@
from qtpy import QtWidgets, QtCore, QtGui
class PressHoverButton(QtWidgets.QPushButton):
"""Keep track about mouse press/release and enter/leave."""
_mouse_pressed = False
_mouse_hovered = False
change_state = QtCore.Signal(bool)
def mousePressEvent(self, event):
self._mouse_pressed = True
self._mouse_hovered = True
self.change_state.emit(self._mouse_hovered)
super(PressHoverButton, self).mousePressEvent(event)
def mouseReleaseEvent(self, event):
self._mouse_pressed = False
self._mouse_hovered = False
self.change_state.emit(self._mouse_hovered)
super(PressHoverButton, self).mouseReleaseEvent(event)
def mouseMoveEvent(self, event):
mouse_pos = self.mapFromGlobal(QtGui.QCursor.pos())
under_mouse = self.rect().contains(mouse_pos)
if under_mouse != self._mouse_hovered:
self._mouse_hovered = under_mouse
self.change_state.emit(self._mouse_hovered)
super(PressHoverButton, self).mouseMoveEvent(event)
class PlaceholderLineEdit(QtWidgets.QLineEdit):
"""Set placeholder color of QLineEdit in Qt 5.12 and higher."""
def __init__(self, *args, **kwargs):
super(PlaceholderLineEdit, self).__init__(*args, **kwargs)
# Change placeholder palette color
if hasattr(QtGui.QPalette, "PlaceholderText"):
filter_palette = self.palette()
color = QtGui.QColor("#D3D8DE")
color.setAlpha(67)
filter_palette.setColor(
QtGui.QPalette.PlaceholderText,
color
)
self.setPalette(filter_palette)

View file

@ -1,18 +0,0 @@
Addon distribution tool
------------------------
Code in this folder is backend portion of Addon distribution logic for v4 server.
Each host, module will be separate Addon in the future. Each v4 server could run different set of Addons.
Client (running on artist machine) will in the first step ask v4 for list of enabled addons.
(It expects list of json documents matching to `addon_distribution.py:AddonInfo` object.)
Next it will compare presence of enabled addon version in local folder. In the case of missing version of
an addon, client will use information in the addon to download (from http/shared local disk/git) zip file
and unzip it.
Required part of addon distribution will be sharing of dependencies (python libraries, utilities) which is not part of this folder.
Location of this folder might change in the future as it will be required for a clint to add this folder to sys.path reliably.
This code needs to be independent on Openpype code as much as possible!

View file

@ -1,9 +0,0 @@
from .control import AyonDistribution, BundleNotFoundError
from .utils import show_missing_bundle_information
__all__ = (
"AyonDistribution",
"BundleNotFoundError",
"show_missing_bundle_information",
)

File diff suppressed because it is too large Load diff

View file

@ -1,265 +0,0 @@
import attr
from enum import Enum
class UrlType(Enum):
HTTP = "http"
GIT = "git"
FILESYSTEM = "filesystem"
SERVER = "server"
@attr.s
class MultiPlatformValue(object):
windows = attr.ib(default=None)
linux = attr.ib(default=None)
darwin = attr.ib(default=None)
@attr.s
class SourceInfo(object):
type = attr.ib()
@attr.s
class LocalSourceInfo(SourceInfo):
path = attr.ib(default=attr.Factory(MultiPlatformValue))
@attr.s
class WebSourceInfo(SourceInfo):
url = attr.ib(default=None)
headers = attr.ib(default=None)
filename = attr.ib(default=None)
@attr.s
class ServerSourceInfo(SourceInfo):
filename = attr.ib(default=None)
path = attr.ib(default=None)
def convert_source(source):
"""Create source object from data information.
Args:
source (Dict[str, any]): Information about source.
Returns:
Union[None, SourceInfo]: Object with source information if type is
known.
"""
source_type = source.get("type")
if not source_type:
return None
if source_type == UrlType.FILESYSTEM.value:
return LocalSourceInfo(
type=source_type,
path=source["path"]
)
if source_type == UrlType.HTTP.value:
url = source["path"]
return WebSourceInfo(
type=source_type,
url=url,
headers=source.get("headers"),
filename=source.get("filename")
)
if source_type == UrlType.SERVER.value:
return ServerSourceInfo(
type=source_type,
filename=source.get("filename"),
path=source.get("path")
)
def prepare_sources(src_sources):
sources = []
unknown_sources = []
for source in (src_sources or []):
dependency_source = convert_source(source)
if dependency_source is not None:
sources.append(dependency_source)
else:
print(f"Unknown source {source.get('type')}")
unknown_sources.append(source)
return sources, unknown_sources
@attr.s
class VersionData(object):
version_data = attr.ib(default=None)
@attr.s
class AddonVersionInfo(object):
version = attr.ib()
full_name = attr.ib()
title = attr.ib(default=None)
require_distribution = attr.ib(default=False)
sources = attr.ib(default=attr.Factory(list))
unknown_sources = attr.ib(default=attr.Factory(list))
hash = attr.ib(default=None)
@classmethod
def from_dict(
cls, addon_name, addon_title, addon_version, version_data
):
"""Addon version info.
Args:
addon_name (str): Name of addon.
addon_title (str): Title of addon.
addon_version (str): Version of addon.
version_data (dict[str, Any]): Addon version information from
server.
Returns:
AddonVersionInfo: Addon version info.
"""
full_name = f"{addon_name}_{addon_version}"
title = f"{addon_title} {addon_version}"
source_info = version_data.get("clientSourceInfo")
require_distribution = source_info is not None
sources, unknown_sources = prepare_sources(source_info)
return cls(
version=addon_version,
full_name=full_name,
require_distribution=require_distribution,
sources=sources,
unknown_sources=unknown_sources,
hash=version_data.get("hash"),
title=title
)
@attr.s
class AddonInfo(object):
"""Object matching json payload from Server"""
name = attr.ib()
versions = attr.ib(default=attr.Factory(dict))
title = attr.ib(default=None)
description = attr.ib(default=None)
license = attr.ib(default=None)
authors = attr.ib(default=None)
@classmethod
def from_dict(cls, data):
"""Addon info by available versions.
Args:
data (dict[str, Any]): Addon information from server. Should
contain information about every version under 'versions'.
Returns:
AddonInfo: Addon info with available versions.
"""
# server payload contains info about all versions
addon_name = data["name"]
title = data.get("title") or addon_name
src_versions = data.get("versions") or {}
dst_versions = {
addon_version: AddonVersionInfo.from_dict(
addon_name, title, addon_version, version_data
)
for addon_version, version_data in src_versions.items()
}
return cls(
name=addon_name,
versions=dst_versions,
description=data.get("description"),
title=data.get("title") or addon_name,
license=data.get("license"),
authors=data.get("authors")
)
@attr.s
class DependencyItem(object):
"""Object matching payload from Server about single dependency package"""
name = attr.ib()
platform_name = attr.ib()
checksum = attr.ib()
sources = attr.ib(default=attr.Factory(list))
unknown_sources = attr.ib(default=attr.Factory(list))
source_addons = attr.ib(default=attr.Factory(dict))
python_modules = attr.ib(default=attr.Factory(dict))
@classmethod
def from_dict(cls, package):
src_sources = package.get("sources") or []
for source in src_sources:
if source.get("type") == "server" and not source.get("filename"):
source["filename"] = package["filename"]
sources, unknown_sources = prepare_sources(src_sources)
return cls(
name=package["filename"],
platform_name=package["platform"],
sources=sources,
unknown_sources=unknown_sources,
checksum=package["checksum"],
source_addons=package["sourceAddons"],
python_modules=package["pythonModules"]
)
@attr.s
class Installer:
version = attr.ib()
filename = attr.ib()
platform_name = attr.ib()
size = attr.ib()
checksum = attr.ib()
python_version = attr.ib()
python_modules = attr.ib()
sources = attr.ib(default=attr.Factory(list))
unknown_sources = attr.ib(default=attr.Factory(list))
@classmethod
def from_dict(cls, installer_info):
sources, unknown_sources = prepare_sources(
installer_info.get("sources"))
return cls(
version=installer_info["version"],
filename=installer_info["filename"],
platform_name=installer_info["platform"],
size=installer_info["size"],
sources=sources,
unknown_sources=unknown_sources,
checksum=installer_info["checksum"],
python_version=installer_info["pythonVersion"],
python_modules=installer_info["pythonModules"]
)
@attr.s
class Bundle:
"""Class representing bundle information."""
name = attr.ib()
installer_version = attr.ib()
addon_versions = attr.ib(default=attr.Factory(dict))
dependency_packages = attr.ib(default=attr.Factory(dict))
is_production = attr.ib(default=False)
is_staging = attr.ib(default=False)
@classmethod
def from_dict(cls, data):
return cls(
name=data["name"],
installer_version=data.get("installerVersion"),
addon_versions=data.get("addons", {}),
dependency_packages=data.get("dependencyPackages", {}),
is_production=data["isProduction"],
is_staging=data["isStaging"],
)

View file

@ -1,250 +0,0 @@
import os
import logging
import platform
from abc import ABCMeta, abstractmethod
import ayon_api
from .file_handler import RemoteFileHandler
from .data_structures import UrlType
class SourceDownloader(metaclass=ABCMeta):
"""Abstract class for source downloader."""
log = logging.getLogger(__name__)
@classmethod
@abstractmethod
def download(cls, source, destination_dir, data, transfer_progress):
"""Returns url of downloaded addon zip file.
Tranfer progress can be ignored, in that case file transfer won't
be shown as 0-100% but as 'running'. First step should be to set
destination content size and then add transferred chunk sizes.
Args:
source (dict): {type:"http", "url":"https://} ...}
destination_dir (str): local folder to unzip
data (dict): More information about download content. Always have
'type' key in.
transfer_progress (ayon_api.TransferProgress): Progress of
transferred (copy/download) content.
Returns:
(str) local path to addon zip file
"""
pass
@classmethod
@abstractmethod
def cleanup(cls, source, destination_dir, data):
"""Cleanup files when distribution finishes or crashes.
Cleanup e.g. temporary files (downloaded zip) or other related stuff
to downloader.
"""
pass
@classmethod
def check_hash(cls, addon_path, addon_hash, hash_type="sha256"):
"""Compares 'hash' of downloaded 'addon_url' file.
Args:
addon_path (str): Local path to addon file.
addon_hash (str): Hash of downloaded file.
hash_type (str): Type of hash.
Raises:
ValueError if hashes doesn't match
"""
if not os.path.exists(addon_path):
raise ValueError(f"{addon_path} doesn't exist.")
if not RemoteFileHandler.check_integrity(
addon_path, addon_hash, hash_type=hash_type
):
raise ValueError(f"{addon_path} doesn't match expected hash.")
@classmethod
def unzip(cls, addon_zip_path, destination_dir):
"""Unzips local 'addon_zip_path' to 'destination'.
Args:
addon_zip_path (str): local path to addon zip file
destination_dir (str): local folder to unzip
"""
RemoteFileHandler.unzip(addon_zip_path, destination_dir)
os.remove(addon_zip_path)
class OSDownloader(SourceDownloader):
"""Downloader using files from file drive."""
@classmethod
def download(cls, source, destination_dir, data, transfer_progress):
# OS doesn't need to download, unzip directly
addon_url = source["path"].get(platform.system().lower())
if not os.path.exists(addon_url):
raise ValueError(f"{addon_url} is not accessible")
return addon_url
@classmethod
def cleanup(cls, source, destination_dir, data):
# Nothing to do - download does not copy anything
pass
class HTTPDownloader(SourceDownloader):
"""Downloader using http or https protocol."""
CHUNK_SIZE = 100000
@staticmethod
def get_filename(source):
source_url = source["url"]
filename = source.get("filename")
if not filename:
filename = os.path.basename(source_url)
basename, ext = os.path.splitext(filename)
allowed_exts = set(RemoteFileHandler.IMPLEMENTED_ZIP_FORMATS)
if ext.lower().lstrip(".") not in allowed_exts:
filename = f"{basename}.zip"
return filename
@classmethod
def download(cls, source, destination_dir, data, transfer_progress):
source_url = source["url"]
cls.log.debug(f"Downloading {source_url} to {destination_dir}")
headers = source.get("headers")
filename = cls.get_filename(source)
# TODO use transfer progress
RemoteFileHandler.download_url(
source_url,
destination_dir,
filename,
headers=headers
)
return os.path.join(destination_dir, filename)
@classmethod
def cleanup(cls, source, destination_dir, data):
filename = cls.get_filename(source)
filepath = os.path.join(destination_dir, filename)
if os.path.exists(filepath) and os.path.isfile(filepath):
os.remove(filepath)
class AyonServerDownloader(SourceDownloader):
"""Downloads static resource file from AYON Server.
Expects filled env var AYON_SERVER_URL.
"""
CHUNK_SIZE = 8192
@classmethod
def download(cls, source, destination_dir, data, transfer_progress):
path = source["path"]
filename = source["filename"]
if path and not filename:
filename = path.split("/")[-1]
cls.log.debug(f"Downloading {filename} to {destination_dir}")
_, ext = os.path.splitext(filename)
ext = ext.lower().lstrip(".")
valid_exts = set(RemoteFileHandler.IMPLEMENTED_ZIP_FORMATS)
if ext not in valid_exts:
raise ValueError((
f"Invalid file extension \"{ext}\"."
f" Expected {', '.join(valid_exts)}"
))
if path:
filepath = os.path.join(destination_dir, filename)
return ayon_api.download_file(
path,
filepath,
chunk_size=cls.CHUNK_SIZE,
progress=transfer_progress
)
# dst_filepath = os.path.join(destination_dir, filename)
if data["type"] == "dependency_package":
return ayon_api.download_dependency_package(
data["name"],
destination_dir,
filename,
platform_name=data["platform"],
chunk_size=cls.CHUNK_SIZE,
progress=transfer_progress
)
if data["type"] == "addon":
return ayon_api.download_addon_private_file(
data["name"],
data["version"],
filename,
destination_dir,
chunk_size=cls.CHUNK_SIZE,
progress=transfer_progress
)
raise ValueError(f"Unknown type to download \"{data['type']}\"")
@classmethod
def cleanup(cls, source, destination_dir, data):
filename = source["filename"]
filepath = os.path.join(destination_dir, filename)
if os.path.exists(filepath) and os.path.isfile(filepath):
os.remove(filepath)
class DownloadFactory:
"""Factory for downloaders."""
def __init__(self):
self._downloaders = {}
def register_format(self, downloader_type, downloader):
"""Register downloader for download type.
Args:
downloader_type (UrlType): Type of source.
downloader (SourceDownloader): Downloader which cares about
download, hash check and unzipping.
"""
self._downloaders[downloader_type.value] = downloader
def get_downloader(self, downloader_type):
"""Registered downloader for type.
Args:
downloader_type (UrlType): Type of source.
Returns:
SourceDownloader: Downloader object which should care about file
distribution.
Raises:
ValueError: If type does not have registered downloader.
"""
if downloader := self._downloaders.get(downloader_type):
return downloader()
raise ValueError(f"{downloader_type} not implemented")
def get_default_download_factory():
download_factory = DownloadFactory()
download_factory.register_format(UrlType.FILESYSTEM, OSDownloader)
download_factory.register_format(UrlType.HTTP, HTTPDownloader)
download_factory.register_format(UrlType.SERVER, AyonServerDownloader)
return download_factory

View file

@ -1,248 +0,0 @@
import os
import sys
import copy
import tempfile
import attr
import pytest
current_dir = os.path.dirname(os.path.abspath(__file__))
root_dir = os.path.abspath(os.path.join(current_dir, "..", "..", "..", ".."))
sys.path.append(root_dir)
from common.ayon_common.distribution.downloaders import (
DownloadFactory,
OSDownloader,
HTTPDownloader,
)
from common.ayon_common.distribution.control import (
AyonDistribution,
UpdateState,
)
from common.ayon_common.distribution.data_structures import (
AddonInfo,
UrlType,
)
@pytest.fixture
def download_factory():
addon_downloader = DownloadFactory()
addon_downloader.register_format(UrlType.FILESYSTEM, OSDownloader)
addon_downloader.register_format(UrlType.HTTP, HTTPDownloader)
yield addon_downloader
@pytest.fixture
def http_downloader(download_factory):
yield download_factory.get_downloader(UrlType.HTTP.value)
@pytest.fixture
def temp_folder():
yield tempfile.mkdtemp(prefix="ayon_test_")
@pytest.fixture
def sample_bundles():
yield {
"bundles": [
{
"name": "TestBundle",
"createdAt": "2023-06-29T00:00:00.0+00:00",
"installerVersion": None,
"addons": {
"slack": "1.0.0"
},
"dependencyPackages": {},
"isProduction": True,
"isStaging": False
}
],
"productionBundle": "TestBundle",
"stagingBundle": None
}
@pytest.fixture
def sample_addon_info():
yield {
"name": "slack",
"title": "Slack addon",
"versions": {
"1.0.0": {
"hasSettings": True,
"hasSiteSettings": False,
"clientPyproject": {
"tool": {
"poetry": {
"dependencies": {
"nxtools": "^1.6",
"orjson": "^3.6.7",
"typer": "^0.4.1",
"email-validator": "^1.1.3",
"python": "^3.10",
"fastapi": "^0.73.0"
}
}
}
},
"clientSourceInfo": [
{
"type": "http",
"path": "https://drive.google.com/file/d/1TcuV8c2OV8CcbPeWi7lxOdqWsEqQNPYy/view?usp=sharing", # noqa
"filename": "dummy.zip"
},
{
"type": "filesystem",
"path": {
"windows": "P:/sources/some_file.zip",
"linux": "/mnt/srv/sources/some_file.zip",
"darwin": "/Volumes/srv/sources/some_file.zip"
}
}
],
"frontendScopes": {
"project": {
"sidebar": "hierarchy",
}
},
"hash": "4be25eb6215e91e5894d3c5475aeb1e379d081d3f5b43b4ee15b0891cf5f5658" # noqa
}
},
"description": ""
}
def test_register(printer):
download_factory = DownloadFactory()
assert len(download_factory._downloaders) == 0, "Contains registered"
download_factory.register_format(UrlType.FILESYSTEM, OSDownloader)
assert len(download_factory._downloaders) == 1, "Should contain one"
def test_get_downloader(printer, download_factory):
assert download_factory.get_downloader(UrlType.FILESYSTEM.value), "Should find" # noqa
with pytest.raises(ValueError):
download_factory.get_downloader("unknown"), "Shouldn't find"
def test_addon_info(printer, sample_addon_info):
"""Tests parsing of expected payload from v4 server into AadonInfo."""
valid_minimum = {
"name": "slack",
"versions": {
"1.0.0": {
"clientSourceInfo": [
{
"type": "filesystem",
"path": {
"windows": "P:/sources/some_file.zip",
"linux": "/mnt/srv/sources/some_file.zip",
"darwin": "/Volumes/srv/sources/some_file.zip"
}
}
]
}
}
}
assert AddonInfo.from_dict(valid_minimum), "Missing required fields"
addon = AddonInfo.from_dict(sample_addon_info)
assert addon, "Should be created"
assert addon.name == "slack", "Incorrect name"
assert "1.0.0" in addon.versions, "Version is not in versions"
with pytest.raises(TypeError):
assert addon["name"], "Dict approach not implemented"
addon_as_dict = attr.asdict(addon)
assert addon_as_dict["name"], "Dict approach should work"
def _get_dist_item(dist_items, name, version):
final_dist_info = next(
(
dist_info
for dist_info in dist_items
if (
dist_info["addon_name"] == name
and dist_info["addon_version"] == version
)
),
{}
)
return final_dist_info["dist_item"]
def test_update_addon_state(
printer, sample_addon_info, temp_folder, download_factory, sample_bundles
):
"""Tests possible cases of addon update."""
addon_version = list(sample_addon_info["versions"])[0]
broken_addon_info = copy.deepcopy(sample_addon_info)
# Cause crash because of invalid hash
broken_addon_info["versions"][addon_version]["hash"] = "brokenhash"
distribution = AyonDistribution(
addon_dirpath=temp_folder,
dependency_dirpath=temp_folder,
dist_factory=download_factory,
addons_info=[broken_addon_info],
dependency_packages_info=[],
bundles_info=sample_bundles
)
distribution.distribute()
dist_items = distribution.get_addon_dist_items()
slack_dist_item = _get_dist_item(
dist_items,
sample_addon_info["name"],
addon_version
)
slack_state = slack_dist_item.state
assert slack_state == UpdateState.UPDATE_FAILED, (
"Update should have failed because of wrong hash")
# Fix cache and validate if was updated
distribution = AyonDistribution(
addon_dirpath=temp_folder,
dependency_dirpath=temp_folder,
dist_factory=download_factory,
addons_info=[sample_addon_info],
dependency_packages_info=[],
bundles_info=sample_bundles
)
distribution.distribute()
dist_items = distribution.get_addon_dist_items()
slack_dist_item = _get_dist_item(
dist_items,
sample_addon_info["name"],
addon_version
)
assert slack_dist_item.state == UpdateState.UPDATED, (
"Addon should have been updated")
# Is UPDATED without calling distribute
distribution = AyonDistribution(
addon_dirpath=temp_folder,
dependency_dirpath=temp_folder,
dist_factory=download_factory,
addons_info=[sample_addon_info],
dependency_packages_info=[],
bundles_info=sample_bundles
)
dist_items = distribution.get_addon_dist_items()
slack_dist_item = _get_dist_item(
dist_items,
sample_addon_info["name"],
addon_version
)
assert slack_dist_item.state == UpdateState.UPDATED, (
"Addon should already exist")

View file

@ -1,146 +0,0 @@
import sys
from qtpy import QtWidgets, QtGui
from ayon_common import is_staging_enabled
from ayon_common.resources import (
get_icon_path,
load_stylesheet,
)
from ayon_common.ui_utils import get_qt_app
class MissingBundleWindow(QtWidgets.QDialog):
default_width = 410
default_height = 170
def __init__(
self, url=None, bundle_name=None, use_staging=None, parent=None
):
super().__init__(parent)
icon_path = get_icon_path()
icon = QtGui.QIcon(icon_path)
self.setWindowIcon(icon)
self.setWindowTitle("Missing Bundle")
self._url = url
self._bundle_name = bundle_name
self._use_staging = use_staging
self._first_show = True
info_label = QtWidgets.QLabel("", self)
info_label.setWordWrap(True)
btns_widget = QtWidgets.QWidget(self)
confirm_btn = QtWidgets.QPushButton("Exit", btns_widget)
btns_layout = QtWidgets.QHBoxLayout(btns_widget)
btns_layout.setContentsMargins(0, 0, 0, 0)
btns_layout.addStretch(1)
btns_layout.addWidget(confirm_btn, 0)
main_layout = QtWidgets.QVBoxLayout(self)
main_layout.addWidget(info_label, 0)
main_layout.addStretch(1)
main_layout.addWidget(btns_widget, 0)
confirm_btn.clicked.connect(self._on_confirm_click)
self._info_label = info_label
self._confirm_btn = confirm_btn
self._update_label()
def set_url(self, url):
if url == self._url:
return
self._url = url
self._update_label()
def set_bundle_name(self, bundle_name):
if bundle_name == self._bundle_name:
return
self._bundle_name = bundle_name
self._update_label()
def set_use_staging(self, use_staging):
if self._use_staging == use_staging:
return
self._use_staging = use_staging
self._update_label()
def showEvent(self, event):
super().showEvent(event)
if self._first_show:
self._first_show = False
self._on_first_show()
self._recalculate_sizes()
def resizeEvent(self, event):
super().resizeEvent(event)
self._recalculate_sizes()
def _recalculate_sizes(self):
hint = self._confirm_btn.sizeHint()
new_width = max((hint.width(), hint.height() * 3))
self._confirm_btn.setMinimumWidth(new_width)
def _on_first_show(self):
self.setStyleSheet(load_stylesheet())
self.resize(self.default_width, self.default_height)
def _on_confirm_click(self):
self.accept()
self.close()
def _update_label(self):
self._info_label.setText(self._get_label())
def _get_label(self):
url_part = f" <b>{self._url}</b>" if self._url else ""
if self._bundle_name:
return (
f"Requested release bundle <b>{self._bundle_name}</b>"
f" is not available on server{url_part}."
"<br/><br/>Try to restart AYON desktop launcher. Please"
" contact your administrator if issue persist."
)
mode = "staging" if self._use_staging else "production"
return (
f"No release bundle is set as {mode} on the AYON"
f" server{url_part} so there is nothing to launch."
"<br/><br/>Please contact your administrator"
" to resolve the issue."
)
def main():
"""Show message that server does not have set bundle to use.
It is possible to pass url as argument to show it in the message. To use
this feature, pass `--url <url>` as argument to this script.
"""
url = None
bundle_name = None
if "--url" in sys.argv:
url_index = sys.argv.index("--url") + 1
if url_index < len(sys.argv):
url = sys.argv[url_index]
if "--bundle" in sys.argv:
bundle_index = sys.argv.index("--bundle") + 1
if bundle_index < len(sys.argv):
bundle_name = sys.argv[bundle_index]
use_staging = is_staging_enabled()
app = get_qt_app()
window = MissingBundleWindow(url, bundle_name, use_staging)
window.show()
app.exec_()
if __name__ == "__main__":
main()

View file

@ -1,90 +0,0 @@
import os
import subprocess
from ayon_common.utils import get_ayon_appdirs, get_ayon_launch_args
def get_local_dir(*subdirs):
"""Get product directory in user's home directory.
Each user on machine have own local directory where are downloaded updates,
addons etc.
Returns:
str: Path to product local directory.
"""
if not subdirs:
raise ValueError("Must fill dir_name if nothing else provided!")
local_dir = get_ayon_appdirs(*subdirs)
if not os.path.isdir(local_dir):
try:
os.makedirs(local_dir)
except Exception: # TODO fix exception
raise RuntimeError(f"Cannot create {local_dir}")
return local_dir
def get_addons_dir():
"""Directory where addon packages are stored.
Path to addons is defined using python module 'appdirs' which
The path is stored into environment variable 'AYON_ADDONS_DIR'.
Value of environment variable can be overriden, but we highly recommended
to use that option only for development purposes.
Returns:
str: Path to directory where addons should be downloaded.
"""
addons_dir = os.environ.get("AYON_ADDONS_DIR")
if not addons_dir:
addons_dir = get_local_dir("addons")
os.environ["AYON_ADDONS_DIR"] = addons_dir
return addons_dir
def get_dependencies_dir():
"""Directory where dependency packages are stored.
Path to addons is defined using python module 'appdirs' which
The path is stored into environment variable 'AYON_DEPENDENCIES_DIR'.
Value of environment variable can be overriden, but we highly recommended
to use that option only for development purposes.
Returns:
str: Path to directory where dependency packages should be downloaded.
"""
dependencies_dir = os.environ.get("AYON_DEPENDENCIES_DIR")
if not dependencies_dir:
dependencies_dir = get_local_dir("dependency_packages")
os.environ["AYON_DEPENDENCIES_DIR"] = dependencies_dir
return dependencies_dir
def show_missing_bundle_information(url, bundle_name=None):
"""Show missing bundle information window.
This function should be called when server does not have set bundle for
production or staging, or when bundle that should be used is not available
on server.
Using subprocess to show the dialog. Is blocking and is waiting until
dialog is closed.
Args:
url (str): Server url where bundle is not set.
bundle_name (Optional[str]): Name of bundle that was not found.
"""
ui_dir = os.path.join(os.path.dirname(__file__), "ui")
script_path = os.path.join(ui_dir, "missing_bundle_window.py")
args = get_ayon_launch_args(script_path, "--skip-bootstrap", "--url", url)
if bundle_name:
args.extend(["--bundle", bundle_name])
subprocess.call(args)

Binary file not shown.

Before

Width:  |  Height:  |  Size: 4.2 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 16 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 15 KiB

View file

@ -1,25 +0,0 @@
import os
from ayon_common.utils import is_staging_enabled
RESOURCES_DIR = os.path.dirname(os.path.abspath(__file__))
def get_resource_path(*args):
path_items = list(args)
path_items.insert(0, RESOURCES_DIR)
return os.path.sep.join(path_items)
def get_icon_path():
if is_staging_enabled():
return get_resource_path("AYON_staging.png")
return get_resource_path("AYON.png")
def load_stylesheet():
stylesheet_path = get_resource_path("stylesheet.css")
with open(stylesheet_path, "r") as stream:
content = stream.read()
return content

Binary file not shown.

Before

Width:  |  Height:  |  Size: 8.9 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.1 KiB

View file

@ -1,84 +0,0 @@
* {
font-size: 10pt;
font-family: "Noto Sans";
font-weight: 450;
outline: none;
}
QWidget {
color: #D3D8DE;
background: #2C313A;
border-radius: 0px;
}
QWidget:disabled {
color: #5b6779;
}
QLabel {
background: transparent;
}
QPushButton {
text-align:center center;
border: 0px solid transparent;
border-radius: 0.2em;
padding: 3px 5px 3px 5px;
background: #434a56;
}
QPushButton:hover {
background: rgba(168, 175, 189, 0.3);
color: #F0F2F5;
}
QPushButton:pressed {}
QPushButton:disabled {
background: #434a56;
}
QLineEdit {
border: 1px solid #373D48;
border-radius: 0.3em;
background: #21252B;
padding: 0.1em;
}
QLineEdit:disabled {
background: #2C313A;
}
QLineEdit:hover {
border-color: rgba(168, 175, 189, .3);
}
QLineEdit:focus {
border-color: rgb(92, 173, 214);
}
QLineEdit[state="invalid"] {
border-color: #AA5050;
}
#Separator {
background: rgba(75, 83, 98, 127);
}
#PasswordBtn {
border: none;
padding: 0.1em;
background: transparent;
}
#PasswordBtn:hover {
background: #434a56;
}
#LikeDisabledInput {
background: #2C313A;
}
#LikeDisabledInput:hover {
border-color: #373D48;
}
#LikeDisabledInput:focus {
border-color: #373D48;
}

View file

@ -1,36 +0,0 @@
import sys
from qtpy import QtWidgets, QtCore
def set_style_property(widget, property_name, property_value):
"""Set widget's property that may affect style.
Style of widget is polished if current property value is different.
"""
cur_value = widget.property(property_name)
if cur_value == property_value:
return
widget.setProperty(property_name, property_value)
widget.style().polish(widget)
def get_qt_app():
app = QtWidgets.QApplication.instance()
if app is not None:
return app
for attr_name in (
"AA_EnableHighDpiScaling",
"AA_UseHighDpiPixmaps",
):
attr = getattr(QtCore.Qt, attr_name, None)
if attr is not None:
QtWidgets.QApplication.setAttribute(attr)
if hasattr(QtWidgets.QApplication, "setHighDpiScaleFactorRoundingPolicy"):
QtWidgets.QApplication.setHighDpiScaleFactorRoundingPolicy(
QtCore.Qt.HighDpiScaleFactorRoundingPolicy.PassThrough
)
return QtWidgets.QApplication(sys.argv)

View file

@ -1,90 +0,0 @@
import os
import sys
import appdirs
IS_BUILT_APPLICATION = getattr(sys, "frozen", False)
def get_ayon_appdirs(*args):
"""Local app data directory of AYON client.
Args:
*args (Iterable[str]): Subdirectories/files in local app data dir.
Returns:
str: Path to directory/file in local app data dir.
"""
return os.path.join(
appdirs.user_data_dir("AYON", "Ynput"),
*args
)
def is_staging_enabled():
"""Check if staging is enabled.
Returns:
bool: True if staging is enabled.
"""
return os.getenv("AYON_USE_STAGING") == "1"
def _create_local_site_id():
"""Create a local site identifier.
Returns:
str: Randomly generated site id.
"""
from coolname import generate_slug
new_id = generate_slug(3)
print("Created local site id \"{}\"".format(new_id))
return new_id
def get_local_site_id():
"""Get local site identifier.
Site id is created if does not exist yet.
Returns:
str: Site id.
"""
# used for background syncing
site_id = os.environ.get("AYON_SITE_ID")
if site_id:
return site_id
site_id_path = get_ayon_appdirs("site_id")
if os.path.exists(site_id_path):
with open(site_id_path, "r") as stream:
site_id = stream.read()
if not site_id:
site_id = _create_local_site_id()
with open(site_id_path, "w") as stream:
stream.write(site_id)
return site_id
def get_ayon_launch_args(*args):
"""Launch arguments that can be used to launch ayon process.
Args:
*args (str): Additional arguments.
Returns:
list[str]: Launch arguments.
"""
output = [sys.executable]
if not IS_BUILT_APPLICATION:
output.append(sys.argv[0])
output.extend(args)
return output

View file

@ -1,135 +0,0 @@
import warnings
import functools
import pyblish.api
class ActionDeprecatedWarning(DeprecationWarning):
pass
def deprecated(new_destination):
"""Mark functions as deprecated.
It will result in a warning being emitted when the function is used.
"""
func = None
if callable(new_destination):
func = new_destination
new_destination = None
def _decorator(decorated_func):
if new_destination is None:
warning_message = (
" Please check content of deprecated function to figure out"
" possible replacement."
)
else:
warning_message = " Please replace your usage with '{}'.".format(
new_destination
)
@functools.wraps(decorated_func)
def wrapper(*args, **kwargs):
warnings.simplefilter("always", ActionDeprecatedWarning)
warnings.warn(
(
"Call to deprecated function '{}'"
"\nFunction was moved or removed.{}"
).format(decorated_func.__name__, warning_message),
category=ActionDeprecatedWarning,
stacklevel=4
)
return decorated_func(*args, **kwargs)
return wrapper
if func is None:
return _decorator
return _decorator(func)
@deprecated("openpype.pipeline.publish.get_errored_instances_from_context")
def get_errored_instances_from_context(context, plugin=None):
"""
Deprecated:
Since 3.14.* will be removed in 3.16.* or later.
"""
from openpype.pipeline.publish import get_errored_instances_from_context
return get_errored_instances_from_context(context, plugin=plugin)
@deprecated("openpype.pipeline.publish.get_errored_plugins_from_context")
def get_errored_plugins_from_data(context):
"""
Deprecated:
Since 3.14.* will be removed in 3.16.* or later.
"""
from openpype.pipeline.publish import get_errored_plugins_from_context
return get_errored_plugins_from_context(context)
class RepairAction(pyblish.api.Action):
"""Repairs the action
To process the repairing this requires a static `repair(instance)` method
is available on the plugin.
Deprecated:
'RepairAction' and 'RepairContextAction' were moved to
'openpype.pipeline.publish' please change you imports.
There is no "reasonable" way hot mark these classes as deprecated
to show warning of wrong import. Deprecated since 3.14.* will be
removed in 3.16.*
"""
label = "Repair"
on = "failed" # This action is only available on a failed plug-in
icon = "wrench" # Icon from Awesome Icon
def process(self, context, plugin):
if not hasattr(plugin, "repair"):
raise RuntimeError("Plug-in does not have repair method.")
# Get the errored instances
self.log.info("Finding failed instances..")
errored_instances = get_errored_instances_from_context(context,
plugin=plugin)
for instance in errored_instances:
plugin.repair(instance)
class RepairContextAction(pyblish.api.Action):
"""Repairs the action
To process the repairing this requires a static `repair(instance)` method
is available on the plugin.
Deprecated:
'RepairAction' and 'RepairContextAction' were moved to
'openpype.pipeline.publish' please change you imports.
There is no "reasonable" way hot mark these classes as deprecated
to show warning of wrong import. Deprecated since 3.14.* will be
removed in 3.16.*
"""
label = "Repair"
on = "failed" # This action is only available on a failed plug-in
def process(self, context, plugin):
if not hasattr(plugin, "repair"):
raise RuntimeError("Plug-in does not have repair method.")
# Get the errored instances
self.log.info("Finding failed instances..")
errored_plugins = get_errored_plugins_from_data(context)
# Apply pyblish.logic to get the instances for the plug-in
if plugin in errored_plugins:
self.log.info("Attempting fix ...")
plugin.repair(context)

View file

@ -20,6 +20,7 @@ from openpype.pipeline import get_current_asset_name, get_current_task_name
from openpype.tools.utils import host_tools
from .workio import OpenFileCacher
from . import pipeline
PREVIEW_COLLECTIONS: Dict = dict()
@ -344,6 +345,26 @@ class LaunchWorkFiles(LaunchQtApp):
self._window.refresh()
class SetFrameRange(bpy.types.Operator):
bl_idname = "wm.ayon_set_frame_range"
bl_label = "Set Frame Range"
def execute(self, context):
data = pipeline.get_asset_data()
pipeline.set_frame_range(data)
return {"FINISHED"}
class SetResolution(bpy.types.Operator):
bl_idname = "wm.ayon_set_resolution"
bl_label = "Set Resolution"
def execute(self, context):
data = pipeline.get_asset_data()
pipeline.set_resolution(data)
return {"FINISHED"}
class TOPBAR_MT_avalon(bpy.types.Menu):
"""Avalon menu."""
@ -381,9 +402,11 @@ class TOPBAR_MT_avalon(bpy.types.Menu):
layout.operator(LaunchManager.bl_idname, text="Manage...")
layout.operator(LaunchLibrary.bl_idname, text="Library...")
layout.separator()
layout.operator(SetFrameRange.bl_idname, text="Set Frame Range")
layout.operator(SetResolution.bl_idname, text="Set Resolution")
layout.separator()
layout.operator(LaunchWorkFiles.bl_idname, text="Work Files...")
# TODO (jasper): maybe add 'Reload Pipeline', 'Set Frame Range' and
# 'Set Resolution'?
# TODO (jasper): maybe add 'Reload Pipeline'
def draw_avalon_menu(self, context):
@ -399,6 +422,8 @@ classes = [
LaunchManager,
LaunchLibrary,
LaunchWorkFiles,
SetFrameRange,
SetResolution,
TOPBAR_MT_avalon,
]

View file

@ -113,22 +113,21 @@ def message_window(title, message):
_process_app_events()
def set_start_end_frames():
def get_asset_data():
project_name = get_current_project_name()
asset_name = get_current_asset_name()
asset_doc = get_asset_by_name(project_name, asset_name)
return asset_doc.get("data")
def set_frame_range(data):
scene = bpy.context.scene
# Default scene settings
frameStart = scene.frame_start
frameEnd = scene.frame_end
fps = scene.render.fps / scene.render.fps_base
resolution_x = scene.render.resolution_x
resolution_y = scene.render.resolution_y
# Check if settings are set
data = asset_doc.get("data")
if not data:
return
@ -139,26 +138,47 @@ def set_start_end_frames():
frameEnd = data.get("frameEnd")
if data.get("fps"):
fps = data.get("fps")
if data.get("resolutionWidth"):
resolution_x = data.get("resolutionWidth")
if data.get("resolutionHeight"):
resolution_y = data.get("resolutionHeight")
scene.frame_start = frameStart
scene.frame_end = frameEnd
scene.render.fps = round(fps)
scene.render.fps_base = round(fps) / fps
def set_resolution(data):
scene = bpy.context.scene
# Default scene settings
resolution_x = scene.render.resolution_x
resolution_y = scene.render.resolution_y
if not data:
return
if data.get("resolutionWidth"):
resolution_x = data.get("resolutionWidth")
if data.get("resolutionHeight"):
resolution_y = data.get("resolutionHeight")
scene.render.resolution_x = resolution_x
scene.render.resolution_y = resolution_y
def on_new():
set_start_end_frames()
project = os.environ.get("AVALON_PROJECT")
settings = get_project_settings(project)
settings = get_project_settings(project).get("blender")
unit_scale_settings = settings.get("blender").get("unit_scale_settings")
set_resolution_startup = settings.get("set_resolution_startup")
set_frames_startup = settings.get("set_frames_startup")
data = get_asset_data()
if set_resolution_startup:
set_resolution(data)
if set_frames_startup:
set_frame_range(data)
unit_scale_settings = settings.get("unit_scale_settings")
unit_scale_enabled = unit_scale_settings.get("enabled")
if unit_scale_enabled:
unit_scale = unit_scale_settings.get("base_file_unit_scale")
@ -166,12 +186,20 @@ def on_new():
def on_open():
set_start_end_frames()
project = os.environ.get("AVALON_PROJECT")
settings = get_project_settings(project)
settings = get_project_settings(project).get("blender")
unit_scale_settings = settings.get("blender").get("unit_scale_settings")
set_resolution_startup = settings.get("set_resolution_startup")
set_frames_startup = settings.get("set_frames_startup")
data = get_asset_data()
if set_resolution_startup:
set_resolution(data)
if set_frames_startup:
set_frame_range(data)
unit_scale_settings = settings.get("unit_scale_settings")
unit_scale_enabled = unit_scale_settings.get("enabled")
apply_on_opening = unit_scale_settings.get("apply_on_opening")
if unit_scale_enabled and apply_on_opening:

View file

@ -29,6 +29,8 @@ class CollectReview(pyblish.api.InstancePlugin):
camera = cameras[0].name
self.log.debug(f"camera: {camera}")
focal_length = cameras[0].data.lens
# get isolate objects list from meshes instance members .
isolate_objects = [
obj
@ -40,6 +42,10 @@ class CollectReview(pyblish.api.InstancePlugin):
task = instance.context.data["task"]
# Store focal length in `burninDataMembers`
burninData = instance.data.setdefault("burninDataMembers", {})
burninData["focalLength"] = focal_length
instance.data.update({
"subset": f"{task}Review",
"review_camera": camera,

View file

@ -22,8 +22,6 @@ class ExtractABC(publish.Extractor):
filepath = os.path.join(stagingdir, filename)
context = bpy.context
scene = context.scene
view_layer = context.view_layer
# Perform extraction
self.log.info("Performing extraction..")
@ -31,24 +29,25 @@ class ExtractABC(publish.Extractor):
plugin.deselect_all()
selected = []
asset_group = None
active = None
for obj in instance:
obj.select_set(True)
selected.append(obj)
# Set as active the asset group
if obj.get(AVALON_PROPERTY):
asset_group = obj
active = obj
context = plugin.create_blender_context(
active=asset_group, selected=selected)
active=active, selected=selected)
# We export the abc
bpy.ops.wm.alembic_export(
context,
filepath=filepath,
selected=True,
flatten=False
)
with bpy.context.temp_override(**context):
# We export the abc
bpy.ops.wm.alembic_export(
filepath=filepath,
selected=True,
flatten=False
)
plugin.deselect_all()

View file

@ -0,0 +1,73 @@
import os
import bpy
from openpype.pipeline import publish
from openpype.hosts.blender.api import plugin
from openpype.hosts.blender.api.pipeline import AVALON_PROPERTY
class ExtractCameraABC(publish.Extractor):
"""Extract camera as ABC."""
label = "Extract Camera (ABC)"
hosts = ["blender"]
families = ["camera"]
optional = True
def process(self, instance):
# Define extract output file path
stagingdir = self.staging_dir(instance)
filename = f"{instance.name}.abc"
filepath = os.path.join(stagingdir, filename)
context = bpy.context
# Perform extraction
self.log.info("Performing extraction..")
plugin.deselect_all()
selected = []
active = None
asset_group = None
for obj in instance:
if obj.get(AVALON_PROPERTY):
asset_group = obj
break
assert asset_group, "No asset group found"
# Need to cast to list because children is a tuple
selected = list(asset_group.children)
active = selected[0]
for obj in selected:
obj.select_set(True)
context = plugin.create_blender_context(
active=active, selected=selected)
with bpy.context.temp_override(**context):
# We export the abc
bpy.ops.wm.alembic_export(
filepath=filepath,
selected=True,
flatten=True
)
plugin.deselect_all()
if "representations" not in instance.data:
instance.data["representations"] = []
representation = {
'name': 'abc',
'ext': 'abc',
'files': filename,
"stagingDir": stagingdir,
}
instance.data["representations"].append(representation)
self.log.info("Extracted instance '%s' to: %s",
instance.name, representation)

View file

@ -9,7 +9,7 @@ from openpype.hosts.blender.api import plugin
class ExtractCamera(publish.Extractor):
"""Extract as the camera as FBX."""
label = "Extract Camera"
label = "Extract Camera (FBX)"
hosts = ["blender"]
families = ["camera"]
optional = True

View file

@ -18,8 +18,10 @@ class SelectInvalidAction(pyblish.api.Action):
icon = "search" # Icon from Awesome Icon
def process(self, context, plugin):
errored_instances = get_errored_instances_from_context(context,
plugin=plugin)
errored_instances = get_errored_instances_from_context(
context,
plugin=plugin,
)
# Get the invalid nodes for the plug-ins
self.log.info("Finding invalid nodes..")
@ -51,6 +53,7 @@ class SelectInvalidAction(pyblish.api.Action):
names = set()
for tool in invalid:
flow.Select(tool, True)
comp.SetActiveTool(tool)
names.add(tool.Name)
self.log.info(
"Selecting invalid tools: %s" % ", ".join(sorted(names))

View file

@ -94,15 +94,14 @@ class ExtractRender(pyblish.api.InstancePlugin):
# Generate thumbnail.
thumbnail_path = os.path.join(path, "thumbnail.png")
ffmpeg_path = openpype.lib.get_ffmpeg_tool_path("ffmpeg")
args = [
ffmpeg_path,
args = openpype.lib.get_ffmpeg_tool_args(
"ffmpeg",
"-y",
"-i", os.path.join(path, list(collections[0])[0]),
"-vf", "scale=300:-1",
"-vframes", "1",
thumbnail_path
]
)
process = subprocess.Popen(
args,
stdout=subprocess.PIPE,

View file

@ -2,7 +2,7 @@ import os
import pyblish.api
from openpype.lib import (
get_oiio_tools_path,
get_oiio_tool_args,
run_subprocess,
)
from openpype.pipeline import publish
@ -18,7 +18,7 @@ class ExtractFrames(publish.Extractor):
movie_extensions = ["mov", "mp4"]
def process(self, instance):
oiio_tool_path = get_oiio_tools_path()
oiio_tool_args = get_oiio_tool_args("oiiotool")
staging_dir = self.staging_dir(instance)
output_template = os.path.join(staging_dir, instance.data["name"])
sequence = instance.context.data["activeTimeline"]
@ -36,7 +36,7 @@ class ExtractFrames(publish.Extractor):
output_path = output_template
output_path += ".{:04d}.{}".format(int(frame), output_ext)
args = [oiio_tool_path]
args = list(oiio_tool_args)
ext = os.path.splitext(input_path)[1][1:]
if ext in self.movie_extensions:

View file

@ -1,5 +1,5 @@
from openpype.hosts.houdini.api import plugin
from openpype.lib import EnumDef
from openpype.lib import EnumDef, BoolDef
class CreateArnoldRop(plugin.HoudiniCreator):
@ -24,7 +24,7 @@ class CreateArnoldRop(plugin.HoudiniCreator):
# Add chunk size attribute
instance_data["chunkSize"] = 1
# Submit for job publishing
instance_data["farm"] = True
instance_data["farm"] = pre_create_data.get("farm")
instance = super(CreateArnoldRop, self).create(
subset_name,
@ -64,6 +64,9 @@ class CreateArnoldRop(plugin.HoudiniCreator):
]
return attrs + [
BoolDef("farm",
label="Submitting to Farm",
default=True),
EnumDef("image_format",
image_format_enum,
default=self.ext,

View file

@ -21,7 +21,7 @@ class CreateKarmaROP(plugin.HoudiniCreator):
# Add chunk size attribute
instance_data["chunkSize"] = 10
# Submit for job publishing
instance_data["farm"] = True
instance_data["farm"] = pre_create_data.get("farm")
instance = super(CreateKarmaROP, self).create(
subset_name,
@ -67,6 +67,7 @@ class CreateKarmaROP(plugin.HoudiniCreator):
camera = None
for node in self.selected_nodes:
if node.type().name() == "cam":
camera = node.path()
has_camera = pre_create_data.get("cam_res")
if has_camera:
res_x = node.evalParm("resx")
@ -96,6 +97,9 @@ class CreateKarmaROP(plugin.HoudiniCreator):
]
return attrs + [
BoolDef("farm",
label="Submitting to Farm",
default=True),
EnumDef("image_format",
image_format_enum,
default="exr",

View file

@ -21,7 +21,7 @@ class CreateMantraROP(plugin.HoudiniCreator):
# Add chunk size attribute
instance_data["chunkSize"] = 10
# Submit for job publishing
instance_data["farm"] = True
instance_data["farm"] = pre_create_data.get("farm")
instance = super(CreateMantraROP, self).create(
subset_name,
@ -76,6 +76,9 @@ class CreateMantraROP(plugin.HoudiniCreator):
]
return attrs + [
BoolDef("farm",
label="Submitting to Farm",
default=True),
EnumDef("image_format",
image_format_enum,
default="exr",

View file

@ -3,7 +3,7 @@
import hou # noqa
from openpype.hosts.houdini.api import plugin
from openpype.lib import EnumDef
from openpype.lib import EnumDef, BoolDef
class CreateRedshiftROP(plugin.HoudiniCreator):
@ -23,7 +23,7 @@ class CreateRedshiftROP(plugin.HoudiniCreator):
# Add chunk size attribute
instance_data["chunkSize"] = 10
# Submit for job publishing
instance_data["farm"] = True
instance_data["farm"] = pre_create_data.get("farm")
instance = super(CreateRedshiftROP, self).create(
subset_name,
@ -100,6 +100,9 @@ class CreateRedshiftROP(plugin.HoudiniCreator):
]
return attrs + [
BoolDef("farm",
label="Submitting to Farm",
default=True),
EnumDef("image_format",
image_format_enum,
default=self.ext,

View file

@ -25,7 +25,7 @@ class CreateVrayROP(plugin.HoudiniCreator):
# Add chunk size attribute
instance_data["chunkSize"] = 10
# Submit for job publishing
instance_data["farm"] = True
instance_data["farm"] = pre_create_data.get("farm")
instance = super(CreateVrayROP, self).create(
subset_name,
@ -139,6 +139,9 @@ class CreateVrayROP(plugin.HoudiniCreator):
]
return attrs + [
BoolDef("farm",
label="Submitting to Farm",
default=True),
EnumDef("image_format",
image_format_enum,
default=self.ext,

View file

@ -17,5 +17,5 @@ class CollectPointcacheType(pyblish.api.InstancePlugin):
def process(self, instance):
if instance.data["creator_identifier"] == "io.openpype.creators.houdini.bgeo": # noqa: E501
instance.data["families"] += ["bgeo"]
elif instance.data["creator_identifier"] == "io.openpype.creators.houdini.alembic": # noqa: E501
elif instance.data["creator_identifier"] == "io.openpype.creators.houdini.pointcache": # noqa: E501
instance.data["families"] += ["abc"]

View file

@ -27,20 +27,16 @@ from openpype.settings import get_project_settings
from openpype.pipeline import (
get_current_project_name,
get_current_asset_name,
get_current_task_name,
discover_loader_plugins,
loaders_from_representation,
get_representation_path,
load_container,
registered_host,
registered_host
)
from openpype.lib import NumberDef
from openpype.pipeline.context_tools import get_current_project_asset
from openpype.pipeline.create import CreateContext
from openpype.pipeline.context_tools import (
get_current_asset_name,
get_current_project_name,
get_current_task_name
)
from openpype.lib.profiles_filtering import filter_profiles

View file

@ -8,13 +8,24 @@ from maya import cmds
from maya.app.renderSetup.model import renderSetup
from openpype.lib import BoolDef, Logger
from openpype.pipeline import AVALON_CONTAINER_ID, Anatomy, CreatedInstance
from openpype.pipeline import Creator as NewCreator
from openpype.pipeline import (
CreatorError, LegacyCreator, LoaderPlugin, get_representation_path,
legacy_io)
from openpype.pipeline.load import LoadError
from openpype.settings import get_project_settings
from openpype.pipeline import (
AVALON_CONTAINER_ID,
Anatomy,
CreatedInstance,
Creator as NewCreator,
AutoCreator,
HiddenCreator,
CreatorError,
LegacyCreator,
LoaderPlugin,
get_representation_path,
legacy_io,
)
from openpype.pipeline.load import LoadError
from . import lib
from .lib import imprint, read
@ -177,10 +188,42 @@ class MayaCreatorBase(object):
return node_data
def _default_collect_instances(self):
self.cache_subsets(self.collection_shared_data)
cached_subsets = self.collection_shared_data["maya_cached_subsets"]
for node in cached_subsets.get(self.identifier, []):
node_data = self.read_instance_node(node)
created_instance = CreatedInstance.from_existing(node_data, self)
self._add_instance_to_context(created_instance)
def _default_update_instances(self, update_list):
for created_inst, _changes in update_list:
data = created_inst.data_to_store()
node = data.get("instance_node")
self.imprint_instance_node(node, data)
def _default_remove_instances(self, instances):
"""Remove specified instance from the scene.
This is only removing `id` parameter so instance is no longer
instance, because it might contain valuable data for artist.
"""
for instance in instances:
node = instance.data.get("instance_node")
if node:
cmds.delete(node)
self._remove_instance_from_context(instance)
@six.add_metaclass(ABCMeta)
class MayaCreator(NewCreator, MayaCreatorBase):
settings_name = None
def create(self, subset_name, instance_data, pre_create_data):
members = list()
@ -202,34 +245,13 @@ class MayaCreator(NewCreator, MayaCreatorBase):
return instance
def collect_instances(self):
self.cache_subsets(self.collection_shared_data)
cached_subsets = self.collection_shared_data["maya_cached_subsets"]
for node in cached_subsets.get(self.identifier, []):
node_data = self.read_instance_node(node)
created_instance = CreatedInstance.from_existing(node_data, self)
self._add_instance_to_context(created_instance)
return self._default_collect_instances()
def update_instances(self, update_list):
for created_inst, _changes in update_list:
data = created_inst.data_to_store()
node = data.get("instance_node")
self.imprint_instance_node(node, data)
return self._default_update_instances(update_list)
def remove_instances(self, instances):
"""Remove specified instance from the scene.
This is only removing `id` parameter so instance is no longer
instance, because it might contain valuable data for artist.
"""
for instance in instances:
node = instance.data.get("instance_node")
if node:
cmds.delete(node)
self._remove_instance_from_context(instance)
return self._default_remove_instances(instances)
def get_pre_create_attr_defs(self):
return [
@ -238,6 +260,61 @@ class MayaCreator(NewCreator, MayaCreatorBase):
default=True)
]
def apply_settings(self, project_settings, system_settings):
"""Method called on initialization of plugin to apply settings."""
settings_name = self.settings_name
if settings_name is None:
settings_name = self.__class__.__name__
settings = project_settings["maya"]["create"]
settings = settings.get(settings_name)
if settings is None:
self.log.debug(
"No settings found for {}".format(self.__class__.__name__)
)
return
for key, value in settings.items():
setattr(self, key, value)
class MayaAutoCreator(AutoCreator, MayaCreatorBase):
"""Automatically triggered creator for Maya.
The plugin is not visible in UI, and 'create' method does not expect
any arguments.
"""
def collect_instances(self):
return self._default_collect_instances()
def update_instances(self, update_list):
return self._default_update_instances(update_list)
def remove_instances(self, instances):
return self._default_remove_instances(instances)
class MayaHiddenCreator(HiddenCreator, MayaCreatorBase):
"""Hidden creator for Maya.
The plugin is not visible in UI, and it does not have strictly defined
arguments for 'create' method.
"""
def create(self, *args, **kwargs):
return MayaCreator.create(self, *args, **kwargs)
def collect_instances(self):
return self._default_collect_instances()
def update_instances(self, update_list):
return self._default_update_instances(update_list)
def remove_instances(self, instances):
return self._default_remove_instances(instances)
def ensure_namespace(namespace):
"""Make sure the namespace exists.

View file

@ -51,7 +51,7 @@ class MayaLegacyConvertor(SubsetConvertorPlugin,
# From all current new style manual creators find the mapping
# from family to identifier
family_to_id = {}
for identifier, creator in self.create_context.manual_creators.items():
for identifier, creator in self.create_context.creators.items():
family = getattr(creator, "family", None)
if not family:
continue
@ -70,7 +70,6 @@ class MayaLegacyConvertor(SubsetConvertorPlugin,
# logic was thus to be live to the current task to begin with.
data = dict()
data["task"] = self.create_context.get_current_task_name()
for family, instance_nodes in legacy.items():
if family not in family_to_id:
self.log.warning(
@ -81,7 +80,7 @@ class MayaLegacyConvertor(SubsetConvertorPlugin,
continue
creator_id = family_to_id[family]
creator = self.create_context.manual_creators[creator_id]
creator = self.create_context.creators[creator_id]
data["creator_identifier"] = creator_id
if isinstance(creator, plugin.RenderlayerCreator):

View file

@ -8,15 +8,13 @@ from openpype.lib import (
)
class CreateAnimation(plugin.MayaCreator):
"""Animation output for character rigs"""
# We hide the animation creator from the UI since the creation of it
# is automated upon loading a rig. There's an inventory action to recreate
# it for loaded rigs if by chance someone deleted the animation instance.
# Note: This setting is actually applied from project settings
enabled = False
class CreateAnimation(plugin.MayaHiddenCreator):
"""Animation output for character rigs
We hide the animation creator from the UI since the creation of it is
automated upon loading a rig. There's an inventory action to recreate it
for loaded rigs if by chance someone deleted the animation instance.
"""
identifier = "io.openpype.creators.maya.animation"
name = "animationDefault"
label = "Animation"
@ -28,9 +26,6 @@ class CreateAnimation(plugin.MayaCreator):
include_parent_hierarchy = False
include_user_defined_attributes = False
# TODO: Would be great if we could visually hide this from the creator
# by default but do allow to generate it through code.
def get_instance_attr_defs(self):
defs = lib.collect_animation_defs()
@ -85,3 +80,12 @@ class CreateAnimation(plugin.MayaCreator):
"""
return defs
def apply_settings(self, project_settings, system_settings):
super(CreateAnimation, self).apply_settings(
project_settings, system_settings
)
# Hardcoding creator to be enabled due to existing settings would
# disable the creator causing the creator plugin to not be
# discoverable.
self.enabled = True

View file

@ -15,6 +15,7 @@ class CreateArnoldSceneSource(plugin.MayaCreator):
label = "Arnold Scene Source"
family = "ass"
icon = "cube"
settings_name = "CreateAss"
expandProcedurals = False
motionBlur = True

View file

@ -12,7 +12,7 @@ class CreateModel(plugin.MayaCreator):
label = "Model"
family = "model"
icon = "cube"
defaults = ["Main", "Proxy", "_MD", "_HD", "_LD"]
default_variants = ["Main", "Proxy", "_MD", "_HD", "_LD"]
write_color_sets = False
write_face_sets = False

View file

@ -20,6 +20,6 @@ class CreateRig(plugin.MayaCreator):
instance_node = instance.get("instance_node")
self.log.info("Creating Rig instance set up ...")
controls = cmds.sets(name="controls_SET", empty=True)
pointcache = cmds.sets(name="out_SET", empty=True)
controls = cmds.sets(name=subset_name + "_controls_SET", empty=True)
pointcache = cmds.sets(name=subset_name + "_out_SET", empty=True)
cmds.sets([controls, pointcache], forceElement=instance_node)

View file

@ -9,7 +9,7 @@ class CreateSetDress(plugin.MayaCreator):
label = "Set Dress"
family = "setdress"
icon = "cubes"
defaults = ["Main", "Anim"]
default_variants = ["Main", "Anim"]
def get_instance_attr_defs(self):
return [

View file

@ -107,6 +107,9 @@ class CollectReview(pyblish.api.InstancePlugin):
data["displayLights"] = display_lights
data["burninDataMembers"] = burninDataMembers
for key, value in instance.data["publish_attributes"].items():
data["publish_attributes"][key] = value
# The review instance must be active
cmds.setAttr(str(instance) + '.active', 1)

View file

@ -15,8 +15,14 @@ import pyblish.api
from maya import cmds # noqa
from openpype.lib.vendor_bin_utils import find_executable
from openpype.lib import source_hash, run_subprocess, get_oiio_tools_path
from openpype.lib import (
find_executable,
source_hash,
run_subprocess,
get_oiio_tool_args,
ToolNotFoundError,
)
from openpype.pipeline import legacy_io, publish, KnownPublishError
from openpype.hosts.maya.api import lib
@ -267,12 +273,11 @@ class MakeTX(TextureProcessor):
"""
maketx_path = get_oiio_tools_path("maketx")
if not maketx_path:
raise AssertionError(
"OIIO 'maketx' tool not found. Result: {}".format(maketx_path)
)
try:
maketx_args = get_oiio_tool_args("maketx")
except ToolNotFoundError:
raise KnownPublishError(
"OpenImageIO is not available on the machine")
# Define .tx filepath in staging if source file is not .tx
fname, ext = os.path.splitext(os.path.basename(source))
@ -328,8 +333,7 @@ class MakeTX(TextureProcessor):
self.log.info("Generating .tx file for %s .." % source)
subprocess_args = [
maketx_path,
subprocess_args = maketx_args + [
"-v", # verbose
"-u", # update mode
# --checknan doesn't influence the output file but aborts the

View file

@ -3,7 +3,9 @@
from __future__ import absolute_import
import pyblish.api
from openpype.pipeline.publish import ValidateContentsOrder
from openpype.pipeline.publish import (
ValidateContentsOrder, PublishValidationError
)
from maya import cmds
@ -108,4 +110,5 @@ class ValidateInstanceInContext(pyblish.api.InstancePlugin):
asset = instance.data.get("asset")
context_asset = instance.context.data["assetEntity"]["name"]
msg = "{} has asset {}".format(instance.name, asset)
assert asset == context_asset, msg
if asset != context_asset:
raise PublishValidationError(msg)

View file

@ -63,15 +63,10 @@ class ValidateModelContent(pyblish.api.InstancePlugin):
return True
# Top group
assemblies = cmds.ls(content_instance, assemblies=True, long=True)
if len(assemblies) != 1 and cls.validate_top_group:
top_parents = set([x.split("|")[1] for x in content_instance])
if cls.validate_top_group and len(top_parents) != 1:
cls.log.error("Must have exactly one top group")
return assemblies
if len(assemblies) == 0:
cls.log.warning("No top group found. "
"(Are there objects in the instance?"
" Or is it parented in another group?)")
return assemblies or True
return top_parents
def _is_visible(node):
"""Return whether node is visible"""
@ -82,11 +77,11 @@ class ValidateModelContent(pyblish.api.InstancePlugin):
visibility=True)
# The roots must be visible (the assemblies)
for assembly in assemblies:
if not _is_visible(assembly):
cls.log.error("Invisible assembly (root node) is not "
"allowed: {0}".format(assembly))
invalid.add(assembly)
for parent in top_parents:
if not _is_visible(parent):
cls.log.error("Invisible parent (root node) is not "
"allowed: {0}".format(parent))
invalid.add(parent)
# Ensure at least one shape is visible
if not any(_is_visible(shape) for shape in shapes):

View file

@ -47,7 +47,7 @@ class ValidateRigOutputIds(pyblish.api.InstancePlugin):
invalid = {}
if compute:
out_set = next(x for x in instance if x.endswith("out_SET"))
out_set = next(x for x in instance if "out_SET" in x)
instance_nodes = cmds.sets(out_set, query=True, nodesOnly=True)
instance_nodes = cmds.ls(instance_nodes, long=True)

View file

@ -1,10 +1,9 @@
import os
import shutil
from PIL import Image
from openpype.lib import (
run_subprocess,
get_ffmpeg_tool_path,
get_ffmpeg_tool_args,
)
from openpype.pipeline import publish
from openpype.hosts.photoshop import api as photoshop
@ -85,7 +84,7 @@ class ExtractReview(publish.Extractor):
instance.data["representations"].append(repre_skeleton)
processed_img_names = [img_list]
ffmpeg_path = get_ffmpeg_tool_path("ffmpeg")
ffmpeg_args = get_ffmpeg_tool_args("ffmpeg")
instance.data["stagingDir"] = staging_dir
@ -94,13 +93,21 @@ class ExtractReview(publish.Extractor):
source_files_pattern = self._check_and_resize(processed_img_names,
source_files_pattern,
staging_dir)
self._generate_thumbnail(ffmpeg_path, instance, source_files_pattern,
staging_dir)
self._generate_thumbnail(
list(ffmpeg_args),
instance,
source_files_pattern,
staging_dir)
no_of_frames = len(processed_img_names)
if no_of_frames > 1:
self._generate_mov(ffmpeg_path, instance, fps, no_of_frames,
source_files_pattern, staging_dir)
self._generate_mov(
list(ffmpeg_args),
instance,
fps,
no_of_frames,
source_files_pattern,
staging_dir)
self.log.info(f"Extracted {instance} to {staging_dir}")
@ -142,8 +149,9 @@ class ExtractReview(publish.Extractor):
"tags": self.mov_options['tags']
})
def _generate_thumbnail(self, ffmpeg_path, instance, source_files_pattern,
staging_dir):
def _generate_thumbnail(
self, ffmpeg_args, instance, source_files_pattern, staging_dir
):
"""Generates scaled down thumbnail and adds it as representation.
Args:
@ -157,8 +165,7 @@ class ExtractReview(publish.Extractor):
# Generate thumbnail
thumbnail_path = os.path.join(staging_dir, "thumbnail.jpg")
self.log.info(f"Generate thumbnail {thumbnail_path}")
args = [
ffmpeg_path,
args = ffmpeg_args + [
"-y",
"-i", source_files_pattern,
"-vf", "scale=300:-1",

View file

@ -1,8 +1,9 @@
import os
import subprocess
import tempfile
import pyblish.api
from openpype.lib import (
get_ffmpeg_tool_path,
get_ffmpeg_tool_args,
get_ffprobe_streams,
path_to_subprocess_arg,
run_subprocess,
@ -62,12 +63,12 @@ class ExtractThumbnailSP(pyblish.api.InstancePlugin):
instance.context.data["cleanupFullPaths"].append(full_thumbnail_path)
ffmpeg_path = get_ffmpeg_tool_path("ffmpeg")
ffmpeg_executable_args = get_ffmpeg_tool_args("ffmpeg")
ffmpeg_args = self.ffmpeg_args or {}
jpeg_items = [
path_to_subprocess_arg(ffmpeg_path),
subprocess.list2cmdline(ffmpeg_executable_args),
# override file if already exists
"-y"
]

View file

@ -9,7 +9,8 @@ import json
import pyblish.api
from openpype.lib import (
get_oiio_tools_path,
get_oiio_tool_args,
ToolNotFoundError,
run_subprocess,
)
from openpype.pipeline import KnownPublishError
@ -34,11 +35,12 @@ class ExtractConvertToEXR(pyblish.api.InstancePlugin):
if not repres:
return
oiio_path = get_oiio_tools_path()
# Raise an exception when oiiotool is not available
# - this can currently happen on MacOS machines
if not os.path.exists(oiio_path):
KnownPublishError(
try:
oiio_args = get_oiio_tool_args("oiiotool")
except ToolNotFoundError:
# Raise an exception when oiiotool is not available
# - this can currently happen on MacOS machines
raise KnownPublishError(
"OpenImageIO tool is not available on this machine."
)
@ -64,8 +66,8 @@ class ExtractConvertToEXR(pyblish.api.InstancePlugin):
src_filepaths.add(src_filepath)
args = [
oiio_path, src_filepath,
args = oiio_args + [
src_filepath,
"--compression", self.exr_compression,
# TODO how to define color conversion?
"--colorconvert", "sRGB", "linear",

View file

@ -54,7 +54,8 @@ class UnrealAddon(OpenPypeModule, IHostAddon):
# Set default environments if are not set via settings
defaults = {
"OPENPYPE_LOG_NO_COLORS": "True"
"OPENPYPE_LOG_NO_COLORS": "True",
"UE_PYTHONPATH": os.environ.get("PYTHONPATH", ""),
}
for key, value in defaults.items():
if not env.get(key):

View file

@ -3,21 +3,21 @@
import os
import copy
from pathlib import Path
from openpype.widgets.splash_screen import SplashScreen
from qtpy import QtCore
from openpype.hosts.unreal.ue_workers import (
UEProjectGenerationWorker,
UEPluginInstallWorker
)
from openpype import resources
from openpype.lib import (
PreLaunchHook,
ApplicationLaunchFailed,
ApplicationNotFound,
)
from openpype.pipeline.workfile import get_workfile_template_key
import openpype.hosts.unreal.lib as unreal_lib
from openpype.hosts.unreal.ue_workers import (
UEProjectGenerationWorker,
UEPluginInstallWorker
)
from openpype.hosts.unreal.ui import SplashScreen
class UnrealPrelaunchHook(PreLaunchHook):
@ -187,24 +187,36 @@ class UnrealPrelaunchHook(PreLaunchHook):
project_path.mkdir(parents=True, exist_ok=True)
# Set "AYON_UNREAL_PLUGIN" to current process environment for
# execution of `create_unreal_project`
if self.launch_context.env.get("AYON_UNREAL_PLUGIN"):
self.log.info((
f"{self.signature} using Ayon plugin from "
f"{self.launch_context.env.get('AYON_UNREAL_PLUGIN')}"
))
env_key = "AYON_UNREAL_PLUGIN"
if self.launch_context.env.get(env_key):
os.environ[env_key] = self.launch_context.env[env_key]
# engine_path points to the specific Unreal Engine root
# so, we are going up from the executable itself 3 levels.
engine_path: Path = Path(executable).parents[3]
if not unreal_lib.check_plugin_existence(engine_path):
self.exec_plugin_install(engine_path)
# Check if new env variable exists, and if it does, if the path
# actually contains the plugin. If not, install it.
built_plugin_path = self.launch_context.env.get(
"AYON_BUILT_UNREAL_PLUGIN", None)
if unreal_lib.check_built_plugin_existance(built_plugin_path):
self.log.info((
f"{self.signature} using existing built Ayon plugin from "
f"{built_plugin_path}"
))
unreal_lib.copy_built_plugin(engine_path, Path(built_plugin_path))
else:
# Set "AYON_UNREAL_PLUGIN" to current process environment for
# execution of `create_unreal_project`
env_key = "AYON_UNREAL_PLUGIN"
if self.launch_context.env.get(env_key):
self.log.info((
f"{self.signature} using Ayon plugin from "
f"{self.launch_context.env.get(env_key)}"
))
if self.launch_context.env.get(env_key):
os.environ[env_key] = self.launch_context.env[env_key]
if not unreal_lib.check_plugin_existence(engine_path):
self.exec_plugin_install(engine_path)
project_file = project_path / unreal_project_filename

View file

@ -429,6 +429,36 @@ def get_build_id(engine_path: Path, ue_version: str) -> str:
return "{" + loaded_modules.get("BuildId") + "}"
def check_built_plugin_existance(plugin_path) -> bool:
if not plugin_path:
return False
integration_plugin_path = Path(plugin_path)
if not integration_plugin_path.is_dir():
raise RuntimeError("Path to the integration plugin is null!")
if not (integration_plugin_path / "Binaries").is_dir() \
or not (integration_plugin_path / "Intermediate").is_dir():
return False
return True
def copy_built_plugin(engine_path: Path, plugin_path: Path) -> None:
ayon_plugin_path: Path = engine_path / "Engine/Plugins/Marketplace/Ayon"
if not ayon_plugin_path.is_dir():
ayon_plugin_path.mkdir(parents=True, exist_ok=True)
engine_plugin_config_path: Path = ayon_plugin_path / "Config"
engine_plugin_config_path.mkdir(exist_ok=True)
dir_util._path_created = {}
dir_util.copy_tree(plugin_path.as_posix(), ayon_plugin_path.as_posix())
def check_plugin_existence(engine_path: Path, env: dict = None) -> bool:
env = env or os.environ
integration_plugin_path: Path = Path(env.get("AYON_UNREAL_PLUGIN", ""))

View file

@ -0,0 +1,5 @@
from .splash_screen import SplashScreen
__all__ = (
"SplashScreen",
)

View file

@ -1,6 +1,5 @@
from qtpy import QtWidgets, QtCore, QtGui
from openpype import style, resources
from igniter.nice_progress_bar import NiceProgressBar
class SplashScreen(QtWidgets.QDialog):
@ -143,7 +142,7 @@ class SplashScreen(QtWidgets.QDialog):
button_layout.addWidget(self.close_btn)
# Progress Bar
self.progress_bar = NiceProgressBar()
self.progress_bar = QtWidgets.QProgressBar()
self.progress_bar.setValue(0)
self.progress_bar.setAlignment(QtCore.Qt.AlignTop)

View file

@ -22,11 +22,14 @@ from .events import (
)
from .vendor_bin_utils import (
ToolNotFoundError,
find_executable,
get_vendor_bin_path,
get_oiio_tools_path,
get_oiio_tool_args,
get_ffmpeg_tool_path,
is_oiio_supported
get_ffmpeg_tool_args,
is_oiio_supported,
)
from .attribute_definitions import (
@ -53,7 +56,6 @@ from .env_tools import (
from .terminal import Terminal
from .execute import (
get_openpype_execute_args,
get_pype_execute_args,
get_linux_launcher_args,
execute,
run_subprocess,
@ -65,7 +67,6 @@ from .execute import (
)
from .log import (
Logger,
PypeLogger,
)
from .path_templates import (
@ -77,12 +78,6 @@ from .path_templates import (
FormatObject,
)
from .mongo import (
get_default_components,
validate_mongo_connection,
OpenPypeMongoConnection
)
from .dateutils import (
get_datetime_data,
get_timestamp,
@ -115,25 +110,6 @@ from .transcoding import (
convert_ffprobe_fps_value,
convert_ffprobe_fps_to_float,
)
from .avalon_context import (
CURRENT_DOC_SCHEMAS,
create_project,
get_workfile_template_key,
get_workfile_template_key_from_context,
get_last_workfile_with_version,
get_last_workfile,
BuildWorkfile,
get_creator_by_name,
get_custom_workfile_template,
get_custom_workfile_template_by_context,
get_custom_workfile_template_by_string_context,
get_custom_workfile_template
)
from .local_settings import (
IniSettingRegistry,
@ -163,9 +139,6 @@ from .applications import (
)
from .plugin_tools import (
TaskNotSetError,
get_subset_name,
get_subset_name_with_asset_doc,
prepare_template_data,
source_hash,
)
@ -177,9 +150,6 @@ from .path_tools import (
version_up,
get_version_from_path,
get_last_version_from_path,
create_project_folders,
create_workdir_extra_folders,
get_project_basic_paths,
)
from .openpype_version import (
@ -205,9 +175,7 @@ __all__ = [
"emit_event",
"register_event_callback",
"find_executable",
"get_openpype_execute_args",
"get_pype_execute_args",
"get_linux_launcher_args",
"execute",
"run_subprocess",
@ -220,9 +188,13 @@ __all__ = [
"env_value_to_bool",
"get_paths_from_environ",
"ToolNotFoundError",
"find_executable",
"get_vendor_bin_path",
"get_oiio_tools_path",
"get_oiio_tool_args",
"get_ffmpeg_tool_path",
"get_ffmpeg_tool_args",
"is_oiio_supported",
"AbstractAttrDef",
@ -257,22 +229,6 @@ __all__ = [
"convert_ffprobe_fps_value",
"convert_ffprobe_fps_to_float",
"CURRENT_DOC_SCHEMAS",
"create_project",
"get_workfile_template_key",
"get_workfile_template_key_from_context",
"get_last_workfile_with_version",
"get_last_workfile",
"BuildWorkfile",
"get_creator_by_name",
"get_custom_workfile_template_by_context",
"get_custom_workfile_template_by_string_context",
"get_custom_workfile_template",
"IniSettingRegistry",
"JSONSettingRegistry",
"OpenPypeSecureRegistry",
@ -298,9 +254,7 @@ __all__ = [
"filter_profiles",
"TaskNotSetError",
"get_subset_name",
"get_subset_name_with_asset_doc",
"prepare_template_data",
"source_hash",
"format_file_size",
@ -323,15 +277,6 @@ __all__ = [
"get_formatted_current_time",
"Logger",
"PypeLogger",
"get_default_components",
"validate_mongo_connection",
"OpenPypeMongoConnection",
"create_project_folders",
"create_workdir_extra_folders",
"get_project_basic_paths",
"op_version_control_available",
"get_openpype_version",

View file

@ -1640,11 +1640,7 @@ def prepare_context_environments(data, env_group=None, modules_manager=None):
project_doc = data["project_doc"]
asset_doc = data["asset_doc"]
task_name = data["task_name"]
if (
not project_doc
or not asset_doc
or not task_name
):
if not project_doc:
log.info(
"Skipping context environments preparation."
" Launch context does not contain required data."
@ -1657,18 +1653,16 @@ def prepare_context_environments(data, env_group=None, modules_manager=None):
system_settings = get_system_settings()
data["project_settings"] = project_settings
data["system_settings"] = system_settings
# Apply project specific environments on current env value
apply_project_environments_value(
project_name, data["env"], project_settings, env_group
)
app = data["app"]
context_env = {
"AVALON_PROJECT": project_doc["name"],
"AVALON_ASSET": asset_doc["name"],
"AVALON_TASK": task_name,
"AVALON_APP_NAME": app.full_name
}
if asset_doc:
context_env["AVALON_ASSET"] = asset_doc["name"]
if task_name:
context_env["AVALON_TASK"] = task_name
log.debug(
"Context environments set:\n{}".format(
@ -1676,9 +1670,25 @@ def prepare_context_environments(data, env_group=None, modules_manager=None):
)
)
data["env"].update(context_env)
# Apply project specific environments on current env value
# - apply them once the context environments are set
apply_project_environments_value(
project_name, data["env"], project_settings, env_group
)
if not app.is_host:
return
data["env"]["AVALON_APP"] = app.host_name
if not asset_doc or not task_name:
# QUESTION replace with log.info and skip workfile discovery?
# - technically it should be possible to launch host without context
raise ApplicationLaunchFailed(
"Host launch require asset and task context."
)
workdir_data = get_template_data(
project_doc, asset_doc, task_name, app.host_name, system_settings
)
@ -1716,7 +1726,6 @@ def prepare_context_environments(data, env_group=None, modules_manager=None):
"Couldn't create workdir because: {}".format(str(exc))
)
data["env"]["AVALON_APP"] = app.host_name
data["env"]["AVALON_WORKDIR"] = workdir
_prepare_last_workfile(data, workdir, modules_manager)

View file

@ -1,654 +0,0 @@
"""Should be used only inside of hosts."""
import platform
import logging
import functools
import warnings
import six
from openpype.client import (
get_project,
get_asset_by_name,
)
from openpype.client.operations import (
CURRENT_ASSET_DOC_SCHEMA,
CURRENT_PROJECT_SCHEMA,
CURRENT_PROJECT_CONFIG_SCHEMA,
)
from .profiles_filtering import filter_profiles
from .path_templates import StringTemplate
legacy_io = None
log = logging.getLogger("AvalonContext")
# Backwards compatibility - should not be used anymore
# - Will be removed in OP 3.16.*
CURRENT_DOC_SCHEMAS = {
"project": CURRENT_PROJECT_SCHEMA,
"asset": CURRENT_ASSET_DOC_SCHEMA,
"config": CURRENT_PROJECT_CONFIG_SCHEMA
}
class AvalonContextDeprecatedWarning(DeprecationWarning):
pass
def deprecated(new_destination):
"""Mark functions as deprecated.
It will result in a warning being emitted when the function is used.
"""
func = None
if callable(new_destination):
func = new_destination
new_destination = None
def _decorator(decorated_func):
if new_destination is None:
warning_message = (
" Please check content of deprecated function to figure out"
" possible replacement."
)
else:
warning_message = " Please replace your usage with '{}'.".format(
new_destination
)
@functools.wraps(decorated_func)
def wrapper(*args, **kwargs):
warnings.simplefilter("always", AvalonContextDeprecatedWarning)
warnings.warn(
(
"Call to deprecated function '{}'"
"\nFunction was moved or removed.{}"
).format(decorated_func.__name__, warning_message),
category=AvalonContextDeprecatedWarning,
stacklevel=4
)
return decorated_func(*args, **kwargs)
return wrapper
if func is None:
return _decorator
return _decorator(func)
@deprecated("openpype.client.operations.create_project")
def create_project(
project_name, project_code, library_project=False, dbcon=None
):
"""Create project using OpenPype settings.
This project creation function is not validating project document on
creation. It is because project document is created blindly with only
minimum required information about project which is it's name, code, type
and schema.
Entered project name must be unique and project must not exist yet.
Args:
project_name(str): New project name. Should be unique.
project_code(str): Project's code should be unique too.
library_project(bool): Project is library project.
dbcon(AvalonMongoDB): Object of connection to MongoDB.
Raises:
ValueError: When project name already exists in MongoDB.
Returns:
dict: Created project document.
Deprecated:
Function will be removed after release version 3.16.*
"""
from openpype.client.operations import create_project
return create_project(project_name, project_code, library_project)
def with_pipeline_io(func):
@functools.wraps(func)
def wrapped(*args, **kwargs):
global legacy_io
if legacy_io is None:
from openpype.pipeline import legacy_io
return func(*args, **kwargs)
return wrapped
@deprecated("openpype.client.get_linked_asset_ids")
def get_linked_asset_ids(asset_doc):
"""Return linked asset ids for `asset_doc` from DB
Args:
asset_doc (dict): Asset document from DB.
Returns:
(list): MongoDB ids of input links.
Deprecated:
Function will be removed after release version 3.16.*
"""
from openpype.client import get_linked_asset_ids
from openpype.pipeline import legacy_io
project_name = legacy_io.active_project()
return get_linked_asset_ids(project_name, asset_doc=asset_doc)
@deprecated(
"openpype.pipeline.workfile.get_workfile_template_key_from_context")
def get_workfile_template_key_from_context(
asset_name, task_name, host_name, project_name=None,
dbcon=None, project_settings=None
):
"""Helper function to get template key for workfile template.
Do the same as `get_workfile_template_key` but returns value for "session
context".
It is required to pass one of 'dbcon' with already set project name or
'project_name' arguments.
Args:
asset_name(str): Name of asset document.
task_name(str): Task name for which is template key retrieved.
Must be available on asset document under `data.tasks`.
host_name(str): Name of host implementation for which is workfile
used.
project_name(str): Project name where asset and task is. Not required
when 'dbcon' is passed.
dbcon(AvalonMongoDB): Connection to mongo with already set project
under `AVALON_PROJECT`. Not required when 'project_name' is passed.
project_settings(dict): Project settings for passed 'project_name'.
Not required at all but makes function faster.
Raises:
ValueError: When both 'dbcon' and 'project_name' were not
passed.
Deprecated:
Function will be removed after release version 3.16.*
"""
from openpype.pipeline.workfile import (
get_workfile_template_key_from_context
)
if not project_name:
if not dbcon:
raise ValueError((
"`get_workfile_template_key_from_context` requires to pass"
" one of 'dbcon' or 'project_name' arguments."
))
project_name = dbcon.active_project()
return get_workfile_template_key_from_context(
asset_name, task_name, host_name, project_name, project_settings
)
@deprecated(
"openpype.pipeline.workfile.get_workfile_template_key")
def get_workfile_template_key(
task_type, host_name, project_name=None, project_settings=None
):
"""Workfile template key which should be used to get workfile template.
Function is using profiles from project settings to return right template
for passet task type and host name.
One of 'project_name' or 'project_settings' must be passed it is preferred
to pass settings if are already available.
Args:
task_type(str): Name of task type.
host_name(str): Name of host implementation (e.g. "maya", "nuke", ...)
project_name(str): Name of project in which context should look for
settings. Not required if `project_settings` are passed.
project_settings(dict): Prepare project settings for project name.
Not needed if `project_name` is passed.
Raises:
ValueError: When both 'project_name' and 'project_settings' were not
passed.
Deprecated:
Function will be removed after release version 3.16.*
"""
from openpype.pipeline.workfile import get_workfile_template_key
return get_workfile_template_key(
task_type, host_name, project_name, project_settings
)
@deprecated("openpype.pipeline.context_tools.compute_session_changes")
def compute_session_changes(
session, task=None, asset=None, app=None, template_key=None
):
"""Compute the changes for a Session object on asset, task or app switch
This does *NOT* update the Session object, but returns the changes
required for a valid update of the Session.
Args:
session (dict): The initial session to compute changes to.
This is required for computing the full Work Directory, as that
also depends on the values that haven't changed.
task (str, Optional): Name of task to switch to.
asset (str or dict, Optional): Name of asset to switch to.
You can also directly provide the Asset dictionary as returned
from the database to avoid an additional query. (optimization)
app (str, Optional): Name of app to switch to.
Returns:
dict: The required changes in the Session dictionary.
Deprecated:
Function will be removed after release version 3.16.*
"""
from openpype.pipeline import legacy_io
from openpype.pipeline.context_tools import compute_session_changes
if isinstance(asset, six.string_types):
project_name = legacy_io.active_project()
asset = get_asset_by_name(project_name, asset)
return compute_session_changes(
session,
asset,
task,
template_key
)
@deprecated("openpype.pipeline.context_tools.get_workdir_from_session")
def get_workdir_from_session(session=None, template_key=None):
"""Calculate workdir path based on session data.
Args:
session (Union[None, Dict[str, str]]): Session to use. If not passed
current context session is used (from legacy_io).
template_key (Union[str, None]): Precalculate template key to define
workfile template name in Anatomy.
Returns:
str: Workdir path.
Deprecated:
Function will be removed after release version 3.16.*
"""
from openpype.pipeline.context_tools import get_workdir_from_session
return get_workdir_from_session(session, template_key)
@deprecated("openpype.pipeline.context_tools.change_current_context")
def update_current_task(task=None, asset=None, app=None, template_key=None):
"""Update active Session to a new task work area.
This updates the live Session to a different `asset`, `task` or `app`.
Args:
task (str): The task to set.
asset (str): The asset to set.
app (str): The app to set.
Returns:
dict: The changed key, values in the current Session.
Deprecated:
Function will be removed after release version 3.16.*
"""
from openpype.pipeline import legacy_io
from openpype.pipeline.context_tools import change_current_context
project_name = legacy_io.active_project()
if isinstance(asset, six.string_types):
asset = get_asset_by_name(project_name, asset)
return change_current_context(asset, task, template_key)
@deprecated("openpype.pipeline.workfile.BuildWorkfile")
def BuildWorkfile():
"""Build workfile class was moved to workfile pipeline.
Deprecated:
Function will be removed after release version 3.16.*
"""
from openpype.pipeline.workfile import BuildWorkfile
return BuildWorkfile()
@deprecated("openpype.pipeline.create.get_legacy_creator_by_name")
def get_creator_by_name(creator_name, case_sensitive=False):
"""Find creator plugin by name.
Args:
creator_name (str): Name of creator class that should be returned.
case_sensitive (bool): Match of creator plugin name is case sensitive.
Set to `False` by default.
Returns:
Creator: Return first matching plugin or `None`.
Deprecated:
Function will be removed after release version 3.16.*
"""
from openpype.pipeline.create import get_legacy_creator_by_name
return get_legacy_creator_by_name(creator_name, case_sensitive)
def _get_task_context_data_for_anatomy(
project_doc, asset_doc, task_name, anatomy=None
):
"""Prepare Task context for anatomy data.
WARNING: this data structure is currently used only in workfile templates.
Key "task" is currently in rest of pipeline used as string with task
name.
Args:
project_doc (dict): Project document with available "name" and
"data.code" keys.
asset_doc (dict): Asset document from MongoDB.
task_name (str): Name of context task.
anatomy (Anatomy): Optionally Anatomy for passed project name can be
passed as Anatomy creation may be slow.
Returns:
dict: With Anatomy context data.
"""
from openpype.pipeline.template_data import get_general_template_data
if anatomy is None:
from openpype.pipeline import Anatomy
anatomy = Anatomy(project_doc["name"])
asset_name = asset_doc["name"]
project_task_types = anatomy["tasks"]
# get relevant task type from asset doc
assert task_name in asset_doc["data"]["tasks"], (
"Task name \"{}\" not found on asset \"{}\"".format(
task_name, asset_name
)
)
task_type = asset_doc["data"]["tasks"][task_name].get("type")
assert task_type, (
"Task name \"{}\" on asset \"{}\" does not have specified task type."
).format(asset_name, task_name)
# get short name for task type defined in default anatomy settings
project_task_type_data = project_task_types.get(task_type)
assert project_task_type_data, (
"Something went wrong. Default anatomy tasks are not holding"
"requested task type: `{}`".format(task_type)
)
data = {
"project": {
"name": project_doc["name"],
"code": project_doc["data"].get("code")
},
"asset": asset_name,
"task": {
"name": task_name,
"type": task_type,
"short": project_task_type_data["short_name"]
}
}
system_general_data = get_general_template_data()
data.update(system_general_data)
return data
@deprecated(
"openpype.pipeline.workfile.get_custom_workfile_template_by_context")
def get_custom_workfile_template_by_context(
template_profiles, project_doc, asset_doc, task_name, anatomy=None
):
"""Filter and fill workfile template profiles by passed context.
It is expected that passed argument are already queried documents of
project and asset as parents of processing task name.
Existence of formatted path is not validated.
Args:
template_profiles(list): Template profiles from settings.
project_doc(dict): Project document from MongoDB.
asset_doc(dict): Asset document from MongoDB.
task_name(str): Name of task for which templates are filtered.
anatomy(Anatomy): Optionally passed anatomy object for passed project
name.
Returns:
str: Path to template or None if none of profiles match current
context. (Existence of formatted path is not validated.)
Deprecated:
Function will be removed after release version 3.16.*
"""
if anatomy is None:
from openpype.pipeline import Anatomy
anatomy = Anatomy(project_doc["name"])
# get project, asset, task anatomy context data
anatomy_context_data = _get_task_context_data_for_anatomy(
project_doc, asset_doc, task_name, anatomy
)
# add root dict
anatomy_context_data["root"] = anatomy.roots
# get task type for the task in context
current_task_type = anatomy_context_data["task"]["type"]
# get path from matching profile
matching_item = filter_profiles(
template_profiles,
{"task_types": current_task_type}
)
# when path is available try to format it in case
# there are some anatomy template strings
if matching_item:
template = matching_item["path"][platform.system().lower()]
return StringTemplate.format_strict_template(
template, anatomy_context_data
)
return None
@deprecated(
"openpype.pipeline.workfile.get_custom_workfile_template_by_string_context"
)
def get_custom_workfile_template_by_string_context(
template_profiles, project_name, asset_name, task_name,
dbcon=None, anatomy=None
):
"""Filter and fill workfile template profiles by passed context.
Passed context are string representations of project, asset and task.
Function will query documents of project and asset to be able use
`get_custom_workfile_template_by_context` for rest of logic.
Args:
template_profiles(list): Loaded workfile template profiles.
project_name(str): Project name.
asset_name(str): Asset name.
task_name(str): Task name.
dbcon(AvalonMongoDB): Optional avalon implementation of mongo
connection with context Session.
anatomy(Anatomy): Optionally prepared anatomy object for passed
project.
Returns:
str: Path to template or None if none of profiles match current
context. (Existence of formatted path is not validated.)
Deprecated:
Function will be removed after release version 3.16.*
"""
project_name = None
if anatomy is not None:
project_name = anatomy.project_name
if not project_name and dbcon is not None:
project_name = dbcon.active_project()
if not project_name:
raise ValueError("Can't determina project")
project_doc = get_project(project_name, fields=["name", "data.code"])
asset_doc = get_asset_by_name(
project_name, asset_name, fields=["name", "data.tasks"])
return get_custom_workfile_template_by_context(
template_profiles, project_doc, asset_doc, task_name, anatomy
)
@deprecated("openpype.pipeline.context_tools.get_custom_workfile_template")
def get_custom_workfile_template(template_profiles):
"""Filter and fill workfile template profiles by current context.
Current context is defined by `legacy_io.Session`. That's why this
function should be used only inside host where context is set and stable.
Args:
template_profiles(list): Template profiles from settings.
Returns:
str: Path to template or None if none of profiles match current
context. (Existence of formatted path is not validated.)
Deprecated:
Function will be removed after release version 3.16.*
"""
from openpype.pipeline import legacy_io
return get_custom_workfile_template_by_string_context(
template_profiles,
legacy_io.Session["AVALON_PROJECT"],
legacy_io.Session["AVALON_ASSET"],
legacy_io.Session["AVALON_TASK"],
legacy_io
)
@deprecated("openpype.pipeline.workfile.get_last_workfile_with_version")
def get_last_workfile_with_version(
workdir, file_template, fill_data, extensions
):
"""Return last workfile version.
Args:
workdir(str): Path to dir where workfiles are stored.
file_template(str): Template of file name.
fill_data(dict): Data for filling template.
extensions(list, tuple): All allowed file extensions of workfile.
Returns:
tuple: Last workfile<str> with version<int> if there is any otherwise
returns (None, None).
Deprecated:
Function will be removed after release version 3.16.*
"""
from openpype.pipeline.workfile import get_last_workfile_with_version
return get_last_workfile_with_version(
workdir, file_template, fill_data, extensions
)
@deprecated("openpype.pipeline.workfile.get_last_workfile")
def get_last_workfile(
workdir, file_template, fill_data, extensions, full_path=False
):
"""Return last workfile filename.
Returns file with version 1 if there is not workfile yet.
Args:
workdir(str): Path to dir where workfiles are stored.
file_template(str): Template of file name.
fill_data(dict): Data for filling template.
extensions(list, tuple): All allowed file extensions of workfile.
full_path(bool): Full path to file is returned if set to True.
Returns:
str: Last or first workfile as filename of full path to filename.
Deprecated:
Function will be removed after release version 3.16.*
"""
from openpype.pipeline.workfile import get_last_workfile
return get_last_workfile(
workdir, file_template, fill_data, extensions, full_path
)
@deprecated("openpype.client.get_linked_representation_id")
def get_linked_ids_for_representations(
project_name, repre_ids, dbcon=None, link_type=None, max_depth=0
):
"""Returns list of linked ids of particular type (if provided).
Goes from representations to version, back to representations
Args:
project_name (str)
repre_ids (list) or (ObjectId)
dbcon (avalon.mongodb.AvalonMongoDB, optional): Avalon Mongo connection
with Session.
link_type (str): ['reference', '..]
max_depth (int): limit how many levels of recursion
Returns:
(list) of ObjectId - linked representations
Deprecated:
Function will be removed after release version 3.16.*
"""
from openpype.client import get_linked_representation_id
if not isinstance(repre_ids, list):
repre_ids = [repre_ids]
output = []
for repre_id in repre_ids:
output.extend(get_linked_representation_id(
project_name,
repre_id=repre_id,
link_type=link_type,
max_depth=max_depth
))
return output

View file

@ -1,252 +0,0 @@
"""Functions useful for delivery action or loader"""
import os
import shutil
import functools
import warnings
class DeliveryDeprecatedWarning(DeprecationWarning):
pass
def deprecated(new_destination):
"""Mark functions as deprecated.
It will result in a warning being emitted when the function is used.
"""
func = None
if callable(new_destination):
func = new_destination
new_destination = None
def _decorator(decorated_func):
if new_destination is None:
warning_message = (
" Please check content of deprecated function to figure out"
" possible replacement."
)
else:
warning_message = " Please replace your usage with '{}'.".format(
new_destination
)
@functools.wraps(decorated_func)
def wrapper(*args, **kwargs):
warnings.simplefilter("always", DeliveryDeprecatedWarning)
warnings.warn(
(
"Call to deprecated function '{}'"
"\nFunction was moved or removed.{}"
).format(decorated_func.__name__, warning_message),
category=DeliveryDeprecatedWarning,
stacklevel=4
)
return decorated_func(*args, **kwargs)
return wrapper
if func is None:
return _decorator
return _decorator(func)
@deprecated("openpype.lib.path_tools.collect_frames")
def collect_frames(files):
"""Returns dict of source path and its frame, if from sequence
Uses clique as most precise solution, used when anatomy template that
created files is not known.
Assumption is that frames are separated by '.', negative frames are not
allowed.
Args:
files(list) or (set with single value): list of source paths
Returns:
(dict): {'/asset/subset_v001.0001.png': '0001', ....}
Deprecated:
Function was moved to different location and will be removed
after 3.16.* release.
"""
from .path_tools import collect_frames
return collect_frames(files)
@deprecated("openpype.lib.path_tools.format_file_size")
def sizeof_fmt(num, suffix=None):
"""Returns formatted string with size in appropriate unit
Deprecated:
Function was moved to different location and will be removed
after 3.16.* release.
"""
from .path_tools import format_file_size
return format_file_size(num, suffix)
@deprecated("openpype.pipeline.load.get_representation_path_with_anatomy")
def path_from_representation(representation, anatomy):
"""Get representation path using representation document and anatomy.
Args:
representation (Dict[str, Any]): Representation document.
anatomy (Anatomy): Project anatomy.
Deprecated:
Function was moved to different location and will be removed
after 3.16.* release.
"""
from openpype.pipeline.load import get_representation_path_with_anatomy
return get_representation_path_with_anatomy(representation, anatomy)
@deprecated
def copy_file(src_path, dst_path):
"""Hardlink file if possible(to save space), copy if not"""
from openpype.lib import create_hard_link # safer importing
if os.path.exists(dst_path):
return
try:
create_hard_link(
src_path,
dst_path
)
except OSError:
shutil.copyfile(src_path, dst_path)
@deprecated("openpype.pipeline.delivery.get_format_dict")
def get_format_dict(anatomy, location_path):
"""Returns replaced root values from user provider value.
Args:
anatomy (Anatomy)
location_path (str): user provided value
Returns:
(dict): prepared for formatting of a template
Deprecated:
Function was moved to different location and will be removed
after 3.16.* release.
"""
from openpype.pipeline.delivery import get_format_dict
return get_format_dict(anatomy, location_path)
@deprecated("openpype.pipeline.delivery.check_destination_path")
def check_destination_path(repre_id,
anatomy, anatomy_data,
datetime_data, template_name):
""" Try to create destination path based on 'template_name'.
In the case that path cannot be filled, template contains unmatched
keys, provide error message to filter out repre later.
Args:
anatomy (Anatomy)
anatomy_data (dict): context to fill anatomy
datetime_data (dict): values with actual date
template_name (str): to pick correct delivery template
Returns:
(collections.defauldict): {"TYPE_OF_ERROR":"ERROR_DETAIL"}
Deprecated:
Function was moved to different location and will be removed
after 3.16.* release.
"""
from openpype.pipeline.delivery import check_destination_path
return check_destination_path(
repre_id,
anatomy,
anatomy_data,
datetime_data,
template_name
)
@deprecated("openpype.pipeline.delivery.deliver_single_file")
def process_single_file(
src_path, repre, anatomy, template_name, anatomy_data, format_dict,
report_items, log
):
"""Copy single file to calculated path based on template
Args:
src_path(str): path of source representation file
_repre (dict): full repre, used only in process_sequence, here only
as to share same signature
anatomy (Anatomy)
template_name (string): user selected delivery template name
anatomy_data (dict): data from repre to fill anatomy with
format_dict (dict): root dictionary with names and values
report_items (collections.defaultdict): to return error messages
log (Logger): for log printing
Returns:
(collections.defaultdict , int)
Deprecated:
Function was moved to different location and will be removed
after 3.16.* release.
"""
from openpype.pipeline.delivery import deliver_single_file
return deliver_single_file(
src_path, repre, anatomy, template_name, anatomy_data, format_dict,
report_items, log
)
@deprecated("openpype.pipeline.delivery.deliver_sequence")
def process_sequence(
src_path, repre, anatomy, template_name, anatomy_data, format_dict,
report_items, log
):
""" For Pype2(mainly - works in 3 too) where representation might not
contain files.
Uses listing physical files (not 'files' on repre as a)might not be
present, b)might not be reliable for representation and copying them.
TODO Should be refactored when files are sufficient to drive all
representations.
Args:
src_path(str): path of source representation file
repre (dict): full representation
anatomy (Anatomy)
template_name (string): user selected delivery template name
anatomy_data (dict): data from repre to fill anatomy with
format_dict (dict): root dictionary with names and values
report_items (collections.defaultdict): to return error messages
log (Logger): for log printing
Returns:
(collections.defaultdict , int)
Deprecated:
Function was moved to different location and will be removed
after 3.16.* release.
"""
from openpype.pipeline.delivery import deliver_sequence
return deliver_sequence(
src_path, repre, anatomy, template_name, anatomy_data, format_dict,
report_items, log
)

View file

@ -296,18 +296,6 @@ def path_to_subprocess_arg(path):
return subprocess.list2cmdline([path])
def get_pype_execute_args(*args):
"""Backwards compatible function for 'get_openpype_execute_args'."""
import traceback
log = Logger.get_logger("get_pype_execute_args")
stack = "\n".join(traceback.format_stack())
log.warning((
"Using deprecated function 'get_pype_execute_args'. Called from:\n{}"
).format(stack))
return get_openpype_execute_args(*args)
def get_openpype_execute_args(*args):
"""Arguments to run pype command.

View file

@ -492,21 +492,3 @@ class Logger:
cls.initialize()
return OpenPypeMongoConnection.get_mongo_client()
class PypeLogger(Logger):
"""Duplicate of 'Logger'.
Deprecated:
Class will be removed after release version 3.16.*
"""
@classmethod
def get_logger(cls, *args, **kwargs):
logger = Logger.get_logger(*args, **kwargs)
# TODO uncomment when replaced most of places
logger.warning((
"'openpype.lib.PypeLogger' is deprecated class."
" Please use 'openpype.lib.Logger' instead."
))
return logger

View file

@ -1,61 +0,0 @@
import warnings
import functools
from openpype.client.mongo import (
MongoEnvNotSet,
OpenPypeMongoConnection,
)
class MongoDeprecatedWarning(DeprecationWarning):
pass
def mongo_deprecated(func):
"""Mark functions as deprecated.
It will result in a warning being emitted when the function is used.
"""
@functools.wraps(func)
def new_func(*args, **kwargs):
warnings.simplefilter("always", MongoDeprecatedWarning)
warnings.warn(
(
"Call to deprecated function '{}'."
" Function was moved to 'openpype.client.mongo'."
).format(func.__name__),
category=MongoDeprecatedWarning,
stacklevel=2
)
return func(*args, **kwargs)
return new_func
@mongo_deprecated
def get_default_components():
from openpype.client.mongo import get_default_components
return get_default_components()
@mongo_deprecated
def should_add_certificate_path_to_mongo_url(mongo_url):
from openpype.client.mongo import should_add_certificate_path_to_mongo_url
return should_add_certificate_path_to_mongo_url(mongo_url)
@mongo_deprecated
def validate_mongo_connection(mongo_uri):
from openpype.client.mongo import validate_mongo_connection
return validate_mongo_connection(mongo_uri)
__all__ = (
"MongoEnvNotSet",
"OpenPypeMongoConnection",
"get_default_components",
"should_add_certificate_path_to_mongo_url",
"validate_mongo_connection",
)

View file

@ -2,59 +2,12 @@ import os
import re
import logging
import platform
import functools
import warnings
import clique
log = logging.getLogger(__name__)
class PathToolsDeprecatedWarning(DeprecationWarning):
pass
def deprecated(new_destination):
"""Mark functions as deprecated.
It will result in a warning being emitted when the function is used.
"""
func = None
if callable(new_destination):
func = new_destination
new_destination = None
def _decorator(decorated_func):
if new_destination is None:
warning_message = (
" Please check content of deprecated function to figure out"
" possible replacement."
)
else:
warning_message = " Please replace your usage with '{}'.".format(
new_destination
)
@functools.wraps(decorated_func)
def wrapper(*args, **kwargs):
warnings.simplefilter("always", PathToolsDeprecatedWarning)
warnings.warn(
(
"Call to deprecated function '{}'"
"\nFunction was moved or removed.{}"
).format(decorated_func.__name__, warning_message),
category=PathToolsDeprecatedWarning,
stacklevel=4
)
return decorated_func(*args, **kwargs)
return wrapper
if func is None:
return _decorator
return _decorator(func)
def format_file_size(file_size, suffix=None):
"""Returns formatted string with size in appropriate unit.
@ -269,99 +222,3 @@ def get_last_version_from_path(path_dir, filter):
return filtred_files[-1]
return None
@deprecated("openpype.pipeline.project_folders.concatenate_splitted_paths")
def concatenate_splitted_paths(split_paths, anatomy):
"""
Deprecated:
Function will be removed after release version 3.16.*
"""
from openpype.pipeline.project_folders import concatenate_splitted_paths
return concatenate_splitted_paths(split_paths, anatomy)
@deprecated
def get_format_data(anatomy):
"""
Deprecated:
Function will be removed after release version 3.16.*
"""
from openpype.pipeline.template_data import get_project_template_data
data = get_project_template_data(project_name=anatomy.project_name)
data["root"] = anatomy.roots
return data
@deprecated("openpype.pipeline.project_folders.fill_paths")
def fill_paths(path_list, anatomy):
"""
Deprecated:
Function will be removed after release version 3.16.*
"""
from openpype.pipeline.project_folders import fill_paths
return fill_paths(path_list, anatomy)
@deprecated("openpype.pipeline.project_folders.create_project_folders")
def create_project_folders(basic_paths, project_name):
"""
Deprecated:
Function will be removed after release version 3.16.*
"""
from openpype.pipeline.project_folders import create_project_folders
return create_project_folders(project_name, basic_paths)
@deprecated("openpype.pipeline.project_folders.get_project_basic_paths")
def get_project_basic_paths(project_name):
"""
Deprecated:
Function will be removed after release version 3.16.*
"""
from openpype.pipeline.project_folders import get_project_basic_paths
return get_project_basic_paths(project_name)
@deprecated("openpype.pipeline.workfile.create_workdir_extra_folders")
def create_workdir_extra_folders(
workdir, host_name, task_type, task_name, project_name,
project_settings=None
):
"""Create extra folders in work directory based on context.
Args:
workdir (str): Path to workdir where workfiles is stored.
host_name (str): Name of host implementation.
task_type (str): Type of task for which extra folders should be
created.
task_name (str): Name of task for which extra folders should be
created.
project_name (str): Name of project on which task is.
project_settings (dict): Prepared project settings. Are loaded if not
passed.
Deprecated:
Function will be removed after release version 3.16.*
"""
from openpype.pipeline.project_folders import create_workdir_extra_folders
return create_workdir_extra_folders(
workdir,
host_name,
task_type,
task_name,
project_name,
project_settings
)

View file

@ -4,157 +4,9 @@ import os
import logging
import re
import warnings
import functools
from openpype.client import get_asset_by_id
log = logging.getLogger(__name__)
class PluginToolsDeprecatedWarning(DeprecationWarning):
pass
def deprecated(new_destination):
"""Mark functions as deprecated.
It will result in a warning being emitted when the function is used.
"""
func = None
if callable(new_destination):
func = new_destination
new_destination = None
def _decorator(decorated_func):
if new_destination is None:
warning_message = (
" Please check content of deprecated function to figure out"
" possible replacement."
)
else:
warning_message = " Please replace your usage with '{}'.".format(
new_destination
)
@functools.wraps(decorated_func)
def wrapper(*args, **kwargs):
warnings.simplefilter("always", PluginToolsDeprecatedWarning)
warnings.warn(
(
"Call to deprecated function '{}'"
"\nFunction was moved or removed.{}"
).format(decorated_func.__name__, warning_message),
category=PluginToolsDeprecatedWarning,
stacklevel=4
)
return decorated_func(*args, **kwargs)
return wrapper
if func is None:
return _decorator
return _decorator(func)
@deprecated("openpype.pipeline.create.TaskNotSetError")
def TaskNotSetError(*args, **kwargs):
from openpype.pipeline.create import TaskNotSetError
return TaskNotSetError(*args, **kwargs)
@deprecated("openpype.pipeline.create.get_subset_name")
def get_subset_name_with_asset_doc(
family,
variant,
task_name,
asset_doc,
project_name=None,
host_name=None,
default_template=None,
dynamic_data=None
):
"""Calculate subset name based on passed context and OpenPype settings.
Subst name templates are defined in `project_settings/global/tools/creator
/subset_name_profiles` where are profiles with host name, family, task name
and task type filters. If context does not match any profile then
`DEFAULT_SUBSET_TEMPLATE` is used as default template.
That's main reason why so many arguments are required to calculate subset
name.
Args:
family (str): Instance family.
variant (str): In most of cases it is user input during creation.
task_name (str): Task name on which context is instance created.
asset_doc (dict): Queried asset document with it's tasks in data.
Used to get task type.
project_name (str): Name of project on which is instance created.
Important for project settings that are loaded.
host_name (str): One of filtering criteria for template profile
filters.
default_template (str): Default template if any profile does not match
passed context. Constant 'DEFAULT_SUBSET_TEMPLATE' is used if
is not passed.
dynamic_data (dict): Dynamic data specific for a creator which creates
instance.
"""
from openpype.pipeline.create import get_subset_name
return get_subset_name(
family,
variant,
task_name,
asset_doc,
project_name,
host_name,
default_template,
dynamic_data
)
@deprecated
def get_subset_name(
family,
variant,
task_name,
asset_id,
project_name=None,
host_name=None,
default_template=None,
dynamic_data=None,
dbcon=None
):
"""Calculate subset name using OpenPype settings.
This variant of function expects asset id as argument.
This is legacy function should be replaced with
`get_subset_name_with_asset_doc` where asset document is expected.
"""
from openpype.pipeline.create import get_subset_name
if project_name is None:
project_name = dbcon.project_name
asset_doc = get_asset_by_id(project_name, asset_id, fields=["data.tasks"])
return get_subset_name(
family,
variant,
task_name,
asset_doc,
project_name,
host_name,
default_template,
dynamic_data
)
def prepare_template_data(fill_pairs):
"""
Prepares formatted data for filling template.

View file

@ -11,8 +11,8 @@ import xml.etree.ElementTree
from .execute import run_subprocess
from .vendor_bin_utils import (
get_ffmpeg_tool_path,
get_oiio_tools_path,
get_ffmpeg_tool_args,
get_oiio_tool_args,
is_oiio_supported,
)
@ -83,11 +83,11 @@ def get_oiio_info_for_input(filepath, logger=None, subimages=False):
Stdout should contain xml format string.
"""
args = [
get_oiio_tools_path(),
args = get_oiio_tool_args(
"oiiotool",
"--info",
"-v"
]
)
if subimages:
args.append("-a")
@ -486,12 +486,11 @@ def convert_for_ffmpeg(
compression = "none"
# Prepare subprocess arguments
oiio_cmd = [
get_oiio_tools_path(),
oiio_cmd = get_oiio_tool_args(
"oiiotool",
# Don't add any additional attributes
"--nosoftwareattrib",
]
)
# Add input compression if available
if compression:
oiio_cmd.extend(["--compression", compression])
@ -656,12 +655,11 @@ def convert_input_paths_for_ffmpeg(
for input_path in input_paths:
# Prepare subprocess arguments
oiio_cmd = [
get_oiio_tools_path(),
oiio_cmd = get_oiio_tool_args(
"oiiotool",
# Don't add any additional attributes
"--nosoftwareattrib",
]
)
# Add input compression if available
if compression:
oiio_cmd.extend(["--compression", compression])
@ -729,8 +727,8 @@ def get_ffprobe_data(path_to_file, logger=None):
logger.info(
"Getting information about input \"{}\".".format(path_to_file)
)
args = [
get_ffmpeg_tool_path("ffprobe"),
ffprobe_args = get_ffmpeg_tool_args("ffprobe")
args = ffprobe_args + [
"-hide_banner",
"-loglevel", "fatal",
"-show_error",
@ -1084,13 +1082,13 @@ def convert_colorspace(
if logger is None:
logger = logging.getLogger(__name__)
oiio_cmd = [
get_oiio_tools_path(),
oiio_cmd = get_oiio_tool_args(
"oiiotool",
input_path,
# Don't add any additional attributes
"--nosoftwareattrib",
"--colorconfig", config_path
]
)
if all([target_colorspace, view, display]):
raise ValueError("Colorspace and both screen and display"

View file

@ -3,9 +3,15 @@ import logging
import platform
import subprocess
from openpype import AYON_SERVER_ENABLED
log = logging.getLogger("Vendor utils")
class ToolNotFoundError(Exception):
"""Raised when tool arguments are not found."""
class CachedToolPaths:
"""Cache already used and discovered tools and their executables.
@ -252,7 +258,7 @@ def _check_args_returncode(args):
return proc.returncode == 0
def _oiio_executable_validation(filepath):
def _oiio_executable_validation(args):
"""Validate oiio tool executable if can be executed.
Validation has 2 steps. First is using 'find_executable' to fill possible
@ -270,32 +276,63 @@ def _oiio_executable_validation(filepath):
should be used.
Args:
filepath (str): Path to executable.
args (Union[str, list[str]]): Arguments to launch tool or
path to tool executable.
Returns:
bool: Filepath is valid executable.
"""
filepath = find_executable(filepath)
if not filepath:
if not args:
return False
return _check_args_returncode([filepath, "--help"])
if not isinstance(args, list):
filepath = find_executable(args)
if not filepath:
return False
args = [filepath]
return _check_args_returncode(args + ["--help"])
def _get_ayon_oiio_tool_args(tool_name):
try:
# Use 'ayon-third-party' addon to get oiio arguments
from ayon_third_party import get_oiio_arguments
except Exception:
print("!!! Failed to import 'ayon_third_party' addon.")
return None
try:
return get_oiio_arguments(tool_name)
except Exception as exc:
print("!!! Failed to get OpenImageIO args. Reason: {}".format(exc))
return None
def get_oiio_tools_path(tool="oiiotool"):
"""Path to vendorized OpenImageIO tool executables.
"""Path to OpenImageIO tool executables.
On Window it adds .exe extension if missing from tool argument.
On Windows it adds .exe extension if missing from tool argument.
Args:
tool (string): Tool name (oiiotool, maketx, ...).
tool (string): Tool name 'oiiotool', 'maketx', etc.
Default is "oiiotool".
"""
if CachedToolPaths.is_tool_cached(tool):
return CachedToolPaths.get_executable_path(tool)
if AYON_SERVER_ENABLED:
args = _get_ayon_oiio_tool_args(tool)
if args:
if len(args) > 1:
raise ValueError(
"AYON oiio arguments consist of multiple arguments."
)
tool_executable_path = args[0]
CachedToolPaths.cache_executable_path(tool, tool_executable_path)
return tool_executable_path
custom_paths_str = os.environ.get("OPENPYPE_OIIO_PATHS") or ""
tool_executable_path = find_tool_in_custom_paths(
custom_paths_str.split(os.pathsep),
@ -321,7 +358,33 @@ def get_oiio_tools_path(tool="oiiotool"):
return tool_executable_path
def _ffmpeg_executable_validation(filepath):
def get_oiio_tool_args(tool_name, *extra_args):
"""Arguments to launch OpenImageIO tool.
Args:
tool_name (str): Tool name 'oiiotool', 'maketx', etc.
*extra_args (str): Extra arguments to add to after tool arguments.
Returns:
list[str]: List of arguments.
"""
extra_args = list(extra_args)
if AYON_SERVER_ENABLED:
args = _get_ayon_oiio_tool_args(tool_name)
if args:
return args + extra_args
path = get_oiio_tools_path(tool_name)
if path:
return [path] + extra_args
raise ToolNotFoundError(
"OIIO '{}' tool not found.".format(tool_name)
)
def _ffmpeg_executable_validation(args):
"""Validate ffmpeg tool executable if can be executed.
Validation has 2 steps. First is using 'find_executable' to fill possible
@ -338,24 +401,45 @@ def _ffmpeg_executable_validation(filepath):
It does not validate if the executable is really a ffmpeg tool.
Args:
filepath (str): Path to executable.
args (Union[str, list[str]]): Arguments to launch tool or
path to tool executable.
Returns:
bool: Filepath is valid executable.
"""
filepath = find_executable(filepath)
if not filepath:
if not args:
return False
return _check_args_returncode([filepath, "-version"])
if not isinstance(args, list):
filepath = find_executable(args)
if not filepath:
return False
args = [filepath]
return _check_args_returncode(args + ["--help"])
def _get_ayon_ffmpeg_tool_args(tool_name):
try:
# Use 'ayon-third-party' addon to get ffmpeg arguments
from ayon_third_party import get_ffmpeg_arguments
except Exception:
print("!!! Failed to import 'ayon_third_party' addon.")
return None
try:
return get_ffmpeg_arguments(tool_name)
except Exception as exc:
print("!!! Failed to get FFmpeg args. Reason: {}".format(exc))
return None
def get_ffmpeg_tool_path(tool="ffmpeg"):
"""Path to vendorized FFmpeg executable.
Args:
tool (string): Tool name (ffmpeg, ffprobe, ...).
tool (str): Tool name 'ffmpeg', 'ffprobe', etc.
Default is "ffmpeg".
Returns:
@ -365,6 +449,17 @@ def get_ffmpeg_tool_path(tool="ffmpeg"):
if CachedToolPaths.is_tool_cached(tool):
return CachedToolPaths.get_executable_path(tool)
if AYON_SERVER_ENABLED:
args = _get_ayon_ffmpeg_tool_args(tool)
if args is not None:
if len(args) > 1:
raise ValueError(
"AYON ffmpeg arguments consist of multiple arguments."
)
tool_executable_path = args[0]
CachedToolPaths.cache_executable_path(tool, tool_executable_path)
return tool_executable_path
custom_paths_str = os.environ.get("OPENPYPE_FFMPEG_PATHS") or ""
tool_executable_path = find_tool_in_custom_paths(
custom_paths_str.split(os.pathsep),
@ -390,19 +485,44 @@ def get_ffmpeg_tool_path(tool="ffmpeg"):
return tool_executable_path
def get_ffmpeg_tool_args(tool_name, *extra_args):
"""Arguments to launch FFmpeg tool.
Args:
tool_name (str): Tool name 'ffmpeg', 'ffprobe', exc.
*extra_args (str): Extra arguments to add to after tool arguments.
Returns:
list[str]: List of arguments.
"""
extra_args = list(extra_args)
if AYON_SERVER_ENABLED:
args = _get_ayon_ffmpeg_tool_args(tool_name)
if args:
return args + extra_args
executable_path = get_ffmpeg_tool_path(tool_name)
if executable_path:
return [executable_path] + extra_args
raise ToolNotFoundError(
"FFmpeg '{}' tool not found.".format(tool_name)
)
def is_oiio_supported():
"""Checks if oiiotool is configured for this platform.
Returns:
bool: OIIO tool executable is available.
"""
loaded_path = oiio_path = get_oiio_tools_path()
if oiio_path:
oiio_path = find_executable(oiio_path)
if not oiio_path:
log.debug("OIIOTool is not configured or not present at {}".format(
loaded_path
))
try:
args = get_oiio_tool_args("oiiotool")
except ToolNotFoundError:
args = None
if not args:
log.debug("OIIOTool is not configured or not present.")
return False
return True
return _oiio_executable_validation(args)

View file

@ -40,6 +40,7 @@ class SyncToAvalonServer(ServerAction):
#: Action description.
description = "Send data from Ftrack to Avalon"
role_list = {"Pypeclub", "Administrator", "Project Manager"}
settings_key = "sync_to_avalon"
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
@ -48,11 +49,16 @@ class SyncToAvalonServer(ServerAction):
def discover(self, session, entities, event):
""" Validation """
# Check if selection is valid
is_valid = False
for ent in event["data"]["selection"]:
# Ignore entities that are not tasks or projects
if ent["entityType"].lower() in ["show", "task"]:
return True
return False
is_valid = True
break
if is_valid:
is_valid = self.valid_roles(session, entities, event)
return is_valid
def launch(self, session, in_entities, event):
self.log.debug("{}: Creating job".format(self.label))

View file

@ -1,7 +1,5 @@
import os
import pyblish.api
from openpype.lib.mongo import OpenPypeMongoConnection
from openpype.client.mongo import OpenPypeMongoConnection
class CollectShotgridEntities(pyblish.api.ContextPlugin):

View file

@ -13,6 +13,7 @@ from .create import (
BaseCreator,
Creator,
AutoCreator,
HiddenCreator,
CreatedInstance,
CreatorError,
@ -114,6 +115,7 @@ __all__ = (
"BaseCreator",
"Creator",
"AutoCreator",
"HiddenCreator",
"CreatedInstance",
"CreatorError",

View file

@ -237,10 +237,17 @@ def get_data_subprocess(config_path, data_type):
def compatibility_check():
"""Making sure PyOpenColorIO is importable"""
"""checking if user has a compatible PyOpenColorIO >= 2.
It's achieved by checking if PyOpenColorIO is importable
and calling any version 2 specific function
"""
try:
import PyOpenColorIO # noqa: F401
except ImportError:
import PyOpenColorIO
# ocio versions lower than 2 will raise AttributeError
PyOpenColorIO.GetVersion()
except (ImportError, AttributeError):
return False
return True

View file

@ -1165,8 +1165,8 @@ class CreatedInstance:
Args:
instance_data (Dict[str, Any]): Data in a structure ready for
'CreatedInstance' object.
creator (Creator): Creator plugin which is creating the instance
of for which the instance belong.
creator (BaseCreator): Creator plugin which is creating the
instance of for which the instance belong.
"""
instance_data = copy.deepcopy(instance_data)
@ -1979,7 +1979,11 @@ class CreateContext:
if pre_create_data is None:
pre_create_data = {}
precreate_attr_defs = creator.get_pre_create_attr_defs() or []
precreate_attr_defs = []
# Hidden creators do not have or need the pre-create attributes.
if isinstance(creator, Creator):
precreate_attr_defs = creator.get_pre_create_attr_defs()
# Create default values of precreate data
_pre_create_data = get_default_values(precreate_attr_defs)
# Update passed precreate data to default values

View file

@ -24,6 +24,7 @@ log_ = logging.getLogger(__name__)
ValidationError = jsonschema.ValidationError
SchemaError = jsonschema.SchemaError
CURRENT_DIR = os.path.dirname(os.path.abspath(__file__))
_CACHED = False
@ -121,17 +122,14 @@ def _precache():
"""Store available schemas in-memory for reduced disk access"""
global _CACHED
repos_root = os.environ["OPENPYPE_REPOS_ROOT"]
schema_dir = os.path.join(repos_root, "schema")
for schema in os.listdir(schema_dir):
for schema in os.listdir(CURRENT_DIR):
if schema.startswith(("_", ".")):
continue
if not schema.endswith(".json"):
continue
if not os.path.isfile(os.path.join(schema_dir, schema)):
if not os.path.isfile(os.path.join(CURRENT_DIR, schema)):
continue
with open(os.path.join(schema_dir, schema)) as f:
with open(os.path.join(CURRENT_DIR, schema)) as f:
log_.debug("Installing schema '%s'.." % schema)
_cache[schema] = json.load(f)
_CACHED = True

View file

@ -0,0 +1,68 @@
{
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "openpype:application-1.0",
"description": "An application definition.",
"type": "object",
"additionalProperties": true,
"required": [
"schema",
"label",
"application_dir",
"executable"
],
"properties": {
"schema": {
"description": "Schema identifier for payload",
"type": "string"
},
"label": {
"description": "Nice name of application.",
"type": "string"
},
"application_dir": {
"description": "Name of directory used for application resources.",
"type": "string"
},
"executable": {
"description": "Name of callable executable, this is called to launch the application",
"type": "string"
},
"description": {
"description": "Description of application.",
"type": "string"
},
"environment": {
"description": "Key/value pairs for environment variables related to this application. Supports lists for paths, such as PYTHONPATH.",
"type": "object",
"items": {
"oneOf": [
{"type": "string"},
{"type": "array", "items": {"type": "string"}}
]
}
},
"default_dirs": {
"type": "array",
"items": {
"type": "string"
}
},
"copy": {
"type": "object",
"patternProperties": {
"^.*$": {
"anyOf": [
{"type": "string"},
{"type": "null"}
]
}
},
"additionalProperties": false
}
}
}

View file

@ -0,0 +1,35 @@
{
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "openpype:asset-1.0",
"description": "A unit of data",
"type": "object",
"additionalProperties": true,
"required": [
"schema",
"name",
"subsets"
],
"properties": {
"schema": {
"description": "Schema identifier for payload",
"type": "string"
},
"name": {
"description": "Name of directory",
"type": "string"
},
"subsets": {
"type": "array",
"items": {
"$ref": "subset.json"
}
}
},
"definitions": {}
}

View file

@ -0,0 +1,55 @@
{
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "openpype:asset-2.0",
"description": "A unit of data",
"type": "object",
"additionalProperties": true,
"required": [
"schema",
"type",
"name",
"silo",
"data"
],
"properties": {
"schema": {
"description": "Schema identifier for payload",
"type": "string",
"enum": ["openpype:asset-2.0"],
"example": "openpype:asset-2.0"
},
"type": {
"description": "The type of document",
"type": "string",
"enum": ["asset"],
"example": "asset"
},
"parent": {
"description": "Unique identifier to parent document",
"example": "592c33475f8c1b064c4d1696"
},
"name": {
"description": "Name of asset",
"type": "string",
"pattern": "^[a-zA-Z0-9_.]*$",
"example": "Bruce"
},
"silo": {
"description": "Group or container of asset",
"type": "string",
"example": "assets"
},
"data": {
"description": "Document metadata",
"type": "object",
"example": {"key": "value"}
}
},
"definitions": {}
}

View file

@ -0,0 +1,55 @@
{
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "openpype:asset-3.0",
"description": "A unit of data",
"type": "object",
"additionalProperties": true,
"required": [
"schema",
"type",
"name",
"data"
],
"properties": {
"schema": {
"description": "Schema identifier for payload",
"type": "string",
"enum": ["openpype:asset-3.0"],
"example": "openpype:asset-3.0"
},
"type": {
"description": "The type of document",
"type": "string",
"enum": ["asset"],
"example": "asset"
},
"parent": {
"description": "Unique identifier to parent document",
"example": "592c33475f8c1b064c4d1696"
},
"name": {
"description": "Name of asset",
"type": "string",
"pattern": "^[a-zA-Z0-9_.]*$",
"example": "Bruce"
},
"silo": {
"description": "Group or container of asset",
"type": "string",
"pattern": "^[a-zA-Z0-9_.]*$",
"example": "assets"
},
"data": {
"description": "Document metadata",
"type": "object",
"example": {"key": "value"}
}
},
"definitions": {}
}

View file

@ -0,0 +1,85 @@
{
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "openpype:config-1.0",
"description": "A project configuration.",
"type": "object",
"additionalProperties": false,
"required": [
"tasks",
"apps"
],
"properties": {
"schema": {
"description": "Schema identifier for payload",
"type": "string"
},
"template": {
"type": "object",
"additionalProperties": false,
"patternProperties": {
"^.*$": {
"type": "string"
}
}
},
"tasks": {
"type": "array",
"items": {
"type": "object",
"properties": {
"name": {"type": "string"},
"icon": {"type": "string"},
"group": {"type": "string"},
"label": {"type": "string"}
},
"required": ["name"]
}
},
"apps": {
"type": "array",
"items": {
"type": "object",
"properties": {
"name": {"type": "string"},
"icon": {"type": "string"},
"group": {"type": "string"},
"label": {"type": "string"}
},
"required": ["name"]
}
},
"families": {
"type": "array",
"items": {
"type": "object",
"properties": {
"name": {"type": "string"},
"icon": {"type": "string"},
"label": {"type": "string"},
"hideFilter": {"type": "boolean"}
},
"required": ["name"]
}
},
"groups": {
"type": "array",
"items": {
"type": "object",
"properties": {
"name": {"type": "string"},
"icon": {"type": "string"},
"color": {"type": "string"},
"order": {"type": ["integer", "number"]}
},
"required": ["name"]
}
},
"copy": {
"type": "object"
}
}
}

View file

@ -0,0 +1,87 @@
{
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "openpype:config-1.1",
"description": "A project configuration.",
"type": "object",
"additionalProperties": false,
"required": [
"tasks",
"apps"
],
"properties": {
"schema": {
"description": "Schema identifier for payload",
"type": "string"
},
"template": {
"type": "object",
"additionalProperties": false,
"patternProperties": {
"^.*$": {
"type": "string"
}
}
},
"tasks": {
"type": "object",
"items": {
"type": "object",
"properties": {
"name": {"type": "string"},
"icon": {"type": "string"},
"group": {"type": "string"},
"label": {"type": "string"}
},
"required": [
"short_name"
]
}
},
"apps": {
"type": "array",
"items": {
"type": "object",
"properties": {
"name": {"type": "string"},
"icon": {"type": "string"},
"group": {"type": "string"},
"label": {"type": "string"}
},
"required": ["name"]
}
},
"families": {
"type": "array",
"items": {
"type": "object",
"properties": {
"name": {"type": "string"},
"icon": {"type": "string"},
"label": {"type": "string"},
"hideFilter": {"type": "boolean"}
},
"required": ["name"]
}
},
"groups": {
"type": "array",
"items": {
"type": "object",
"properties": {
"name": {"type": "string"},
"icon": {"type": "string"},
"color": {"type": "string"},
"order": {"type": ["integer", "number"]}
},
"required": ["name"]
}
},
"copy": {
"type": "object"
}
}
}

View file

@ -0,0 +1,87 @@
{
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "openpype:config-2.0",
"description": "A project configuration.",
"type": "object",
"additionalProperties": false,
"required": [
"tasks",
"apps"
],
"properties": {
"schema": {
"description": "Schema identifier for payload",
"type": "string"
},
"templates": {
"type": "object"
},
"roots": {
"type": "object"
},
"imageio": {
"type": "object"
},
"tasks": {
"type": "object",
"items": {
"type": "object",
"properties": {
"name": {"type": "string"},
"icon": {"type": "string"},
"group": {"type": "string"},
"label": {"type": "string"}
},
"required": [
"short_name"
]
}
},
"apps": {
"type": "array",
"items": {
"type": "object",
"properties": {
"name": {"type": "string"},
"icon": {"type": "string"},
"group": {"type": "string"},
"label": {"type": "string"}
},
"required": ["name"]
}
},
"families": {
"type": "array",
"items": {
"type": "object",
"properties": {
"name": {"type": "string"},
"icon": {"type": "string"},
"label": {"type": "string"},
"hideFilter": {"type": "boolean"}
},
"required": ["name"]
}
},
"groups": {
"type": "array",
"items": {
"type": "object",
"properties": {
"name": {"type": "string"},
"icon": {"type": "string"},
"color": {"type": "string"},
"order": {"type": ["integer", "number"]}
},
"required": ["name"]
}
},
"copy": {
"type": "object"
}
}
}

View file

@ -0,0 +1,100 @@
{
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "openpype:container-1.0",
"description": "A loaded asset",
"type": "object",
"additionalProperties": true,
"required": [
"id",
"objectName",
"name",
"author",
"loader",
"families",
"time",
"subset",
"asset",
"representation",
"version",
"silo",
"path",
"source"
],
"properties": {
"id": {
"description": "Identifier for finding object in host",
"type": "string",
"enum": ["pyblish.mindbender.container"],
"example": "pyblish.mindbender.container"
},
"objectName": {
"description": "Name of internal object, such as the objectSet in Maya.",
"type": "string",
"example": "Bruce_:rigDefault_CON"
},
"name": {
"description": "Full name of application object",
"type": "string",
"example": "modelDefault"
},
"author": {
"description": "Name of the author of the published version",
"type": "string",
"example": "Marcus Ottosson"
},
"loader": {
"description": "Name of loader plug-in used to produce this container",
"type": "string",
"example": "ModelLoader"
},
"families": {
"description": "Families associated with the this subset",
"type": "string",
"example": "mindbender.model"
},
"time": {
"description": "File-system safe, formatted time",
"type": "string",
"example": "20170329T131545Z"
},
"subset": {
"description": "Name of source subset",
"type": "string",
"example": "modelDefault"
},
"asset": {
"description": "Name of source asset",
"type": "string" ,
"example": "Bruce"
},
"representation": {
"description": "Name of source representation",
"type": "string" ,
"example": ".ma"
},
"version": {
"description": "Version number",
"type": "number",
"example": 12
},
"silo": {
"description": "Silo of parent asset",
"type": "string",
"example": "assets"
},
"path": {
"description": "Absolute path on disk",
"type": "string",
"example": "{root}/assets/Bruce/publish/rigDefault/v002"
},
"source": {
"description": "Absolute path to file from which this version was published",
"type": "string",
"example": "{root}/assets/Bruce/work/rigging/maya/scenes/rig_v001.ma"
}
}
}

View file

@ -0,0 +1,59 @@
{
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "openpype:container-2.0",
"description": "A loaded asset",
"type": "object",
"additionalProperties": true,
"required": [
"schema",
"id",
"objectName",
"name",
"namespace",
"loader",
"representation"
],
"properties": {
"schema": {
"description": "Schema identifier for payload",
"type": "string",
"enum": ["openpype:container-2.0"],
"example": "openpype:container-2.0"
},
"id": {
"description": "Identifier for finding object in host",
"type": "string",
"enum": ["pyblish.avalon.container"],
"example": "pyblish.avalon.container"
},
"objectName": {
"description": "Name of internal object, such as the objectSet in Maya.",
"type": "string",
"example": "Bruce_:rigDefault_CON"
},
"loader": {
"description": "Name of loader plug-in used to produce this container",
"type": "string",
"example": "ModelLoader"
},
"name": {
"description": "Internal object name of container in application",
"type": "string",
"example": "modelDefault_01"
},
"namespace": {
"description": "Internal namespace of container in application",
"type": "string",
"example": "Bruce_"
},
"representation": {
"description": "Unique id of representation in database",
"type": "string",
"example": "59523f355f8c1b5f6c5e8348"
}
}
}

View file

@ -0,0 +1,44 @@
{
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "openpype:hero_version-1.0",
"description": "Hero version of asset",
"type": "object",
"additionalProperties": true,
"required": [
"version_id",
"schema",
"type",
"parent"
],
"properties": {
"_id": {
"description": "Document's id (database will create it's if not entered)",
"example": "ObjectId(592c33475f8c1b064c4d1696)"
},
"version_id": {
"description": "The version ID from which it was created",
"example": "ObjectId(592c33475f8c1b064c4d1695)"
},
"schema": {
"description": "The schema associated with this document",
"type": "string",
"enum": ["openpype:hero_version-1.0"],
"example": "openpype:hero_version-1.0"
},
"type": {
"description": "The type of document",
"type": "string",
"enum": ["hero_version"],
"example": "hero_version"
},
"parent": {
"description": "Unique identifier to parent document",
"example": "ObjectId(592c33475f8c1b064c4d1697)"
}
}
}

View file

@ -0,0 +1,10 @@
{
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "openpype:config-1.0",
"description": "A project configuration.",
"type": "object",
"additionalProperties": true
}

View file

@ -0,0 +1,10 @@
{
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "openpype:config-1.1",
"description": "A project configuration.",
"type": "object",
"additionalProperties": true
}

View file

@ -0,0 +1,86 @@
{
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "openpype:project-2.0",
"description": "A unit of data",
"type": "object",
"additionalProperties": true,
"required": [
"schema",
"type",
"name",
"data",
"config"
],
"properties": {
"schema": {
"description": "Schema identifier for payload",
"type": "string",
"enum": ["openpype:project-2.0"],
"example": "openpype:project-2.0"
},
"type": {
"description": "The type of document",
"type": "string",
"enum": ["project"],
"example": "project"
},
"parent": {
"description": "Unique identifier to parent document",
"example": "592c33475f8c1b064c4d1696"
},
"name": {
"description": "Name of directory",
"type": "string",
"pattern": "^[a-zA-Z0-9_.]*$",
"example": "hulk"
},
"data": {
"description": "Document metadata",
"type": "object",
"example": {
"fps": 24,
"width": 1920,
"height": 1080
}
},
"config": {
"type": "object",
"description": "Document metadata",
"example": {
"schema": "openpype:config-1.0",
"apps": [
{
"name": "maya2016",
"label": "Autodesk Maya 2016"
},
{
"name": "nuke10",
"label": "The Foundry Nuke 10.0"
}
],
"tasks": [
{"name": "model"},
{"name": "render"},
{"name": "animate"},
{"name": "rig"},
{"name": "lookdev"},
{"name": "layout"}
],
"template": {
"work":
"{root}/{project}/{silo}/{asset}/work/{task}/{app}",
"publish":
"{root}/{project}/{silo}/{asset}/publish/{subset}/v{version:0>3}/{subset}.{representation}"
}
},
"$ref": "config-1.0.json"
}
},
"definitions": {}
}

View file

@ -0,0 +1,86 @@
{
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "openpype:project-2.1",
"description": "A unit of data",
"type": "object",
"additionalProperties": true,
"required": [
"schema",
"type",
"name",
"data",
"config"
],
"properties": {
"schema": {
"description": "Schema identifier for payload",
"type": "string",
"enum": ["openpype:project-2.1"],
"example": "openpype:project-2.1"
},
"type": {
"description": "The type of document",
"type": "string",
"enum": ["project"],
"example": "project"
},
"parent": {
"description": "Unique identifier to parent document",
"example": "592c33475f8c1b064c4d1696"
},
"name": {
"description": "Name of directory",
"type": "string",
"pattern": "^[a-zA-Z0-9_.]*$",
"example": "hulk"
},
"data": {
"description": "Document metadata",
"type": "object",
"example": {
"fps": 24,
"width": 1920,
"height": 1080
}
},
"config": {
"type": "object",
"description": "Document metadata",
"example": {
"schema": "openpype:config-1.1",
"apps": [
{
"name": "maya2016",
"label": "Autodesk Maya 2016"
},
{
"name": "nuke10",
"label": "The Foundry Nuke 10.0"
}
],
"tasks": {
"Model": {"short_name": "mdl"},
"Render": {"short_name": "rnd"},
"Animate": {"short_name": "anim"},
"Rig": {"short_name": "rig"},
"Lookdev": {"short_name": "look"},
"Layout": {"short_name": "lay"}
},
"template": {
"work":
"{root}/{project}/{silo}/{asset}/work/{task}/{app}",
"publish":
"{root}/{project}/{silo}/{asset}/publish/{subset}/v{version:0>3}/{subset}.{representation}"
}
},
"$ref": "config-1.1.json"
}
},
"definitions": {}
}

View file

@ -0,0 +1,59 @@
{
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "openpype:project-3.0",
"description": "A unit of data",
"type": "object",
"additionalProperties": true,
"required": [
"schema",
"type",
"name",
"data",
"config"
],
"properties": {
"schema": {
"description": "Schema identifier for payload",
"type": "string",
"enum": ["openpype:project-3.0"],
"example": "openpype:project-3.0"
},
"type": {
"description": "The type of document",
"type": "string",
"enum": ["project"],
"example": "project"
},
"parent": {
"description": "Unique identifier to parent document",
"example": "592c33475f8c1b064c4d1696"
},
"name": {
"description": "Name of directory",
"type": "string",
"pattern": "^[a-zA-Z0-9_.]*$",
"example": "hulk"
},
"data": {
"description": "Document metadata",
"type": "object",
"example": {
"fps": 24,
"width": 1920,
"height": 1080
}
},
"config": {
"type": "object",
"description": "Document metadata",
"$ref": "config-2.0.json"
}
},
"definitions": {}
}

View file

@ -0,0 +1,28 @@
{
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "openpype:representation-1.0",
"description": "The inverse of an instance",
"type": "object",
"additionalProperties": true,
"required": [
"schema",
"format",
"path"
],
"properties": {
"schema": {"type": "string"},
"format": {
"description": "File extension, including '.'",
"type": "string"
},
"path": {
"description": "Unformatted path to version.",
"type": "string"
}
}
}

View file

@ -0,0 +1,78 @@
{
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "openpype:representation-2.0",
"description": "The inverse of an instance",
"type": "object",
"additionalProperties": true,
"required": [
"schema",
"type",
"parent",
"name",
"data"
],
"properties": {
"schema": {
"description": "Schema identifier for payload",
"type": "string",
"enum": ["openpype:representation-2.0"],
"example": "openpype:representation-2.0"
},
"type": {
"description": "The type of document",
"type": "string",
"enum": ["representation"],
"example": "representation"
},
"parent": {
"description": "Unique identifier to parent document",
"example": "592c33475f8c1b064c4d1696"
},
"name": {
"description": "Name of representation",
"type": "string",
"pattern": "^[a-zA-Z0-9_.]*$",
"example": "abc"
},
"data": {
"description": "Document metadata",
"type": "object",
"example": {
"label": "Alembic"
}
},
"dependencies": {
"description": "Other representation that this representation depends on",
"type": "array",
"items": {"type": "string"},
"example": [
"592d547a5f8c1b388093c145"
]
},
"context": {
"description": "Summary of the context to which this representation belong.",
"type": "object",
"properties": {
"project": {"type": "object"},
"asset": {"type": "string"},
"silo": {"type": ["string", "null"]},
"subset": {"type": "string"},
"version": {"type": "number"},
"representation": {"type": "string"}
},
"example": {
"project": "hulk",
"asset": "Bruce",
"silo": "assets",
"subset": "rigDefault",
"version": 12,
"representation": "ma"
}
}
}
}

View file

@ -0,0 +1,143 @@
{
"$schema": "http://json-schema.org/draft-04/schema#",
"title": "openpype:session-1.0",
"description": "The Avalon environment",
"type": "object",
"additionalProperties": true,
"required": [
"AVALON_PROJECTS",
"AVALON_PROJECT",
"AVALON_ASSET",
"AVALON_SILO",
"AVALON_CONFIG"
],
"properties": {
"AVALON_PROJECTS": {
"description": "Absolute path to root of project directories",
"type": "string",
"example": "/nas/projects"
},
"AVALON_PROJECT": {
"description": "Name of project",
"type": "string",
"pattern": "^\\w*$",
"example": "Hulk"
},
"AVALON_ASSET": {
"description": "Name of asset",
"type": "string",
"pattern": "^\\w*$",
"example": "Bruce"
},
"AVALON_SILO": {
"description": "Name of asset group or container",
"type": "string",
"pattern": "^\\w*$",
"example": "assets"
},
"AVALON_TASK": {
"description": "Name of task",
"type": "string",
"pattern": "^\\w*$",
"example": "modeling"
},
"AVALON_CONFIG": {
"description": "Name of Avalon configuration",
"type": "string",
"pattern": "^\\w*$",
"example": "polly"
},
"AVALON_APP": {
"description": "Name of application",
"type": "string",
"pattern": "^\\w*$",
"example": "maya2016"
},
"AVALON_MONGO": {
"description": "Address to the asset database",
"type": "string",
"pattern": "^mongodb://[\\w/@:.]*$",
"example": "mongodb://localhost:27017",
"default": "mongodb://localhost:27017"
},
"AVALON_DB": {
"description": "Name of database",
"type": "string",
"pattern": "^\\w*$",
"example": "avalon",
"default": "avalon"
},
"AVALON_LABEL": {
"description": "Nice name of Avalon, used in e.g. graphical user interfaces",
"type": "string",
"example": "Mindbender",
"default": "Avalon"
},
"AVALON_SENTRY": {
"description": "Address to Sentry",
"type": "string",
"pattern": "^http[\\w/@:.]*$",
"example": "https://5b872b280de742919b115bdc8da076a5:8d278266fe764361b8fa6024af004a9c@logs.mindbender.com/2",
"default": null
},
"AVALON_DEADLINE": {
"description": "Address to Deadline",
"type": "string",
"pattern": "^http[\\w/@:.]*$",
"example": "http://192.168.99.101",
"default": null
},
"AVALON_TIMEOUT": {
"description": "Wherever there is a need for a timeout, this is the default value.",
"type": "string",
"pattern": "^[0-9]*$",
"default": "1000",
"example": "1000"
},
"AVALON_UPLOAD": {
"description": "Boolean of whether to upload published material to central asset repository",
"type": "string",
"default": null,
"example": "True"
},
"AVALON_USERNAME": {
"description": "Generic username",
"type": "string",
"pattern": "^\\w*$",
"default": "avalon",
"example": "myself"
},
"AVALON_PASSWORD": {
"description": "Generic password",
"type": "string",
"pattern": "^\\w*$",
"default": "secret",
"example": "abc123"
},
"AVALON_INSTANCE_ID": {
"description": "Unique identifier for instances in a working file",
"type": "string",
"pattern": "^[\\w.]*$",
"default": "avalon.instance",
"example": "avalon.instance"
},
"AVALON_CONTAINER_ID": {
"description": "Unique identifier for a loaded representation in a working file",
"type": "string",
"pattern": "^[\\w.]*$",
"default": "avalon.container",
"example": "avalon.container"
},
"AVALON_DEBUG": {
"description": "Enable debugging mode. Some applications may use this for e.g. extended verbosity or mock plug-ins.",
"type": "string",
"default": null,
"example": "True"
}
}
}

Some files were not shown because too many files have changed in this diff Show more