mirror of
https://github.com/ynput/ayon-core.git
synced 2025-12-24 21:04:40 +01:00
Merge branch 'develop' into bugfix/AY-3433_Maya-redshift-AOV-mode
This commit is contained in:
commit
3fdd55c84a
815 changed files with 19594 additions and 21235 deletions
24
.github/workflows/pr_linting.yml
vendored
Normal file
24
.github/workflows/pr_linting.yml
vendored
Normal file
|
|
@ -0,0 +1,24 @@
|
|||
name: 📇 Code Linting
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ develop ]
|
||||
pull_request:
|
||||
branches: [ develop ]
|
||||
|
||||
workflow_dispatch:
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.event.pull_request.number}}
|
||||
cancel-in-progress: true
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
pull-requests: write
|
||||
|
||||
jobs:
|
||||
linting:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: chartboost/ruff-action@v1
|
||||
1
.gitignore
vendored
1
.gitignore
vendored
|
|
@ -77,6 +77,7 @@ dump.sql
|
|||
|
||||
# Poetry
|
||||
########
|
||||
.poetry/
|
||||
.python-version
|
||||
.editorconfig
|
||||
.pre-commit-config.yaml
|
||||
|
|
|
|||
|
|
@ -1,3 +1,3 @@
|
|||
flake8:
|
||||
enabled: true
|
||||
config_file: setup.cfg
|
||||
flake8:
|
||||
enabled: true
|
||||
config_file: setup.cfg
|
||||
|
|
|
|||
|
|
@ -1,12 +1,27 @@
|
|||
# See https://pre-commit.com for more information
|
||||
# See https://pre-commit.com/hooks.html for more hooks
|
||||
repos:
|
||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||
rev: v4.4.0
|
||||
hooks:
|
||||
- id: trailing-whitespace
|
||||
- id: end-of-file-fixer
|
||||
- id: check-yaml
|
||||
- id: check-added-large-files
|
||||
- id: no-commit-to-branch
|
||||
args: [ '--pattern', '^(?!((release|enhancement|feature|bugfix|documentation|tests|local|chore)\/[a-zA-Z0-9\-_]+)$).*' ]
|
||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||
rev: v4.4.0
|
||||
hooks:
|
||||
- id: trailing-whitespace
|
||||
- id: end-of-file-fixer
|
||||
- id: check-yaml
|
||||
- id: check-added-large-files
|
||||
- id: no-commit-to-branch
|
||||
args: [ '--pattern', '^(?!((release|enhancement|feature|bugfix|documentation|tests|local|chore)\/[a-zA-Z0-9\-_]+)$).*' ]
|
||||
- repo: https://github.com/codespell-project/codespell
|
||||
rev: v2.2.6
|
||||
hooks:
|
||||
- id: codespell
|
||||
additional_dependencies:
|
||||
- tomli
|
||||
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.3.3
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff
|
||||
# Run the formatter.
|
||||
# - id: ruff-format
|
||||
|
|
|
|||
24
README.md
24
README.md
|
|
@ -1,8 +1,8 @@
|
|||
|
||||
AYON Core addon
|
||||
========
|
||||
AYON Core Addon
|
||||
===============
|
||||
|
||||
AYON core provides the base building blocks for all other AYON addons and integrations and is responsible for discovery and initialization of other addons.
|
||||
AYON core provides the base building blocks for all other AYON addons and integrations and is responsible for discovery and initialization of other addons.
|
||||
|
||||
- Some of its key functions include:
|
||||
- It is used as the main command line handler in [ayon-launcher](https://github.com/ynput/ayon-launcher) application.
|
||||
|
|
@ -13,8 +13,20 @@ AYON core provides the base building blocks for all other AYON addons and integr
|
|||
- Defines pipeline API used by other integrations
|
||||
- Provides all graphical tools for artists
|
||||
- Defines AYON QT styling
|
||||
- A bunch more things
|
||||
- A bunch more things
|
||||
|
||||
Together with [ayon-launcher](https://github.com/ynput/ayon-launcher) , they form the base of AYON pipeline and is one of few compulsory addons for AYON pipeline to be useful in a meaningful way.
|
||||
Together with [ayon-launcher](https://github.com/ynput/ayon-launcher) , they form the base of AYON pipeline and is one of few compulsory addons for AYON pipeline to be useful in a meaningful way.
|
||||
|
||||
AYON-core is a successor to OpenPype repository (minus all the addons) and still in the process of cleaning up of all references. Please bear with us during this transitional phase.
|
||||
AYON-core is a successor to [OpenPype repository](https://github.com/ynput/OpenPype) (minus all the addons) and still in the process of cleaning up of all references. Please bear with us during this transitional phase.
|
||||
|
||||
Development and testing notes
|
||||
-----------------------------
|
||||
There is `pyproject.toml` file in the root of the repository. This file is used to define the development environment and is used by `poetry` to create a virtual environment.
|
||||
This virtual environment is used to run tests and to develop the code, to help with
|
||||
linting and formatting. Dependencies defined here are not used in actual addon
|
||||
deployment - for that you need to edit `./client/pyproject.toml` file. That file
|
||||
will be then processed [ayon-dependencies-tool](https://github.com/ynput/ayon-dependencies-tool)
|
||||
to create dependency package.
|
||||
|
||||
Right now, this file needs to by synced with dependencies manually, but in the future
|
||||
we plan to automate process of development environment creation.
|
||||
|
|
|
|||
|
|
@ -1,12 +1,28 @@
|
|||
import os
|
||||
from .version import __version__
|
||||
|
||||
|
||||
AYON_CORE_ROOT = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
# TODO remove after '1.x.x'
|
||||
# -------------------------
|
||||
# DEPRECATED - Remove before '1.x.x' release
|
||||
# -------------------------
|
||||
PACKAGE_DIR = AYON_CORE_ROOT
|
||||
PLUGINS_DIR = os.path.join(AYON_CORE_ROOT, "plugins")
|
||||
AYON_SERVER_ENABLED = True
|
||||
|
||||
# Indicate if AYON entities should be used instead of OpenPype entities
|
||||
USE_AYON_ENTITIES = False
|
||||
USE_AYON_ENTITIES = True
|
||||
# -------------------------
|
||||
|
||||
|
||||
__all__ = (
|
||||
"__version__",
|
||||
|
||||
# Deprecated
|
||||
"AYON_CORE_ROOT",
|
||||
"PACKAGE_DIR",
|
||||
"PLUGINS_DIR",
|
||||
"AYON_SERVER_ENABLED",
|
||||
"USE_AYON_ENTITIES",
|
||||
)
|
||||
|
|
|
|||
|
|
@ -27,7 +27,7 @@ AYON addons should contain separated logic of specific kind of implementation, s
|
|||
- default interfaces are defined in `interfaces.py`
|
||||
|
||||
## IPluginPaths
|
||||
- addon wants to add directory path/s to avalon or publish plugins
|
||||
- addon wants to add directory path/s to publish, load, create or inventory plugins
|
||||
- addon must implement `get_plugin_paths` which must return dictionary with possible keys `"publish"`, `"load"`, `"create"` or `"actions"`
|
||||
- each key may contain list or string with a path to directory with plugins
|
||||
|
||||
|
|
@ -89,4 +89,4 @@ AYON addons should contain separated logic of specific kind of implementation, s
|
|||
|
||||
### TrayAddonsManager
|
||||
- inherits from `AddonsManager`
|
||||
- has specific implementation for Pype Tray tool and handle `ITrayAddon` methods
|
||||
- has specific implementation for AYON Tray and handle `ITrayAddon` methods
|
||||
|
|
|
|||
|
|
@ -14,9 +14,10 @@ from abc import ABCMeta, abstractmethod
|
|||
|
||||
import six
|
||||
import appdirs
|
||||
import ayon_api
|
||||
|
||||
from ayon_core import AYON_CORE_ROOT
|
||||
from ayon_core.lib import Logger, is_dev_mode_enabled
|
||||
from ayon_core.client import get_ayon_server_api_connection
|
||||
from ayon_core.settings import get_studio_settings
|
||||
|
||||
from .interfaces import (
|
||||
|
|
@ -147,8 +148,7 @@ def load_addons(force=False):
|
|||
|
||||
|
||||
def _get_ayon_bundle_data():
|
||||
con = get_ayon_server_api_connection()
|
||||
bundles = con.get_bundles()["bundles"]
|
||||
bundles = ayon_api.get_bundles()["bundles"]
|
||||
|
||||
bundle_name = os.getenv("AYON_BUNDLE_NAME")
|
||||
|
||||
|
|
@ -176,8 +176,7 @@ def _get_ayon_addons_information(bundle_info):
|
|||
|
||||
output = []
|
||||
bundle_addons = bundle_info["addons"]
|
||||
con = get_ayon_server_api_connection()
|
||||
addons = con.get_addons_info()["addons"]
|
||||
addons = ayon_api.get_addons_info()["addons"]
|
||||
for addon in addons:
|
||||
name = addon["name"]
|
||||
versions = addon.get("versions")
|
||||
|
|
@ -337,14 +336,70 @@ def _load_ayon_addons(openpype_modules, modules_key, log):
|
|||
return addons_to_skip_in_core
|
||||
|
||||
|
||||
def _load_ayon_core_addons_dir(
|
||||
ignore_addon_names, openpype_modules, modules_key, log
|
||||
):
|
||||
addons_dir = os.path.join(AYON_CORE_ROOT, "addons")
|
||||
if not os.path.exists(addons_dir):
|
||||
return
|
||||
|
||||
imported_modules = []
|
||||
|
||||
# Make sure that addons which already have client code are not loaded
|
||||
# from core again, with older code
|
||||
filtered_paths = []
|
||||
for name in os.listdir(addons_dir):
|
||||
if name in ignore_addon_names:
|
||||
continue
|
||||
path = os.path.join(addons_dir, name)
|
||||
if os.path.isdir(path):
|
||||
filtered_paths.append(path)
|
||||
|
||||
for path in filtered_paths:
|
||||
while path in sys.path:
|
||||
sys.path.remove(path)
|
||||
sys.path.insert(0, path)
|
||||
|
||||
for name in os.listdir(path):
|
||||
fullpath = os.path.join(path, name)
|
||||
if os.path.isfile(fullpath):
|
||||
basename, ext = os.path.splitext(name)
|
||||
if ext != ".py":
|
||||
continue
|
||||
else:
|
||||
basename = name
|
||||
try:
|
||||
module = __import__(basename, fromlist=("",))
|
||||
for attr_name in dir(module):
|
||||
attr = getattr(module, attr_name)
|
||||
if (
|
||||
inspect.isclass(attr)
|
||||
and issubclass(attr, AYONAddon)
|
||||
):
|
||||
new_import_str = "{}.{}".format(modules_key, basename)
|
||||
sys.modules[new_import_str] = module
|
||||
setattr(openpype_modules, basename, module)
|
||||
imported_modules.append(module)
|
||||
break
|
||||
|
||||
except Exception:
|
||||
log.error(
|
||||
"Failed to import addon '{}'.".format(fullpath),
|
||||
exc_info=True
|
||||
)
|
||||
return imported_modules
|
||||
|
||||
|
||||
def _load_addons_in_core(
|
||||
ignore_addon_names, openpype_modules, modules_key, log
|
||||
):
|
||||
_load_ayon_core_addons_dir(
|
||||
ignore_addon_names, openpype_modules, modules_key, log
|
||||
)
|
||||
# Add current directory at first place
|
||||
# - has small differences in import logic
|
||||
current_dir = os.path.abspath(os.path.dirname(__file__))
|
||||
hosts_dir = os.path.join(os.path.dirname(current_dir), "hosts")
|
||||
modules_dir = os.path.join(os.path.dirname(current_dir), "modules")
|
||||
hosts_dir = os.path.join(AYON_CORE_ROOT, "hosts")
|
||||
modules_dir = os.path.join(AYON_CORE_ROOT, "modules")
|
||||
|
||||
ignored_host_names = set(IGNORED_HOSTS_IN_AYON)
|
||||
ignored_module_dir_filenames = (
|
||||
|
|
@ -743,7 +798,7 @@ class AddonsManager:
|
|||
|
||||
addon_classes = []
|
||||
for module in openpype_modules:
|
||||
# Go through globals in `pype.modules`
|
||||
# Go through globals in `ayon_core.modules`
|
||||
for name in dir(module):
|
||||
modules_item = getattr(module, name, None)
|
||||
# Filter globals that are not classes which inherit from
|
||||
|
|
@ -1077,7 +1132,7 @@ class AddonsManager:
|
|||
"""Print out report of time spent on addons initialization parts.
|
||||
|
||||
Reporting is not automated must be implemented for each initialization
|
||||
part separatelly. Reports must be stored to `_report` attribute.
|
||||
part separately. Reports must be stored to `_report` attribute.
|
||||
Print is skipped if `_report` is empty.
|
||||
|
||||
Attribute `_report` is dictionary where key is "label" describing
|
||||
|
|
@ -1269,7 +1324,7 @@ class TrayAddonsManager(AddonsManager):
|
|||
def add_doubleclick_callback(self, addon, callback):
|
||||
"""Register doubleclick callbacks on tray icon.
|
||||
|
||||
Currently there is no way how to determine which is launched. Name of
|
||||
Currently, there is no way how to determine which is launched. Name of
|
||||
callback can be defined with `doubleclick_callback` attribute.
|
||||
|
||||
Missing feature how to define default callback.
|
||||
|
|
|
|||
|
|
@ -0,0 +1,58 @@
|
|||
from .constants import (
|
||||
APPLICATIONS_ADDON_ROOT,
|
||||
DEFAULT_ENV_SUBGROUP,
|
||||
PLATFORM_NAMES,
|
||||
)
|
||||
from .exceptions import (
|
||||
ApplicationNotFound,
|
||||
ApplicationExecutableNotFound,
|
||||
ApplicationLaunchFailed,
|
||||
MissingRequiredKey,
|
||||
)
|
||||
from .defs import (
|
||||
LaunchTypes,
|
||||
ApplicationExecutable,
|
||||
UndefinedApplicationExecutable,
|
||||
ApplicationGroup,
|
||||
Application,
|
||||
EnvironmentToolGroup,
|
||||
EnvironmentTool,
|
||||
)
|
||||
from .hooks import (
|
||||
LaunchHook,
|
||||
PreLaunchHook,
|
||||
PostLaunchHook,
|
||||
)
|
||||
from .manager import (
|
||||
ApplicationManager,
|
||||
ApplicationLaunchContext,
|
||||
)
|
||||
from .addon import ApplicationsAddon
|
||||
|
||||
|
||||
__all__ = (
|
||||
"DEFAULT_ENV_SUBGROUP",
|
||||
"PLATFORM_NAMES",
|
||||
|
||||
"ApplicationNotFound",
|
||||
"ApplicationExecutableNotFound",
|
||||
"ApplicationLaunchFailed",
|
||||
"MissingRequiredKey",
|
||||
|
||||
"LaunchTypes",
|
||||
"ApplicationExecutable",
|
||||
"UndefinedApplicationExecutable",
|
||||
"ApplicationGroup",
|
||||
"Application",
|
||||
"EnvironmentToolGroup",
|
||||
"EnvironmentTool",
|
||||
|
||||
"LaunchHook",
|
||||
"PreLaunchHook",
|
||||
"PostLaunchHook",
|
||||
|
||||
"ApplicationManager",
|
||||
"ApplicationLaunchContext",
|
||||
|
||||
"ApplicationsAddon",
|
||||
)
|
||||
173
client/ayon_core/addons/applications/ayon_applications/addon.py
Normal file
173
client/ayon_core/addons/applications/ayon_applications/addon.py
Normal file
|
|
@ -0,0 +1,173 @@
|
|||
import os
|
||||
import json
|
||||
|
||||
from ayon_core.addon import AYONAddon, IPluginPaths, click_wrap
|
||||
|
||||
from .constants import APPLICATIONS_ADDON_ROOT
|
||||
from .defs import LaunchTypes
|
||||
from .manager import ApplicationManager
|
||||
|
||||
|
||||
class ApplicationsAddon(AYONAddon, IPluginPaths):
|
||||
name = "applications"
|
||||
|
||||
def initialize(self, settings):
|
||||
# TODO remove when addon is removed from ayon-core
|
||||
self.enabled = self.name in settings
|
||||
|
||||
def get_app_environments_for_context(
|
||||
self,
|
||||
project_name,
|
||||
folder_path,
|
||||
task_name,
|
||||
full_app_name,
|
||||
env_group=None,
|
||||
launch_type=None,
|
||||
env=None,
|
||||
):
|
||||
"""Calculate environment variables for launch context.
|
||||
|
||||
Args:
|
||||
project_name (str): Project name.
|
||||
folder_path (str): Folder path.
|
||||
task_name (str): Task name.
|
||||
full_app_name (str): Full application name.
|
||||
env_group (Optional[str]): Environment group.
|
||||
launch_type (Optional[str]): Launch type.
|
||||
env (Optional[dict[str, str]]): Environment variables to update.
|
||||
|
||||
Returns:
|
||||
dict[str, str]: Environment variables for context.
|
||||
|
||||
"""
|
||||
from ayon_applications.utils import get_app_environments_for_context
|
||||
|
||||
if not full_app_name:
|
||||
return {}
|
||||
|
||||
return get_app_environments_for_context(
|
||||
project_name,
|
||||
folder_path,
|
||||
task_name,
|
||||
full_app_name,
|
||||
env_group=env_group,
|
||||
launch_type=launch_type,
|
||||
env=env,
|
||||
addons_manager=self.manager
|
||||
)
|
||||
|
||||
def get_farm_publish_environment_variables(
|
||||
self,
|
||||
project_name,
|
||||
folder_path,
|
||||
task_name,
|
||||
full_app_name=None,
|
||||
env_group=None,
|
||||
):
|
||||
"""Calculate environment variables for farm publish.
|
||||
|
||||
Args:
|
||||
project_name (str): Project name.
|
||||
folder_path (str): Folder path.
|
||||
task_name (str): Task name.
|
||||
env_group (Optional[str]): Environment group.
|
||||
full_app_name (Optional[str]): Full application name. Value from
|
||||
environment variable 'AYON_APP_NAME' is used if 'None' is
|
||||
passed.
|
||||
|
||||
Returns:
|
||||
dict[str, str]: Environment variables for farm publish.
|
||||
|
||||
"""
|
||||
if full_app_name is None:
|
||||
full_app_name = os.getenv("AYON_APP_NAME")
|
||||
|
||||
return self.get_app_environments_for_context(
|
||||
project_name,
|
||||
folder_path,
|
||||
task_name,
|
||||
full_app_name,
|
||||
env_group=env_group,
|
||||
launch_type=LaunchTypes.farm_publish
|
||||
)
|
||||
|
||||
def get_applications_manager(self, settings=None):
|
||||
"""Get applications manager.
|
||||
|
||||
Args:
|
||||
settings (Optional[dict]): Studio/project settings.
|
||||
|
||||
Returns:
|
||||
ApplicationManager: Applications manager.
|
||||
|
||||
"""
|
||||
return ApplicationManager(settings)
|
||||
|
||||
def get_plugin_paths(self):
|
||||
return {
|
||||
"publish": [
|
||||
os.path.join(APPLICATIONS_ADDON_ROOT, "plugins", "publish")
|
||||
]
|
||||
}
|
||||
|
||||
# --- CLI ---
|
||||
def cli(self, addon_click_group):
|
||||
main_group = click_wrap.group(
|
||||
self._cli_main, name=self.name, help="Applications addon"
|
||||
)
|
||||
(
|
||||
main_group.command(
|
||||
self._cli_extract_environments,
|
||||
name="extractenvironments",
|
||||
help=(
|
||||
"Extract environment variables for context into json file"
|
||||
)
|
||||
)
|
||||
.argument("output_json_path")
|
||||
.option("--project", help="Project name", default=None)
|
||||
.option("--folder", help="Folder path", default=None)
|
||||
.option("--task", help="Task name", default=None)
|
||||
.option("--app", help="Application name", default=None)
|
||||
.option(
|
||||
"--envgroup",
|
||||
help="Environment group (e.g. \"farm\")",
|
||||
default=None
|
||||
)
|
||||
)
|
||||
# Convert main command to click object and add it to parent group
|
||||
addon_click_group.add_command(
|
||||
main_group.to_click_obj()
|
||||
)
|
||||
|
||||
def _cli_main(self):
|
||||
pass
|
||||
|
||||
def _cli_extract_environments(
|
||||
self, output_json_path, project, folder, task, app, envgroup
|
||||
):
|
||||
"""Produces json file with environment based on project and app.
|
||||
|
||||
Called by farm integration to propagate environment into farm jobs.
|
||||
|
||||
Args:
|
||||
output_json_path (str): Output json file path.
|
||||
project (str): Project name.
|
||||
folder (str): Folder path.
|
||||
task (str): Task name.
|
||||
app (str): Full application name e.g. 'maya/2024'.
|
||||
envgroup (str): Environment group.
|
||||
|
||||
"""
|
||||
if all((project, folder, task, app)):
|
||||
env = self.get_farm_publish_environment_variables(
|
||||
project, folder, task, app, env_group=envgroup,
|
||||
)
|
||||
else:
|
||||
env = os.environ.copy()
|
||||
|
||||
output_dir = os.path.dirname(output_json_path)
|
||||
if not os.path.exists(output_dir):
|
||||
os.makedirs(output_dir)
|
||||
|
||||
with open(output_json_path, "w") as file_stream:
|
||||
json.dump(env, file_stream, indent=4)
|
||||
|
|
@ -0,0 +1,6 @@
|
|||
import os
|
||||
|
||||
APPLICATIONS_ADDON_ROOT = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
PLATFORM_NAMES = {"windows", "linux", "darwin"}
|
||||
DEFAULT_ENV_SUBGROUP = "standard"
|
||||
404
client/ayon_core/addons/applications/ayon_applications/defs.py
Normal file
404
client/ayon_core/addons/applications/ayon_applications/defs.py
Normal file
|
|
@ -0,0 +1,404 @@
|
|||
import os
|
||||
import platform
|
||||
import json
|
||||
import copy
|
||||
|
||||
from ayon_core.lib import find_executable
|
||||
|
||||
|
||||
class LaunchTypes:
|
||||
"""Launch types are filters for pre/post-launch hooks.
|
||||
|
||||
Please use these variables in case they'll change values.
|
||||
"""
|
||||
|
||||
# Local launch - application is launched on local machine
|
||||
local = "local"
|
||||
# Farm render job - application is on farm
|
||||
farm_render = "farm-render"
|
||||
# Farm publish job - integration post-render job
|
||||
farm_publish = "farm-publish"
|
||||
# Remote launch - application is launched on remote machine from which
|
||||
# can be started publishing
|
||||
remote = "remote"
|
||||
# Automated launch - application is launched with automated publishing
|
||||
automated = "automated"
|
||||
|
||||
|
||||
class ApplicationExecutable:
|
||||
"""Representation of executable loaded from settings."""
|
||||
|
||||
def __init__(self, executable):
|
||||
# Try to format executable with environments
|
||||
try:
|
||||
executable = executable.format(**os.environ)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# On MacOS check if exists path to executable when ends with `.app`
|
||||
# - it is common that path will lead to "/Applications/Blender" but
|
||||
# real path is "/Applications/Blender.app"
|
||||
if platform.system().lower() == "darwin":
|
||||
executable = self.macos_executable_prep(executable)
|
||||
|
||||
self.executable_path = executable
|
||||
|
||||
def __str__(self):
|
||||
return self.executable_path
|
||||
|
||||
def __repr__(self):
|
||||
return "<{}> {}".format(self.__class__.__name__, self.executable_path)
|
||||
|
||||
@staticmethod
|
||||
def macos_executable_prep(executable):
|
||||
"""Try to find full path to executable file.
|
||||
|
||||
Real executable is stored in '*.app/Contents/MacOS/<executable>'.
|
||||
|
||||
Having path to '*.app' gives ability to read it's plist info and
|
||||
use "CFBundleExecutable" key from plist to know what is "executable."
|
||||
|
||||
Plist is stored in '*.app/Contents/Info.plist'.
|
||||
|
||||
This is because some '*.app' directories don't have same permissions
|
||||
as real executable.
|
||||
"""
|
||||
# Try to find if there is `.app` file
|
||||
if not os.path.exists(executable):
|
||||
_executable = executable + ".app"
|
||||
if os.path.exists(_executable):
|
||||
executable = _executable
|
||||
|
||||
# Try to find real executable if executable has `Contents` subfolder
|
||||
contents_dir = os.path.join(executable, "Contents")
|
||||
if os.path.exists(contents_dir):
|
||||
executable_filename = None
|
||||
# Load plist file and check for bundle executable
|
||||
plist_filepath = os.path.join(contents_dir, "Info.plist")
|
||||
if os.path.exists(plist_filepath):
|
||||
import plistlib
|
||||
|
||||
if hasattr(plistlib, "load"):
|
||||
with open(plist_filepath, "rb") as stream:
|
||||
parsed_plist = plistlib.load(stream)
|
||||
else:
|
||||
parsed_plist = plistlib.readPlist(plist_filepath)
|
||||
executable_filename = parsed_plist.get("CFBundleExecutable")
|
||||
|
||||
if executable_filename:
|
||||
executable = os.path.join(
|
||||
contents_dir, "MacOS", executable_filename
|
||||
)
|
||||
|
||||
return executable
|
||||
|
||||
def as_args(self):
|
||||
return [self.executable_path]
|
||||
|
||||
def _realpath(self):
|
||||
"""Check if path is valid executable path."""
|
||||
# Check for executable in PATH
|
||||
result = find_executable(self.executable_path)
|
||||
if result is not None:
|
||||
return result
|
||||
|
||||
# This is not 100% validation but it is better than remove ability to
|
||||
# launch .bat, .sh or extentionless files
|
||||
if os.path.exists(self.executable_path):
|
||||
return self.executable_path
|
||||
return None
|
||||
|
||||
def exists(self):
|
||||
if not self.executable_path:
|
||||
return False
|
||||
return bool(self._realpath())
|
||||
|
||||
|
||||
class UndefinedApplicationExecutable(ApplicationExecutable):
|
||||
"""Some applications do not require executable path from settings.
|
||||
|
||||
In that case this class is used to "fake" existing executable.
|
||||
"""
|
||||
def __init__(self):
|
||||
pass
|
||||
|
||||
def __str__(self):
|
||||
return self.__class__.__name__
|
||||
|
||||
def __repr__(self):
|
||||
return "<{}>".format(self.__class__.__name__)
|
||||
|
||||
def as_args(self):
|
||||
return []
|
||||
|
||||
def exists(self):
|
||||
return True
|
||||
|
||||
|
||||
class ApplicationGroup:
|
||||
"""Hold information about application group.
|
||||
|
||||
Application group wraps different versions(variants) of application.
|
||||
e.g. "maya" is group and "maya_2020" is variant.
|
||||
|
||||
Group hold `host_name` which is implementation name used in AYON. Also
|
||||
holds `enabled` if whole app group is enabled or `icon` for application
|
||||
icon path in resources.
|
||||
|
||||
Group has also `environment` which hold same environments for all variants.
|
||||
|
||||
Args:
|
||||
name (str): Groups' name.
|
||||
data (dict): Group defying data loaded from settings.
|
||||
manager (ApplicationManager): Manager that created the group.
|
||||
"""
|
||||
|
||||
def __init__(self, name, data, manager):
|
||||
self.name = name
|
||||
self.manager = manager
|
||||
self._data = data
|
||||
|
||||
self.enabled = data["enabled"]
|
||||
self.label = data["label"] or None
|
||||
self.icon = data["icon"] or None
|
||||
env = {}
|
||||
try:
|
||||
env = json.loads(data["environment"])
|
||||
except Exception:
|
||||
pass
|
||||
self._environment = env
|
||||
|
||||
host_name = data["host_name"] or None
|
||||
self.is_host = host_name is not None
|
||||
self.host_name = host_name
|
||||
|
||||
settings_variants = data["variants"]
|
||||
variants = {}
|
||||
for variant_data in settings_variants:
|
||||
app_variant = Application(variant_data, self)
|
||||
variants[app_variant.name] = app_variant
|
||||
|
||||
self.variants = variants
|
||||
|
||||
def __repr__(self):
|
||||
return "<{}> - {}".format(self.__class__.__name__, self.name)
|
||||
|
||||
def __iter__(self):
|
||||
for variant in self.variants.values():
|
||||
yield variant
|
||||
|
||||
@property
|
||||
def environment(self):
|
||||
return copy.deepcopy(self._environment)
|
||||
|
||||
|
||||
class Application:
|
||||
"""Hold information about application.
|
||||
|
||||
Object by itself does nothing special.
|
||||
|
||||
Args:
|
||||
data (dict): Data for the version containing information about
|
||||
executables, variant label or if is enabled.
|
||||
Only required key is `executables`.
|
||||
group (ApplicationGroup): App group object that created the application
|
||||
and under which application belongs.
|
||||
|
||||
"""
|
||||
def __init__(self, data, group):
|
||||
self._data = data
|
||||
name = data["name"]
|
||||
label = data["label"] or name
|
||||
enabled = False
|
||||
if group.enabled:
|
||||
enabled = data.get("enabled", True)
|
||||
|
||||
if group.label:
|
||||
full_label = " ".join((group.label, label))
|
||||
else:
|
||||
full_label = label
|
||||
env = {}
|
||||
try:
|
||||
env = json.loads(data["environment"])
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
arguments = data["arguments"]
|
||||
if isinstance(arguments, dict):
|
||||
arguments = arguments.get(platform.system().lower())
|
||||
|
||||
if not arguments:
|
||||
arguments = []
|
||||
|
||||
_executables = data["executables"].get(platform.system().lower(), [])
|
||||
executables = [
|
||||
ApplicationExecutable(executable)
|
||||
for executable in _executables
|
||||
]
|
||||
|
||||
self.group = group
|
||||
|
||||
self.name = name
|
||||
self.label = label
|
||||
self.enabled = enabled
|
||||
self.use_python_2 = data.get("use_python_2", False)
|
||||
|
||||
self.full_name = "/".join((group.name, name))
|
||||
self.full_label = full_label
|
||||
self.arguments = arguments
|
||||
self.executables = executables
|
||||
self._environment = env
|
||||
|
||||
def __repr__(self):
|
||||
return "<{}> - {}".format(self.__class__.__name__, self.full_name)
|
||||
|
||||
@property
|
||||
def environment(self):
|
||||
return copy.deepcopy(self._environment)
|
||||
|
||||
@property
|
||||
def manager(self):
|
||||
return self.group.manager
|
||||
|
||||
@property
|
||||
def host_name(self):
|
||||
return self.group.host_name
|
||||
|
||||
@property
|
||||
def icon(self):
|
||||
return self.group.icon
|
||||
|
||||
@property
|
||||
def is_host(self):
|
||||
return self.group.is_host
|
||||
|
||||
def find_executable(self):
|
||||
"""Try to find existing executable for application.
|
||||
|
||||
Returns (str): Path to executable from `executables` or None if any
|
||||
exists.
|
||||
"""
|
||||
for executable in self.executables:
|
||||
if executable.exists():
|
||||
return executable
|
||||
return None
|
||||
|
||||
def launch(self, *args, **kwargs):
|
||||
"""Launch the application.
|
||||
|
||||
For this purpose is used manager's launch method to keep logic at one
|
||||
place.
|
||||
|
||||
Arguments must match with manager's launch method. That's why *args
|
||||
**kwargs are used.
|
||||
|
||||
Returns:
|
||||
subprocess.Popen: Return executed process as Popen object.
|
||||
"""
|
||||
return self.manager.launch(self.full_name, *args, **kwargs)
|
||||
|
||||
|
||||
class EnvironmentToolGroup:
|
||||
"""Hold information about environment tool group.
|
||||
|
||||
Environment tool group may hold different variants of same tool and set
|
||||
environments that are same for all of them.
|
||||
|
||||
e.g. "mtoa" may have different versions but all environments except one
|
||||
are same.
|
||||
|
||||
Args:
|
||||
data (dict): Group information with variants.
|
||||
manager (ApplicationManager): Manager that creates the group.
|
||||
"""
|
||||
|
||||
def __init__(self, data, manager):
|
||||
name = data["name"]
|
||||
label = data["label"]
|
||||
|
||||
self.name = name
|
||||
self.label = label
|
||||
self._data = data
|
||||
self.manager = manager
|
||||
|
||||
environment = {}
|
||||
try:
|
||||
environment = json.loads(data["environment"])
|
||||
except Exception:
|
||||
pass
|
||||
self._environment = environment
|
||||
|
||||
variants = data.get("variants") or []
|
||||
variants_by_name = {}
|
||||
for variant_data in variants:
|
||||
tool = EnvironmentTool(variant_data, self)
|
||||
variants_by_name[tool.name] = tool
|
||||
self.variants = variants_by_name
|
||||
|
||||
def __repr__(self):
|
||||
return "<{}> - {}".format(self.__class__.__name__, self.name)
|
||||
|
||||
def __iter__(self):
|
||||
for variant in self.variants.values():
|
||||
yield variant
|
||||
|
||||
@property
|
||||
def environment(self):
|
||||
return copy.deepcopy(self._environment)
|
||||
|
||||
|
||||
class EnvironmentTool:
|
||||
"""Hold information about application tool.
|
||||
|
||||
Structure of tool information.
|
||||
|
||||
Args:
|
||||
variant_data (dict): Variant data with environments and
|
||||
host and app variant filters.
|
||||
group (EnvironmentToolGroup): Name of group which wraps tool.
|
||||
"""
|
||||
|
||||
def __init__(self, variant_data, group):
|
||||
# Backwards compatibility 3.9.1 - 3.9.2
|
||||
# - 'variant_data' contained only environments but contain also host
|
||||
# and application variant filters
|
||||
name = variant_data["name"]
|
||||
label = variant_data["label"]
|
||||
host_names = variant_data["host_names"]
|
||||
app_variants = variant_data["app_variants"]
|
||||
|
||||
environment = {}
|
||||
try:
|
||||
environment = json.loads(variant_data["environment"])
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
self.host_names = host_names
|
||||
self.app_variants = app_variants
|
||||
self.name = name
|
||||
self.variant_label = label
|
||||
self.label = " ".join((group.label, label))
|
||||
self.group = group
|
||||
|
||||
self._environment = environment
|
||||
self.full_name = "/".join((group.name, name))
|
||||
|
||||
def __repr__(self):
|
||||
return "<{}> - {}".format(self.__class__.__name__, self.full_name)
|
||||
|
||||
@property
|
||||
def environment(self):
|
||||
return copy.deepcopy(self._environment)
|
||||
|
||||
def is_valid_for_app(self, app):
|
||||
"""Is tool valid for application.
|
||||
|
||||
Args:
|
||||
app (Application): Application for which are prepared environments.
|
||||
"""
|
||||
if self.app_variants and app.full_name not in self.app_variants:
|
||||
return False
|
||||
|
||||
if self.host_names and app.host_name not in self.host_names:
|
||||
return False
|
||||
return True
|
||||
|
|
@ -0,0 +1,50 @@
|
|||
class ApplicationNotFound(Exception):
|
||||
"""Application was not found in ApplicationManager by name."""
|
||||
|
||||
def __init__(self, app_name):
|
||||
self.app_name = app_name
|
||||
super(ApplicationNotFound, self).__init__(
|
||||
"Application \"{}\" was not found.".format(app_name)
|
||||
)
|
||||
|
||||
|
||||
class ApplicationExecutableNotFound(Exception):
|
||||
"""Defined executable paths are not available on the machine."""
|
||||
|
||||
def __init__(self, application):
|
||||
self.application = application
|
||||
details = None
|
||||
if not application.executables:
|
||||
msg = (
|
||||
"Executable paths for application \"{}\"({}) are not set."
|
||||
)
|
||||
else:
|
||||
msg = (
|
||||
"Defined executable paths for application \"{}\"({})"
|
||||
" are not available on this machine."
|
||||
)
|
||||
details = "Defined paths:"
|
||||
for executable in application.executables:
|
||||
details += "\n- " + executable.executable_path
|
||||
|
||||
self.msg = msg.format(application.full_label, application.full_name)
|
||||
self.details = details
|
||||
|
||||
exc_mgs = str(self.msg)
|
||||
if details:
|
||||
# Is good idea to pass new line symbol to exception message?
|
||||
exc_mgs += "\n" + details
|
||||
self.exc_msg = exc_mgs
|
||||
super(ApplicationExecutableNotFound, self).__init__(exc_mgs)
|
||||
|
||||
|
||||
class ApplicationLaunchFailed(Exception):
|
||||
"""Application launch failed due to known reason.
|
||||
|
||||
Message should be self explanatory as traceback won't be shown.
|
||||
"""
|
||||
pass
|
||||
|
||||
|
||||
class MissingRequiredKey(KeyError):
|
||||
pass
|
||||
150
client/ayon_core/addons/applications/ayon_applications/hooks.py
Normal file
150
client/ayon_core/addons/applications/ayon_applications/hooks.py
Normal file
|
|
@ -0,0 +1,150 @@
|
|||
import platform
|
||||
from abc import ABCMeta, abstractmethod
|
||||
|
||||
import six
|
||||
|
||||
from ayon_core.lib import Logger
|
||||
|
||||
from .defs import LaunchTypes
|
||||
|
||||
|
||||
@six.add_metaclass(ABCMeta)
|
||||
class LaunchHook:
|
||||
"""Abstract base class of launch hook."""
|
||||
# Order of prelaunch hook, will be executed as last if set to None.
|
||||
order = None
|
||||
# List of host implementations, skipped if empty.
|
||||
hosts = set()
|
||||
# Set of application groups
|
||||
app_groups = set()
|
||||
# Set of specific application names
|
||||
app_names = set()
|
||||
# Set of platform availability
|
||||
platforms = set()
|
||||
# Set of launch types for which is available
|
||||
# - if empty then is available for all launch types
|
||||
# - by default has 'local' which is most common reason for launc hooks
|
||||
launch_types = {LaunchTypes.local}
|
||||
|
||||
def __init__(self, launch_context):
|
||||
"""Constructor of launch hook.
|
||||
|
||||
Always should be called
|
||||
"""
|
||||
self.log = Logger.get_logger(self.__class__.__name__)
|
||||
|
||||
self.launch_context = launch_context
|
||||
|
||||
is_valid = self.class_validation(launch_context)
|
||||
if is_valid:
|
||||
is_valid = self.validate()
|
||||
|
||||
self.is_valid = is_valid
|
||||
|
||||
@classmethod
|
||||
def class_validation(cls, launch_context):
|
||||
"""Validation of class attributes by launch context.
|
||||
|
||||
Args:
|
||||
launch_context (ApplicationLaunchContext): Context of launching
|
||||
application.
|
||||
|
||||
Returns:
|
||||
bool: Is launch hook valid for the context by class attributes.
|
||||
"""
|
||||
if cls.platforms:
|
||||
low_platforms = tuple(
|
||||
_platform.lower()
|
||||
for _platform in cls.platforms
|
||||
)
|
||||
if platform.system().lower() not in low_platforms:
|
||||
return False
|
||||
|
||||
if cls.hosts:
|
||||
if launch_context.host_name not in cls.hosts:
|
||||
return False
|
||||
|
||||
if cls.app_groups:
|
||||
if launch_context.app_group.name not in cls.app_groups:
|
||||
return False
|
||||
|
||||
if cls.app_names:
|
||||
if launch_context.app_name not in cls.app_names:
|
||||
return False
|
||||
|
||||
if cls.launch_types:
|
||||
if launch_context.launch_type not in cls.launch_types:
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
@property
|
||||
def data(self):
|
||||
return self.launch_context.data
|
||||
|
||||
@property
|
||||
def application(self):
|
||||
return getattr(self.launch_context, "application", None)
|
||||
|
||||
@property
|
||||
def manager(self):
|
||||
return getattr(self.application, "manager", None)
|
||||
|
||||
@property
|
||||
def host_name(self):
|
||||
return getattr(self.application, "host_name", None)
|
||||
|
||||
@property
|
||||
def app_group(self):
|
||||
return getattr(self.application, "group", None)
|
||||
|
||||
@property
|
||||
def app_name(self):
|
||||
return getattr(self.application, "full_name", None)
|
||||
|
||||
@property
|
||||
def addons_manager(self):
|
||||
return getattr(self.launch_context, "addons_manager", None)
|
||||
|
||||
@property
|
||||
def modules_manager(self):
|
||||
"""
|
||||
Deprecated:
|
||||
Use 'addons_wrapper' instead.
|
||||
"""
|
||||
return self.addons_manager
|
||||
|
||||
def validate(self):
|
||||
"""Optional validation of launch hook on initialization.
|
||||
|
||||
Returns:
|
||||
bool: Hook is valid (True) or invalid (False).
|
||||
"""
|
||||
# QUESTION Not sure if this method has any usable potential.
|
||||
# - maybe result can be based on settings
|
||||
return True
|
||||
|
||||
@abstractmethod
|
||||
def execute(self, *args, **kwargs):
|
||||
"""Abstract execute method where logic of hook is."""
|
||||
pass
|
||||
|
||||
|
||||
class PreLaunchHook(LaunchHook):
|
||||
"""Abstract class of prelaunch hook.
|
||||
|
||||
This launch hook will be processed before application is launched.
|
||||
|
||||
If any exception will happen during processing the application won't be
|
||||
launched.
|
||||
"""
|
||||
|
||||
|
||||
class PostLaunchHook(LaunchHook):
|
||||
"""Abstract class of postlaunch hook.
|
||||
|
||||
This launch hook will be processed after application is launched.
|
||||
|
||||
Nothing will happen if any exception will happen during processing. And
|
||||
processing of other postlaunch hooks won't stop either.
|
||||
"""
|
||||
|
|
@ -0,0 +1,676 @@
|
|||
import os
|
||||
import sys
|
||||
import copy
|
||||
import json
|
||||
import tempfile
|
||||
import platform
|
||||
import inspect
|
||||
import subprocess
|
||||
|
||||
import six
|
||||
|
||||
from ayon_core import AYON_CORE_ROOT
|
||||
from ayon_core.settings import get_studio_settings
|
||||
from ayon_core.lib import (
|
||||
Logger,
|
||||
modules_from_path,
|
||||
classes_from_module,
|
||||
get_linux_launcher_args,
|
||||
)
|
||||
from ayon_core.addon import AddonsManager
|
||||
|
||||
from .constants import DEFAULT_ENV_SUBGROUP
|
||||
from .exceptions import (
|
||||
ApplicationNotFound,
|
||||
ApplicationExecutableNotFound,
|
||||
)
|
||||
from .hooks import PostLaunchHook, PreLaunchHook
|
||||
from .defs import EnvironmentToolGroup, ApplicationGroup, LaunchTypes
|
||||
|
||||
|
||||
class ApplicationManager:
|
||||
"""Load applications and tools and store them by their full name.
|
||||
|
||||
Args:
|
||||
studio_settings (dict): Preloaded studio settings. When passed manager
|
||||
will always use these values. Gives ability to create manager
|
||||
using different settings.
|
||||
"""
|
||||
|
||||
def __init__(self, studio_settings=None):
|
||||
self.log = Logger.get_logger(self.__class__.__name__)
|
||||
|
||||
self.app_groups = {}
|
||||
self.applications = {}
|
||||
self.tool_groups = {}
|
||||
self.tools = {}
|
||||
|
||||
self._studio_settings = studio_settings
|
||||
|
||||
self.refresh()
|
||||
|
||||
def set_studio_settings(self, studio_settings):
|
||||
"""Ability to change init system settings.
|
||||
|
||||
This will trigger refresh of manager.
|
||||
"""
|
||||
self._studio_settings = studio_settings
|
||||
|
||||
self.refresh()
|
||||
|
||||
def refresh(self):
|
||||
"""Refresh applications from settings."""
|
||||
self.app_groups.clear()
|
||||
self.applications.clear()
|
||||
self.tool_groups.clear()
|
||||
self.tools.clear()
|
||||
|
||||
if self._studio_settings is not None:
|
||||
settings = copy.deepcopy(self._studio_settings)
|
||||
else:
|
||||
settings = get_studio_settings(
|
||||
clear_metadata=False, exclude_locals=False
|
||||
)
|
||||
|
||||
applications_addon_settings = settings["applications"]
|
||||
|
||||
# Prepare known applications
|
||||
app_defs = applications_addon_settings["applications"]
|
||||
additional_apps = app_defs.pop("additional_apps")
|
||||
for additional_app in additional_apps:
|
||||
app_name = additional_app.pop("name")
|
||||
if app_name in app_defs:
|
||||
self.log.warning((
|
||||
"Additional application '{}' is already"
|
||||
" in built-in applications."
|
||||
).format(app_name))
|
||||
app_defs[app_name] = additional_app
|
||||
|
||||
for group_name, variant_defs in app_defs.items():
|
||||
group = ApplicationGroup(group_name, variant_defs, self)
|
||||
self.app_groups[group_name] = group
|
||||
for app in group:
|
||||
self.applications[app.full_name] = app
|
||||
|
||||
tools_definitions = applications_addon_settings["tool_groups"]
|
||||
for tool_group_data in tools_definitions:
|
||||
group = EnvironmentToolGroup(tool_group_data, self)
|
||||
self.tool_groups[group.name] = group
|
||||
for tool in group:
|
||||
self.tools[tool.full_name] = tool
|
||||
|
||||
def find_latest_available_variant_for_group(self, group_name):
|
||||
group = self.app_groups.get(group_name)
|
||||
if group is None or not group.enabled:
|
||||
return None
|
||||
|
||||
output = None
|
||||
for _, variant in reversed(sorted(group.variants.items())):
|
||||
executable = variant.find_executable()
|
||||
if executable:
|
||||
output = variant
|
||||
break
|
||||
return output
|
||||
|
||||
def create_launch_context(self, app_name, **data):
|
||||
"""Prepare launch context for application.
|
||||
|
||||
Args:
|
||||
app_name (str): Name of application that should be launched.
|
||||
**data (Any): Any additional data. Data may be used during
|
||||
|
||||
Returns:
|
||||
ApplicationLaunchContext: Launch context for application.
|
||||
|
||||
Raises:
|
||||
ApplicationNotFound: Application was not found by entered name.
|
||||
"""
|
||||
|
||||
app = self.applications.get(app_name)
|
||||
if not app:
|
||||
raise ApplicationNotFound(app_name)
|
||||
|
||||
executable = app.find_executable()
|
||||
|
||||
return ApplicationLaunchContext(
|
||||
app, executable, **data
|
||||
)
|
||||
|
||||
def launch_with_context(self, launch_context):
|
||||
"""Launch application using existing launch context.
|
||||
|
||||
Args:
|
||||
launch_context (ApplicationLaunchContext): Prepared launch
|
||||
context.
|
||||
"""
|
||||
|
||||
if not launch_context.executable:
|
||||
raise ApplicationExecutableNotFound(launch_context.application)
|
||||
return launch_context.launch()
|
||||
|
||||
def launch(self, app_name, **data):
|
||||
"""Launch procedure.
|
||||
|
||||
For host application it's expected to contain "project_name",
|
||||
"folder_path" and "task_name".
|
||||
|
||||
Args:
|
||||
app_name (str): Name of application that should be launched.
|
||||
**data (dict): Any additional data. Data may be used during
|
||||
preparation to store objects usable in multiple places.
|
||||
|
||||
Raises:
|
||||
ApplicationNotFound: Application was not found by entered
|
||||
argument `app_name`.
|
||||
ApplicationExecutableNotFound: Executables in application definition
|
||||
were not found on this machine.
|
||||
ApplicationLaunchFailed: Something important for application launch
|
||||
failed. Exception should contain explanation message,
|
||||
traceback should not be needed.
|
||||
"""
|
||||
|
||||
context = self.create_launch_context(app_name, **data)
|
||||
return self.launch_with_context(context)
|
||||
|
||||
|
||||
class ApplicationLaunchContext:
|
||||
"""Context of launching application.
|
||||
|
||||
Main purpose of context is to prepare launch arguments and keyword
|
||||
arguments for new process. Most important part of keyword arguments
|
||||
preparations are environment variables.
|
||||
|
||||
During the whole process is possible to use `data` attribute to store
|
||||
object usable in multiple places.
|
||||
|
||||
Launch arguments are strings in list. It is possible to "chain" argument
|
||||
when order of them matters. That is possible to do with adding list where
|
||||
order is right and should not change.
|
||||
NOTE: This is recommendation, not requirement.
|
||||
e.g.: `["nuke.exe", "--NukeX"]` -> In this case any part of process may
|
||||
insert argument between `nuke.exe` and `--NukeX`. To keep them together
|
||||
it is better to wrap them in another list: `[["nuke.exe", "--NukeX"]]`.
|
||||
|
||||
Notes:
|
||||
It is possible to use launch context only to prepare environment
|
||||
variables. In that case `executable` may be None and can be used
|
||||
'run_prelaunch_hooks' method to run prelaunch hooks which prepare
|
||||
them.
|
||||
|
||||
Args:
|
||||
application (Application): Application definition.
|
||||
executable (ApplicationExecutable): Object with path to executable.
|
||||
env_group (Optional[str]): Environment variable group. If not set
|
||||
'DEFAULT_ENV_SUBGROUP' is used.
|
||||
launch_type (Optional[str]): Launch type. If not set 'local' is used.
|
||||
**data (dict): Any additional data. Data may be used during
|
||||
preparation to store objects usable in multiple places.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
application,
|
||||
executable,
|
||||
env_group=None,
|
||||
launch_type=None,
|
||||
**data
|
||||
):
|
||||
# Application object
|
||||
self.application = application
|
||||
|
||||
self.addons_manager = AddonsManager()
|
||||
|
||||
# Logger
|
||||
logger_name = "{}-{}".format(self.__class__.__name__,
|
||||
self.application.full_name)
|
||||
self.log = Logger.get_logger(logger_name)
|
||||
|
||||
self.executable = executable
|
||||
|
||||
if launch_type is None:
|
||||
launch_type = LaunchTypes.local
|
||||
self.launch_type = launch_type
|
||||
|
||||
if env_group is None:
|
||||
env_group = DEFAULT_ENV_SUBGROUP
|
||||
|
||||
self.env_group = env_group
|
||||
|
||||
self.data = dict(data)
|
||||
|
||||
launch_args = []
|
||||
if executable is not None:
|
||||
launch_args = executable.as_args()
|
||||
# subprocess.Popen launch arguments (first argument in constructor)
|
||||
self.launch_args = launch_args
|
||||
self.launch_args.extend(application.arguments)
|
||||
if self.data.get("app_args"):
|
||||
self.launch_args.extend(self.data.pop("app_args"))
|
||||
|
||||
# Handle launch environemtns
|
||||
src_env = self.data.pop("env", None)
|
||||
if src_env is not None and not isinstance(src_env, dict):
|
||||
self.log.warning((
|
||||
"Passed `env` kwarg has invalid type: {}. Expected: `dict`."
|
||||
" Using `os.environ` instead."
|
||||
).format(str(type(src_env))))
|
||||
src_env = None
|
||||
|
||||
if src_env is None:
|
||||
src_env = os.environ
|
||||
|
||||
ignored_env = {"QT_API", }
|
||||
env = {
|
||||
key: str(value)
|
||||
for key, value in src_env.items()
|
||||
if key not in ignored_env
|
||||
}
|
||||
# subprocess.Popen keyword arguments
|
||||
self.kwargs = {"env": env}
|
||||
|
||||
if platform.system().lower() == "windows":
|
||||
# Detach new process from currently running process on Windows
|
||||
flags = (
|
||||
subprocess.CREATE_NEW_PROCESS_GROUP
|
||||
| subprocess.DETACHED_PROCESS
|
||||
)
|
||||
self.kwargs["creationflags"] = flags
|
||||
|
||||
if not sys.stdout:
|
||||
self.kwargs["stdout"] = subprocess.DEVNULL
|
||||
self.kwargs["stderr"] = subprocess.DEVNULL
|
||||
|
||||
self.prelaunch_hooks = None
|
||||
self.postlaunch_hooks = None
|
||||
|
||||
self.process = None
|
||||
self._prelaunch_hooks_executed = False
|
||||
|
||||
@property
|
||||
def env(self):
|
||||
if (
|
||||
"env" not in self.kwargs
|
||||
or self.kwargs["env"] is None
|
||||
):
|
||||
self.kwargs["env"] = {}
|
||||
return self.kwargs["env"]
|
||||
|
||||
@env.setter
|
||||
def env(self, value):
|
||||
if not isinstance(value, dict):
|
||||
raise ValueError(
|
||||
"'env' attribute expect 'dict' object. Got: {}".format(
|
||||
str(type(value))
|
||||
)
|
||||
)
|
||||
self.kwargs["env"] = value
|
||||
|
||||
@property
|
||||
def modules_manager(self):
|
||||
"""
|
||||
Deprecated:
|
||||
Use 'addons_manager' instead.
|
||||
|
||||
"""
|
||||
return self.addons_manager
|
||||
|
||||
def _collect_addons_launch_hook_paths(self):
|
||||
"""Helper to collect application launch hooks from addons.
|
||||
|
||||
Module have to have implemented 'get_launch_hook_paths' method which
|
||||
can expect application as argument or nothing.
|
||||
|
||||
Returns:
|
||||
List[str]: Paths to launch hook directories.
|
||||
"""
|
||||
|
||||
expected_types = (list, tuple, set)
|
||||
|
||||
output = []
|
||||
for module in self.addons_manager.get_enabled_addons():
|
||||
# Skip module if does not have implemented 'get_launch_hook_paths'
|
||||
func = getattr(module, "get_launch_hook_paths", None)
|
||||
if func is None:
|
||||
continue
|
||||
|
||||
func = module.get_launch_hook_paths
|
||||
if hasattr(inspect, "signature"):
|
||||
sig = inspect.signature(func)
|
||||
expect_args = len(sig.parameters) > 0
|
||||
else:
|
||||
expect_args = len(inspect.getargspec(func)[0]) > 0
|
||||
|
||||
# Pass application argument if method expect it.
|
||||
try:
|
||||
if expect_args:
|
||||
hook_paths = func(self.application)
|
||||
else:
|
||||
hook_paths = func()
|
||||
except Exception:
|
||||
self.log.warning(
|
||||
"Failed to call 'get_launch_hook_paths'",
|
||||
exc_info=True
|
||||
)
|
||||
continue
|
||||
|
||||
if not hook_paths:
|
||||
continue
|
||||
|
||||
# Convert string to list
|
||||
if isinstance(hook_paths, six.string_types):
|
||||
hook_paths = [hook_paths]
|
||||
|
||||
# Skip invalid types
|
||||
if not isinstance(hook_paths, expected_types):
|
||||
self.log.warning((
|
||||
"Result of `get_launch_hook_paths`"
|
||||
" has invalid type {}. Expected {}"
|
||||
).format(type(hook_paths), expected_types))
|
||||
continue
|
||||
|
||||
output.extend(hook_paths)
|
||||
return output
|
||||
|
||||
def paths_to_launch_hooks(self):
|
||||
"""Directory paths where to look for launch hooks."""
|
||||
# This method has potential to be part of application manager (maybe).
|
||||
paths = []
|
||||
|
||||
# TODO load additional studio paths from settings
|
||||
global_hooks_dir = os.path.join(AYON_CORE_ROOT, "hooks")
|
||||
|
||||
hooks_dirs = [
|
||||
global_hooks_dir
|
||||
]
|
||||
if self.host_name:
|
||||
# If host requires launch hooks and is module then launch hooks
|
||||
# should be collected using 'collect_launch_hook_paths'
|
||||
# - module have to implement 'get_launch_hook_paths'
|
||||
host_module = self.addons_manager.get_host_addon(self.host_name)
|
||||
if not host_module:
|
||||
hooks_dirs.append(os.path.join(
|
||||
AYON_CORE_ROOT, "hosts", self.host_name, "hooks"
|
||||
))
|
||||
|
||||
for path in hooks_dirs:
|
||||
if (
|
||||
os.path.exists(path)
|
||||
and os.path.isdir(path)
|
||||
and path not in paths
|
||||
):
|
||||
paths.append(path)
|
||||
|
||||
# Load modules paths
|
||||
paths.extend(self._collect_addons_launch_hook_paths())
|
||||
|
||||
return paths
|
||||
|
||||
def discover_launch_hooks(self, force=False):
|
||||
"""Load and prepare launch hooks."""
|
||||
if (
|
||||
self.prelaunch_hooks is not None
|
||||
or self.postlaunch_hooks is not None
|
||||
):
|
||||
if not force:
|
||||
self.log.info("Launch hooks were already discovered.")
|
||||
return
|
||||
|
||||
self.prelaunch_hooks.clear()
|
||||
self.postlaunch_hooks.clear()
|
||||
|
||||
self.log.debug("Discovery of launch hooks started.")
|
||||
|
||||
paths = self.paths_to_launch_hooks()
|
||||
self.log.debug("Paths searched for launch hooks:\n{}".format(
|
||||
"\n".join("- {}".format(path) for path in paths)
|
||||
))
|
||||
|
||||
all_classes = {
|
||||
"pre": [],
|
||||
"post": []
|
||||
}
|
||||
for path in paths:
|
||||
if not os.path.exists(path):
|
||||
self.log.info(
|
||||
"Path to launch hooks does not exist: \"{}\"".format(path)
|
||||
)
|
||||
continue
|
||||
|
||||
modules, _crashed = modules_from_path(path)
|
||||
for _filepath, module in modules:
|
||||
all_classes["pre"].extend(
|
||||
classes_from_module(PreLaunchHook, module)
|
||||
)
|
||||
all_classes["post"].extend(
|
||||
classes_from_module(PostLaunchHook, module)
|
||||
)
|
||||
|
||||
for launch_type, classes in all_classes.items():
|
||||
hooks_with_order = []
|
||||
hooks_without_order = []
|
||||
for klass in classes:
|
||||
try:
|
||||
hook = klass(self)
|
||||
if not hook.is_valid:
|
||||
self.log.debug(
|
||||
"Skipped hook invalid for current launch context: "
|
||||
"{}".format(klass.__name__)
|
||||
)
|
||||
continue
|
||||
|
||||
if inspect.isabstract(hook):
|
||||
self.log.debug("Skipped abstract hook: {}".format(
|
||||
klass.__name__
|
||||
))
|
||||
continue
|
||||
|
||||
# Separate hooks by pre/post class
|
||||
if hook.order is None:
|
||||
hooks_without_order.append(hook)
|
||||
else:
|
||||
hooks_with_order.append(hook)
|
||||
|
||||
except Exception:
|
||||
self.log.warning(
|
||||
"Initialization of hook failed: "
|
||||
"{}".format(klass.__name__),
|
||||
exc_info=True
|
||||
)
|
||||
|
||||
# Sort hooks with order by order
|
||||
ordered_hooks = list(sorted(
|
||||
hooks_with_order, key=lambda obj: obj.order
|
||||
))
|
||||
# Extend ordered hooks with hooks without defined order
|
||||
ordered_hooks.extend(hooks_without_order)
|
||||
|
||||
if launch_type == "pre":
|
||||
self.prelaunch_hooks = ordered_hooks
|
||||
else:
|
||||
self.postlaunch_hooks = ordered_hooks
|
||||
|
||||
self.log.debug("Found {} prelaunch and {} postlaunch hooks.".format(
|
||||
len(self.prelaunch_hooks), len(self.postlaunch_hooks)
|
||||
))
|
||||
|
||||
@property
|
||||
def app_name(self):
|
||||
return self.application.name
|
||||
|
||||
@property
|
||||
def host_name(self):
|
||||
return self.application.host_name
|
||||
|
||||
@property
|
||||
def app_group(self):
|
||||
return self.application.group
|
||||
|
||||
@property
|
||||
def manager(self):
|
||||
return self.application.manager
|
||||
|
||||
def _run_process(self):
|
||||
# Windows and MacOS have easier process start
|
||||
low_platform = platform.system().lower()
|
||||
if low_platform in ("windows", "darwin"):
|
||||
return subprocess.Popen(self.launch_args, **self.kwargs)
|
||||
|
||||
# Linux uses mid process
|
||||
# - it is possible that the mid process executable is not
|
||||
# available for this version of AYON in that case use standard
|
||||
# launch
|
||||
launch_args = get_linux_launcher_args()
|
||||
if launch_args is None:
|
||||
return subprocess.Popen(self.launch_args, **self.kwargs)
|
||||
|
||||
# Prepare data that will be passed to midprocess
|
||||
# - store arguments to a json and pass path to json as last argument
|
||||
# - pass environments to set
|
||||
app_env = self.kwargs.pop("env", {})
|
||||
json_data = {
|
||||
"args": self.launch_args,
|
||||
"env": app_env
|
||||
}
|
||||
if app_env:
|
||||
# Filter environments of subprocess
|
||||
self.kwargs["env"] = {
|
||||
key: value
|
||||
for key, value in os.environ.items()
|
||||
if key in app_env
|
||||
}
|
||||
|
||||
# Create temp file
|
||||
json_temp = tempfile.NamedTemporaryFile(
|
||||
mode="w", prefix="op_app_args", suffix=".json", delete=False
|
||||
)
|
||||
json_temp.close()
|
||||
json_temp_filpath = json_temp.name
|
||||
with open(json_temp_filpath, "w") as stream:
|
||||
json.dump(json_data, stream)
|
||||
|
||||
launch_args.append(json_temp_filpath)
|
||||
|
||||
# Create mid-process which will launch application
|
||||
process = subprocess.Popen(launch_args, **self.kwargs)
|
||||
# Wait until the process finishes
|
||||
# - This is important! The process would stay in "open" state.
|
||||
process.wait()
|
||||
# Remove the temp file
|
||||
os.remove(json_temp_filpath)
|
||||
# Return process which is already terminated
|
||||
return process
|
||||
|
||||
def run_prelaunch_hooks(self):
|
||||
"""Run prelaunch hooks.
|
||||
|
||||
This method will be executed only once, any future calls will skip
|
||||
the processing.
|
||||
"""
|
||||
|
||||
if self._prelaunch_hooks_executed:
|
||||
self.log.warning("Prelaunch hooks were already executed.")
|
||||
return
|
||||
# Discover launch hooks
|
||||
self.discover_launch_hooks()
|
||||
|
||||
# Execute prelaunch hooks
|
||||
for prelaunch_hook in self.prelaunch_hooks:
|
||||
self.log.debug("Executing prelaunch hook: {}".format(
|
||||
str(prelaunch_hook.__class__.__name__)
|
||||
))
|
||||
prelaunch_hook.execute()
|
||||
self._prelaunch_hooks_executed = True
|
||||
|
||||
def launch(self):
|
||||
"""Collect data for new process and then create it.
|
||||
|
||||
This method must not be executed more than once.
|
||||
|
||||
Returns:
|
||||
subprocess.Popen: Created process as Popen object.
|
||||
"""
|
||||
if self.process is not None:
|
||||
self.log.warning("Application was already launched.")
|
||||
return
|
||||
|
||||
if not self._prelaunch_hooks_executed:
|
||||
self.run_prelaunch_hooks()
|
||||
|
||||
self.log.debug("All prelaunch hook executed. Starting new process.")
|
||||
|
||||
# Prepare subprocess args
|
||||
args_len_str = ""
|
||||
if isinstance(self.launch_args, str):
|
||||
args = self.launch_args
|
||||
else:
|
||||
args = self.clear_launch_args(self.launch_args)
|
||||
args_len_str = " ({})".format(len(args))
|
||||
self.log.info(
|
||||
"Launching \"{}\" with args{}: {}".format(
|
||||
self.application.full_name, args_len_str, args
|
||||
)
|
||||
)
|
||||
self.launch_args = args
|
||||
|
||||
# Run process
|
||||
self.process = self._run_process()
|
||||
|
||||
# Process post launch hooks
|
||||
for postlaunch_hook in self.postlaunch_hooks:
|
||||
self.log.debug("Executing postlaunch hook: {}".format(
|
||||
str(postlaunch_hook.__class__.__name__)
|
||||
))
|
||||
|
||||
# TODO how to handle errors?
|
||||
# - store to variable to let them accessible?
|
||||
try:
|
||||
postlaunch_hook.execute()
|
||||
|
||||
except Exception:
|
||||
self.log.warning(
|
||||
"After launch procedures were not successful.",
|
||||
exc_info=True
|
||||
)
|
||||
|
||||
self.log.debug("Launch of {} finished.".format(
|
||||
self.application.full_name
|
||||
))
|
||||
|
||||
return self.process
|
||||
|
||||
@staticmethod
|
||||
def clear_launch_args(args):
|
||||
"""Collect launch arguments to final order.
|
||||
|
||||
Launch argument should be list that may contain another lists this
|
||||
function will upack inner lists and keep ordering.
|
||||
|
||||
```
|
||||
# source
|
||||
[ [ arg1, [ arg2, arg3 ] ], arg4, [arg5, arg6]]
|
||||
# result
|
||||
[ arg1, arg2, arg3, arg4, arg5, arg6]
|
||||
|
||||
Args:
|
||||
args (list): Source arguments in list may contain inner lists.
|
||||
|
||||
Return:
|
||||
list: Unpacked arguments.
|
||||
"""
|
||||
if isinstance(args, str):
|
||||
return args
|
||||
all_cleared = False
|
||||
while not all_cleared:
|
||||
all_cleared = True
|
||||
new_args = []
|
||||
for arg in args:
|
||||
if isinstance(arg, (list, tuple, set)):
|
||||
all_cleared = False
|
||||
for _arg in arg:
|
||||
new_args.append(_arg)
|
||||
else:
|
||||
new_args.append(arg)
|
||||
args = new_args
|
||||
|
||||
return args
|
||||
|
||||
|
|
@ -0,0 +1,48 @@
|
|||
"""
|
||||
Run after global plugin 'CollectHostName' in ayon_core.
|
||||
|
||||
Requires:
|
||||
None
|
||||
|
||||
Provides:
|
||||
context -> hostName (str)
|
||||
context -> appName (str)
|
||||
context -> appLabel (str)
|
||||
"""
|
||||
import os
|
||||
import pyblish.api
|
||||
|
||||
from ayon_applications import ApplicationManager
|
||||
|
||||
|
||||
class CollectAppName(pyblish.api.ContextPlugin):
|
||||
"""Collect avalon host name to context."""
|
||||
|
||||
label = "Collect App Name"
|
||||
order = pyblish.api.CollectorOrder - 0.499999
|
||||
|
||||
def process(self, context):
|
||||
host_name = context.data.get("hostName")
|
||||
app_name = context.data.get("appName")
|
||||
app_label = context.data.get("appLabel")
|
||||
# Don't override value if is already set
|
||||
if host_name and app_name and app_label:
|
||||
return
|
||||
|
||||
# Use AYON_APP_NAME to get full app name
|
||||
if not app_name:
|
||||
app_name = os.environ.get("AYON_APP_NAME")
|
||||
|
||||
# Fill missing values based on app full name
|
||||
if (not host_name or not app_label) and app_name:
|
||||
app_manager = ApplicationManager()
|
||||
app = app_manager.applications.get(app_name)
|
||||
if app:
|
||||
if not host_name:
|
||||
host_name = app.host_name
|
||||
if not app_label:
|
||||
app_label = app.full_label
|
||||
|
||||
context.data["hostName"] = host_name
|
||||
context.data["appName"] = app_name
|
||||
context.data["appLabel"] = app_label
|
||||
609
client/ayon_core/addons/applications/ayon_applications/utils.py
Normal file
609
client/ayon_core/addons/applications/ayon_applications/utils.py
Normal file
|
|
@ -0,0 +1,609 @@
|
|||
import os
|
||||
import copy
|
||||
import json
|
||||
import platform
|
||||
import collections
|
||||
|
||||
import six
|
||||
import acre
|
||||
|
||||
from ayon_core import AYON_CORE_ROOT
|
||||
from ayon_core.settings import get_project_settings
|
||||
from ayon_core.lib import Logger, get_ayon_username
|
||||
from ayon_core.addon import AddonsManager
|
||||
from ayon_core.pipeline import HOST_WORKFILE_EXTENSIONS
|
||||
from ayon_core.pipeline.template_data import get_template_data
|
||||
from ayon_core.pipeline.workfile import (
|
||||
get_workfile_template_key,
|
||||
get_workdir_with_workdir_data,
|
||||
get_last_workfile,
|
||||
should_use_last_workfile_on_launch,
|
||||
should_open_workfiles_tool_on_launch,
|
||||
)
|
||||
|
||||
from .constants import PLATFORM_NAMES, DEFAULT_ENV_SUBGROUP
|
||||
from .exceptions import MissingRequiredKey, ApplicationLaunchFailed
|
||||
from .manager import ApplicationManager
|
||||
|
||||
|
||||
def parse_environments(env_data, env_group=None, platform_name=None):
|
||||
"""Parse environment values from settings byt group and platform.
|
||||
|
||||
Data may contain up to 2 hierarchical levels of dictionaries. At the end
|
||||
of the last level must be string or list. List is joined using platform
|
||||
specific joiner (';' for windows and ':' for linux and mac).
|
||||
|
||||
Hierarchical levels can contain keys for subgroups and platform name.
|
||||
Platform specific values must be always last level of dictionary. Platform
|
||||
names are "windows" (MS Windows), "linux" (any linux distribution) and
|
||||
"darwin" (any MacOS distribution).
|
||||
|
||||
Subgroups are helpers added mainly for standard and on farm usage. Farm
|
||||
may require different environments for e.g. licence related values or
|
||||
plugins. Default subgroup is "standard".
|
||||
|
||||
Examples:
|
||||
```
|
||||
{
|
||||
# Unchanged value
|
||||
"ENV_KEY1": "value",
|
||||
# Empty values are kept (unset environment variable)
|
||||
"ENV_KEY2": "",
|
||||
|
||||
# Join list values with ':' or ';'
|
||||
"ENV_KEY3": ["value1", "value2"],
|
||||
|
||||
# Environment groups
|
||||
"ENV_KEY4": {
|
||||
"standard": "DEMO_SERVER_URL",
|
||||
"farm": "LICENCE_SERVER_URL"
|
||||
},
|
||||
|
||||
# Platform specific (and only for windows and mac)
|
||||
"ENV_KEY5": {
|
||||
"windows": "windows value",
|
||||
"darwin": ["value 1", "value 2"]
|
||||
},
|
||||
|
||||
# Environment groups and platform combination
|
||||
"ENV_KEY6": {
|
||||
"farm": "FARM_VALUE",
|
||||
"standard": {
|
||||
"windows": ["value1", "value2"],
|
||||
"linux": "value1",
|
||||
"darwin": ""
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
"""
|
||||
output = {}
|
||||
if not env_data:
|
||||
return output
|
||||
|
||||
if not env_group:
|
||||
env_group = DEFAULT_ENV_SUBGROUP
|
||||
|
||||
if not platform_name:
|
||||
platform_name = platform.system().lower()
|
||||
|
||||
for key, value in env_data.items():
|
||||
if isinstance(value, dict):
|
||||
# Look if any key is platform key
|
||||
# - expect that represents environment group if does not contain
|
||||
# platform keys
|
||||
if not PLATFORM_NAMES.intersection(set(value.keys())):
|
||||
# Skip the key if group is not available
|
||||
if env_group not in value:
|
||||
continue
|
||||
value = value[env_group]
|
||||
|
||||
# Check again if value is dictionary
|
||||
# - this time there should be only platform keys
|
||||
if isinstance(value, dict):
|
||||
value = value.get(platform_name)
|
||||
|
||||
# Check if value is list and join it's values
|
||||
# QUESTION Should empty values be skipped?
|
||||
if isinstance(value, (list, tuple)):
|
||||
value = os.pathsep.join(value)
|
||||
|
||||
# Set key to output if value is string
|
||||
if isinstance(value, six.string_types):
|
||||
output[key] = value
|
||||
return output
|
||||
|
||||
|
||||
class EnvironmentPrepData(dict):
|
||||
"""Helper dictionary for storin temp data during environment prep.
|
||||
|
||||
Args:
|
||||
data (dict): Data must contain required keys.
|
||||
"""
|
||||
required_keys = (
|
||||
"project_entity", "folder_entity", "task_entity", "app", "anatomy"
|
||||
)
|
||||
|
||||
def __init__(self, data):
|
||||
for key in self.required_keys:
|
||||
if key not in data:
|
||||
raise MissingRequiredKey(key)
|
||||
|
||||
if not data.get("log"):
|
||||
data["log"] = Logger.get_logger("EnvironmentPrepData")
|
||||
|
||||
if data.get("env") is None:
|
||||
data["env"] = os.environ.copy()
|
||||
|
||||
project_name = data["project_entity"]["name"]
|
||||
if "project_settings" not in data:
|
||||
data["project_settings"] = get_project_settings(project_name)
|
||||
|
||||
super(EnvironmentPrepData, self).__init__(data)
|
||||
|
||||
|
||||
def get_app_environments_for_context(
|
||||
project_name,
|
||||
folder_path,
|
||||
task_name,
|
||||
app_name,
|
||||
env_group=None,
|
||||
launch_type=None,
|
||||
env=None,
|
||||
addons_manager=None
|
||||
):
|
||||
"""Prepare environment variables by context.
|
||||
Args:
|
||||
project_name (str): Name of project.
|
||||
folder_path (str): Folder path.
|
||||
task_name (str): Name of task.
|
||||
app_name (str): Name of application that is launched and can be found
|
||||
by ApplicationManager.
|
||||
env_group (Optional[str]): Name of environment group. If not passed
|
||||
default group is used.
|
||||
launch_type (Optional[str]): Type for which prelaunch hooks are
|
||||
executed.
|
||||
env (Optional[dict[str, str]]): Initial environment variables.
|
||||
`os.environ` is used when not passed.
|
||||
addons_manager (Optional[AddonsManager]): Initialized modules
|
||||
manager.
|
||||
|
||||
Returns:
|
||||
dict: Environments for passed context and application.
|
||||
"""
|
||||
|
||||
# Prepare app object which can be obtained only from ApplicationManager
|
||||
app_manager = ApplicationManager()
|
||||
context = app_manager.create_launch_context(
|
||||
app_name,
|
||||
project_name=project_name,
|
||||
folder_path=folder_path,
|
||||
task_name=task_name,
|
||||
env_group=env_group,
|
||||
launch_type=launch_type,
|
||||
env=env,
|
||||
addons_manager=addons_manager,
|
||||
modules_manager=addons_manager,
|
||||
)
|
||||
context.run_prelaunch_hooks()
|
||||
return context.env
|
||||
|
||||
|
||||
def _merge_env(env, current_env):
|
||||
"""Modified function(merge) from acre module."""
|
||||
result = current_env.copy()
|
||||
for key, value in env.items():
|
||||
# Keep missing keys by not filling `missing` kwarg
|
||||
value = acre.lib.partial_format(value, data=current_env)
|
||||
result[key] = value
|
||||
return result
|
||||
|
||||
|
||||
def _add_python_version_paths(app, env, logger, addons_manager):
|
||||
"""Add vendor packages specific for a Python version."""
|
||||
|
||||
for addon in addons_manager.get_enabled_addons():
|
||||
addon.modify_application_launch_arguments(app, env)
|
||||
|
||||
# Skip adding if host name is not set
|
||||
if not app.host_name:
|
||||
return
|
||||
|
||||
# Add Python 2/3 modules
|
||||
python_vendor_dir = os.path.join(
|
||||
AYON_CORE_ROOT,
|
||||
"vendor",
|
||||
"python"
|
||||
)
|
||||
if app.use_python_2:
|
||||
pythonpath = os.path.join(python_vendor_dir, "python_2")
|
||||
else:
|
||||
pythonpath = os.path.join(python_vendor_dir, "python_3")
|
||||
|
||||
if not os.path.exists(pythonpath):
|
||||
return
|
||||
|
||||
logger.debug("Adding Python version specific paths to PYTHONPATH")
|
||||
python_paths = [pythonpath]
|
||||
|
||||
# Load PYTHONPATH from current launch context
|
||||
python_path = env.get("PYTHONPATH")
|
||||
if python_path:
|
||||
python_paths.append(python_path)
|
||||
|
||||
# Set new PYTHONPATH to launch context environments
|
||||
env["PYTHONPATH"] = os.pathsep.join(python_paths)
|
||||
|
||||
|
||||
def prepare_app_environments(
|
||||
data, env_group=None, implementation_envs=True, addons_manager=None
|
||||
):
|
||||
"""Modify launch environments based on launched app and context.
|
||||
|
||||
Args:
|
||||
data (EnvironmentPrepData): Dictionary where result and intermediate
|
||||
result will be stored.
|
||||
|
||||
"""
|
||||
app = data["app"]
|
||||
log = data["log"]
|
||||
source_env = data["env"].copy()
|
||||
|
||||
if addons_manager is None:
|
||||
addons_manager = AddonsManager()
|
||||
|
||||
_add_python_version_paths(app, source_env, log, addons_manager)
|
||||
|
||||
# Use environments from local settings
|
||||
filtered_local_envs = {}
|
||||
# NOTE Overrides for environment variables are not implemented in AYON.
|
||||
# project_settings = data["project_settings"]
|
||||
# whitelist_envs = project_settings["general"].get("local_env_white_list")
|
||||
# if whitelist_envs:
|
||||
# local_settings = get_local_settings()
|
||||
# local_envs = local_settings.get("environments") or {}
|
||||
# filtered_local_envs = {
|
||||
# key: value
|
||||
# for key, value in local_envs.items()
|
||||
# if key in whitelist_envs
|
||||
# }
|
||||
|
||||
# Apply local environment variables for already existing values
|
||||
for key, value in filtered_local_envs.items():
|
||||
if key in source_env:
|
||||
source_env[key] = value
|
||||
|
||||
# `app_and_tool_labels` has debug purpose
|
||||
app_and_tool_labels = [app.full_name]
|
||||
# Environments for application
|
||||
environments = [
|
||||
app.group.environment,
|
||||
app.environment
|
||||
]
|
||||
|
||||
folder_entity = data.get("folder_entity")
|
||||
# Add tools environments
|
||||
groups_by_name = {}
|
||||
tool_by_group_name = collections.defaultdict(dict)
|
||||
if folder_entity:
|
||||
# Make sure each tool group can be added only once
|
||||
for key in folder_entity["attrib"].get("tools") or []:
|
||||
tool = app.manager.tools.get(key)
|
||||
if not tool or not tool.is_valid_for_app(app):
|
||||
continue
|
||||
groups_by_name[tool.group.name] = tool.group
|
||||
tool_by_group_name[tool.group.name][tool.name] = tool
|
||||
|
||||
for group_name in sorted(groups_by_name.keys()):
|
||||
group = groups_by_name[group_name]
|
||||
environments.append(group.environment)
|
||||
for tool_name in sorted(tool_by_group_name[group_name].keys()):
|
||||
tool = tool_by_group_name[group_name][tool_name]
|
||||
environments.append(tool.environment)
|
||||
app_and_tool_labels.append(tool.full_name)
|
||||
|
||||
log.debug(
|
||||
"Will add environments for apps and tools: {}".format(
|
||||
", ".join(app_and_tool_labels)
|
||||
)
|
||||
)
|
||||
|
||||
env_values = {}
|
||||
for _env_values in environments:
|
||||
if not _env_values:
|
||||
continue
|
||||
|
||||
# Choose right platform
|
||||
tool_env = parse_environments(_env_values, env_group)
|
||||
|
||||
# Apply local environment variables
|
||||
# - must happen between all values because they may be used during
|
||||
# merge
|
||||
for key, value in filtered_local_envs.items():
|
||||
if key in tool_env:
|
||||
tool_env[key] = value
|
||||
|
||||
# Merge dictionaries
|
||||
env_values = _merge_env(tool_env, env_values)
|
||||
|
||||
merged_env = _merge_env(env_values, source_env)
|
||||
|
||||
loaded_env = acre.compute(merged_env, cleanup=False)
|
||||
|
||||
final_env = None
|
||||
# Add host specific environments
|
||||
if app.host_name and implementation_envs:
|
||||
host_addon = addons_manager.get_host_addon(app.host_name)
|
||||
add_implementation_envs = None
|
||||
if host_addon:
|
||||
add_implementation_envs = getattr(
|
||||
host_addon, "add_implementation_envs", None
|
||||
)
|
||||
if add_implementation_envs:
|
||||
# Function may only modify passed dict without returning value
|
||||
final_env = add_implementation_envs(loaded_env, app)
|
||||
|
||||
if final_env is None:
|
||||
final_env = loaded_env
|
||||
|
||||
keys_to_remove = set(source_env.keys()) - set(final_env.keys())
|
||||
|
||||
# Update env
|
||||
data["env"].update(final_env)
|
||||
for key in keys_to_remove:
|
||||
data["env"].pop(key, None)
|
||||
|
||||
|
||||
def apply_project_environments_value(
|
||||
project_name, env, project_settings=None, env_group=None
|
||||
):
|
||||
"""Apply project specific environments on passed environments.
|
||||
|
||||
The environments are applied on passed `env` argument value so it is not
|
||||
required to apply changes back.
|
||||
|
||||
Args:
|
||||
project_name (str): Name of project for which environments should be
|
||||
received.
|
||||
env (dict): Environment values on which project specific environments
|
||||
will be applied.
|
||||
project_settings (dict): Project settings for passed project name.
|
||||
Optional if project settings are already prepared.
|
||||
|
||||
Returns:
|
||||
dict: Passed env values with applied project environments.
|
||||
|
||||
Raises:
|
||||
KeyError: If project settings do not contain keys for project specific
|
||||
environments.
|
||||
|
||||
"""
|
||||
if project_settings is None:
|
||||
project_settings = get_project_settings(project_name)
|
||||
|
||||
env_value = project_settings["core"]["project_environments"]
|
||||
if env_value:
|
||||
env_value = json.loads(env_value)
|
||||
parsed_value = parse_environments(env_value, env_group)
|
||||
env.update(acre.compute(
|
||||
_merge_env(parsed_value, env),
|
||||
cleanup=False
|
||||
))
|
||||
return env
|
||||
|
||||
|
||||
def prepare_context_environments(data, env_group=None, addons_manager=None):
|
||||
"""Modify launch environments with context data for launched host.
|
||||
|
||||
Args:
|
||||
data (EnvironmentPrepData): Dictionary where result and intermediate
|
||||
result will be stored.
|
||||
|
||||
"""
|
||||
# Context environments
|
||||
log = data["log"]
|
||||
|
||||
project_entity = data["project_entity"]
|
||||
folder_entity = data["folder_entity"]
|
||||
task_entity = data["task_entity"]
|
||||
if not project_entity:
|
||||
log.info(
|
||||
"Skipping context environments preparation."
|
||||
" Launch context does not contain required data."
|
||||
)
|
||||
return
|
||||
|
||||
# Load project specific environments
|
||||
project_name = project_entity["name"]
|
||||
project_settings = get_project_settings(project_name)
|
||||
data["project_settings"] = project_settings
|
||||
|
||||
app = data["app"]
|
||||
context_env = {
|
||||
"AYON_PROJECT_NAME": project_entity["name"],
|
||||
"AYON_APP_NAME": app.full_name
|
||||
}
|
||||
if folder_entity:
|
||||
folder_path = folder_entity["path"]
|
||||
context_env["AYON_FOLDER_PATH"] = folder_path
|
||||
|
||||
if task_entity:
|
||||
context_env["AYON_TASK_NAME"] = task_entity["name"]
|
||||
|
||||
log.debug(
|
||||
"Context environments set:\n{}".format(
|
||||
json.dumps(context_env, indent=4)
|
||||
)
|
||||
)
|
||||
data["env"].update(context_env)
|
||||
|
||||
# Apply project specific environments on current env value
|
||||
# - apply them once the context environments are set
|
||||
apply_project_environments_value(
|
||||
project_name, data["env"], project_settings, env_group
|
||||
)
|
||||
|
||||
if not app.is_host:
|
||||
return
|
||||
|
||||
data["env"]["AYON_HOST_NAME"] = app.host_name
|
||||
|
||||
if not folder_entity or not task_entity:
|
||||
# QUESTION replace with log.info and skip workfile discovery?
|
||||
# - technically it should be possible to launch host without context
|
||||
raise ApplicationLaunchFailed(
|
||||
"Host launch require folder and task context."
|
||||
)
|
||||
|
||||
workdir_data = get_template_data(
|
||||
project_entity,
|
||||
folder_entity,
|
||||
task_entity,
|
||||
app.host_name,
|
||||
project_settings
|
||||
)
|
||||
data["workdir_data"] = workdir_data
|
||||
|
||||
anatomy = data["anatomy"]
|
||||
|
||||
task_type = workdir_data["task"]["type"]
|
||||
# Temp solution how to pass task type to `_prepare_last_workfile`
|
||||
data["task_type"] = task_type
|
||||
|
||||
try:
|
||||
workdir = get_workdir_with_workdir_data(
|
||||
workdir_data,
|
||||
anatomy.project_name,
|
||||
anatomy,
|
||||
project_settings=project_settings
|
||||
)
|
||||
|
||||
except Exception as exc:
|
||||
raise ApplicationLaunchFailed(
|
||||
"Error in anatomy.format: {}".format(str(exc))
|
||||
)
|
||||
|
||||
if not os.path.exists(workdir):
|
||||
log.debug(
|
||||
"Creating workdir folder: \"{}\"".format(workdir)
|
||||
)
|
||||
try:
|
||||
os.makedirs(workdir)
|
||||
except Exception as exc:
|
||||
raise ApplicationLaunchFailed(
|
||||
"Couldn't create workdir because: {}".format(str(exc))
|
||||
)
|
||||
|
||||
data["env"]["AYON_WORKDIR"] = workdir
|
||||
|
||||
_prepare_last_workfile(data, workdir, addons_manager)
|
||||
|
||||
|
||||
def _prepare_last_workfile(data, workdir, addons_manager):
|
||||
"""last workfile workflow preparation.
|
||||
|
||||
Function check if should care about last workfile workflow and tries
|
||||
to find the last workfile. Both information are stored to `data` and
|
||||
environments.
|
||||
|
||||
Last workfile is filled always (with version 1) even if any workfile
|
||||
exists yet.
|
||||
|
||||
Args:
|
||||
data (EnvironmentPrepData): Dictionary where result and intermediate
|
||||
result will be stored.
|
||||
workdir (str): Path to folder where workfiles should be stored.
|
||||
|
||||
"""
|
||||
if not addons_manager:
|
||||
addons_manager = AddonsManager()
|
||||
|
||||
log = data["log"]
|
||||
|
||||
_workdir_data = data.get("workdir_data")
|
||||
if not _workdir_data:
|
||||
log.info(
|
||||
"Skipping last workfile preparation."
|
||||
" Key `workdir_data` not filled."
|
||||
)
|
||||
return
|
||||
|
||||
app = data["app"]
|
||||
workdir_data = copy.deepcopy(_workdir_data)
|
||||
project_name = data["project_name"]
|
||||
task_name = data["task_name"]
|
||||
task_type = data["task_type"]
|
||||
|
||||
start_last_workfile = data.get("start_last_workfile")
|
||||
if start_last_workfile is None:
|
||||
start_last_workfile = should_use_last_workfile_on_launch(
|
||||
project_name, app.host_name, task_name, task_type
|
||||
)
|
||||
else:
|
||||
log.info("Opening of last workfile was disabled by user")
|
||||
|
||||
data["start_last_workfile"] = start_last_workfile
|
||||
|
||||
workfile_startup = should_open_workfiles_tool_on_launch(
|
||||
project_name, app.host_name, task_name, task_type
|
||||
)
|
||||
data["workfile_startup"] = workfile_startup
|
||||
|
||||
# Store boolean as "0"(False) or "1"(True)
|
||||
data["env"]["AVALON_OPEN_LAST_WORKFILE"] = (
|
||||
str(int(bool(start_last_workfile)))
|
||||
)
|
||||
data["env"]["AYON_WORKFILE_TOOL_ON_START"] = (
|
||||
str(int(bool(workfile_startup)))
|
||||
)
|
||||
|
||||
_sub_msg = "" if start_last_workfile else " not"
|
||||
log.debug(
|
||||
"Last workfile should{} be opened on start.".format(_sub_msg)
|
||||
)
|
||||
|
||||
# Last workfile path
|
||||
last_workfile_path = data.get("last_workfile_path") or ""
|
||||
if not last_workfile_path:
|
||||
host_addon = addons_manager.get_host_addon(app.host_name)
|
||||
if host_addon:
|
||||
extensions = host_addon.get_workfile_extensions()
|
||||
else:
|
||||
extensions = HOST_WORKFILE_EXTENSIONS.get(app.host_name)
|
||||
|
||||
if extensions:
|
||||
anatomy = data["anatomy"]
|
||||
project_settings = data["project_settings"]
|
||||
task_type = workdir_data["task"]["type"]
|
||||
template_key = get_workfile_template_key(
|
||||
project_name,
|
||||
task_type,
|
||||
app.host_name,
|
||||
project_settings=project_settings
|
||||
)
|
||||
# Find last workfile
|
||||
file_template = anatomy.get_template_item(
|
||||
"work", template_key, "file"
|
||||
).template
|
||||
|
||||
workdir_data.update({
|
||||
"version": 1,
|
||||
"user": get_ayon_username(),
|
||||
"ext": extensions[0]
|
||||
})
|
||||
|
||||
last_workfile_path = get_last_workfile(
|
||||
workdir, file_template, workdir_data, extensions, True
|
||||
)
|
||||
|
||||
if os.path.exists(last_workfile_path):
|
||||
log.debug((
|
||||
"Workfiles for launch context does not exists"
|
||||
" yet but path will be set."
|
||||
))
|
||||
log.debug(
|
||||
"Setting last workfile path: {}".format(last_workfile_path)
|
||||
)
|
||||
|
||||
data["env"]["AYON_LAST_WORKFILE"] = last_workfile_path
|
||||
data["last_workfile_path"] = last_workfile_path
|
||||
|
|
@ -4,6 +4,7 @@ import os
|
|||
import sys
|
||||
import code
|
||||
import traceback
|
||||
from pathlib import Path
|
||||
|
||||
import click
|
||||
import acre
|
||||
|
|
@ -11,6 +12,7 @@ import acre
|
|||
from ayon_core import AYON_CORE_ROOT
|
||||
from ayon_core.addon import AddonsManager
|
||||
from ayon_core.settings import get_general_environments
|
||||
from ayon_core.lib import initialize_ayon_connection, is_running_from_build
|
||||
|
||||
from .cli_commands import Commands
|
||||
|
||||
|
|
@ -80,7 +82,7 @@ main_cli.set_alias("addon", "module")
|
|||
@main_cli.command()
|
||||
@click.argument("output_json_path")
|
||||
@click.option("--project", help="Project name", default=None)
|
||||
@click.option("--asset", help="Asset name", default=None)
|
||||
@click.option("--asset", help="Folder path", default=None)
|
||||
@click.option("--task", help="Task name", default=None)
|
||||
@click.option("--app", help="Application name", default=None)
|
||||
@click.option(
|
||||
|
|
@ -95,6 +97,10 @@ def extractenvironments(output_json_path, project, asset, task, app, envgroup):
|
|||
environments will be extracted.
|
||||
|
||||
Context options are "project", "asset", "task", "app"
|
||||
|
||||
Deprecated:
|
||||
This function is deprecated and will be removed in future. Please use
|
||||
'addon applications extractenvironments ...' instead.
|
||||
"""
|
||||
Commands.extractenvironments(
|
||||
output_json_path, project, asset, task, app, envgroup
|
||||
|
|
@ -102,19 +108,18 @@ def extractenvironments(output_json_path, project, asset, task, app, envgroup):
|
|||
|
||||
|
||||
@main_cli.command()
|
||||
@click.argument("paths", nargs=-1)
|
||||
@click.option("-t", "--targets", help="Targets module", default=None,
|
||||
@click.argument("path", required=True)
|
||||
@click.option("-t", "--targets", help="Targets", default=None,
|
||||
multiple=True)
|
||||
@click.option("-g", "--gui", is_flag=True,
|
||||
help="Show Publish UI", default=False)
|
||||
def publish(paths, targets, gui):
|
||||
def publish(path, targets, gui):
|
||||
"""Start CLI publishing.
|
||||
|
||||
Publish collects json from paths provided as an argument.
|
||||
More than one path is allowed.
|
||||
Publish collects json from path provided as an argument.
|
||||
S
|
||||
"""
|
||||
|
||||
Commands.publish(list(paths), targets, gui)
|
||||
Commands.publish(path, targets, gui)
|
||||
|
||||
|
||||
@main_cli.command(context_settings={"ignore_unknown_options": True})
|
||||
|
|
@ -127,7 +132,7 @@ def publish_report_viewer():
|
|||
@main_cli.command()
|
||||
@click.argument("output_path")
|
||||
@click.option("--project", help="Define project context")
|
||||
@click.option("--asset", help="Define asset in project (project must be set)")
|
||||
@click.option("--folder", help="Define folder in project (project must be set)")
|
||||
@click.option(
|
||||
"--strict",
|
||||
is_flag=True,
|
||||
|
|
@ -136,18 +141,18 @@ def publish_report_viewer():
|
|||
def contextselection(
|
||||
output_path,
|
||||
project,
|
||||
asset,
|
||||
folder,
|
||||
strict
|
||||
):
|
||||
"""Show Qt dialog to select context.
|
||||
|
||||
Context is project name, asset name and task name. The result is stored
|
||||
Context is project name, folder path and task name. The result is stored
|
||||
into json file which path is passed in first argument.
|
||||
"""
|
||||
Commands.contextselection(
|
||||
output_path,
|
||||
project,
|
||||
asset,
|
||||
folder,
|
||||
strict
|
||||
)
|
||||
|
||||
|
|
@ -163,16 +168,27 @@ def run(script):
|
|||
|
||||
if not script:
|
||||
print("Error: missing path to script file.")
|
||||
return
|
||||
|
||||
# Remove first argument if it is the same as AYON executable
|
||||
# - Forward compatibility with future AYON versions.
|
||||
# - Current AYON launcher keeps the arguments with first argument but
|
||||
# future versions might remove it.
|
||||
first_arg = sys.argv[0]
|
||||
if is_running_from_build():
|
||||
comp_path = os.path.join(os.environ["AYON_ROOT"], "start.py")
|
||||
else:
|
||||
comp_path = os.getenv("AYON_EXECUTABLE")
|
||||
# Compare paths and remove first argument if it is the same as AYON
|
||||
if Path(first_arg).resolve() == Path(comp_path).resolve():
|
||||
sys.argv.pop(0)
|
||||
|
||||
args = sys.argv
|
||||
args.remove("run")
|
||||
args.remove(script)
|
||||
sys.argv = args
|
||||
# Remove 'run' command from sys.argv
|
||||
sys.argv.remove("run")
|
||||
|
||||
args_string = " ".join(args[1:])
|
||||
print(f"... running: {script} {args_string}")
|
||||
runpy.run_path(script, run_name="__main__", )
|
||||
args_string = " ".join(sys.argv[1:])
|
||||
print(f"... running: {script} {args_string}")
|
||||
runpy.run_path(script, run_name="__main__")
|
||||
|
||||
|
||||
@main_cli.command()
|
||||
|
|
@ -243,6 +259,7 @@ def _set_addons_environments():
|
|||
|
||||
|
||||
def main(*args, **kwargs):
|
||||
initialize_ayon_connection()
|
||||
python_path = os.getenv("PYTHONPATH", "")
|
||||
split_paths = python_path.split(os.pathsep)
|
||||
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@
|
|||
"""Implementation of AYON commands."""
|
||||
import os
|
||||
import sys
|
||||
import json
|
||||
import warnings
|
||||
|
||||
|
||||
class Commands:
|
||||
|
|
@ -41,38 +41,35 @@ class Commands:
|
|||
return click_func
|
||||
|
||||
@staticmethod
|
||||
def publish(paths, targets=None, gui=False):
|
||||
def publish(path: str, targets: list=None, gui:bool=False) -> None:
|
||||
"""Start headless publishing.
|
||||
|
||||
Publish use json from passed paths argument.
|
||||
Publish use json from passed path argument.
|
||||
|
||||
Args:
|
||||
paths (list): Paths to jsons.
|
||||
targets (string): What module should be targeted
|
||||
(to choose validator for example)
|
||||
path (str): Path to JSON.
|
||||
targets (list of str): List of pyblish targets.
|
||||
gui (bool): Show publish UI.
|
||||
|
||||
Raises:
|
||||
RuntimeError: When there is no path to process.
|
||||
"""
|
||||
RuntimeError: When executed with list of JSON paths.
|
||||
|
||||
"""
|
||||
from ayon_core.lib import Logger
|
||||
from ayon_core.lib.applications import (
|
||||
get_app_environments_for_context,
|
||||
LaunchTypes,
|
||||
)
|
||||
|
||||
from ayon_core.addon import AddonsManager
|
||||
from ayon_core.pipeline import (
|
||||
install_ayon_plugins,
|
||||
get_global_context,
|
||||
)
|
||||
from ayon_core.tools.utils.host_tools import show_publish
|
||||
from ayon_core.tools.utils.lib import qt_app_context
|
||||
|
||||
# Register target and host
|
||||
import pyblish.api
|
||||
import pyblish.util
|
||||
|
||||
if not isinstance(path, str):
|
||||
raise RuntimeError("Path to JSON must be a string.")
|
||||
|
||||
# Fix older jobs
|
||||
for src_key, dst_key in (
|
||||
("AVALON_PROJECT", "AYON_PROJECT_NAME"),
|
||||
|
|
@ -95,21 +92,16 @@ class Commands:
|
|||
|
||||
publish_paths = manager.collect_plugin_paths()["publish"]
|
||||
|
||||
for path in publish_paths:
|
||||
pyblish.api.register_plugin_path(path)
|
||||
for plugin_path in publish_paths:
|
||||
pyblish.api.register_plugin_path(plugin_path)
|
||||
|
||||
if not any(paths):
|
||||
raise RuntimeError("No publish paths specified")
|
||||
|
||||
app_full_name = os.getenv("AYON_APP_NAME")
|
||||
if app_full_name:
|
||||
applications_addon = manager.get_enabled_addon("applications")
|
||||
if applications_addon is not None:
|
||||
context = get_global_context()
|
||||
env = get_app_environments_for_context(
|
||||
env = applications_addon.get_farm_publish_environment_variables(
|
||||
context["project_name"],
|
||||
context["folder_path"],
|
||||
context["task_name"],
|
||||
app_full_name,
|
||||
launch_type=LaunchTypes.farm_publish,
|
||||
)
|
||||
os.environ.update(env)
|
||||
|
||||
|
|
@ -122,7 +114,7 @@ class Commands:
|
|||
else:
|
||||
pyblish.api.register_target("farm")
|
||||
|
||||
os.environ["AYON_PUBLISH_DATA"] = os.pathsep.join(paths)
|
||||
os.environ["AYON_PUBLISH_DATA"] = path
|
||||
os.environ["HEADLESS_PUBLISH"] = 'true' # to use in app lib
|
||||
|
||||
log.info("Running publish ...")
|
||||
|
|
@ -133,6 +125,8 @@ class Commands:
|
|||
print(plugin)
|
||||
|
||||
if gui:
|
||||
from ayon_core.tools.utils.host_tools import show_publish
|
||||
from ayon_core.tools.utils.lib import qt_app_context
|
||||
with qt_app_context():
|
||||
show_publish()
|
||||
else:
|
||||
|
|
@ -149,39 +143,39 @@ class Commands:
|
|||
log.info("Publish finished.")
|
||||
|
||||
@staticmethod
|
||||
def extractenvironments(output_json_path, project, asset, task, app,
|
||||
env_group):
|
||||
def extractenvironments(
|
||||
output_json_path, project, asset, task, app, env_group
|
||||
):
|
||||
"""Produces json file with environment based on project and app.
|
||||
|
||||
Called by Deadline plugin to propagate environment into render jobs.
|
||||
"""
|
||||
|
||||
from ayon_core.lib.applications import (
|
||||
get_app_environments_for_context,
|
||||
LaunchTypes,
|
||||
from ayon_core.addon import AddonsManager
|
||||
|
||||
warnings.warn(
|
||||
(
|
||||
"Command 'extractenvironments' is deprecated and will be"
|
||||
" removed in future. Please use "
|
||||
"'addon applications extractenvironments ...' instead."
|
||||
),
|
||||
DeprecationWarning
|
||||
)
|
||||
|
||||
if all((project, asset, task, app)):
|
||||
env = get_app_environments_for_context(
|
||||
project,
|
||||
asset,
|
||||
task,
|
||||
app,
|
||||
env_group=env_group,
|
||||
launch_type=LaunchTypes.farm_render
|
||||
addons_manager = AddonsManager()
|
||||
applications_addon = addons_manager.get_enabled_addon("applications")
|
||||
if applications_addon is None:
|
||||
raise RuntimeError(
|
||||
"Applications addon is not available or enabled."
|
||||
)
|
||||
else:
|
||||
env = os.environ.copy()
|
||||
|
||||
output_dir = os.path.dirname(output_json_path)
|
||||
if not os.path.exists(output_dir):
|
||||
os.makedirs(output_dir)
|
||||
|
||||
with open(output_json_path, "w") as file_stream:
|
||||
json.dump(env, file_stream, indent=4)
|
||||
# Please ignore the fact this is using private method
|
||||
applications_addon._cli_extract_environments(
|
||||
output_json_path, project, asset, task, app, env_group
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def contextselection(output_path, project_name, asset_name, strict):
|
||||
def contextselection(output_path, project_name, folder_path, strict):
|
||||
from ayon_core.tools.context_dialog import main
|
||||
|
||||
main(output_path, project_name, asset_name, strict)
|
||||
main(output_path, project_name, folder_path, strict)
|
||||
|
|
|
|||
|
|
@ -1,110 +0,0 @@
|
|||
from .utils import get_ayon_server_api_connection
|
||||
|
||||
from .entities import (
|
||||
get_projects,
|
||||
get_project,
|
||||
get_whole_project,
|
||||
|
||||
get_asset_by_id,
|
||||
get_asset_by_name,
|
||||
get_assets,
|
||||
get_archived_assets,
|
||||
get_asset_ids_with_subsets,
|
||||
|
||||
get_subset_by_id,
|
||||
get_subset_by_name,
|
||||
get_subsets,
|
||||
get_subset_families,
|
||||
|
||||
get_version_by_id,
|
||||
get_version_by_name,
|
||||
get_versions,
|
||||
get_hero_version_by_id,
|
||||
get_hero_version_by_subset_id,
|
||||
get_hero_versions,
|
||||
get_last_versions,
|
||||
get_last_version_by_subset_id,
|
||||
get_last_version_by_subset_name,
|
||||
get_output_link_versions,
|
||||
|
||||
version_is_latest,
|
||||
|
||||
get_representation_by_id,
|
||||
get_representation_by_name,
|
||||
get_representations,
|
||||
get_representation_parents,
|
||||
get_representations_parents,
|
||||
get_archived_representations,
|
||||
|
||||
get_thumbnail,
|
||||
get_thumbnails,
|
||||
get_thumbnail_id_from_source,
|
||||
|
||||
get_workfile_info,
|
||||
|
||||
get_asset_name_identifier,
|
||||
)
|
||||
|
||||
from .entity_links import (
|
||||
get_linked_asset_ids,
|
||||
get_linked_assets,
|
||||
get_linked_representation_id,
|
||||
)
|
||||
|
||||
from .operations import (
|
||||
create_project,
|
||||
)
|
||||
|
||||
|
||||
__all__ = (
|
||||
"get_ayon_server_api_connection",
|
||||
|
||||
"get_projects",
|
||||
"get_project",
|
||||
"get_whole_project",
|
||||
|
||||
"get_asset_by_id",
|
||||
"get_asset_by_name",
|
||||
"get_assets",
|
||||
"get_archived_assets",
|
||||
"get_asset_ids_with_subsets",
|
||||
|
||||
"get_subset_by_id",
|
||||
"get_subset_by_name",
|
||||
"get_subsets",
|
||||
"get_subset_families",
|
||||
|
||||
"get_version_by_id",
|
||||
"get_version_by_name",
|
||||
"get_versions",
|
||||
"get_hero_version_by_id",
|
||||
"get_hero_version_by_subset_id",
|
||||
"get_hero_versions",
|
||||
"get_last_versions",
|
||||
"get_last_version_by_subset_id",
|
||||
"get_last_version_by_subset_name",
|
||||
"get_output_link_versions",
|
||||
|
||||
"version_is_latest",
|
||||
|
||||
"get_representation_by_id",
|
||||
"get_representation_by_name",
|
||||
"get_representations",
|
||||
"get_representation_parents",
|
||||
"get_representations_parents",
|
||||
"get_archived_representations",
|
||||
|
||||
"get_thumbnail",
|
||||
"get_thumbnails",
|
||||
"get_thumbnail_id_from_source",
|
||||
|
||||
"get_workfile_info",
|
||||
|
||||
"get_linked_asset_ids",
|
||||
"get_linked_assets",
|
||||
"get_linked_representation_id",
|
||||
|
||||
"create_project",
|
||||
|
||||
"get_asset_name_identifier",
|
||||
)
|
||||
|
|
@ -1,28 +0,0 @@
|
|||
# --- Folders ---
|
||||
DEFAULT_FOLDER_FIELDS = {
|
||||
"id",
|
||||
"name",
|
||||
"path",
|
||||
"parentId",
|
||||
"active",
|
||||
"parents",
|
||||
"thumbnailId"
|
||||
}
|
||||
|
||||
REPRESENTATION_FILES_FIELDS = {
|
||||
"files.name",
|
||||
"files.hash",
|
||||
"files.id",
|
||||
"files.path",
|
||||
"files.size",
|
||||
}
|
||||
|
||||
CURRENT_PROJECT_SCHEMA = "openpype:project-3.0"
|
||||
CURRENT_PROJECT_CONFIG_SCHEMA = "openpype:config-2.0"
|
||||
CURRENT_ASSET_DOC_SCHEMA = "openpype:asset-3.0"
|
||||
CURRENT_SUBSET_SCHEMA = "openpype:subset-3.0"
|
||||
CURRENT_VERSION_SCHEMA = "openpype:version-3.0"
|
||||
CURRENT_HERO_VERSION_SCHEMA = "openpype:hero_version-1.0"
|
||||
CURRENT_REPRESENTATION_SCHEMA = "openpype:representation-2.0"
|
||||
CURRENT_WORKFILE_INFO_SCHEMA = "openpype:workfile-1.0"
|
||||
CURRENT_THUMBNAIL_SCHEMA = "openpype:thumbnail-1.0"
|
||||
File diff suppressed because it is too large
Load diff
|
|
@ -1,741 +0,0 @@
|
|||
import collections
|
||||
|
||||
from .constants import CURRENT_THUMBNAIL_SCHEMA
|
||||
from .utils import get_ayon_server_api_connection
|
||||
from .openpype_comp import get_folders_with_tasks
|
||||
from .conversion_utils import (
|
||||
project_fields_v3_to_v4,
|
||||
convert_v4_project_to_v3,
|
||||
|
||||
folder_fields_v3_to_v4,
|
||||
convert_v4_folder_to_v3,
|
||||
|
||||
subset_fields_v3_to_v4,
|
||||
convert_v4_subset_to_v3,
|
||||
|
||||
version_fields_v3_to_v4,
|
||||
convert_v4_version_to_v3,
|
||||
|
||||
representation_fields_v3_to_v4,
|
||||
convert_v4_representation_to_v3,
|
||||
|
||||
workfile_info_fields_v3_to_v4,
|
||||
convert_v4_workfile_info_to_v3,
|
||||
)
|
||||
|
||||
|
||||
def get_asset_name_identifier(asset_doc):
|
||||
"""Get asset name identifier by asset document.
|
||||
|
||||
This function is added because of AYON implementation where name
|
||||
identifier is not just a name but full path.
|
||||
|
||||
Asset document must have "name" key, and "data.parents" when in AYON mode.
|
||||
|
||||
Args:
|
||||
asset_doc (dict[str, Any]): Asset document.
|
||||
"""
|
||||
|
||||
parents = list(asset_doc["data"]["parents"])
|
||||
parents.append(asset_doc["name"])
|
||||
return "/" + "/".join(parents)
|
||||
|
||||
|
||||
def get_projects(active=True, inactive=False, library=None, fields=None):
|
||||
if not active and not inactive:
|
||||
return
|
||||
|
||||
if active and inactive:
|
||||
active = None
|
||||
elif active:
|
||||
active = True
|
||||
elif inactive:
|
||||
active = False
|
||||
|
||||
con = get_ayon_server_api_connection()
|
||||
fields = project_fields_v3_to_v4(fields, con)
|
||||
for project in con.get_projects(active, library, fields=fields):
|
||||
yield convert_v4_project_to_v3(project)
|
||||
|
||||
|
||||
def get_project(project_name, active=True, inactive=False, fields=None):
|
||||
# Skip if both are disabled
|
||||
con = get_ayon_server_api_connection()
|
||||
fields = project_fields_v3_to_v4(fields, con)
|
||||
return convert_v4_project_to_v3(
|
||||
con.get_project(project_name, fields=fields)
|
||||
)
|
||||
|
||||
|
||||
def get_whole_project(*args, **kwargs):
|
||||
raise NotImplementedError("'get_whole_project' not implemented")
|
||||
|
||||
|
||||
def _get_subsets(
|
||||
project_name,
|
||||
subset_ids=None,
|
||||
subset_names=None,
|
||||
folder_ids=None,
|
||||
names_by_folder_ids=None,
|
||||
archived=False,
|
||||
fields=None
|
||||
):
|
||||
# Convert fields and add minimum required fields
|
||||
con = get_ayon_server_api_connection()
|
||||
fields = subset_fields_v3_to_v4(fields, con)
|
||||
if fields is not None:
|
||||
for key in (
|
||||
"id",
|
||||
"active"
|
||||
):
|
||||
fields.add(key)
|
||||
|
||||
active = True
|
||||
if archived:
|
||||
active = None
|
||||
|
||||
for subset in con.get_products(
|
||||
project_name,
|
||||
product_ids=subset_ids,
|
||||
product_names=subset_names,
|
||||
folder_ids=folder_ids,
|
||||
names_by_folder_ids=names_by_folder_ids,
|
||||
active=active,
|
||||
fields=fields,
|
||||
):
|
||||
yield convert_v4_subset_to_v3(subset)
|
||||
|
||||
|
||||
def _get_versions(
|
||||
project_name,
|
||||
version_ids=None,
|
||||
subset_ids=None,
|
||||
versions=None,
|
||||
hero=True,
|
||||
standard=True,
|
||||
latest=None,
|
||||
active=None,
|
||||
fields=None
|
||||
):
|
||||
con = get_ayon_server_api_connection()
|
||||
|
||||
fields = version_fields_v3_to_v4(fields, con)
|
||||
|
||||
# Make sure 'productId' and 'version' are available when hero versions
|
||||
# are queried
|
||||
if fields and hero:
|
||||
fields = set(fields)
|
||||
fields |= {"productId", "version"}
|
||||
|
||||
queried_versions = con.get_versions(
|
||||
project_name,
|
||||
version_ids=version_ids,
|
||||
product_ids=subset_ids,
|
||||
versions=versions,
|
||||
hero=hero,
|
||||
standard=standard,
|
||||
latest=latest,
|
||||
active=active,
|
||||
fields=fields
|
||||
)
|
||||
|
||||
version_entities = []
|
||||
hero_versions = []
|
||||
for version in queried_versions:
|
||||
if version["version"] < 0:
|
||||
hero_versions.append(version)
|
||||
else:
|
||||
version_entities.append(convert_v4_version_to_v3(version))
|
||||
|
||||
if hero_versions:
|
||||
subset_ids = set()
|
||||
versions_nums = set()
|
||||
for hero_version in hero_versions:
|
||||
versions_nums.add(abs(hero_version["version"]))
|
||||
subset_ids.add(hero_version["productId"])
|
||||
|
||||
hero_eq_versions = con.get_versions(
|
||||
project_name,
|
||||
product_ids=subset_ids,
|
||||
versions=versions_nums,
|
||||
hero=False,
|
||||
fields=["id", "version", "productId"]
|
||||
)
|
||||
hero_eq_by_subset_id = collections.defaultdict(list)
|
||||
for version in hero_eq_versions:
|
||||
hero_eq_by_subset_id[version["productId"]].append(version)
|
||||
|
||||
for hero_version in hero_versions:
|
||||
abs_version = abs(hero_version["version"])
|
||||
subset_id = hero_version["productId"]
|
||||
version_id = None
|
||||
for version in hero_eq_by_subset_id.get(subset_id, []):
|
||||
if version["version"] == abs_version:
|
||||
version_id = version["id"]
|
||||
break
|
||||
conv_hero = convert_v4_version_to_v3(hero_version)
|
||||
conv_hero["version_id"] = version_id
|
||||
version_entities.append(conv_hero)
|
||||
|
||||
return version_entities
|
||||
|
||||
|
||||
def get_asset_by_id(project_name, asset_id, fields=None):
|
||||
assets = get_assets(
|
||||
project_name, asset_ids=[asset_id], fields=fields
|
||||
)
|
||||
for asset in assets:
|
||||
return asset
|
||||
return None
|
||||
|
||||
|
||||
def get_asset_by_name(project_name, asset_name, fields=None):
|
||||
assets = get_assets(
|
||||
project_name, asset_names=[asset_name], fields=fields
|
||||
)
|
||||
for asset in assets:
|
||||
return asset
|
||||
return None
|
||||
|
||||
|
||||
def _folders_query(project_name, con, fields, **kwargs):
|
||||
if fields is None or "tasks" in fields:
|
||||
folders = get_folders_with_tasks(
|
||||
con, project_name, fields=fields, **kwargs
|
||||
)
|
||||
|
||||
else:
|
||||
folders = con.get_folders(project_name, fields=fields, **kwargs)
|
||||
|
||||
for folder in folders:
|
||||
yield folder
|
||||
|
||||
|
||||
def get_assets(
|
||||
project_name,
|
||||
asset_ids=None,
|
||||
asset_names=None,
|
||||
parent_ids=None,
|
||||
archived=False,
|
||||
fields=None
|
||||
):
|
||||
if not project_name:
|
||||
return
|
||||
|
||||
active = True
|
||||
if archived:
|
||||
active = None
|
||||
|
||||
con = get_ayon_server_api_connection()
|
||||
fields = folder_fields_v3_to_v4(fields, con)
|
||||
kwargs = dict(
|
||||
folder_ids=asset_ids,
|
||||
parent_ids=parent_ids,
|
||||
active=active,
|
||||
)
|
||||
if not asset_names:
|
||||
for folder in _folders_query(project_name, con, fields, **kwargs):
|
||||
yield convert_v4_folder_to_v3(folder, project_name)
|
||||
return
|
||||
|
||||
new_asset_names = set()
|
||||
folder_paths = set()
|
||||
for name in asset_names:
|
||||
if "/" in name:
|
||||
folder_paths.add(name)
|
||||
else:
|
||||
new_asset_names.add(name)
|
||||
|
||||
yielded_ids = set()
|
||||
if folder_paths:
|
||||
for folder in _folders_query(
|
||||
project_name, con, fields, folder_paths=folder_paths, **kwargs
|
||||
):
|
||||
yielded_ids.add(folder["id"])
|
||||
yield convert_v4_folder_to_v3(folder, project_name)
|
||||
|
||||
if not new_asset_names:
|
||||
return
|
||||
|
||||
for folder in _folders_query(
|
||||
project_name, con, fields, folder_names=new_asset_names, **kwargs
|
||||
):
|
||||
if folder["id"] not in yielded_ids:
|
||||
yielded_ids.add(folder["id"])
|
||||
yield convert_v4_folder_to_v3(folder, project_name)
|
||||
|
||||
|
||||
def get_archived_assets(
|
||||
project_name,
|
||||
asset_ids=None,
|
||||
asset_names=None,
|
||||
parent_ids=None,
|
||||
fields=None
|
||||
):
|
||||
return get_assets(
|
||||
project_name,
|
||||
asset_ids,
|
||||
asset_names,
|
||||
parent_ids,
|
||||
True,
|
||||
fields
|
||||
)
|
||||
|
||||
|
||||
def get_asset_ids_with_subsets(project_name, asset_ids=None):
|
||||
con = get_ayon_server_api_connection()
|
||||
return con.get_folder_ids_with_products(project_name, asset_ids)
|
||||
|
||||
|
||||
def get_subset_by_id(project_name, subset_id, fields=None):
|
||||
subsets = get_subsets(
|
||||
project_name, subset_ids=[subset_id], fields=fields
|
||||
)
|
||||
for subset in subsets:
|
||||
return subset
|
||||
return None
|
||||
|
||||
|
||||
def get_subset_by_name(project_name, subset_name, asset_id, fields=None):
|
||||
subsets = get_subsets(
|
||||
project_name,
|
||||
subset_names=[subset_name],
|
||||
asset_ids=[asset_id],
|
||||
fields=fields
|
||||
)
|
||||
for subset in subsets:
|
||||
return subset
|
||||
return None
|
||||
|
||||
|
||||
def get_subsets(
|
||||
project_name,
|
||||
subset_ids=None,
|
||||
subset_names=None,
|
||||
asset_ids=None,
|
||||
names_by_asset_ids=None,
|
||||
archived=False,
|
||||
fields=None
|
||||
):
|
||||
return _get_subsets(
|
||||
project_name,
|
||||
subset_ids,
|
||||
subset_names,
|
||||
asset_ids,
|
||||
names_by_asset_ids,
|
||||
archived,
|
||||
fields=fields
|
||||
)
|
||||
|
||||
|
||||
def get_subset_families(project_name, subset_ids=None):
|
||||
con = get_ayon_server_api_connection()
|
||||
return con.get_product_type_names(project_name, subset_ids)
|
||||
|
||||
|
||||
def get_version_by_id(project_name, version_id, fields=None):
|
||||
versions = get_versions(
|
||||
project_name,
|
||||
version_ids=[version_id],
|
||||
fields=fields,
|
||||
hero=True
|
||||
)
|
||||
for version in versions:
|
||||
return version
|
||||
return None
|
||||
|
||||
|
||||
def get_version_by_name(project_name, version, subset_id, fields=None):
|
||||
versions = get_versions(
|
||||
project_name,
|
||||
subset_ids=[subset_id],
|
||||
versions=[version],
|
||||
fields=fields
|
||||
)
|
||||
for version in versions:
|
||||
return version
|
||||
return None
|
||||
|
||||
|
||||
def get_versions(
|
||||
project_name,
|
||||
version_ids=None,
|
||||
subset_ids=None,
|
||||
versions=None,
|
||||
hero=False,
|
||||
fields=None
|
||||
):
|
||||
return _get_versions(
|
||||
project_name,
|
||||
version_ids,
|
||||
subset_ids,
|
||||
versions,
|
||||
hero=hero,
|
||||
standard=True,
|
||||
fields=fields
|
||||
)
|
||||
|
||||
|
||||
def get_hero_version_by_id(project_name, version_id, fields=None):
|
||||
versions = get_hero_versions(
|
||||
project_name,
|
||||
version_ids=[version_id],
|
||||
fields=fields
|
||||
)
|
||||
for version in versions:
|
||||
return version
|
||||
return None
|
||||
|
||||
|
||||
def get_hero_version_by_subset_id(
|
||||
project_name, subset_id, fields=None
|
||||
):
|
||||
versions = get_hero_versions(
|
||||
project_name,
|
||||
subset_ids=[subset_id],
|
||||
fields=fields
|
||||
)
|
||||
for version in versions:
|
||||
return version
|
||||
return None
|
||||
|
||||
|
||||
def get_hero_versions(
|
||||
project_name, subset_ids=None, version_ids=None, fields=None
|
||||
):
|
||||
return _get_versions(
|
||||
project_name,
|
||||
version_ids=version_ids,
|
||||
subset_ids=subset_ids,
|
||||
hero=True,
|
||||
standard=False,
|
||||
fields=fields
|
||||
)
|
||||
|
||||
|
||||
def get_last_versions(project_name, subset_ids, active=None, fields=None):
|
||||
if fields:
|
||||
fields = set(fields)
|
||||
fields.add("parent")
|
||||
|
||||
versions = _get_versions(
|
||||
project_name,
|
||||
subset_ids=subset_ids,
|
||||
latest=True,
|
||||
hero=False,
|
||||
active=active,
|
||||
fields=fields
|
||||
)
|
||||
return {
|
||||
version["parent"]: version
|
||||
for version in versions
|
||||
}
|
||||
|
||||
|
||||
def get_last_version_by_subset_id(project_name, subset_id, fields=None):
|
||||
versions = _get_versions(
|
||||
project_name,
|
||||
subset_ids=[subset_id],
|
||||
latest=True,
|
||||
hero=False,
|
||||
fields=fields
|
||||
)
|
||||
if not versions:
|
||||
return None
|
||||
return versions[0]
|
||||
|
||||
|
||||
def get_last_version_by_subset_name(
|
||||
project_name,
|
||||
subset_name,
|
||||
asset_id=None,
|
||||
asset_name=None,
|
||||
fields=None
|
||||
):
|
||||
if not asset_id and not asset_name:
|
||||
return None
|
||||
|
||||
if not asset_id:
|
||||
asset = get_asset_by_name(
|
||||
project_name, asset_name, fields=["_id"]
|
||||
)
|
||||
if not asset:
|
||||
return None
|
||||
asset_id = asset["_id"]
|
||||
|
||||
subset = get_subset_by_name(
|
||||
project_name, subset_name, asset_id, fields=["_id"]
|
||||
)
|
||||
if not subset:
|
||||
return None
|
||||
return get_last_version_by_subset_id(
|
||||
project_name, subset["_id"], fields=fields
|
||||
)
|
||||
|
||||
|
||||
def get_output_link_versions(project_name, version_id, fields=None):
|
||||
if not version_id:
|
||||
return []
|
||||
|
||||
con = get_ayon_server_api_connection()
|
||||
version_links = con.get_version_links(
|
||||
project_name, version_id, link_direction="out")
|
||||
|
||||
version_ids = {
|
||||
link["entityId"]
|
||||
for link in version_links
|
||||
if link["entityType"] == "version"
|
||||
}
|
||||
if not version_ids:
|
||||
return []
|
||||
|
||||
return get_versions(project_name, version_ids=version_ids, fields=fields)
|
||||
|
||||
|
||||
def version_is_latest(project_name, version_id):
|
||||
con = get_ayon_server_api_connection()
|
||||
return con.version_is_latest(project_name, version_id)
|
||||
|
||||
|
||||
def get_representation_by_id(project_name, representation_id, fields=None):
|
||||
representations = get_representations(
|
||||
project_name,
|
||||
representation_ids=[representation_id],
|
||||
fields=fields
|
||||
)
|
||||
for representation in representations:
|
||||
return representation
|
||||
return None
|
||||
|
||||
|
||||
def get_representation_by_name(
|
||||
project_name, representation_name, version_id, fields=None
|
||||
):
|
||||
representations = get_representations(
|
||||
project_name,
|
||||
representation_names=[representation_name],
|
||||
version_ids=[version_id],
|
||||
fields=fields
|
||||
)
|
||||
for representation in representations:
|
||||
return representation
|
||||
return None
|
||||
|
||||
|
||||
def get_representations(
|
||||
project_name,
|
||||
representation_ids=None,
|
||||
representation_names=None,
|
||||
version_ids=None,
|
||||
context_filters=None,
|
||||
names_by_version_ids=None,
|
||||
archived=False,
|
||||
standard=True,
|
||||
fields=None
|
||||
):
|
||||
if context_filters is not None:
|
||||
# TODO should we add the support?
|
||||
# - there was ability to fitler using regex
|
||||
raise ValueError("OP v4 can't filter by representation context.")
|
||||
|
||||
if not archived and not standard:
|
||||
return
|
||||
|
||||
if archived and not standard:
|
||||
active = False
|
||||
elif not archived and standard:
|
||||
active = True
|
||||
else:
|
||||
active = None
|
||||
|
||||
con = get_ayon_server_api_connection()
|
||||
fields = representation_fields_v3_to_v4(fields, con)
|
||||
if fields and active is not None:
|
||||
fields.add("active")
|
||||
|
||||
representations = con.get_representations(
|
||||
project_name,
|
||||
representation_ids=representation_ids,
|
||||
representation_names=representation_names,
|
||||
version_ids=version_ids,
|
||||
names_by_version_ids=names_by_version_ids,
|
||||
active=active,
|
||||
fields=fields
|
||||
)
|
||||
for representation in representations:
|
||||
yield convert_v4_representation_to_v3(representation)
|
||||
|
||||
|
||||
def get_representation_parents(project_name, representation):
|
||||
if not representation:
|
||||
return None
|
||||
|
||||
repre_id = representation["_id"]
|
||||
parents_by_repre_id = get_representations_parents(
|
||||
project_name, [representation]
|
||||
)
|
||||
return parents_by_repre_id[repre_id]
|
||||
|
||||
|
||||
def get_representations_parents(project_name, representations):
|
||||
repre_ids = {
|
||||
repre["_id"]
|
||||
for repre in representations
|
||||
}
|
||||
con = get_ayon_server_api_connection()
|
||||
parents_by_repre_id = con.get_representations_parents(project_name,
|
||||
repre_ids)
|
||||
folder_ids = set()
|
||||
for parents in parents_by_repre_id .values():
|
||||
folder_ids.add(parents[2]["id"])
|
||||
|
||||
tasks_by_folder_id = {}
|
||||
|
||||
new_parents = {}
|
||||
for repre_id, parents in parents_by_repre_id .items():
|
||||
version, subset, folder, project = parents
|
||||
folder_tasks = tasks_by_folder_id.get(folder["id"]) or {}
|
||||
folder["tasks"] = folder_tasks
|
||||
new_parents[repre_id] = (
|
||||
convert_v4_version_to_v3(version),
|
||||
convert_v4_subset_to_v3(subset),
|
||||
convert_v4_folder_to_v3(folder, project_name),
|
||||
project
|
||||
)
|
||||
return new_parents
|
||||
|
||||
|
||||
def get_archived_representations(
|
||||
project_name,
|
||||
representation_ids=None,
|
||||
representation_names=None,
|
||||
version_ids=None,
|
||||
context_filters=None,
|
||||
names_by_version_ids=None,
|
||||
fields=None
|
||||
):
|
||||
return get_representations(
|
||||
project_name,
|
||||
representation_ids=representation_ids,
|
||||
representation_names=representation_names,
|
||||
version_ids=version_ids,
|
||||
context_filters=context_filters,
|
||||
names_by_version_ids=names_by_version_ids,
|
||||
archived=True,
|
||||
standard=False,
|
||||
fields=fields
|
||||
)
|
||||
|
||||
|
||||
def get_thumbnail(
|
||||
project_name, thumbnail_id, entity_type, entity_id, fields=None
|
||||
):
|
||||
"""Receive thumbnail entity data.
|
||||
|
||||
Args:
|
||||
project_name (str): Name of project where to look for queried entities.
|
||||
thumbnail_id (Union[str, ObjectId]): Id of thumbnail entity.
|
||||
entity_type (str): Type of entity for which the thumbnail should be
|
||||
received.
|
||||
entity_id (str): Id of entity for which the thumbnail should be
|
||||
received.
|
||||
fields (Iterable[str]): Fields that should be returned. All fields are
|
||||
returned if 'None' is passed.
|
||||
|
||||
Returns:
|
||||
None: If thumbnail with specified id was not found.
|
||||
Dict: Thumbnail entity data which can be reduced to specified 'fields'.
|
||||
"""
|
||||
|
||||
if not thumbnail_id or not entity_type or not entity_id:
|
||||
return None
|
||||
|
||||
if entity_type == "asset":
|
||||
entity_type = "folder"
|
||||
|
||||
elif entity_type == "hero_version":
|
||||
entity_type = "version"
|
||||
|
||||
return {
|
||||
"_id": thumbnail_id,
|
||||
"type": "thumbnail",
|
||||
"schema": CURRENT_THUMBNAIL_SCHEMA,
|
||||
"data": {
|
||||
"entity_type": entity_type,
|
||||
"entity_id": entity_id
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
def get_thumbnails(project_name, thumbnail_contexts, fields=None):
|
||||
"""Get thumbnail entities.
|
||||
|
||||
Warning:
|
||||
This function is not OpenPype compatible. There is none usage of this
|
||||
function in codebase so there is nothing to convert. The previous
|
||||
implementation cannot be AYON compatible without entity types.
|
||||
"""
|
||||
|
||||
thumbnail_items = set()
|
||||
for thumbnail_context in thumbnail_contexts:
|
||||
thumbnail_id, entity_type, entity_id = thumbnail_context
|
||||
thumbnail_item = get_thumbnail(
|
||||
project_name, thumbnail_id, entity_type, entity_id
|
||||
)
|
||||
if thumbnail_item:
|
||||
thumbnail_items.add(thumbnail_item)
|
||||
return list(thumbnail_items)
|
||||
|
||||
|
||||
def get_thumbnail_id_from_source(project_name, src_type, src_id):
|
||||
"""Receive thumbnail id from source entity.
|
||||
|
||||
Args:
|
||||
project_name (str): Name of project where to look for queried entities.
|
||||
src_type (str): Type of source entity ('asset', 'version').
|
||||
src_id (Union[str, ObjectId]): Id of source entity.
|
||||
|
||||
Returns:
|
||||
ObjectId: Thumbnail id assigned to entity.
|
||||
None: If Source entity does not have any thumbnail id assigned.
|
||||
"""
|
||||
|
||||
if not src_type or not src_id:
|
||||
return None
|
||||
|
||||
if src_type == "version":
|
||||
version = get_version_by_id(
|
||||
project_name, src_id, fields=["data.thumbnail_id"]
|
||||
) or {}
|
||||
return version.get("data", {}).get("thumbnail_id")
|
||||
|
||||
if src_type == "asset":
|
||||
asset = get_asset_by_id(
|
||||
project_name, src_id, fields=["data.thumbnail_id"]
|
||||
) or {}
|
||||
return asset.get("data", {}).get("thumbnail_id")
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def get_workfile_info(
|
||||
project_name, asset_id, task_name, filename, fields=None
|
||||
):
|
||||
if not asset_id or not task_name or not filename:
|
||||
return None
|
||||
|
||||
con = get_ayon_server_api_connection()
|
||||
task = con.get_task_by_name(
|
||||
project_name, asset_id, task_name, fields=["id", "name", "folderId"]
|
||||
)
|
||||
if not task:
|
||||
return None
|
||||
|
||||
fields = workfile_info_fields_v3_to_v4(fields)
|
||||
|
||||
for workfile_info in con.get_workfiles_info(
|
||||
project_name, task_ids=[task["id"]], fields=fields
|
||||
):
|
||||
if workfile_info["name"] == filename:
|
||||
return convert_v4_workfile_info_to_v3(workfile_info, task)
|
||||
return None
|
||||
|
|
@ -1,157 +0,0 @@
|
|||
from .utils import get_ayon_server_api_connection
|
||||
from .entities import get_assets, get_representation_by_id
|
||||
|
||||
|
||||
def get_linked_asset_ids(project_name, asset_doc=None, asset_id=None):
|
||||
"""Extract linked asset ids from asset document.
|
||||
|
||||
One of asset document or asset id must be passed.
|
||||
|
||||
Note:
|
||||
Asset links now works only from asset to assets.
|
||||
|
||||
Args:
|
||||
project_name (str): Project where to look for asset.
|
||||
asset_doc (dict): Asset document from DB.
|
||||
asset_id (str): Asset id to find its document.
|
||||
|
||||
Returns:
|
||||
List[Union[ObjectId, str]]: Asset ids of input links.
|
||||
"""
|
||||
|
||||
output = []
|
||||
if not asset_doc and not asset_id:
|
||||
return output
|
||||
|
||||
if not asset_id:
|
||||
asset_id = asset_doc["_id"]
|
||||
|
||||
con = get_ayon_server_api_connection()
|
||||
links = con.get_folder_links(project_name, asset_id, link_direction="in")
|
||||
return [
|
||||
link["entityId"]
|
||||
for link in links
|
||||
if link["entityType"] == "folder"
|
||||
]
|
||||
|
||||
|
||||
def get_linked_assets(
|
||||
project_name, asset_doc=None, asset_id=None, fields=None
|
||||
):
|
||||
"""Return linked assets based on passed asset document.
|
||||
|
||||
One of asset document or asset id must be passed.
|
||||
|
||||
Args:
|
||||
project_name (str): Name of project where to look for queried entities.
|
||||
asset_doc (Dict[str, Any]): Asset document from database.
|
||||
asset_id (Union[ObjectId, str]): Asset id. Can be used instead of
|
||||
asset document.
|
||||
fields (Iterable[str]): Fields that should be returned. All fields are
|
||||
returned if 'None' is passed.
|
||||
|
||||
Returns:
|
||||
List[Dict[str, Any]]: Asset documents of input links for passed
|
||||
asset doc.
|
||||
"""
|
||||
|
||||
link_ids = get_linked_asset_ids(project_name, asset_doc, asset_id)
|
||||
if not link_ids:
|
||||
return []
|
||||
return list(get_assets(project_name, asset_ids=link_ids, fields=fields))
|
||||
|
||||
|
||||
|
||||
def get_linked_representation_id(
|
||||
project_name, repre_doc=None, repre_id=None, link_type=None, max_depth=None
|
||||
):
|
||||
"""Returns list of linked ids of particular type (if provided).
|
||||
|
||||
One of representation document or representation id must be passed.
|
||||
Note:
|
||||
Representation links now works only from representation through version
|
||||
back to representations.
|
||||
|
||||
Todos:
|
||||
Missing depth query. Not sure how it did find more representations in
|
||||
depth, probably links to version?
|
||||
|
||||
Args:
|
||||
project_name (str): Name of project where look for links.
|
||||
repre_doc (Dict[str, Any]): Representation document.
|
||||
repre_id (Union[ObjectId, str]): Representation id.
|
||||
link_type (str): Type of link (e.g. 'reference', ...).
|
||||
max_depth (int): Limit recursion level. Default: 0
|
||||
|
||||
Returns:
|
||||
List[ObjectId] Linked representation ids.
|
||||
"""
|
||||
|
||||
if repre_doc:
|
||||
repre_id = repre_doc["_id"]
|
||||
|
||||
if not repre_id and not repre_doc:
|
||||
return []
|
||||
|
||||
version_id = None
|
||||
if repre_doc:
|
||||
version_id = repre_doc.get("parent")
|
||||
|
||||
if not version_id:
|
||||
repre_doc = get_representation_by_id(
|
||||
project_name, repre_id, fields=["parent"]
|
||||
)
|
||||
if repre_doc:
|
||||
version_id = repre_doc["parent"]
|
||||
|
||||
if not version_id:
|
||||
return []
|
||||
|
||||
if max_depth is None or max_depth == 0:
|
||||
max_depth = 1
|
||||
|
||||
link_types = None
|
||||
if link_type:
|
||||
link_types = [link_type]
|
||||
|
||||
con = get_ayon_server_api_connection()
|
||||
# Store already found version ids to avoid recursion, and also to store
|
||||
# output -> Don't forget to remove 'version_id' at the end!!!
|
||||
linked_version_ids = {version_id}
|
||||
# Each loop of depth will reset this variable
|
||||
versions_to_check = {version_id}
|
||||
for _ in range(max_depth):
|
||||
if not versions_to_check:
|
||||
break
|
||||
|
||||
versions_links = con.get_versions_links(
|
||||
project_name,
|
||||
versions_to_check,
|
||||
link_types=link_types,
|
||||
link_direction="out")
|
||||
|
||||
versions_to_check = set()
|
||||
for links in versions_links.values():
|
||||
for link in links:
|
||||
# Care only about version links
|
||||
if link["entityType"] != "version":
|
||||
continue
|
||||
entity_id = link["entityId"]
|
||||
# Skip already found linked version ids
|
||||
if entity_id in linked_version_ids:
|
||||
continue
|
||||
linked_version_ids.add(entity_id)
|
||||
versions_to_check.add(entity_id)
|
||||
|
||||
linked_version_ids.remove(version_id)
|
||||
if not linked_version_ids:
|
||||
return []
|
||||
con = get_ayon_server_api_connection()
|
||||
representations = con.get_representations(
|
||||
project_name,
|
||||
version_ids=linked_version_ids,
|
||||
fields=["id"])
|
||||
return [
|
||||
repre["id"]
|
||||
for repre in representations
|
||||
]
|
||||
|
|
@ -1,39 +0,0 @@
|
|||
# Client functionality
|
||||
## Reason
|
||||
Preparation for OpenPype v4 server. Goal is to remove direct mongo calls in code to prepare a little bit for different source of data for code before. To start think about database calls less as mongo calls but more universally. To do so was implemented simple wrapper around database calls to not use pymongo specific code.
|
||||
|
||||
Current goal is not to make universal database model which can be easily replaced with any different source of data but to make it close as possible. Current implementation of OpenPype is too tightly connected to pymongo and it's abilities so we're trying to get closer with long term changes that can be used even in current state.
|
||||
|
||||
## Queries
|
||||
Query functions don't use full potential of mongo queries like very specific queries based on subdictionaries or unknown structures. We try to avoid these calls as much as possible because they'll probably won't be available in future. If it's really necessary a new function can be added but only if it's reasonable for overall logic. All query functions were moved to `~/client/entities.py`. Each function has arguments with available filters and possible reduce of returned keys for each entity.
|
||||
|
||||
## Changes
|
||||
Changes are a little bit complicated. Mongo has many options how update can happen which had to be reduced also it would be at this stage complicated to validate values which are created or updated thus automation is at this point almost none. Changes can be made using operations available in `~/client/operations.py`. Each operation require project name and entity type, but may require operation specific data.
|
||||
|
||||
### Create
|
||||
Create operations expect already prepared document data, for that are prepared functions creating skeletal structures of documents (do not fill all required data), except `_id` all data should be right. Existence of entity is not validated so if the same creation operation is send n times it will create the entity n times which can cause issues.
|
||||
|
||||
### Update
|
||||
Update operation require entity id and keys that should be changed, update dictionary must have {"key": value}. If value should be set in nested dictionary the key must have also all subkeys joined with dot `.` (e.g. `{"data": {"fps": 25}}` -> `{"data.fps": 25}`). To simplify update dictionaries were prepared functions which does that for you, their name has template `prepare_<entity type>_update_data` - they work on comparison of previous document and new document. If there is missing function for requested entity type it is because we didn't need it yet and require implementation.
|
||||
|
||||
### Delete
|
||||
Delete operation need entity id. Entity will be deleted from mongo.
|
||||
|
||||
|
||||
## What (probably) won't be replaced
|
||||
Some parts of code are still using direct mongo calls. In most of cases it is for very specific calls that are module specific or their usage will completely change in future.
|
||||
- Mongo calls that are not project specific (out of `avalon` collection) will be removed or will have to use different mechanism how the data are stored. At this moment it is related to OpenPype settings and logs, ftrack server events, some other data.
|
||||
- Sync server queries. They're complex and very specific for sync server module. Their replacement will require specific calls to OpenPype server in v4 thus their abstraction with wrapper is irrelevant and would complicate production in v3.
|
||||
- Project managers (ftrack, kitsu, shotgrid, embedded Project Manager, etc.). Project managers are creating, updating or removing assets in v3, but in v4 will create folders with different structure. Wrapping creation of assets would not help to prepare for v4 because of new data structures. The same can be said about editorial Extract Hierarchy Avalon plugin which create project structure.
|
||||
- Code parts that is marked as deprecated in v3 or will be deprecated in v4.
|
||||
- integrate asset legacy publish plugin - already is legacy kept for safety
|
||||
- integrate thumbnail - thumbnails will be stored in different way in v4
|
||||
- input links - link will be stored in different way and will have different mechanism of linking. In v3 are links limited to same entity type "asset <-> asset" or "representation <-> representation".
|
||||
|
||||
## Known missing replacements
|
||||
- change subset group in loader tool
|
||||
- integrate subset group
|
||||
- query input links in openpype lib
|
||||
- create project in openpype lib
|
||||
- save/create workfile doc in openpype lib
|
||||
- integrate hero version
|
||||
|
|
@ -1,159 +0,0 @@
|
|||
import collections
|
||||
import json
|
||||
|
||||
import six
|
||||
from ayon_api.graphql import GraphQlQuery, FIELD_VALUE, fields_to_dict
|
||||
|
||||
from .constants import DEFAULT_FOLDER_FIELDS
|
||||
|
||||
|
||||
def folders_tasks_graphql_query(fields):
|
||||
query = GraphQlQuery("FoldersQuery")
|
||||
project_name_var = query.add_variable("projectName", "String!")
|
||||
folder_ids_var = query.add_variable("folderIds", "[String!]")
|
||||
parent_folder_ids_var = query.add_variable("parentFolderIds", "[String!]")
|
||||
folder_paths_var = query.add_variable("folderPaths", "[String!]")
|
||||
folder_names_var = query.add_variable("folderNames", "[String!]")
|
||||
has_products_var = query.add_variable("folderHasProducts", "Boolean!")
|
||||
|
||||
project_field = query.add_field("project")
|
||||
project_field.set_filter("name", project_name_var)
|
||||
|
||||
folders_field = project_field.add_field_with_edges("folders")
|
||||
folders_field.set_filter("ids", folder_ids_var)
|
||||
folders_field.set_filter("parentIds", parent_folder_ids_var)
|
||||
folders_field.set_filter("names", folder_names_var)
|
||||
folders_field.set_filter("paths", folder_paths_var)
|
||||
folders_field.set_filter("hasProducts", has_products_var)
|
||||
|
||||
fields = set(fields)
|
||||
fields.discard("tasks")
|
||||
tasks_field = folders_field.add_field_with_edges("tasks")
|
||||
tasks_field.add_field("name")
|
||||
tasks_field.add_field("taskType")
|
||||
|
||||
nested_fields = fields_to_dict(fields)
|
||||
|
||||
query_queue = collections.deque()
|
||||
for key, value in nested_fields.items():
|
||||
query_queue.append((key, value, folders_field))
|
||||
|
||||
while query_queue:
|
||||
item = query_queue.popleft()
|
||||
key, value, parent = item
|
||||
field = parent.add_field(key)
|
||||
if value is FIELD_VALUE:
|
||||
continue
|
||||
|
||||
for k, v in value.items():
|
||||
query_queue.append((k, v, field))
|
||||
return query
|
||||
|
||||
|
||||
def get_folders_with_tasks(
|
||||
con,
|
||||
project_name,
|
||||
folder_ids=None,
|
||||
folder_paths=None,
|
||||
folder_names=None,
|
||||
parent_ids=None,
|
||||
active=True,
|
||||
fields=None
|
||||
):
|
||||
"""Query folders with tasks from server.
|
||||
|
||||
This is for v4 compatibility where tasks were stored on assets. This is
|
||||
an inefficient way how folders and tasks are queried so it was added only
|
||||
as compatibility function.
|
||||
|
||||
Todos:
|
||||
Folder name won't be unique identifier, so we should add folder path
|
||||
filtering.
|
||||
|
||||
Notes:
|
||||
Filter 'active' don't have direct filter in GraphQl.
|
||||
|
||||
Args:
|
||||
con (ServerAPI): Connection to server.
|
||||
project_name (str): Name of project where folders are.
|
||||
folder_ids (Iterable[str]): Folder ids to filter.
|
||||
folder_paths (Iterable[str]): Folder paths used for filtering.
|
||||
folder_names (Iterable[str]): Folder names used for filtering.
|
||||
parent_ids (Iterable[str]): Ids of folder parents. Use 'None'
|
||||
if folder is direct child of project.
|
||||
active (Union[bool, None]): Filter active/inactive folders. Both
|
||||
are returned if is set to None.
|
||||
fields (Union[Iterable(str), None]): Fields to be queried
|
||||
for folder. All possible folder fields are returned if 'None'
|
||||
is passed.
|
||||
|
||||
Yields:
|
||||
Dict[str, Any]: Queried folder entities.
|
||||
"""
|
||||
|
||||
if not project_name:
|
||||
return
|
||||
|
||||
filters = {
|
||||
"projectName": project_name
|
||||
}
|
||||
if folder_ids is not None:
|
||||
folder_ids = set(folder_ids)
|
||||
if not folder_ids:
|
||||
return
|
||||
filters["folderIds"] = list(folder_ids)
|
||||
|
||||
if folder_paths is not None:
|
||||
folder_paths = set(folder_paths)
|
||||
if not folder_paths:
|
||||
return
|
||||
filters["folderPaths"] = list(folder_paths)
|
||||
|
||||
if folder_names is not None:
|
||||
folder_names = set(folder_names)
|
||||
if not folder_names:
|
||||
return
|
||||
filters["folderNames"] = list(folder_names)
|
||||
|
||||
if parent_ids is not None:
|
||||
parent_ids = set(parent_ids)
|
||||
if not parent_ids:
|
||||
return
|
||||
if None in parent_ids:
|
||||
# Replace 'None' with '"root"' which is used during GraphQl
|
||||
# query for parent ids filter for folders without folder
|
||||
# parent
|
||||
parent_ids.remove(None)
|
||||
parent_ids.add("root")
|
||||
|
||||
if project_name in parent_ids:
|
||||
# Replace project name with '"root"' which is used during
|
||||
# GraphQl query for parent ids filter for folders without
|
||||
# folder parent
|
||||
parent_ids.remove(project_name)
|
||||
parent_ids.add("root")
|
||||
|
||||
filters["parentFolderIds"] = list(parent_ids)
|
||||
|
||||
if fields:
|
||||
fields = set(fields)
|
||||
else:
|
||||
fields = con.get_default_fields_for_type("folder")
|
||||
fields |= DEFAULT_FOLDER_FIELDS
|
||||
|
||||
if active is not None:
|
||||
fields.add("active")
|
||||
|
||||
query = folders_tasks_graphql_query(fields)
|
||||
for attr, filter_value in filters.items():
|
||||
query.set_variable_value(attr, filter_value)
|
||||
|
||||
parsed_data = query.query(con)
|
||||
folders = parsed_data["project"]["folders"]
|
||||
for folder in folders:
|
||||
if active is not None and folder["active"] is not active:
|
||||
continue
|
||||
folder_data = folder.get("data")
|
||||
if isinstance(folder_data, six.string_types):
|
||||
folder["data"] = json.loads(folder_data)
|
||||
yield folder
|
||||
|
|
@ -1,880 +0,0 @@
|
|||
import copy
|
||||
import json
|
||||
import collections
|
||||
import uuid
|
||||
import datetime
|
||||
|
||||
from ayon_api.server_api import (
|
||||
PROJECT_NAME_ALLOWED_SYMBOLS,
|
||||
PROJECT_NAME_REGEX,
|
||||
)
|
||||
|
||||
from .constants import (
|
||||
CURRENT_PROJECT_SCHEMA,
|
||||
CURRENT_PROJECT_CONFIG_SCHEMA,
|
||||
CURRENT_ASSET_DOC_SCHEMA,
|
||||
CURRENT_SUBSET_SCHEMA,
|
||||
CURRENT_VERSION_SCHEMA,
|
||||
CURRENT_HERO_VERSION_SCHEMA,
|
||||
CURRENT_REPRESENTATION_SCHEMA,
|
||||
CURRENT_WORKFILE_INFO_SCHEMA,
|
||||
CURRENT_THUMBNAIL_SCHEMA,
|
||||
)
|
||||
from .operations_base import (
|
||||
REMOVED_VALUE,
|
||||
CreateOperation,
|
||||
UpdateOperation,
|
||||
DeleteOperation,
|
||||
BaseOperationsSession
|
||||
)
|
||||
from .conversion_utils import (
|
||||
convert_create_asset_to_v4,
|
||||
convert_create_task_to_v4,
|
||||
convert_create_subset_to_v4,
|
||||
convert_create_version_to_v4,
|
||||
convert_create_hero_version_to_v4,
|
||||
convert_create_representation_to_v4,
|
||||
convert_create_workfile_info_to_v4,
|
||||
|
||||
convert_update_folder_to_v4,
|
||||
convert_update_subset_to_v4,
|
||||
convert_update_version_to_v4,
|
||||
convert_update_hero_version_to_v4,
|
||||
convert_update_representation_to_v4,
|
||||
convert_update_workfile_info_to_v4,
|
||||
)
|
||||
from .utils import create_entity_id, get_ayon_server_api_connection
|
||||
|
||||
|
||||
def _create_or_convert_to_id(entity_id=None):
|
||||
if entity_id is None:
|
||||
return create_entity_id()
|
||||
|
||||
# Validate if can be converted to uuid
|
||||
uuid.UUID(entity_id)
|
||||
return entity_id
|
||||
|
||||
|
||||
def new_project_document(
|
||||
project_name, project_code, config, data=None, entity_id=None
|
||||
):
|
||||
"""Create skeleton data of project document.
|
||||
|
||||
Args:
|
||||
project_name (str): Name of project. Used as identifier of a project.
|
||||
project_code (str): Shorter version of projet without spaces and
|
||||
special characters (in most of cases). Should be also considered
|
||||
as unique name across projects.
|
||||
config (Dic[str, Any]): Project config consist of roots, templates,
|
||||
applications and other project Anatomy related data.
|
||||
data (Dict[str, Any]): Project data with information about it's
|
||||
attributes (e.g. 'fps' etc.) or integration specific keys.
|
||||
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
|
||||
created if not passed.
|
||||
|
||||
Returns:
|
||||
Dict[str, Any]: Skeleton of project document.
|
||||
"""
|
||||
|
||||
if data is None:
|
||||
data = {}
|
||||
|
||||
data["code"] = project_code
|
||||
|
||||
return {
|
||||
"_id": _create_or_convert_to_id(entity_id),
|
||||
"name": project_name,
|
||||
"type": CURRENT_PROJECT_SCHEMA,
|
||||
"entity_data": data,
|
||||
"config": config
|
||||
}
|
||||
|
||||
|
||||
def new_asset_document(
|
||||
name, project_id, parent_id, parents, data=None, entity_id=None
|
||||
):
|
||||
"""Create skeleton data of asset document.
|
||||
|
||||
Args:
|
||||
name (str): Is considered as unique identifier of asset in project.
|
||||
project_id (Union[str, ObjectId]): Id of project doument.
|
||||
parent_id (Union[str, ObjectId]): Id of parent asset.
|
||||
parents (List[str]): List of parent assets names.
|
||||
data (Dict[str, Any]): Asset document data. Empty dictionary is used
|
||||
if not passed. Value of 'parent_id' is used to fill 'visualParent'.
|
||||
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
|
||||
created if not passed.
|
||||
|
||||
Returns:
|
||||
Dict[str, Any]: Skeleton of asset document.
|
||||
"""
|
||||
|
||||
if data is None:
|
||||
data = {}
|
||||
if parent_id is not None:
|
||||
parent_id = _create_or_convert_to_id(parent_id)
|
||||
data["visualParent"] = parent_id
|
||||
data["parents"] = parents
|
||||
|
||||
return {
|
||||
"_id": _create_or_convert_to_id(entity_id),
|
||||
"type": "asset",
|
||||
"name": name,
|
||||
# This will be ignored
|
||||
"parent": project_id,
|
||||
"data": data,
|
||||
"schema": CURRENT_ASSET_DOC_SCHEMA
|
||||
}
|
||||
|
||||
|
||||
def new_subset_document(name, family, asset_id, data=None, entity_id=None):
|
||||
"""Create skeleton data of subset document.
|
||||
|
||||
Args:
|
||||
name (str): Is considered as unique identifier of subset under asset.
|
||||
family (str): Subset's family.
|
||||
asset_id (Union[str, ObjectId]): Id of parent asset.
|
||||
data (Dict[str, Any]): Subset document data. Empty dictionary is used
|
||||
if not passed. Value of 'family' is used to fill 'family'.
|
||||
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
|
||||
created if not passed.
|
||||
|
||||
Returns:
|
||||
Dict[str, Any]: Skeleton of subset document.
|
||||
"""
|
||||
|
||||
if data is None:
|
||||
data = {}
|
||||
data["family"] = family
|
||||
return {
|
||||
"_id": _create_or_convert_to_id(entity_id),
|
||||
"schema": CURRENT_SUBSET_SCHEMA,
|
||||
"type": "subset",
|
||||
"name": name,
|
||||
"data": data,
|
||||
"parent": _create_or_convert_to_id(asset_id)
|
||||
}
|
||||
|
||||
|
||||
def new_version_doc(version, subset_id, data=None, entity_id=None):
|
||||
"""Create skeleton data of version document.
|
||||
|
||||
Args:
|
||||
version (int): Is considered as unique identifier of version
|
||||
under subset.
|
||||
subset_id (Union[str, ObjectId]): Id of parent subset.
|
||||
data (Dict[str, Any]): Version document data.
|
||||
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
|
||||
created if not passed.
|
||||
|
||||
Returns:
|
||||
Dict[str, Any]: Skeleton of version document.
|
||||
"""
|
||||
|
||||
if data is None:
|
||||
data = {}
|
||||
|
||||
return {
|
||||
"_id": _create_or_convert_to_id(entity_id),
|
||||
"schema": CURRENT_VERSION_SCHEMA,
|
||||
"type": "version",
|
||||
"name": int(version),
|
||||
"parent": _create_or_convert_to_id(subset_id),
|
||||
"data": data
|
||||
}
|
||||
|
||||
|
||||
def new_hero_version_doc(subset_id, data, version=None, entity_id=None):
|
||||
"""Create skeleton data of hero version document.
|
||||
|
||||
Args:
|
||||
subset_id (Union[str, ObjectId]): Id of parent subset.
|
||||
data (Dict[str, Any]): Version document data.
|
||||
version (int): Version of source version.
|
||||
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
|
||||
created if not passed.
|
||||
|
||||
Returns:
|
||||
Dict[str, Any]: Skeleton of version document.
|
||||
"""
|
||||
|
||||
if version is None:
|
||||
version = -1
|
||||
elif version > 0:
|
||||
version = -version
|
||||
|
||||
return {
|
||||
"_id": _create_or_convert_to_id(entity_id),
|
||||
"schema": CURRENT_HERO_VERSION_SCHEMA,
|
||||
"type": "hero_version",
|
||||
"version": version,
|
||||
"parent": _create_or_convert_to_id(subset_id),
|
||||
"data": data
|
||||
}
|
||||
|
||||
|
||||
def new_representation_doc(
|
||||
name, version_id, context, data=None, entity_id=None
|
||||
):
|
||||
"""Create skeleton data of representation document.
|
||||
|
||||
Args:
|
||||
name (str): Representation name considered as unique identifier
|
||||
of representation under version.
|
||||
version_id (Union[str, ObjectId]): Id of parent version.
|
||||
context (Dict[str, Any]): Representation context used for fill template
|
||||
of to query.
|
||||
data (Dict[str, Any]): Representation document data.
|
||||
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
|
||||
created if not passed.
|
||||
|
||||
Returns:
|
||||
Dict[str, Any]: Skeleton of version document.
|
||||
"""
|
||||
|
||||
if data is None:
|
||||
data = {}
|
||||
|
||||
return {
|
||||
"_id": _create_or_convert_to_id(entity_id),
|
||||
"schema": CURRENT_REPRESENTATION_SCHEMA,
|
||||
"type": "representation",
|
||||
"parent": _create_or_convert_to_id(version_id),
|
||||
"name": name,
|
||||
"data": data,
|
||||
|
||||
# Imprint shortcut to context for performance reasons.
|
||||
"context": context
|
||||
}
|
||||
|
||||
|
||||
def new_thumbnail_doc(data=None, entity_id=None):
|
||||
"""Create skeleton data of thumbnail document.
|
||||
|
||||
Args:
|
||||
data (Dict[str, Any]): Thumbnail document data.
|
||||
entity_id (Union[str, ObjectId]): Predefined id of document. New id is
|
||||
created if not passed.
|
||||
|
||||
Returns:
|
||||
Dict[str, Any]: Skeleton of thumbnail document.
|
||||
"""
|
||||
|
||||
if data is None:
|
||||
data = {}
|
||||
|
||||
return {
|
||||
"_id": _create_or_convert_to_id(entity_id),
|
||||
"type": "thumbnail",
|
||||
"schema": CURRENT_THUMBNAIL_SCHEMA,
|
||||
"data": data
|
||||
}
|
||||
|
||||
|
||||
def new_workfile_info_doc(
|
||||
filename, asset_id, task_name, files, data=None, entity_id=None
|
||||
):
|
||||
"""Create skeleton data of workfile info document.
|
||||
|
||||
Workfile document is at this moment used primarily for artist notes.
|
||||
|
||||
Args:
|
||||
filename (str): Filename of workfile.
|
||||
asset_id (Union[str, ObjectId]): Id of asset under which workfile live.
|
||||
task_name (str): Task under which was workfile created.
|
||||
files (List[str]): List of rootless filepaths related to workfile.
|
||||
data (Dict[str, Any]): Additional metadata.
|
||||
|
||||
Returns:
|
||||
Dict[str, Any]: Skeleton of workfile info document.
|
||||
"""
|
||||
|
||||
if not data:
|
||||
data = {}
|
||||
|
||||
return {
|
||||
"_id": _create_or_convert_to_id(entity_id),
|
||||
"type": "workfile",
|
||||
"parent": _create_or_convert_to_id(asset_id),
|
||||
"task_name": task_name,
|
||||
"filename": filename,
|
||||
"data": data,
|
||||
"files": files
|
||||
}
|
||||
|
||||
|
||||
def _prepare_update_data(old_doc, new_doc, replace):
|
||||
changes = {}
|
||||
for key, value in new_doc.items():
|
||||
if key not in old_doc or value != old_doc[key]:
|
||||
changes[key] = value
|
||||
|
||||
if replace:
|
||||
for key in old_doc.keys():
|
||||
if key not in new_doc:
|
||||
changes[key] = REMOVED_VALUE
|
||||
return changes
|
||||
|
||||
|
||||
def prepare_subset_update_data(old_doc, new_doc, replace=True):
|
||||
"""Compare two subset documents and prepare update data.
|
||||
|
||||
Based on compared values will create update data for
|
||||
'MongoUpdateOperation'.
|
||||
|
||||
Empty output means that documents are identical.
|
||||
|
||||
Returns:
|
||||
Dict[str, Any]: Changes between old and new document.
|
||||
"""
|
||||
|
||||
return _prepare_update_data(old_doc, new_doc, replace)
|
||||
|
||||
|
||||
def prepare_version_update_data(old_doc, new_doc, replace=True):
|
||||
"""Compare two version documents and prepare update data.
|
||||
|
||||
Based on compared values will create update data for
|
||||
'MongoUpdateOperation'.
|
||||
|
||||
Empty output means that documents are identical.
|
||||
|
||||
Returns:
|
||||
Dict[str, Any]: Changes between old and new document.
|
||||
"""
|
||||
|
||||
return _prepare_update_data(old_doc, new_doc, replace)
|
||||
|
||||
|
||||
def prepare_hero_version_update_data(old_doc, new_doc, replace=True):
|
||||
"""Compare two hero version documents and prepare update data.
|
||||
|
||||
Based on compared values will create update data for 'UpdateOperation'.
|
||||
|
||||
Empty output means that documents are identical.
|
||||
|
||||
Returns:
|
||||
Dict[str, Any]: Changes between old and new document.
|
||||
"""
|
||||
|
||||
changes = _prepare_update_data(old_doc, new_doc, replace)
|
||||
changes.pop("version_id", None)
|
||||
return changes
|
||||
|
||||
|
||||
def prepare_representation_update_data(old_doc, new_doc, replace=True):
|
||||
"""Compare two representation documents and prepare update data.
|
||||
|
||||
Based on compared values will create update data for
|
||||
'MongoUpdateOperation'.
|
||||
|
||||
Empty output means that documents are identical.
|
||||
|
||||
Returns:
|
||||
Dict[str, Any]: Changes between old and new document.
|
||||
"""
|
||||
|
||||
changes = _prepare_update_data(old_doc, new_doc, replace)
|
||||
context = changes.get("data", {}).get("context")
|
||||
# Make sure that both 'family' and 'subset' are in changes if
|
||||
# one of them changed (they'll both become 'product').
|
||||
if (
|
||||
context
|
||||
and ("family" in context or "subset" in context)
|
||||
):
|
||||
context["family"] = new_doc["data"]["context"]["family"]
|
||||
context["subset"] = new_doc["data"]["context"]["subset"]
|
||||
|
||||
return changes
|
||||
|
||||
|
||||
def prepare_workfile_info_update_data(old_doc, new_doc, replace=True):
|
||||
"""Compare two workfile info documents and prepare update data.
|
||||
|
||||
Based on compared values will create update data for
|
||||
'MongoUpdateOperation'.
|
||||
|
||||
Empty output means that documents are identical.
|
||||
|
||||
Returns:
|
||||
Dict[str, Any]: Changes between old and new document.
|
||||
"""
|
||||
|
||||
return _prepare_update_data(old_doc, new_doc, replace)
|
||||
|
||||
|
||||
class FailedOperations(Exception):
|
||||
pass
|
||||
|
||||
|
||||
def entity_data_json_default(value):
|
||||
if isinstance(value, datetime.datetime):
|
||||
return int(value.timestamp())
|
||||
|
||||
raise TypeError(
|
||||
"Object of type {} is not JSON serializable".format(str(type(value)))
|
||||
)
|
||||
|
||||
|
||||
def failed_json_default(value):
|
||||
return "< Failed value {} > {}".format(type(value), str(value))
|
||||
|
||||
|
||||
class ServerCreateOperation(CreateOperation):
|
||||
"""Operation to create an entity.
|
||||
|
||||
Args:
|
||||
project_name (str): On which project operation will happen.
|
||||
entity_type (str): Type of entity on which change happens.
|
||||
e.g. 'asset', 'representation' etc.
|
||||
data (Dict[str, Any]): Data of entity that will be created.
|
||||
"""
|
||||
|
||||
def __init__(self, project_name, entity_type, data, session):
|
||||
self._session = session
|
||||
|
||||
if not data:
|
||||
data = {}
|
||||
data = copy.deepcopy(data)
|
||||
if entity_type == "project":
|
||||
raise ValueError("Project cannot be created using operations")
|
||||
|
||||
tasks = None
|
||||
if entity_type in "asset":
|
||||
# TODO handle tasks
|
||||
entity_type = "folder"
|
||||
if "data" in data:
|
||||
tasks = data["data"].get("tasks")
|
||||
|
||||
project = self._session.get_project(project_name)
|
||||
new_data = convert_create_asset_to_v4(data, project, self.con)
|
||||
|
||||
elif entity_type == "task":
|
||||
project = self._session.get_project(project_name)
|
||||
new_data = convert_create_task_to_v4(data, project, self.con)
|
||||
|
||||
elif entity_type == "subset":
|
||||
new_data = convert_create_subset_to_v4(data, self.con)
|
||||
entity_type = "product"
|
||||
|
||||
elif entity_type == "version":
|
||||
new_data = convert_create_version_to_v4(data, self.con)
|
||||
|
||||
elif entity_type == "hero_version":
|
||||
new_data = convert_create_hero_version_to_v4(
|
||||
data, project_name, self.con
|
||||
)
|
||||
entity_type = "version"
|
||||
|
||||
elif entity_type in ("representation", "archived_representation"):
|
||||
new_data = convert_create_representation_to_v4(data, self.con)
|
||||
entity_type = "representation"
|
||||
|
||||
elif entity_type == "workfile":
|
||||
new_data = convert_create_workfile_info_to_v4(
|
||||
data, project_name, self.con
|
||||
)
|
||||
|
||||
else:
|
||||
raise ValueError(
|
||||
"Unhandled entity type \"{}\"".format(entity_type)
|
||||
)
|
||||
|
||||
# Simple check if data can be dumped into json
|
||||
# - should raise error on 'ObjectId' object
|
||||
try:
|
||||
new_data = json.loads(
|
||||
json.dumps(new_data, default=entity_data_json_default)
|
||||
)
|
||||
|
||||
except:
|
||||
raise ValueError("Couldn't json parse body: {}".format(
|
||||
json.dumps(new_data, default=failed_json_default)
|
||||
))
|
||||
|
||||
super(ServerCreateOperation, self).__init__(
|
||||
project_name, entity_type, new_data
|
||||
)
|
||||
|
||||
if "id" not in self._data:
|
||||
self._data["id"] = create_entity_id()
|
||||
|
||||
if tasks:
|
||||
copied_tasks = copy.deepcopy(tasks)
|
||||
for task_name, task in copied_tasks.items():
|
||||
task["name"] = task_name
|
||||
task["folderId"] = self._data["id"]
|
||||
self.session.create_entity(
|
||||
project_name, "task", task, nested_id=self.id
|
||||
)
|
||||
|
||||
@property
|
||||
def con(self):
|
||||
return self.session.con
|
||||
|
||||
@property
|
||||
def session(self):
|
||||
return self._session
|
||||
|
||||
@property
|
||||
def entity_id(self):
|
||||
return self._data["id"]
|
||||
|
||||
def to_server_operation(self):
|
||||
return {
|
||||
"id": self.id,
|
||||
"type": "create",
|
||||
"entityType": self.entity_type,
|
||||
"entityId": self.entity_id,
|
||||
"data": self._data
|
||||
}
|
||||
|
||||
|
||||
class ServerUpdateOperation(UpdateOperation):
|
||||
"""Operation to update an entity.
|
||||
|
||||
Args:
|
||||
project_name (str): On which project operation will happen.
|
||||
entity_type (str): Type of entity on which change happens.
|
||||
e.g. 'asset', 'representation' etc.
|
||||
entity_id (Union[str, ObjectId]): Identifier of an entity.
|
||||
update_data (Dict[str, Any]): Key -> value changes that will be set in
|
||||
database. If value is set to 'REMOVED_VALUE' the key will be
|
||||
removed. Only first level of dictionary is checked (on purpose).
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self, project_name, entity_type, entity_id, update_data, session
|
||||
):
|
||||
self._session = session
|
||||
|
||||
update_data = copy.deepcopy(update_data)
|
||||
if entity_type == "project":
|
||||
raise ValueError("Project cannot be created using operations")
|
||||
|
||||
if entity_type in ("asset", "archived_asset"):
|
||||
new_update_data = convert_update_folder_to_v4(
|
||||
project_name, entity_id, update_data, self.con
|
||||
)
|
||||
entity_type = "folder"
|
||||
|
||||
elif entity_type == "subset":
|
||||
new_update_data = convert_update_subset_to_v4(
|
||||
project_name, entity_id, update_data, self.con
|
||||
)
|
||||
entity_type = "product"
|
||||
|
||||
elif entity_type == "version":
|
||||
new_update_data = convert_update_version_to_v4(
|
||||
project_name, entity_id, update_data, self.con
|
||||
)
|
||||
|
||||
elif entity_type == "hero_version":
|
||||
new_update_data = convert_update_hero_version_to_v4(
|
||||
project_name, entity_id, update_data, self.con
|
||||
)
|
||||
entity_type = "version"
|
||||
|
||||
elif entity_type in ("representation", "archived_representation"):
|
||||
new_update_data = convert_update_representation_to_v4(
|
||||
project_name, entity_id, update_data, self.con
|
||||
)
|
||||
entity_type = "representation"
|
||||
|
||||
elif entity_type == "workfile":
|
||||
new_update_data = convert_update_workfile_info_to_v4(
|
||||
project_name, entity_id, update_data, self.con
|
||||
)
|
||||
|
||||
else:
|
||||
raise ValueError(
|
||||
"Unhandled entity type \"{}\"".format(entity_type)
|
||||
)
|
||||
|
||||
try:
|
||||
new_update_data = json.loads(
|
||||
json.dumps(new_update_data, default=entity_data_json_default)
|
||||
)
|
||||
|
||||
except:
|
||||
raise ValueError("Couldn't json parse body: {}".format(
|
||||
json.dumps(new_update_data, default=failed_json_default)
|
||||
))
|
||||
|
||||
super(ServerUpdateOperation, self).__init__(
|
||||
project_name, entity_type, entity_id, new_update_data
|
||||
)
|
||||
|
||||
@property
|
||||
def con(self):
|
||||
return self.session.con
|
||||
|
||||
@property
|
||||
def session(self):
|
||||
return self._session
|
||||
|
||||
def to_server_operation(self):
|
||||
if not self._update_data:
|
||||
return None
|
||||
|
||||
update_data = {}
|
||||
for key, value in self._update_data.items():
|
||||
if value is REMOVED_VALUE:
|
||||
value = None
|
||||
update_data[key] = value
|
||||
|
||||
return {
|
||||
"id": self.id,
|
||||
"type": "update",
|
||||
"entityType": self.entity_type,
|
||||
"entityId": self.entity_id,
|
||||
"data": update_data
|
||||
}
|
||||
|
||||
|
||||
class ServerDeleteOperation(DeleteOperation):
|
||||
"""Operation to delete an entity.
|
||||
|
||||
Args:
|
||||
project_name (str): On which project operation will happen.
|
||||
entity_type (str): Type of entity on which change happens.
|
||||
e.g. 'asset', 'representation' etc.
|
||||
entity_id (Union[str, ObjectId]): Entity id that will be removed.
|
||||
"""
|
||||
|
||||
def __init__(self, project_name, entity_type, entity_id, session):
|
||||
self._session = session
|
||||
|
||||
if entity_type == "asset":
|
||||
entity_type = "folder"
|
||||
|
||||
elif entity_type == "hero_version":
|
||||
entity_type = "version"
|
||||
|
||||
elif entity_type == "subset":
|
||||
entity_type = "product"
|
||||
|
||||
super(ServerDeleteOperation, self).__init__(
|
||||
project_name, entity_type, entity_id
|
||||
)
|
||||
|
||||
@property
|
||||
def con(self):
|
||||
return self.session.con
|
||||
|
||||
@property
|
||||
def session(self):
|
||||
return self._session
|
||||
|
||||
def to_server_operation(self):
|
||||
return {
|
||||
"id": self.id,
|
||||
"type": self.operation_name,
|
||||
"entityId": self.entity_id,
|
||||
"entityType": self.entity_type,
|
||||
}
|
||||
|
||||
|
||||
class OperationsSession(BaseOperationsSession):
|
||||
def __init__(self, con=None, *args, **kwargs):
|
||||
super(OperationsSession, self).__init__(*args, **kwargs)
|
||||
if con is None:
|
||||
con = get_ayon_server_api_connection()
|
||||
self._con = con
|
||||
self._project_cache = {}
|
||||
self._nested_operations = collections.defaultdict(list)
|
||||
|
||||
@property
|
||||
def con(self):
|
||||
return self._con
|
||||
|
||||
def get_project(self, project_name):
|
||||
if project_name not in self._project_cache:
|
||||
self._project_cache[project_name] = self.con.get_project(
|
||||
project_name)
|
||||
return copy.deepcopy(self._project_cache[project_name])
|
||||
|
||||
def commit(self):
|
||||
"""Commit session operations."""
|
||||
|
||||
operations, self._operations = self._operations, []
|
||||
if not operations:
|
||||
return
|
||||
|
||||
operations_by_project = collections.defaultdict(list)
|
||||
for operation in operations:
|
||||
operations_by_project[operation.project_name].append(operation)
|
||||
|
||||
body_by_id = {}
|
||||
results = []
|
||||
for project_name, operations in operations_by_project.items():
|
||||
operations_body = []
|
||||
for operation in operations:
|
||||
body = operation.to_server_operation()
|
||||
if body is not None:
|
||||
try:
|
||||
json.dumps(body)
|
||||
except:
|
||||
raise ValueError("Couldn't json parse body: {}".format(
|
||||
json.dumps(
|
||||
body, indent=4, default=failed_json_default
|
||||
)
|
||||
))
|
||||
|
||||
body_by_id[operation.id] = body
|
||||
operations_body.append(body)
|
||||
|
||||
if operations_body:
|
||||
result = self._con.post(
|
||||
"projects/{}/operations".format(project_name),
|
||||
operations=operations_body,
|
||||
canFail=False
|
||||
)
|
||||
results.append(result.data)
|
||||
|
||||
for result in results:
|
||||
if result.get("success"):
|
||||
continue
|
||||
|
||||
if "operations" not in result:
|
||||
raise FailedOperations(
|
||||
"Operation failed. Content: {}".format(str(result))
|
||||
)
|
||||
|
||||
for op_result in result["operations"]:
|
||||
if not op_result["success"]:
|
||||
operation_id = op_result["id"]
|
||||
raise FailedOperations((
|
||||
"Operation \"{}\" failed with data:\n{}\nError: {}."
|
||||
).format(
|
||||
operation_id,
|
||||
json.dumps(body_by_id[operation_id], indent=4),
|
||||
op_result.get("error", "unknown"),
|
||||
))
|
||||
|
||||
def create_entity(self, project_name, entity_type, data, nested_id=None):
|
||||
"""Fast access to 'ServerCreateOperation'.
|
||||
|
||||
Args:
|
||||
project_name (str): On which project the creation happens.
|
||||
entity_type (str): Which entity type will be created.
|
||||
data (Dicst[str, Any]): Entity data.
|
||||
nested_id (str): Id of other operation from which is triggered
|
||||
operation -> Operations can trigger suboperations but they
|
||||
must be added to operations list after it's parent is added.
|
||||
|
||||
Returns:
|
||||
ServerCreateOperation: Object of update operation.
|
||||
"""
|
||||
|
||||
operation = ServerCreateOperation(
|
||||
project_name, entity_type, data, self
|
||||
)
|
||||
|
||||
if nested_id:
|
||||
self._nested_operations[nested_id].append(operation)
|
||||
else:
|
||||
self.add(operation)
|
||||
if operation.id in self._nested_operations:
|
||||
self.extend(self._nested_operations.pop(operation.id))
|
||||
|
||||
return operation
|
||||
|
||||
def update_entity(
|
||||
self, project_name, entity_type, entity_id, update_data, nested_id=None
|
||||
):
|
||||
"""Fast access to 'ServerUpdateOperation'.
|
||||
|
||||
Returns:
|
||||
ServerUpdateOperation: Object of update operation.
|
||||
"""
|
||||
|
||||
operation = ServerUpdateOperation(
|
||||
project_name, entity_type, entity_id, update_data, self
|
||||
)
|
||||
if nested_id:
|
||||
self._nested_operations[nested_id].append(operation)
|
||||
else:
|
||||
self.add(operation)
|
||||
if operation.id in self._nested_operations:
|
||||
self.extend(self._nested_operations.pop(operation.id))
|
||||
return operation
|
||||
|
||||
def delete_entity(
|
||||
self, project_name, entity_type, entity_id, nested_id=None
|
||||
):
|
||||
"""Fast access to 'ServerDeleteOperation'.
|
||||
|
||||
Returns:
|
||||
ServerDeleteOperation: Object of delete operation.
|
||||
"""
|
||||
|
||||
operation = ServerDeleteOperation(
|
||||
project_name, entity_type, entity_id, self
|
||||
)
|
||||
if nested_id:
|
||||
self._nested_operations[nested_id].append(operation)
|
||||
else:
|
||||
self.add(operation)
|
||||
if operation.id in self._nested_operations:
|
||||
self.extend(self._nested_operations.pop(operation.id))
|
||||
return operation
|
||||
|
||||
|
||||
def create_project(
|
||||
project_name,
|
||||
project_code,
|
||||
library_project=False,
|
||||
preset_name=None,
|
||||
con=None
|
||||
):
|
||||
"""Create project using OpenPype settings.
|
||||
|
||||
This project creation function is not validating project document on
|
||||
creation. It is because project document is created blindly with only
|
||||
minimum required information about project which is it's name, code, type
|
||||
and schema.
|
||||
|
||||
Entered project name must be unique and project must not exist yet.
|
||||
|
||||
Note:
|
||||
This function is here to be OP v4 ready but in v3 has more logic
|
||||
to do. That's why inner imports are in the body.
|
||||
|
||||
Args:
|
||||
project_name (str): New project name. Should be unique.
|
||||
project_code (str): Project's code should be unique too.
|
||||
library_project (bool): Project is library project.
|
||||
preset_name (str): Name of anatomy preset. Default is used if not
|
||||
passed.
|
||||
con (ServerAPI): Connection to server with logged user.
|
||||
|
||||
Raises:
|
||||
ValueError: When project name already exists in MongoDB.
|
||||
|
||||
Returns:
|
||||
dict: Created project document.
|
||||
"""
|
||||
|
||||
if con is None:
|
||||
con = get_ayon_server_api_connection()
|
||||
|
||||
return con.create_project(
|
||||
project_name,
|
||||
project_code,
|
||||
library_project,
|
||||
preset_name
|
||||
)
|
||||
|
||||
|
||||
def delete_project(project_name, con=None):
|
||||
if con is None:
|
||||
con = get_ayon_server_api_connection()
|
||||
|
||||
return con.delete_project(project_name)
|
||||
|
||||
|
||||
def create_thumbnail(project_name, src_filepath, thumbnail_id=None, con=None):
|
||||
if con is None:
|
||||
con = get_ayon_server_api_connection()
|
||||
return con.create_thumbnail(project_name, src_filepath, thumbnail_id)
|
||||
|
|
@ -1,289 +0,0 @@
|
|||
import uuid
|
||||
import copy
|
||||
from abc import ABCMeta, abstractmethod, abstractproperty
|
||||
import six
|
||||
|
||||
REMOVED_VALUE = object()
|
||||
|
||||
|
||||
@six.add_metaclass(ABCMeta)
|
||||
class AbstractOperation(object):
|
||||
"""Base operation class.
|
||||
|
||||
Operation represent a call into database. The call can create, change or
|
||||
remove data.
|
||||
|
||||
Args:
|
||||
project_name (str): On which project operation will happen.
|
||||
entity_type (str): Type of entity on which change happens.
|
||||
e.g. 'asset', 'representation' etc.
|
||||
"""
|
||||
|
||||
def __init__(self, project_name, entity_type):
|
||||
self._project_name = project_name
|
||||
self._entity_type = entity_type
|
||||
self._id = str(uuid.uuid4())
|
||||
|
||||
@property
|
||||
def project_name(self):
|
||||
return self._project_name
|
||||
|
||||
@property
|
||||
def id(self):
|
||||
"""Identifier of operation."""
|
||||
|
||||
return self._id
|
||||
|
||||
@property
|
||||
def entity_type(self):
|
||||
return self._entity_type
|
||||
|
||||
@abstractproperty
|
||||
def operation_name(self):
|
||||
"""Stringified type of operation."""
|
||||
|
||||
pass
|
||||
|
||||
def to_data(self):
|
||||
"""Convert operation to data that can be converted to json or others.
|
||||
|
||||
Warning:
|
||||
Current state returns ObjectId objects which cannot be parsed by
|
||||
json.
|
||||
|
||||
Returns:
|
||||
Dict[str, Any]: Description of operation.
|
||||
"""
|
||||
|
||||
return {
|
||||
"id": self._id,
|
||||
"entity_type": self.entity_type,
|
||||
"project_name": self.project_name,
|
||||
"operation": self.operation_name
|
||||
}
|
||||
|
||||
|
||||
class CreateOperation(AbstractOperation):
|
||||
"""Operation to create an entity.
|
||||
|
||||
Args:
|
||||
project_name (str): On which project operation will happen.
|
||||
entity_type (str): Type of entity on which change happens.
|
||||
e.g. 'asset', 'representation' etc.
|
||||
data (Dict[str, Any]): Data of entity that will be created.
|
||||
"""
|
||||
|
||||
operation_name = "create"
|
||||
|
||||
def __init__(self, project_name, entity_type, data):
|
||||
super(CreateOperation, self).__init__(project_name, entity_type)
|
||||
|
||||
if not data:
|
||||
data = {}
|
||||
else:
|
||||
data = copy.deepcopy(dict(data))
|
||||
self._data = data
|
||||
|
||||
def __setitem__(self, key, value):
|
||||
self.set_value(key, value)
|
||||
|
||||
def __getitem__(self, key):
|
||||
return self.data[key]
|
||||
|
||||
def set_value(self, key, value):
|
||||
self.data[key] = value
|
||||
|
||||
def get(self, key, *args, **kwargs):
|
||||
return self.data.get(key, *args, **kwargs)
|
||||
|
||||
@abstractproperty
|
||||
def entity_id(self):
|
||||
pass
|
||||
|
||||
@property
|
||||
def data(self):
|
||||
return self._data
|
||||
|
||||
def to_data(self):
|
||||
output = super(CreateOperation, self).to_data()
|
||||
output["data"] = copy.deepcopy(self.data)
|
||||
return output
|
||||
|
||||
|
||||
class UpdateOperation(AbstractOperation):
|
||||
"""Operation to update an entity.
|
||||
|
||||
Args:
|
||||
project_name (str): On which project operation will happen.
|
||||
entity_type (str): Type of entity on which change happens.
|
||||
e.g. 'asset', 'representation' etc.
|
||||
entity_id (Union[str, ObjectId]): Identifier of an entity.
|
||||
update_data (Dict[str, Any]): Key -> value changes that will be set in
|
||||
database. If value is set to 'REMOVED_VALUE' the key will be
|
||||
removed. Only first level of dictionary is checked (on purpose).
|
||||
"""
|
||||
|
||||
operation_name = "update"
|
||||
|
||||
def __init__(self, project_name, entity_type, entity_id, update_data):
|
||||
super(UpdateOperation, self).__init__(project_name, entity_type)
|
||||
|
||||
self._entity_id = entity_id
|
||||
self._update_data = update_data
|
||||
|
||||
@property
|
||||
def entity_id(self):
|
||||
return self._entity_id
|
||||
|
||||
@property
|
||||
def update_data(self):
|
||||
return self._update_data
|
||||
|
||||
def to_data(self):
|
||||
changes = {}
|
||||
for key, value in self._update_data.items():
|
||||
if value is REMOVED_VALUE:
|
||||
value = None
|
||||
changes[key] = value
|
||||
|
||||
output = super(UpdateOperation, self).to_data()
|
||||
output.update({
|
||||
"entity_id": self.entity_id,
|
||||
"changes": changes
|
||||
})
|
||||
return output
|
||||
|
||||
|
||||
class DeleteOperation(AbstractOperation):
|
||||
"""Operation to delete an entity.
|
||||
|
||||
Args:
|
||||
project_name (str): On which project operation will happen.
|
||||
entity_type (str): Type of entity on which change happens.
|
||||
e.g. 'asset', 'representation' etc.
|
||||
entity_id (Union[str, ObjectId]): Entity id that will be removed.
|
||||
"""
|
||||
|
||||
operation_name = "delete"
|
||||
|
||||
def __init__(self, project_name, entity_type, entity_id):
|
||||
super(DeleteOperation, self).__init__(project_name, entity_type)
|
||||
|
||||
self._entity_id = entity_id
|
||||
|
||||
@property
|
||||
def entity_id(self):
|
||||
return self._entity_id
|
||||
|
||||
def to_data(self):
|
||||
output = super(DeleteOperation, self).to_data()
|
||||
output["entity_id"] = self.entity_id
|
||||
return output
|
||||
|
||||
|
||||
class BaseOperationsSession(object):
|
||||
"""Session storing operations that should happen in an order.
|
||||
|
||||
At this moment does not handle anything special can be considered as
|
||||
stupid list of operations that will happen after each other. If creation
|
||||
of same entity is there multiple times it's handled in any way and document
|
||||
values are not validated.
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
self._operations = []
|
||||
|
||||
def __len__(self):
|
||||
return len(self._operations)
|
||||
|
||||
def add(self, operation):
|
||||
"""Add operation to be processed.
|
||||
|
||||
Args:
|
||||
operation (BaseOperation): Operation that should be processed.
|
||||
"""
|
||||
if not isinstance(
|
||||
operation,
|
||||
(CreateOperation, UpdateOperation, DeleteOperation)
|
||||
):
|
||||
raise TypeError("Expected Operation object got {}".format(
|
||||
str(type(operation))
|
||||
))
|
||||
|
||||
self._operations.append(operation)
|
||||
|
||||
def append(self, operation):
|
||||
"""Add operation to be processed.
|
||||
|
||||
Args:
|
||||
operation (BaseOperation): Operation that should be processed.
|
||||
"""
|
||||
|
||||
self.add(operation)
|
||||
|
||||
def extend(self, operations):
|
||||
"""Add operations to be processed.
|
||||
|
||||
Args:
|
||||
operations (List[BaseOperation]): Operations that should be
|
||||
processed.
|
||||
"""
|
||||
|
||||
for operation in operations:
|
||||
self.add(operation)
|
||||
|
||||
def remove(self, operation):
|
||||
"""Remove operation."""
|
||||
|
||||
self._operations.remove(operation)
|
||||
|
||||
def clear(self):
|
||||
"""Clear all registered operations."""
|
||||
|
||||
self._operations = []
|
||||
|
||||
def to_data(self):
|
||||
return [
|
||||
operation.to_data()
|
||||
for operation in self._operations
|
||||
]
|
||||
|
||||
@abstractmethod
|
||||
def commit(self):
|
||||
"""Commit session operations."""
|
||||
pass
|
||||
|
||||
def create_entity(self, project_name, entity_type, data):
|
||||
"""Fast access to 'CreateOperation'.
|
||||
|
||||
Returns:
|
||||
CreateOperation: Object of update operation.
|
||||
"""
|
||||
|
||||
operation = CreateOperation(project_name, entity_type, data)
|
||||
self.add(operation)
|
||||
return operation
|
||||
|
||||
def update_entity(self, project_name, entity_type, entity_id, update_data):
|
||||
"""Fast access to 'UpdateOperation'.
|
||||
|
||||
Returns:
|
||||
UpdateOperation: Object of update operation.
|
||||
"""
|
||||
|
||||
operation = UpdateOperation(
|
||||
project_name, entity_type, entity_id, update_data
|
||||
)
|
||||
self.add(operation)
|
||||
return operation
|
||||
|
||||
def delete_entity(self, project_name, entity_type, entity_id):
|
||||
"""Fast access to 'DeleteOperation'.
|
||||
|
||||
Returns:
|
||||
DeleteOperation: Object of delete operation.
|
||||
"""
|
||||
|
||||
operation = DeleteOperation(project_name, entity_type, entity_id)
|
||||
self.add(operation)
|
||||
return operation
|
||||
|
|
@ -1,134 +0,0 @@
|
|||
import os
|
||||
import uuid
|
||||
|
||||
import ayon_api
|
||||
|
||||
from ayon_core.client.operations_base import REMOVED_VALUE
|
||||
|
||||
|
||||
class _GlobalCache:
|
||||
initialized = False
|
||||
|
||||
|
||||
def get_ayon_server_api_connection():
|
||||
if _GlobalCache.initialized:
|
||||
con = ayon_api.get_server_api_connection()
|
||||
else:
|
||||
from ayon_core.lib.local_settings import get_local_site_id
|
||||
|
||||
_GlobalCache.initialized = True
|
||||
site_id = get_local_site_id()
|
||||
version = os.getenv("AYON_VERSION")
|
||||
if ayon_api.is_connection_created():
|
||||
con = ayon_api.get_server_api_connection()
|
||||
con.set_site_id(site_id)
|
||||
con.set_client_version(version)
|
||||
else:
|
||||
con = ayon_api.create_connection(site_id, version)
|
||||
return con
|
||||
|
||||
|
||||
def create_entity_id():
|
||||
return uuid.uuid1().hex
|
||||
|
||||
|
||||
def prepare_attribute_changes(old_entity, new_entity, replace=False):
|
||||
"""Prepare changes of attributes on entities.
|
||||
|
||||
Compare 'attrib' of old and new entity data to prepare only changed
|
||||
values that should be sent to server for update.
|
||||
|
||||
Example:
|
||||
>>> # Limited entity data to 'attrib'
|
||||
>>> old_entity = {
|
||||
... "attrib": {"attr_1": 1, "attr_2": "MyString", "attr_3": True}
|
||||
... }
|
||||
>>> new_entity = {
|
||||
... "attrib": {"attr_1": 2, "attr_3": True, "attr_4": 3}
|
||||
... }
|
||||
>>> # Changes if replacement should not happen
|
||||
>>> expected_changes = {
|
||||
... "attr_1": 2,
|
||||
... "attr_4": 3
|
||||
... }
|
||||
>>> changes = prepare_attribute_changes(old_entity, new_entity)
|
||||
>>> changes == expected_changes
|
||||
True
|
||||
|
||||
>>> # Changes if replacement should happen
|
||||
>>> expected_changes_replace = {
|
||||
... "attr_1": 2,
|
||||
... "attr_2": REMOVED_VALUE,
|
||||
... "attr_4": 3
|
||||
... }
|
||||
>>> changes_replace = prepare_attribute_changes(
|
||||
... old_entity, new_entity, True)
|
||||
>>> changes_replace == expected_changes_replace
|
||||
True
|
||||
|
||||
Args:
|
||||
old_entity (dict[str, Any]): Data of entity queried from server.
|
||||
new_entity (dict[str, Any]): Entity data with applied changes.
|
||||
replace (bool): New entity should fully replace all old entity values.
|
||||
|
||||
Returns:
|
||||
Dict[str, Any]: Values from new entity only if value has changed.
|
||||
"""
|
||||
|
||||
attrib_changes = {}
|
||||
new_attrib = new_entity.get("attrib")
|
||||
old_attrib = old_entity.get("attrib")
|
||||
if new_attrib is None:
|
||||
if not replace:
|
||||
return attrib_changes
|
||||
new_attrib = {}
|
||||
|
||||
if old_attrib is None:
|
||||
return new_attrib
|
||||
|
||||
for attr, new_attr_value in new_attrib.items():
|
||||
old_attr_value = old_attrib.get(attr)
|
||||
if old_attr_value != new_attr_value:
|
||||
attrib_changes[attr] = new_attr_value
|
||||
|
||||
if replace:
|
||||
for attr in old_attrib:
|
||||
if attr not in new_attrib:
|
||||
attrib_changes[attr] = REMOVED_VALUE
|
||||
|
||||
return attrib_changes
|
||||
|
||||
|
||||
def prepare_entity_changes(old_entity, new_entity, replace=False):
|
||||
"""Prepare changes of AYON entities.
|
||||
|
||||
Compare old and new entity to filter values from new data that changed.
|
||||
|
||||
Args:
|
||||
old_entity (dict[str, Any]): Data of entity queried from server.
|
||||
new_entity (dict[str, Any]): Entity data with applied changes.
|
||||
replace (bool): All attributes should be replaced by new values. So
|
||||
all attribute values that are not on new entity will be removed.
|
||||
|
||||
Returns:
|
||||
Dict[str, Any]: Only values from new entity that changed.
|
||||
"""
|
||||
|
||||
changes = {}
|
||||
for key, new_value in new_entity.items():
|
||||
if key == "attrib":
|
||||
continue
|
||||
|
||||
old_value = old_entity.get(key)
|
||||
if old_value != new_value:
|
||||
changes[key] = new_value
|
||||
|
||||
if replace:
|
||||
for key in old_entity:
|
||||
if key not in new_entity:
|
||||
changes[key] = REMOVED_VALUE
|
||||
|
||||
attr_changes = prepare_attribute_changes(old_entity, new_entity, replace)
|
||||
if attr_changes:
|
||||
changes["attrib"] = attr_changes
|
||||
return changes
|
||||
|
|
@ -1,6 +1,6 @@
|
|||
import os
|
||||
|
||||
from ayon_core.lib.applications import PreLaunchHook, LaunchTypes
|
||||
from ayon_applications import PreLaunchHook, LaunchTypes
|
||||
|
||||
|
||||
class AddLastWorkfileToLaunchArgs(PreLaunchHook):
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
import os
|
||||
import shutil
|
||||
from ayon_core.settings import get_project_settings
|
||||
from ayon_core.lib.applications import PreLaunchHook, LaunchTypes
|
||||
from ayon_applications import PreLaunchHook, LaunchTypes
|
||||
from ayon_core.pipeline.workfile import (
|
||||
get_custom_workfile_template,
|
||||
get_custom_workfile_template_by_string_context
|
||||
|
|
@ -54,21 +54,22 @@ class CopyTemplateWorkfile(PreLaunchHook):
|
|||
self.log.info("Last workfile does not exist.")
|
||||
|
||||
project_name = self.data["project_name"]
|
||||
asset_name = self.data["folder_path"]
|
||||
folder_path = self.data["folder_path"]
|
||||
task_name = self.data["task_name"]
|
||||
host_name = self.application.host_name
|
||||
|
||||
project_settings = get_project_settings(project_name)
|
||||
|
||||
project_doc = self.data.get("project_doc")
|
||||
asset_doc = self.data.get("asset_doc")
|
||||
project_entity = self.data.get("project_entity")
|
||||
folder_entity = self.data.get("folder_entity")
|
||||
task_entity = self.data.get("task_entity")
|
||||
anatomy = self.data.get("anatomy")
|
||||
if project_doc and asset_doc:
|
||||
if project_entity and folder_entity and task_entity:
|
||||
self.log.debug("Started filtering of custom template paths.")
|
||||
template_path = get_custom_workfile_template(
|
||||
project_doc,
|
||||
asset_doc,
|
||||
task_name,
|
||||
project_entity,
|
||||
folder_entity,
|
||||
task_entity,
|
||||
host_name,
|
||||
anatomy,
|
||||
project_settings
|
||||
|
|
@ -81,7 +82,7 @@ class CopyTemplateWorkfile(PreLaunchHook):
|
|||
))
|
||||
template_path = get_custom_workfile_template_by_string_context(
|
||||
project_name,
|
||||
asset_name,
|
||||
folder_path,
|
||||
task_name,
|
||||
host_name,
|
||||
anatomy,
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
import os
|
||||
from ayon_core.lib.applications import PreLaunchHook, LaunchTypes
|
||||
from ayon_applications import PreLaunchHook, LaunchTypes
|
||||
from ayon_core.pipeline.workfile import create_workdir_extra_folders
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -1,6 +1,7 @@
|
|||
from ayon_core.client import get_project, get_asset_by_name
|
||||
from ayon_core.lib.applications import (
|
||||
PreLaunchHook,
|
||||
from ayon_api import get_project, get_folder_by_path, get_task_by_name
|
||||
|
||||
from ayon_applications import PreLaunchHook
|
||||
from ayon_applications.utils import (
|
||||
EnvironmentPrepData,
|
||||
prepare_app_environments,
|
||||
prepare_context_environments
|
||||
|
|
@ -16,7 +17,7 @@ class GlobalHostDataHook(PreLaunchHook):
|
|||
"""Prepare global objects to `data` that will be used for sure."""
|
||||
self.prepare_global_data()
|
||||
|
||||
if not self.data.get("asset_doc"):
|
||||
if not self.data.get("folder_entity"):
|
||||
return
|
||||
|
||||
app = self.launch_context.application
|
||||
|
|
@ -27,8 +28,9 @@ class GlobalHostDataHook(PreLaunchHook):
|
|||
|
||||
"app": app,
|
||||
|
||||
"project_doc": self.data["project_doc"],
|
||||
"asset_doc": self.data["asset_doc"],
|
||||
"project_entity": self.data["project_entity"],
|
||||
"folder_entity": self.data["folder_entity"],
|
||||
"task_entity": self.data["task_entity"],
|
||||
|
||||
"anatomy": self.data["anatomy"],
|
||||
|
||||
|
|
@ -59,19 +61,37 @@ class GlobalHostDataHook(PreLaunchHook):
|
|||
return
|
||||
|
||||
self.log.debug("Project name is set to \"{}\"".format(project_name))
|
||||
|
||||
# Project Entity
|
||||
project_entity = get_project(project_name)
|
||||
self.data["project_entity"] = project_entity
|
||||
|
||||
# Anatomy
|
||||
self.data["anatomy"] = Anatomy(project_name)
|
||||
self.data["anatomy"] = Anatomy(
|
||||
project_name, project_entity=project_entity
|
||||
)
|
||||
|
||||
# Project document
|
||||
project_doc = get_project(project_name)
|
||||
self.data["project_doc"] = project_doc
|
||||
|
||||
asset_name = self.data.get("folder_path")
|
||||
if not asset_name:
|
||||
folder_path = self.data.get("folder_path")
|
||||
if not folder_path:
|
||||
self.log.warning(
|
||||
"Asset name was not set. Skipping asset document query."
|
||||
"Folder path is not set. Skipping folder query."
|
||||
)
|
||||
return
|
||||
|
||||
asset_doc = get_asset_by_name(project_name, asset_name)
|
||||
self.data["asset_doc"] = asset_doc
|
||||
folder_entity = get_folder_by_path(project_name, folder_path)
|
||||
self.data["folder_entity"] = folder_entity
|
||||
|
||||
task_name = self.data.get("task_name")
|
||||
if not task_name:
|
||||
self.log.warning(
|
||||
"Task name is not set. Skipping task query."
|
||||
)
|
||||
return
|
||||
|
||||
if not folder_entity:
|
||||
return
|
||||
|
||||
task_entity = get_task_by_name(
|
||||
project_name, folder_entity["id"], task_name
|
||||
)
|
||||
self.data["task_entity"] = task_entity
|
||||
|
|
@ -1,5 +1,5 @@
|
|||
import os
|
||||
from ayon_core.lib.applications import PreLaunchHook, LaunchTypes
|
||||
from ayon_applications import PreLaunchHook, LaunchTypes
|
||||
|
||||
|
||||
class LaunchWithTerminal(PreLaunchHook):
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
import subprocess
|
||||
from ayon_core.lib.applications import PreLaunchHook, LaunchTypes
|
||||
from ayon_applications import PreLaunchHook, LaunchTypes
|
||||
|
||||
|
||||
class LaunchNewConsoleApps(PreLaunchHook):
|
||||
|
|
|
|||
|
|
@ -1,58 +0,0 @@
|
|||
import os
|
||||
|
||||
from ayon_core.lib import get_ayon_launcher_args
|
||||
from ayon_core.lib.applications import (
|
||||
get_non_python_host_kwargs,
|
||||
PreLaunchHook,
|
||||
LaunchTypes,
|
||||
)
|
||||
|
||||
from ayon_core import AYON_CORE_ROOT
|
||||
|
||||
|
||||
class NonPythonHostHook(PreLaunchHook):
|
||||
"""Launch arguments preparation.
|
||||
|
||||
Non python host implementation do not launch host directly but use
|
||||
python script which launch the host. For these cases it is necessary to
|
||||
prepend python (or ayon) executable and script path before application's.
|
||||
"""
|
||||
app_groups = {"harmony", "photoshop", "aftereffects"}
|
||||
|
||||
order = 20
|
||||
launch_types = {LaunchTypes.local}
|
||||
|
||||
def execute(self):
|
||||
# Pop executable
|
||||
executable_path = self.launch_context.launch_args.pop(0)
|
||||
|
||||
# Pop rest of launch arguments - There should not be other arguments!
|
||||
remainders = []
|
||||
while self.launch_context.launch_args:
|
||||
remainders.append(self.launch_context.launch_args.pop(0))
|
||||
|
||||
script_path = os.path.join(
|
||||
AYON_CORE_ROOT,
|
||||
"scripts",
|
||||
"non_python_host_launch.py"
|
||||
)
|
||||
|
||||
new_launch_args = get_ayon_launcher_args(
|
||||
"run", script_path, executable_path
|
||||
)
|
||||
# Add workfile path if exists
|
||||
workfile_path = self.data["last_workfile_path"]
|
||||
if (
|
||||
self.data.get("start_last_workfile")
|
||||
and workfile_path
|
||||
and os.path.exists(workfile_path)):
|
||||
new_launch_args.append(workfile_path)
|
||||
|
||||
# Append as whole list as these areguments should not be separated
|
||||
self.launch_context.launch_args.append(new_launch_args)
|
||||
|
||||
if remainders:
|
||||
self.launch_context.launch_args.extend(remainders)
|
||||
|
||||
self.launch_context.kwargs = \
|
||||
get_non_python_host_kwargs(self.launch_context.kwargs)
|
||||
|
|
@ -1,4 +1,4 @@
|
|||
from ayon_core.lib.applications import PreLaunchHook
|
||||
from ayon_applications import PreLaunchHook
|
||||
|
||||
from ayon_core.pipeline.colorspace import get_imageio_config
|
||||
from ayon_core.pipeline.template_data import get_template_data_with_names
|
||||
|
|
@ -28,7 +28,7 @@ class OCIOEnvHook(PreLaunchHook):
|
|||
|
||||
template_data = get_template_data_with_names(
|
||||
project_name=self.data["project_name"],
|
||||
asset_name=self.data["folder_path"],
|
||||
folder_path=self.data["folder_path"],
|
||||
task_name=self.data["task_name"],
|
||||
host_name=self.host_name,
|
||||
settings=self.data["project_settings"]
|
||||
|
|
|
|||
|
|
@ -36,23 +36,23 @@ class HostDirmap(object):
|
|||
host_name,
|
||||
project_name,
|
||||
project_settings=None,
|
||||
sync_module=None
|
||||
sitesync_addon=None
|
||||
):
|
||||
self.host_name = host_name
|
||||
self.project_name = project_name
|
||||
self._project_settings = project_settings
|
||||
self._sync_module = sync_module
|
||||
self._sitesync_addon = sitesync_addon
|
||||
# to limit reinit of Modules
|
||||
self._sync_module_discovered = sync_module is not None
|
||||
self._sitesync_addon_discovered = sitesync_addon is not None
|
||||
self._log = None
|
||||
|
||||
@property
|
||||
def sync_module(self):
|
||||
if not self._sync_module_discovered:
|
||||
self._sync_module_discovered = True
|
||||
def sitesync_addon(self):
|
||||
if not self._sitesync_addon_discovered:
|
||||
self._sitesync_addon_discovered = True
|
||||
manager = AddonsManager()
|
||||
self._sync_module = manager.get("sync_server")
|
||||
return self._sync_module
|
||||
self._sitesync_addon = manager.get("sitesync")
|
||||
return self._sitesync_addon
|
||||
|
||||
@property
|
||||
def project_settings(self):
|
||||
|
|
@ -158,25 +158,25 @@ class HostDirmap(object):
|
|||
"""
|
||||
project_name = self.project_name
|
||||
|
||||
sync_module = self.sync_module
|
||||
sitesync_addon = self.sitesync_addon
|
||||
mapping = {}
|
||||
if (
|
||||
sync_module is None
|
||||
or not sync_module.enabled
|
||||
or project_name not in sync_module.get_enabled_projects()
|
||||
sitesync_addon is None
|
||||
or not sitesync_addon.enabled
|
||||
or project_name not in sitesync_addon.get_enabled_projects()
|
||||
):
|
||||
return mapping
|
||||
|
||||
active_site = sync_module.get_local_normalized_site(
|
||||
sync_module.get_active_site(project_name))
|
||||
remote_site = sync_module.get_local_normalized_site(
|
||||
sync_module.get_remote_site(project_name))
|
||||
active_site = sitesync_addon.get_local_normalized_site(
|
||||
sitesync_addon.get_active_site(project_name))
|
||||
remote_site = sitesync_addon.get_local_normalized_site(
|
||||
sitesync_addon.get_remote_site(project_name))
|
||||
self.log.debug(
|
||||
"active {} - remote {}".format(active_site, remote_site)
|
||||
)
|
||||
|
||||
if active_site == "local" and active_site != remote_site:
|
||||
sync_settings = sync_module.get_sync_project_setting(
|
||||
sync_settings = sitesync_addon.get_sync_project_setting(
|
||||
project_name,
|
||||
exclude_locals=False,
|
||||
cached=False)
|
||||
|
|
@ -194,7 +194,7 @@ class HostDirmap(object):
|
|||
self.log.debug("remote overrides {}".format(remote_overrides))
|
||||
|
||||
current_platform = platform.system().lower()
|
||||
remote_provider = sync_module.get_provider_for_site(
|
||||
remote_provider = sitesync_addon.get_provider_for_site(
|
||||
project_name, remote_site
|
||||
)
|
||||
# dirmap has sense only with regular disk provider, in the workfile
|
||||
|
|
|
|||
|
|
@ -18,7 +18,7 @@ class HostBase(object):
|
|||
Compared to 'avalon' concept:
|
||||
What was before considered as functions in host implementation folder. The
|
||||
host implementation should primarily care about adding ability of creation
|
||||
(mark subsets to be published) and optionally about referencing published
|
||||
(mark products to be published) and optionally about referencing published
|
||||
representations as containers.
|
||||
|
||||
Host may need extend some functionality like working with workfiles
|
||||
|
|
@ -108,7 +108,7 @@ class HostBase(object):
|
|||
|
||||
return os.environ.get("AYON_PROJECT_NAME")
|
||||
|
||||
def get_current_asset_name(self):
|
||||
def get_current_folder_path(self):
|
||||
"""
|
||||
Returns:
|
||||
Union[str, None]: Current asset name.
|
||||
|
|
@ -139,7 +139,7 @@ class HostBase(object):
|
|||
|
||||
return {
|
||||
"project_name": self.get_current_project_name(),
|
||||
"folder_path": self.get_current_asset_name(),
|
||||
"folder_path": self.get_current_folder_path(),
|
||||
"task_name": self.get_current_task_name()
|
||||
}
|
||||
|
||||
|
|
@ -161,13 +161,13 @@ class HostBase(object):
|
|||
# Use current context to fill the context title
|
||||
current_context = self.get_current_context()
|
||||
project_name = current_context["project_name"]
|
||||
asset_name = current_context["folder_path"]
|
||||
folder_path = current_context["folder_path"]
|
||||
task_name = current_context["task_name"]
|
||||
items = []
|
||||
if project_name:
|
||||
items.append(project_name)
|
||||
if asset_name:
|
||||
items.append(asset_name.lstrip("/"))
|
||||
if folder_path:
|
||||
items.append(folder_path.lstrip("/"))
|
||||
if task_name:
|
||||
items.append(task_name)
|
||||
if items:
|
||||
|
|
|
|||
|
|
@ -1,6 +1,12 @@
|
|||
from .addon import AfterEffectsAddon
|
||||
from .addon import (
|
||||
AFTEREFFECTS_ADDON_ROOT,
|
||||
AfterEffectsAddon,
|
||||
get_launch_script_path,
|
||||
)
|
||||
|
||||
|
||||
__all__ = (
|
||||
"AFTEREFFECTS_ADDON_ROOT",
|
||||
"AfterEffectsAddon",
|
||||
"get_launch_script_path",
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,5 +1,9 @@
|
|||
import os
|
||||
|
||||
from ayon_core.addon import AYONAddon, IHostAddon
|
||||
|
||||
AFTEREFFECTS_ADDON_ROOT = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class AfterEffectsAddon(AYONAddon, IHostAddon):
|
||||
name = "aftereffects"
|
||||
|
|
@ -17,3 +21,16 @@ class AfterEffectsAddon(AYONAddon, IHostAddon):
|
|||
|
||||
def get_workfile_extensions(self):
|
||||
return [".aep"]
|
||||
|
||||
def get_launch_hook_paths(self, app):
|
||||
if app.host_name != self.host_name:
|
||||
return []
|
||||
return [
|
||||
os.path.join(AFTEREFFECTS_ADDON_ROOT, "hooks")
|
||||
]
|
||||
|
||||
|
||||
def get_launch_script_path():
|
||||
return os.path.join(
|
||||
AFTEREFFECTS_ADDON_ROOT, "api", "launch_script.py"
|
||||
)
|
||||
|
|
|
|||
|
|
@ -17,7 +17,7 @@ from .pipeline import (
|
|||
from .lib import (
|
||||
maintained_selection,
|
||||
get_extension_manifest_path,
|
||||
get_asset_settings,
|
||||
get_folder_settings,
|
||||
set_settings
|
||||
)
|
||||
|
||||
|
|
@ -31,13 +31,14 @@ __all__ = [
|
|||
"get_stub",
|
||||
|
||||
# pipeline
|
||||
"AfterEffectsHost",
|
||||
"ls",
|
||||
"containerise",
|
||||
|
||||
# lib
|
||||
"maintained_selection",
|
||||
"get_extension_manifest_path",
|
||||
"get_asset_settings",
|
||||
"get_folder_settings",
|
||||
"set_settings",
|
||||
|
||||
# plugin
|
||||
|
|
|
|||
|
|
@ -7,7 +7,6 @@ import asyncio
|
|||
import functools
|
||||
import traceback
|
||||
|
||||
|
||||
from wsrpc_aiohttp import (
|
||||
WebSocketRoute,
|
||||
WebSocketAsync
|
||||
|
|
@ -286,20 +285,21 @@ class AfterEffectsRoute(WebSocketRoute):
|
|||
|
||||
# This method calls function on the client side
|
||||
# client functions
|
||||
async def set_context(self, project, asset, task):
|
||||
async def set_context(self, project, folder, task):
|
||||
"""
|
||||
Sets 'project' and 'asset' to envs, eg. setting context
|
||||
Sets 'project', 'folder' and 'task' to envs, eg. setting context
|
||||
|
||||
Args:
|
||||
project (str)
|
||||
asset (str)
|
||||
folder (str)
|
||||
task (str)
|
||||
"""
|
||||
log.info("Setting context change")
|
||||
log.info("project {} asset {} ".format(project, asset))
|
||||
log.info("project {} folder {} ".format(project, folder))
|
||||
if project:
|
||||
os.environ["AYON_PROJECT_NAME"] = project
|
||||
if asset:
|
||||
os.environ["AYON_FOLDER_PATH"] = asset
|
||||
if folder:
|
||||
os.environ["AYON_FOLDER_PATH"] = folder
|
||||
if task:
|
||||
os.environ["AYON_TASK_NAME"] = task
|
||||
|
||||
|
|
|
|||
|
|
@ -1,4 +1,4 @@
|
|||
"""Script wraps launch mechanism of non python host implementations.
|
||||
"""Script wraps launch mechanism of AfterEffects implementations.
|
||||
|
||||
Arguments passed to the script are passed to launch function in host
|
||||
implementation. In all cases requires host app executable and may contain
|
||||
|
|
@ -8,6 +8,8 @@ workfile or others.
|
|||
import os
|
||||
import sys
|
||||
|
||||
from ayon_core.hosts.aftereffects.api.launch_logic import main as host_main
|
||||
|
||||
# Get current file to locate start point of sys.argv
|
||||
CURRENT_FILE = os.path.abspath(__file__)
|
||||
|
||||
|
|
@ -79,26 +81,9 @@ def main(argv):
|
|||
if after_script_idx is not None:
|
||||
launch_args = sys_args[after_script_idx:]
|
||||
|
||||
host_name = os.environ["AYON_HOST_NAME"].lower()
|
||||
if host_name == "photoshop":
|
||||
# TODO refactor launch logic according to AE
|
||||
from ayon_core.hosts.photoshop.api.lib import main
|
||||
elif host_name == "aftereffects":
|
||||
from ayon_core.hosts.aftereffects.api.launch_logic import main
|
||||
elif host_name == "harmony":
|
||||
from ayon_core.hosts.harmony.api.lib import main
|
||||
else:
|
||||
title = "Unknown host name"
|
||||
message = (
|
||||
"BUG: Environment variable AYON_HOST_NAME contains unknown"
|
||||
" host name \"{}\""
|
||||
).format(host_name)
|
||||
show_error_messagebox(title, message)
|
||||
return
|
||||
|
||||
if launch_args:
|
||||
# Launch host implementation
|
||||
main(*launch_args)
|
||||
host_main(*launch_args)
|
||||
else:
|
||||
# Show message box
|
||||
on_invalid_args(after_script_idx is None)
|
||||
|
|
@ -4,8 +4,10 @@ import json
|
|||
import contextlib
|
||||
import logging
|
||||
|
||||
import ayon_api
|
||||
|
||||
from ayon_core.pipeline.context_tools import get_current_context
|
||||
from ayon_core.client import get_asset_by_name
|
||||
|
||||
from .ws_stub import get_stub
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
|
@ -85,21 +87,21 @@ def get_background_layers(file_url):
|
|||
return layers
|
||||
|
||||
|
||||
def get_asset_settings(asset_doc):
|
||||
"""Get settings on current asset from database.
|
||||
def get_folder_settings(folder_entity):
|
||||
"""Get settings of current folder.
|
||||
|
||||
Returns:
|
||||
dict: Scene data.
|
||||
|
||||
"""
|
||||
asset_data = asset_doc["data"]
|
||||
fps = asset_data.get("fps", 0)
|
||||
frame_start = asset_data.get("frameStart", 0)
|
||||
frame_end = asset_data.get("frameEnd", 0)
|
||||
handle_start = asset_data.get("handleStart", 0)
|
||||
handle_end = asset_data.get("handleEnd", 0)
|
||||
resolution_width = asset_data.get("resolutionWidth", 0)
|
||||
resolution_height = asset_data.get("resolutionHeight", 0)
|
||||
folder_attributes = folder_entity["attrib"]
|
||||
fps = folder_attributes.get("fps", 0)
|
||||
frame_start = folder_attributes.get("frameStart", 0)
|
||||
frame_end = folder_attributes.get("frameEnd", 0)
|
||||
handle_start = folder_attributes.get("handleStart", 0)
|
||||
handle_end = folder_attributes.get("handleEnd", 0)
|
||||
resolution_width = folder_attributes.get("resolutionWidth", 0)
|
||||
resolution_height = folder_attributes.get("resolutionHeight", 0)
|
||||
duration = (frame_end - frame_start + 1) + handle_start + handle_end
|
||||
|
||||
return {
|
||||
|
|
@ -127,9 +129,11 @@ def set_settings(frames, resolution, comp_ids=None, print_msg=True):
|
|||
frame_start = frames_duration = fps = width = height = None
|
||||
current_context = get_current_context()
|
||||
|
||||
asset_doc = get_asset_by_name(current_context["project_name"],
|
||||
current_context["folder_path"])
|
||||
settings = get_asset_settings(asset_doc)
|
||||
folder_entity = ayon_api.get_folder_by_path(
|
||||
current_context["project_name"],
|
||||
current_context["folder_path"]
|
||||
)
|
||||
settings = get_folder_settings(folder_entity)
|
||||
|
||||
msg = ''
|
||||
if frames:
|
||||
|
|
|
|||
|
|
@ -271,7 +271,7 @@ def containerise(name,
|
|||
"name": name,
|
||||
"namespace": namespace,
|
||||
"loader": str(loader),
|
||||
"representation": str(context["representation"]["_id"]),
|
||||
"representation": context["representation"]["id"],
|
||||
"members": comp.members or [comp.id]
|
||||
}
|
||||
|
||||
|
|
|
|||
88
client/ayon_core/hosts/aftereffects/hooks/pre_launch_args.py
Normal file
88
client/ayon_core/hosts/aftereffects/hooks/pre_launch_args.py
Normal file
|
|
@ -0,0 +1,88 @@
|
|||
import os
|
||||
import platform
|
||||
import subprocess
|
||||
|
||||
from ayon_core.lib import (
|
||||
get_ayon_launcher_args,
|
||||
is_using_ayon_console,
|
||||
)
|
||||
from ayon_applications import PreLaunchHook, LaunchTypes
|
||||
from ayon_core.hosts.aftereffects import get_launch_script_path
|
||||
|
||||
|
||||
def get_launch_kwargs(kwargs):
|
||||
"""Explicit setting of kwargs for Popen for AfterEffects.
|
||||
|
||||
Expected behavior
|
||||
- ayon_console opens window with logs
|
||||
- ayon has stdout/stderr available for capturing
|
||||
|
||||
Args:
|
||||
kwargs (Union[dict, None]): Current kwargs or None.
|
||||
|
||||
"""
|
||||
if kwargs is None:
|
||||
kwargs = {}
|
||||
|
||||
if platform.system().lower() != "windows":
|
||||
return kwargs
|
||||
|
||||
if is_using_ayon_console():
|
||||
kwargs.update({
|
||||
"creationflags": subprocess.CREATE_NEW_CONSOLE
|
||||
})
|
||||
else:
|
||||
kwargs.update({
|
||||
"creationflags": subprocess.CREATE_NO_WINDOW,
|
||||
"stdout": subprocess.DEVNULL,
|
||||
"stderr": subprocess.DEVNULL
|
||||
})
|
||||
return kwargs
|
||||
|
||||
|
||||
class AEPrelaunchHook(PreLaunchHook):
|
||||
"""Launch arguments preparation.
|
||||
|
||||
Hook add python executable and script path to AE implementation before
|
||||
AE executable and add last workfile path to launch arguments.
|
||||
|
||||
Existence of last workfile is checked. If workfile does not exists tries
|
||||
to copy templated workfile from predefined path.
|
||||
"""
|
||||
app_groups = {"aftereffects"}
|
||||
|
||||
order = 20
|
||||
launch_types = {LaunchTypes.local}
|
||||
|
||||
def execute(self):
|
||||
# Pop executable
|
||||
executable_path = self.launch_context.launch_args.pop(0)
|
||||
|
||||
# Pop rest of launch arguments - There should not be other arguments!
|
||||
remainders = []
|
||||
while self.launch_context.launch_args:
|
||||
remainders.append(self.launch_context.launch_args.pop(0))
|
||||
|
||||
script_path = get_launch_script_path()
|
||||
|
||||
new_launch_args = get_ayon_launcher_args(
|
||||
"run", script_path, executable_path
|
||||
)
|
||||
# Add workfile path if exists
|
||||
workfile_path = self.data["last_workfile_path"]
|
||||
if (
|
||||
self.data.get("start_last_workfile")
|
||||
and workfile_path
|
||||
and os.path.exists(workfile_path)
|
||||
):
|
||||
new_launch_args.append(workfile_path)
|
||||
|
||||
# Append as whole list as these arguments should not be separated
|
||||
self.launch_context.launch_args.append(new_launch_args)
|
||||
|
||||
if remainders:
|
||||
self.launch_context.launch_args.extend(remainders)
|
||||
|
||||
self.launch_context.kwargs = get_launch_kwargs(
|
||||
self.launch_context.kwargs
|
||||
)
|
||||
|
|
@ -218,7 +218,13 @@ class RenderCreator(Creator):
|
|||
"""
|
||||
|
||||
def get_dynamic_data(
|
||||
self, project_name, asset_doc, task_name, variant, host_name, instance
|
||||
self,
|
||||
project_name,
|
||||
folder_entity,
|
||||
task_entity,
|
||||
variant,
|
||||
host_name,
|
||||
instance
|
||||
):
|
||||
dynamic_data = {}
|
||||
if instance is not None:
|
||||
|
|
|
|||
|
|
@ -1,5 +1,6 @@
|
|||
import ayon_api
|
||||
|
||||
import ayon_core.hosts.aftereffects.api as api
|
||||
from ayon_core.client import get_asset_by_name
|
||||
from ayon_core.pipeline import (
|
||||
AutoCreator,
|
||||
CreatedInstance
|
||||
|
|
@ -39,32 +40,37 @@ class AEWorkfileCreator(AutoCreator):
|
|||
|
||||
context = self.create_context
|
||||
project_name = context.get_current_project_name()
|
||||
asset_name = context.get_current_asset_name()
|
||||
folder_path = context.get_current_folder_path()
|
||||
task_name = context.get_current_task_name()
|
||||
host_name = context.host_name
|
||||
|
||||
existing_asset_name = None
|
||||
existing_folder_path = None
|
||||
if existing_instance is not None:
|
||||
existing_asset_name = existing_instance.get("folderPath")
|
||||
existing_folder_path = existing_instance.get("folderPath")
|
||||
|
||||
if existing_instance is None:
|
||||
asset_doc = get_asset_by_name(project_name, asset_name)
|
||||
folder_entity = ayon_api.get_folder_by_path(
|
||||
project_name, folder_path
|
||||
)
|
||||
task_entity = ayon_api.get_task_by_name(
|
||||
project_name, folder_entity["id"], task_name
|
||||
)
|
||||
product_name = self.get_product_name(
|
||||
project_name,
|
||||
asset_doc,
|
||||
task_name,
|
||||
folder_entity,
|
||||
task_entity,
|
||||
self.default_variant,
|
||||
host_name,
|
||||
)
|
||||
data = {
|
||||
"folderPath": asset_name,
|
||||
"folderPath": folder_path,
|
||||
"task": task_name,
|
||||
"variant": self.default_variant,
|
||||
}
|
||||
data.update(self.get_dynamic_data(
|
||||
project_name,
|
||||
asset_doc,
|
||||
task_name,
|
||||
folder_entity,
|
||||
task_entity,
|
||||
self.default_variant,
|
||||
host_name,
|
||||
None,
|
||||
|
|
@ -79,17 +85,22 @@ class AEWorkfileCreator(AutoCreator):
|
|||
new_instance.data_to_store())
|
||||
|
||||
elif (
|
||||
existing_asset_name != asset_name
|
||||
existing_folder_path != folder_path
|
||||
or existing_instance["task"] != task_name
|
||||
):
|
||||
asset_doc = get_asset_by_name(project_name, asset_name)
|
||||
folder_entity = ayon_api.get_folder_by_path(
|
||||
project_name, folder_path
|
||||
)
|
||||
task_entity = ayon_api.get_task_by_name(
|
||||
project_name, folder_entity["id"], task_name
|
||||
)
|
||||
product_name = self.get_product_name(
|
||||
project_name,
|
||||
asset_doc,
|
||||
task_name,
|
||||
folder_entity,
|
||||
task_entity,
|
||||
self.default_variant,
|
||||
host_name,
|
||||
)
|
||||
existing_instance["folderPath"] = asset_name
|
||||
existing_instance["folderPath"] = folder_path
|
||||
existing_instance["task"] = task_name
|
||||
existing_instance["productName"] = product_name
|
||||
|
|
|
|||
|
|
@ -20,7 +20,7 @@ class BackgroundLoader(api.AfterEffectsLoader):
|
|||
metadata
|
||||
"""
|
||||
label = "Load JSON Background"
|
||||
families = ["background"]
|
||||
product_types = {"background"}
|
||||
representations = ["json"]
|
||||
|
||||
def load(self, context, name=None, namespace=None, data=None):
|
||||
|
|
@ -31,7 +31,7 @@ class BackgroundLoader(api.AfterEffectsLoader):
|
|||
|
||||
comp_name = get_unique_layer_name(
|
||||
existing_items,
|
||||
"{}_{}".format(context["asset"]["name"], name))
|
||||
"{}_{}".format(context["folder"]["name"], name))
|
||||
|
||||
path = self.filepath_from_context(context)
|
||||
layers = get_background_layers(path)
|
||||
|
|
@ -59,12 +59,10 @@ class BackgroundLoader(api.AfterEffectsLoader):
|
|||
def update(self, container, context):
|
||||
""" Switch asset or change version """
|
||||
stub = self.get_stub()
|
||||
asset_doc = context["asset"]
|
||||
subset_doc = context["subset"]
|
||||
repre_doc = context["representation"]
|
||||
folder_name = context["folder"]["name"]
|
||||
product_name = context["product"]["name"]
|
||||
repre_entity = context["representation"]
|
||||
|
||||
folder_name = asset_doc["name"]
|
||||
product_name = subset_doc["name"]
|
||||
_ = container.pop("layer")
|
||||
|
||||
# without iterator number (_001, 002...)
|
||||
|
|
@ -82,7 +80,7 @@ class BackgroundLoader(api.AfterEffectsLoader):
|
|||
else: # switching version - keep same name
|
||||
comp_name = container["namespace"]
|
||||
|
||||
path = get_representation_path(repre_doc)
|
||||
path = get_representation_path(repre_entity)
|
||||
|
||||
layers = get_background_layers(path)
|
||||
comp = stub.reload_background(container["members"][1],
|
||||
|
|
@ -90,7 +88,7 @@ class BackgroundLoader(api.AfterEffectsLoader):
|
|||
layers)
|
||||
|
||||
# update container
|
||||
container["representation"] = str(repre_doc["_id"])
|
||||
container["representation"] = repre_entity["id"]
|
||||
container["name"] = product_name
|
||||
container["namespace"] = comp_name
|
||||
container["members"] = comp.members
|
||||
|
|
|
|||
|
|
@ -12,12 +12,14 @@ class FileLoader(api.AfterEffectsLoader):
|
|||
"""
|
||||
label = "Load file"
|
||||
|
||||
families = ["image",
|
||||
"plate",
|
||||
"render",
|
||||
"prerender",
|
||||
"review",
|
||||
"audio"]
|
||||
product_types = {
|
||||
"image",
|
||||
"plate",
|
||||
"render",
|
||||
"prerender",
|
||||
"review",
|
||||
"audio",
|
||||
}
|
||||
representations = ["*"]
|
||||
|
||||
def load(self, context, name=None, namespace=None, data=None):
|
||||
|
|
@ -25,7 +27,10 @@ class FileLoader(api.AfterEffectsLoader):
|
|||
layers = stub.get_items(comps=True, folders=True, footages=True)
|
||||
existing_layers = [layer.name for layer in layers]
|
||||
comp_name = get_unique_layer_name(
|
||||
existing_layers, "{}_{}".format(context["asset"]["name"], name))
|
||||
existing_layers, "{}_{}".format(
|
||||
context["folder"]["name"], name
|
||||
)
|
||||
)
|
||||
|
||||
import_options = {}
|
||||
|
||||
|
|
@ -35,7 +40,7 @@ class FileLoader(api.AfterEffectsLoader):
|
|||
import_options['sequence'] = True
|
||||
|
||||
if not path:
|
||||
repr_id = context["representation"]["_id"]
|
||||
repr_id = context["representation"]["id"]
|
||||
self.log.warning(
|
||||
"Representation id `{}` is failing to load".format(repr_id))
|
||||
return
|
||||
|
|
@ -69,12 +74,9 @@ class FileLoader(api.AfterEffectsLoader):
|
|||
stub = self.get_stub()
|
||||
layer = container.pop("layer")
|
||||
|
||||
asset_doc = context["asset"]
|
||||
subset_doc = context["subset"]
|
||||
repre_doc = context["representation"]
|
||||
|
||||
folder_name = asset_doc["name"]
|
||||
product_name = subset_doc["name"]
|
||||
folder_name = context["folder"]["name"]
|
||||
product_name = context["product"]["name"]
|
||||
repre_entity = context["representation"]
|
||||
|
||||
namespace_from_container = re.sub(r'_\d{3}$', '',
|
||||
container["namespace"])
|
||||
|
|
@ -88,11 +90,11 @@ class FileLoader(api.AfterEffectsLoader):
|
|||
"{}_{}".format(folder_name, product_name))
|
||||
else: # switching version - keep same name
|
||||
layer_name = container["namespace"]
|
||||
path = get_representation_path(repre_doc)
|
||||
path = get_representation_path(repre_entity)
|
||||
# with aftereffects.maintained_selection(): # TODO
|
||||
stub.replace_item(layer.id, path, stub.LOADED_ICON + layer_name)
|
||||
stub.imprint(
|
||||
layer.id, {"representation": str(repre_doc["_id"]),
|
||||
layer.id, {"representation": repre_entity["id"],
|
||||
"name": product_name,
|
||||
"namespace": layer_name}
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,14 +1,11 @@
|
|||
import os
|
||||
import re
|
||||
import tempfile
|
||||
import attr
|
||||
|
||||
import attr
|
||||
import pyblish.api
|
||||
|
||||
from ayon_core.settings import get_project_settings
|
||||
from ayon_core.pipeline import publish
|
||||
from ayon_core.pipeline.publish import RenderInstance
|
||||
|
||||
from ayon_core.hosts.aftereffects.api import get_stub
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<root>
|
||||
<error id="main">
|
||||
<title>Subset context</title>
|
||||
<title>Product context</title>
|
||||
<description>
|
||||
## Invalid product context
|
||||
|
||||
|
|
@ -15,7 +15,7 @@ You can fix this with "repair" button on the right and refresh Publish at the bo
|
|||
### __Detailed Info__ (optional)
|
||||
|
||||
This might happen if you are reuse old workfile and open it in different context.
|
||||
(Eg. you created product name "renderCompositingDefault" from folder "Robot' in "your_project_Robot_compositing.aep", now you opened this workfile in a context "Sloth" but existing product for "Robot" asset stayed in the workfile.)
|
||||
(Eg. you created product name "renderCompositingDefault" from folder "Robot' in "your_project_Robot_compositing.aep", now you opened this workfile in a context "Sloth" but existing product for "Robot" folder stayed in the workfile.)
|
||||
</detail>
|
||||
</error>
|
||||
</root>
|
||||
|
|
@ -5,20 +5,20 @@
|
|||
<description>
|
||||
## Invalid scene setting found
|
||||
|
||||
One of the settings in a scene doesn't match to asset settings in database.
|
||||
One of the settings in a scene doesn't match to folder settings in database.
|
||||
|
||||
{invalid_setting_str}
|
||||
|
||||
### How to repair?
|
||||
|
||||
Change values for {invalid_keys_str} in the scene OR change them in the asset database if they are wrong there.
|
||||
Change values for {invalid_keys_str} in the scene OR change them in the folder database if they are wrong there.
|
||||
|
||||
In the scene it is right mouse click on published composition > `Composition Settings`.
|
||||
</description>
|
||||
<detail>
|
||||
### __Detailed Info__ (optional)
|
||||
|
||||
This error is shown when for example resolution in the scene doesn't match to resolution set on the asset in the database.
|
||||
This error is shown when for example resolution in the scene doesn't match to resolution set on the folder in the database.
|
||||
Either value in the database or in the scene is wrong.
|
||||
</detail>
|
||||
</error>
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
import pyblish.api
|
||||
|
||||
from ayon_core.pipeline import get_current_asset_name
|
||||
from ayon_core.pipeline import get_current_folder_path
|
||||
from ayon_core.pipeline.publish import (
|
||||
ValidateContentsOrder,
|
||||
PublishXmlValidationError,
|
||||
|
|
@ -8,8 +8,8 @@ from ayon_core.pipeline.publish import (
|
|||
from ayon_core.hosts.aftereffects.api import get_stub
|
||||
|
||||
|
||||
class ValidateInstanceAssetRepair(pyblish.api.Action):
|
||||
"""Repair the instance asset with value from Context."""
|
||||
class ValidateInstanceFolderRepair(pyblish.api.Action):
|
||||
"""Repair the instance folder with value from Context."""
|
||||
|
||||
label = "Repair"
|
||||
icon = "wrench"
|
||||
|
|
@ -30,35 +30,35 @@ class ValidateInstanceAssetRepair(pyblish.api.Action):
|
|||
for instance in instances:
|
||||
data = stub.read(instance[0])
|
||||
|
||||
data["folderPath"] = get_current_asset_name()
|
||||
data["folderPath"] = get_current_folder_path()
|
||||
stub.imprint(instance[0].instance_id, data)
|
||||
|
||||
|
||||
class ValidateInstanceAsset(pyblish.api.InstancePlugin):
|
||||
"""Validate the instance asset is the current selected context asset.
|
||||
class ValidateInstanceFolder(pyblish.api.InstancePlugin):
|
||||
"""Validate the instance folder is the current selected context folder.
|
||||
|
||||
As it might happen that multiple worfiles are opened at same time,
|
||||
switching between them would mess with selected context. (From Launcher
|
||||
or Ftrack).
|
||||
|
||||
In that case outputs might be output under wrong asset!
|
||||
In that case outputs might be output under wrong folder!
|
||||
|
||||
Repair action will use Context asset value (from Workfiles or Launcher)
|
||||
Repair action will use Context folder value (from Workfiles or Launcher)
|
||||
Closing and reopening with Workfiles will refresh Context value.
|
||||
"""
|
||||
|
||||
label = "Validate Instance Asset"
|
||||
label = "Validate Instance Folder"
|
||||
hosts = ["aftereffects"]
|
||||
actions = [ValidateInstanceAssetRepair]
|
||||
actions = [ValidateInstanceFolderRepair]
|
||||
order = ValidateContentsOrder
|
||||
|
||||
def process(self, instance):
|
||||
instance_asset = instance.data["folderPath"]
|
||||
current_asset = get_current_asset_name()
|
||||
instance_folder = instance.data["folderPath"]
|
||||
current_folder = get_current_folder_path()
|
||||
msg = (
|
||||
f"Instance asset {instance_asset} is not the same "
|
||||
f"as current context {current_asset}."
|
||||
f"Instance folder {instance_folder} is not the same "
|
||||
f"as current context {current_folder}."
|
||||
)
|
||||
|
||||
if instance_asset != current_asset:
|
||||
if instance_folder != current_folder:
|
||||
raise PublishXmlValidationError(self, msg)
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Validate scene settings.
|
||||
Requires:
|
||||
instance -> assetEntity
|
||||
instance -> folderEntity
|
||||
instance -> anatomyData
|
||||
"""
|
||||
import os
|
||||
|
|
@ -13,7 +13,7 @@ from ayon_core.pipeline import (
|
|||
PublishXmlValidationError,
|
||||
OptionalPyblishPluginMixin
|
||||
)
|
||||
from ayon_core.hosts.aftereffects.api import get_asset_settings
|
||||
from ayon_core.hosts.aftereffects.api import get_folder_settings
|
||||
|
||||
|
||||
class ValidateSceneSettings(OptionalPyblishPluginMixin,
|
||||
|
|
@ -48,7 +48,7 @@ class ValidateSceneSettings(OptionalPyblishPluginMixin,
|
|||
fps
|
||||
handleStart
|
||||
handleEnd
|
||||
skip_resolution_check - fill entity type ('asset') to skip validation
|
||||
skip_resolution_check - fill entity type ('folder') to skip validation
|
||||
resolutionWidth
|
||||
resolutionHeight
|
||||
TODO support in extension is missing for now
|
||||
|
|
@ -71,11 +71,11 @@ class ValidateSceneSettings(OptionalPyblishPluginMixin,
|
|||
if not self.is_active(instance.data):
|
||||
return
|
||||
|
||||
asset_doc = instance.data["assetEntity"]
|
||||
expected_settings = get_asset_settings(asset_doc)
|
||||
folder_entity = instance.data["folderEntity"]
|
||||
expected_settings = get_folder_settings(folder_entity)
|
||||
self.log.info("config from DB::{}".format(expected_settings))
|
||||
|
||||
task_name = instance.data["anatomyData"]["task"]["name"]
|
||||
task_name = instance.data["task"]
|
||||
if any(re.search(pattern, task_name)
|
||||
for pattern in self.skip_resolution_check):
|
||||
expected_settings.pop("resolutionWidth")
|
||||
|
|
|
|||
|
|
@ -16,7 +16,7 @@ import bpy
|
|||
import bpy.utils.previews
|
||||
|
||||
from ayon_core import style
|
||||
from ayon_core.pipeline import get_current_asset_name, get_current_task_name
|
||||
from ayon_core.pipeline import get_current_folder_path, get_current_task_name
|
||||
from ayon_core.tools.utils import host_tools
|
||||
|
||||
from .workio import OpenFileCacher
|
||||
|
|
@ -191,7 +191,7 @@ def _process_app_events() -> Optional[float]:
|
|||
|
||||
|
||||
class LaunchQtApp(bpy.types.Operator):
|
||||
"""A Base class for opertors to launch a Qt app."""
|
||||
"""A Base class for operators to launch a Qt app."""
|
||||
|
||||
_app: QtWidgets.QApplication
|
||||
_window = Union[QtWidgets.QDialog, ModuleType]
|
||||
|
|
@ -355,7 +355,7 @@ class SetFrameRange(bpy.types.Operator):
|
|||
bl_label = "Set Frame Range"
|
||||
|
||||
def execute(self, context):
|
||||
data = pipeline.get_asset_data()
|
||||
data = pipeline.get_folder_attributes()
|
||||
pipeline.set_frame_range(data)
|
||||
return {"FINISHED"}
|
||||
|
||||
|
|
@ -365,7 +365,7 @@ class SetResolution(bpy.types.Operator):
|
|||
bl_label = "Set Resolution"
|
||||
|
||||
def execute(self, context):
|
||||
data = pipeline.get_asset_data()
|
||||
data = pipeline.get_folder_attributes()
|
||||
pipeline.set_resolution(data)
|
||||
return {"FINISHED"}
|
||||
|
||||
|
|
@ -388,9 +388,9 @@ class TOPBAR_MT_avalon(bpy.types.Menu):
|
|||
else:
|
||||
pyblish_menu_icon_id = 0
|
||||
|
||||
asset = get_current_asset_name()
|
||||
task = get_current_task_name()
|
||||
context_label = f"{asset}, {task}"
|
||||
folder_path = get_current_folder_path()
|
||||
task_name = get_current_task_name()
|
||||
context_label = f"{folder_path}, {task_name}"
|
||||
context_label_item = layout.row()
|
||||
context_label_item.operator(
|
||||
LaunchWorkFiles.bl_idname, text=context_label
|
||||
|
|
|
|||
|
|
@ -9,6 +9,7 @@ from . import lib
|
|||
from . import ops
|
||||
|
||||
import pyblish.api
|
||||
import ayon_api
|
||||
|
||||
from ayon_core.host import (
|
||||
HostBase,
|
||||
|
|
@ -16,11 +17,10 @@ from ayon_core.host import (
|
|||
IPublishHost,
|
||||
ILoadHost
|
||||
)
|
||||
from ayon_core.client import get_asset_by_name
|
||||
from ayon_core.pipeline import (
|
||||
schema,
|
||||
get_current_project_name,
|
||||
get_current_asset_name,
|
||||
get_current_folder_path,
|
||||
register_loader_plugin_path,
|
||||
register_creator_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
|
|
@ -221,12 +221,12 @@ def message_window(title, message):
|
|||
_process_app_events()
|
||||
|
||||
|
||||
def get_asset_data():
|
||||
def get_folder_attributes():
|
||||
project_name = get_current_project_name()
|
||||
asset_name = get_current_asset_name()
|
||||
asset_doc = get_asset_by_name(project_name, asset_name)
|
||||
folder_path = get_current_folder_path()
|
||||
folder_entity = ayon_api.get_folder_by_path(project_name, folder_path)
|
||||
|
||||
return asset_doc.get("data")
|
||||
return folder_entity["attrib"]
|
||||
|
||||
|
||||
def set_frame_range(data):
|
||||
|
|
@ -279,7 +279,7 @@ def on_new():
|
|||
set_resolution_startup = settings.get("set_resolution_startup")
|
||||
set_frames_startup = settings.get("set_frames_startup")
|
||||
|
||||
data = get_asset_data()
|
||||
data = get_folder_attributes()
|
||||
|
||||
if set_resolution_startup:
|
||||
set_resolution(data)
|
||||
|
|
@ -300,7 +300,7 @@ def on_open():
|
|||
set_resolution_startup = settings.get("set_resolution_startup")
|
||||
set_frames_startup = settings.get("set_frames_startup")
|
||||
|
||||
data = get_asset_data()
|
||||
data = get_folder_attributes()
|
||||
|
||||
if set_resolution_startup:
|
||||
set_resolution(data)
|
||||
|
|
@ -468,7 +468,7 @@ def containerise(name: str,
|
|||
|
||||
"""
|
||||
|
||||
node_name = f"{context['asset']['name']}_{name}"
|
||||
node_name = f"{context['folder']['name']}_{name}"
|
||||
if namespace:
|
||||
node_name = f"{namespace}:{node_name}"
|
||||
if suffix:
|
||||
|
|
@ -484,7 +484,7 @@ def containerise(name: str,
|
|||
"name": name,
|
||||
"namespace": namespace or '',
|
||||
"loader": str(loader),
|
||||
"representation": str(context["representation"]["_id"]),
|
||||
"representation": context["representation"]["id"],
|
||||
}
|
||||
|
||||
metadata_update(container, data)
|
||||
|
|
@ -523,7 +523,7 @@ def containerise_existing(
|
|||
"name": name,
|
||||
"namespace": namespace or '',
|
||||
"loader": str(loader),
|
||||
"representation": str(context["representation"]["_id"]),
|
||||
"representation": context["representation"]["id"],
|
||||
}
|
||||
|
||||
metadata_update(container, data)
|
||||
|
|
|
|||
|
|
@ -49,7 +49,7 @@ def prepare_scene_name(
|
|||
def get_unique_number(
|
||||
folder_name: str, product_name: str
|
||||
) -> str:
|
||||
"""Return a unique number based on the asset name."""
|
||||
"""Return a unique number based on the folder name."""
|
||||
avalon_container = bpy.data.collections.get(AVALON_CONTAINERS)
|
||||
if not avalon_container:
|
||||
return "01"
|
||||
|
|
@ -220,9 +220,9 @@ class BaseCreator(Creator):
|
|||
Create new instance and store it.
|
||||
|
||||
Args:
|
||||
product_name(str): Subset name of created instance.
|
||||
instance_data(dict): Instance base data.
|
||||
pre_create_data(dict): Data based on pre creation attributes.
|
||||
product_name (str): Product name of created instance.
|
||||
instance_data (dict): Instance base data.
|
||||
pre_create_data (dict): Data based on pre creation attributes.
|
||||
Those may affect how creator works.
|
||||
"""
|
||||
# Get Instance Container or create it if it does not exist
|
||||
|
|
@ -232,9 +232,9 @@ class BaseCreator(Creator):
|
|||
bpy.context.scene.collection.children.link(instances)
|
||||
|
||||
# Create asset group
|
||||
asset_name = instance_data["folderPath"].split("/")[-1]
|
||||
folder_name = instance_data["folderPath"].split("/")[-1]
|
||||
|
||||
name = prepare_scene_name(asset_name, product_name)
|
||||
name = prepare_scene_name(folder_name, product_name)
|
||||
if self.create_as_asset_group:
|
||||
# Create instance as empty
|
||||
instance_node = bpy.data.objects.new(name=name, object_data=None)
|
||||
|
|
@ -312,9 +312,9 @@ class BaseCreator(Creator):
|
|||
"productName" in changes.changed_keys
|
||||
or "folderPath" in changes.changed_keys
|
||||
) and created_instance.product_type != "workfile":
|
||||
asset_name = data["folderPath"].split("/")[-1]
|
||||
folder_name = data["folderPath"].split("/")[-1]
|
||||
name = prepare_scene_name(
|
||||
asset_name, data["productName"]
|
||||
folder_name, data["productName"]
|
||||
)
|
||||
node.name = name
|
||||
|
||||
|
|
@ -346,7 +346,7 @@ class BaseCreator(Creator):
|
|||
"""Fill instance data with required items.
|
||||
|
||||
Args:
|
||||
product_name(str): Subset name of created instance.
|
||||
product_name(str): Product name of created instance.
|
||||
instance_data(dict): Instance base data.
|
||||
instance_node(bpy.types.ID): Instance node in blender scene.
|
||||
"""
|
||||
|
|
@ -465,8 +465,8 @@ class AssetLoader(LoaderPlugin):
|
|||
filepath = self.filepath_from_context(context)
|
||||
assert Path(filepath).exists(), f"{filepath} doesn't exist."
|
||||
|
||||
folder_name = context["asset"]["name"]
|
||||
product_name = context["subset"]["name"]
|
||||
folder_name = context["folder"]["name"]
|
||||
product_name = context["product"]["name"]
|
||||
unique_number = get_unique_number(
|
||||
folder_name, product_name
|
||||
)
|
||||
|
|
@ -498,8 +498,8 @@ class AssetLoader(LoaderPlugin):
|
|||
# loader=self.__class__.__name__,
|
||||
# )
|
||||
|
||||
# folder_name = context["asset"]["name"]
|
||||
# product_name = context["subset"]["name"]
|
||||
# folder_name = context["folder"]["name"]
|
||||
# product_name = context["product"]["name"]
|
||||
# instance_name = prepare_scene_name(
|
||||
# folder_name, product_name, unique_number
|
||||
# ) + '_CON'
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
from pathlib import Path
|
||||
|
||||
from ayon_core.lib.applications import PreLaunchHook, LaunchTypes
|
||||
from ayon_applications import PreLaunchHook, LaunchTypes
|
||||
|
||||
|
||||
class AddPythonScriptToLaunchArgs(PreLaunchHook):
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@ import os
|
|||
import re
|
||||
import subprocess
|
||||
from platform import system
|
||||
from ayon_core.lib.applications import PreLaunchHook, LaunchTypes
|
||||
from ayon_applications import PreLaunchHook, LaunchTypes
|
||||
|
||||
|
||||
class InstallPySideToBlender(PreLaunchHook):
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
import subprocess
|
||||
from ayon_core.lib.applications import PreLaunchHook, LaunchTypes
|
||||
from ayon_applications import PreLaunchHook, LaunchTypes
|
||||
|
||||
|
||||
class BlenderConsoleWindows(PreLaunchHook):
|
||||
|
|
|
|||
|
|
@ -1,10 +1,10 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Converter for legacy Houdini products."""
|
||||
from ayon_core.pipeline.create.creator_plugins import SubsetConvertorPlugin
|
||||
from ayon_core.pipeline.create.creator_plugins import ProductConvertorPlugin
|
||||
from ayon_core.hosts.blender.api.lib import imprint
|
||||
|
||||
|
||||
class BlenderLegacyConvertor(SubsetConvertorPlugin):
|
||||
class BlenderLegacyConvertor(ProductConvertorPlugin):
|
||||
"""Find and convert any legacy products in the scene.
|
||||
|
||||
This Converter will find all legacy products in the scene and will
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
import bpy
|
||||
import ayon_api
|
||||
|
||||
from ayon_core.pipeline import CreatedInstance, AutoCreator
|
||||
from ayon_core.client import get_asset_by_name
|
||||
from ayon_core.hosts.blender.api.plugin import BaseCreator
|
||||
from ayon_core.hosts.blender.api.pipeline import (
|
||||
AVALON_PROPERTY,
|
||||
|
|
@ -33,33 +33,38 @@ class CreateWorkfile(BaseCreator, AutoCreator):
|
|||
)
|
||||
|
||||
project_name = self.project_name
|
||||
asset_name = self.create_context.get_current_asset_name()
|
||||
folder_path = self.create_context.get_current_folder_path()
|
||||
task_name = self.create_context.get_current_task_name()
|
||||
host_name = self.create_context.host_name
|
||||
|
||||
existing_asset_name = None
|
||||
existing_folder_path = None
|
||||
if workfile_instance is not None:
|
||||
existing_asset_name = workfile_instance.get("folderPath")
|
||||
existing_folder_path = workfile_instance.get("folderPath")
|
||||
|
||||
if not workfile_instance:
|
||||
asset_doc = get_asset_by_name(project_name, asset_name)
|
||||
folder_entity = ayon_api.get_folder_by_path(
|
||||
project_name, folder_path
|
||||
)
|
||||
task_entity = ayon_api.get_task_by_name(
|
||||
project_name, folder_entity["id"], task_name
|
||||
)
|
||||
product_name = self.get_product_name(
|
||||
project_name,
|
||||
asset_doc,
|
||||
task_name,
|
||||
folder_entity,
|
||||
task_entity,
|
||||
task_name,
|
||||
host_name,
|
||||
)
|
||||
data = {
|
||||
"folderPath": asset_name,
|
||||
"folderPath": folder_path,
|
||||
"task": task_name,
|
||||
"variant": task_name,
|
||||
}
|
||||
data.update(
|
||||
self.get_dynamic_data(
|
||||
project_name,
|
||||
asset_doc,
|
||||
task_name,
|
||||
folder_entity,
|
||||
task_entity,
|
||||
task_name,
|
||||
host_name,
|
||||
workfile_instance,
|
||||
|
|
@ -72,20 +77,25 @@ class CreateWorkfile(BaseCreator, AutoCreator):
|
|||
self._add_instance_to_context(workfile_instance)
|
||||
|
||||
elif (
|
||||
existing_asset_name != asset_name
|
||||
existing_folder_path != folder_path
|
||||
or workfile_instance["task"] != task_name
|
||||
):
|
||||
# Update instance context if it's different
|
||||
asset_doc = get_asset_by_name(project_name, asset_name)
|
||||
folder_entity = ayon_api.get_folder_by_path(
|
||||
project_name, folder_path
|
||||
)
|
||||
task_entity = ayon_api.get_task_by_name(
|
||||
project_name, folder_entity["id"], task_name
|
||||
)
|
||||
product_name = self.get_product_name(
|
||||
project_name,
|
||||
asset_doc,
|
||||
task_name,
|
||||
folder_entity,
|
||||
task_entity,
|
||||
self.default_variant,
|
||||
host_name,
|
||||
)
|
||||
|
||||
workfile_instance["folderPath"] = asset_name
|
||||
workfile_instance["folderPath"] = folder_path
|
||||
workfile_instance["task"] = task_name
|
||||
workfile_instance["productName"] = product_name
|
||||
|
||||
|
|
|
|||
|
|
@ -4,8 +4,8 @@ from ayon_core.hosts.blender.api import plugin
|
|||
|
||||
|
||||
def append_workfile(context, fname, do_import):
|
||||
folder_name = context['asset']['name']
|
||||
product_name = context['subset']['name']
|
||||
folder_name = context["folder"]["name"]
|
||||
product_name = context["product"]["name"]
|
||||
|
||||
group_name = plugin.prepare_scene_name(folder_name, product_name)
|
||||
|
||||
|
|
@ -44,7 +44,7 @@ class AppendBlendLoader(plugin.AssetLoader):
|
|||
"""
|
||||
|
||||
representations = ["blend"]
|
||||
families = ["workfile"]
|
||||
product_types = {"workfile"}
|
||||
|
||||
label = "Append Workfile"
|
||||
order = 9
|
||||
|
|
@ -69,7 +69,7 @@ class ImportBlendLoader(plugin.AssetLoader):
|
|||
"""
|
||||
|
||||
representations = ["blend"]
|
||||
families = ["workfile"]
|
||||
product_types = {"workfile"}
|
||||
|
||||
label = "Import Workfile"
|
||||
order = 9
|
||||
|
|
|
|||
|
|
@ -26,7 +26,7 @@ class CacheModelLoader(plugin.AssetLoader):
|
|||
Note:
|
||||
At least for now it only supports Alembic files.
|
||||
"""
|
||||
families = ["model", "pointcache", "animation"]
|
||||
product_types = {"model", "pointcache", "animation"}
|
||||
representations = ["abc"]
|
||||
|
||||
label = "Load Alembic"
|
||||
|
|
@ -134,8 +134,8 @@ class CacheModelLoader(plugin.AssetLoader):
|
|||
"""
|
||||
|
||||
libpath = self.filepath_from_context(context)
|
||||
folder_name = context["asset"]["name"]
|
||||
product_name = context["subset"]["name"]
|
||||
folder_name = context["folder"]["name"]
|
||||
product_name = context["product"]["name"]
|
||||
|
||||
asset_name = plugin.prepare_scene_name(folder_name, product_name)
|
||||
unique_number = plugin.get_unique_number(folder_name, product_name)
|
||||
|
|
@ -161,17 +161,17 @@ class CacheModelLoader(plugin.AssetLoader):
|
|||
|
||||
self._link_objects(objects, asset_group, containers, asset_group)
|
||||
|
||||
product_type = context["subset"]["data"]["family"]
|
||||
product_type = context["product"]["productType"]
|
||||
asset_group[AVALON_PROPERTY] = {
|
||||
"schema": "openpype:container-2.0",
|
||||
"id": AVALON_CONTAINER_ID,
|
||||
"name": name,
|
||||
"namespace": namespace or '',
|
||||
"loader": str(self.__class__.__name__),
|
||||
"representation": str(context["representation"]["_id"]),
|
||||
"representation": context["representation"]["id"],
|
||||
"libpath": libpath,
|
||||
"asset_name": asset_name,
|
||||
"parent": str(context["representation"]["parent"]),
|
||||
"parent": context["representation"]["versionId"],
|
||||
"productType": product_type,
|
||||
"objectName": group_name
|
||||
}
|
||||
|
|
@ -191,16 +191,16 @@ class CacheModelLoader(plugin.AssetLoader):
|
|||
Warning:
|
||||
No nested collections are supported at the moment!
|
||||
"""
|
||||
repre_doc = context["representation"]
|
||||
repre_entity = context["representation"]
|
||||
object_name = container["objectName"]
|
||||
asset_group = bpy.data.objects.get(object_name)
|
||||
libpath = Path(get_representation_path(repre_doc))
|
||||
libpath = Path(get_representation_path(repre_entity))
|
||||
extension = libpath.suffix.lower()
|
||||
|
||||
self.log.info(
|
||||
"Container: %s\nRepresentation: %s",
|
||||
pformat(container, indent=2),
|
||||
pformat(repre_doc, indent=2),
|
||||
pformat(repre_entity, indent=2),
|
||||
)
|
||||
|
||||
assert asset_group, (
|
||||
|
|
@ -245,7 +245,7 @@ class CacheModelLoader(plugin.AssetLoader):
|
|||
asset_group.matrix_basis = mat
|
||||
|
||||
metadata["libpath"] = str(libpath)
|
||||
metadata["representation"] = str(repre_doc["_id"])
|
||||
metadata["representation"] = repre_entity["id"]
|
||||
|
||||
def exec_remove(self, container: Dict) -> bool:
|
||||
"""Remove an existing container from a Blender scene.
|
||||
|
|
|
|||
|
|
@ -24,7 +24,7 @@ class BlendActionLoader(plugin.AssetLoader):
|
|||
moment.
|
||||
"""
|
||||
|
||||
families = ["action"]
|
||||
product_types = {"action"}
|
||||
representations = ["blend"]
|
||||
|
||||
label = "Link Action"
|
||||
|
|
@ -44,8 +44,8 @@ class BlendActionLoader(plugin.AssetLoader):
|
|||
"""
|
||||
|
||||
libpath = self.filepath_from_context(context)
|
||||
folder_name = context["asset"]["name"]
|
||||
product_name = context["subset"]["name"]
|
||||
folder_name = context["folder"]["name"]
|
||||
product_name = context["product"]["name"]
|
||||
lib_container = plugin.prepare_scene_name(folder_name, product_name)
|
||||
container_name = plugin.prepare_scene_name(
|
||||
folder_name, product_name, namespace
|
||||
|
|
@ -126,18 +126,18 @@ class BlendActionLoader(plugin.AssetLoader):
|
|||
Warning:
|
||||
No nested collections are supported at the moment!
|
||||
"""
|
||||
repre_doc = context["representation"]
|
||||
repre_entity = context["representation"]
|
||||
collection = bpy.data.collections.get(
|
||||
container["objectName"]
|
||||
)
|
||||
|
||||
libpath = Path(get_representation_path(repre_doc))
|
||||
libpath = Path(get_representation_path(repre_entity))
|
||||
extension = libpath.suffix.lower()
|
||||
|
||||
logger.info(
|
||||
"Container: %s\nRepresentation: %s",
|
||||
pformat(container, indent=2),
|
||||
pformat(repre_doc, indent=2),
|
||||
pformat(repre_entity, indent=2),
|
||||
)
|
||||
|
||||
assert collection, (
|
||||
|
|
@ -241,7 +241,7 @@ class BlendActionLoader(plugin.AssetLoader):
|
|||
# Save the list of objects in the metadata container
|
||||
collection_metadata["objects"] = objects_list
|
||||
collection_metadata["libpath"] = str(libpath)
|
||||
collection_metadata["representation"] = str(repre_doc["_id"])
|
||||
collection_metadata["representation"] = repre_entity["id"]
|
||||
|
||||
bpy.ops.object.select_all(action='DESELECT')
|
||||
|
||||
|
|
|
|||
|
|
@ -16,7 +16,7 @@ class BlendAnimationLoader(plugin.AssetLoader):
|
|||
moment.
|
||||
"""
|
||||
|
||||
families = ["animation"]
|
||||
product_types = {"animation"}
|
||||
representations = ["blend"]
|
||||
|
||||
label = "Link Animation"
|
||||
|
|
|
|||
|
|
@ -20,7 +20,7 @@ from ayon_core.hosts.blender.api.pipeline import (
|
|||
class AudioLoader(plugin.AssetLoader):
|
||||
"""Load audio in Blender."""
|
||||
|
||||
families = ["audio"]
|
||||
product_types = {"audio"}
|
||||
representations = ["wav"]
|
||||
|
||||
label = "Load Audio"
|
||||
|
|
@ -39,8 +39,8 @@ class AudioLoader(plugin.AssetLoader):
|
|||
options: Additional settings dictionary
|
||||
"""
|
||||
libpath = self.filepath_from_context(context)
|
||||
folder_name = context["asset"]["name"]
|
||||
product_name = context["subset"]["name"]
|
||||
folder_name = context["folder"]["name"]
|
||||
product_name = context["product"]["name"]
|
||||
|
||||
asset_name = plugin.prepare_scene_name(folder_name, product_name)
|
||||
unique_number = plugin.get_unique_number(folder_name, product_name)
|
||||
|
|
@ -83,11 +83,11 @@ class AudioLoader(plugin.AssetLoader):
|
|||
"name": name,
|
||||
"namespace": namespace or '',
|
||||
"loader": str(self.__class__.__name__),
|
||||
"representation": str(context["representation"]["_id"]),
|
||||
"representation": context["representation"]["id"],
|
||||
"libpath": libpath,
|
||||
"asset_name": asset_name,
|
||||
"parent": str(context["representation"]["parent"]),
|
||||
"productType": context["subset"]["data"]["family"],
|
||||
"parent": context["representation"]["versionId"],
|
||||
"productType": context["product"]["productType"],
|
||||
"objectName": group_name,
|
||||
"audio": audio
|
||||
}
|
||||
|
|
@ -105,15 +105,15 @@ class AudioLoader(plugin.AssetLoader):
|
|||
representation (openpype:representation-1.0): Representation to
|
||||
update, from `host.ls()`.
|
||||
"""
|
||||
repre_doc = context["representation"]
|
||||
repre_entity = context["representation"]
|
||||
object_name = container["objectName"]
|
||||
asset_group = bpy.data.objects.get(object_name)
|
||||
libpath = Path(get_representation_path(repre_doc))
|
||||
libpath = Path(get_representation_path(repre_entity))
|
||||
|
||||
self.log.info(
|
||||
"Container: %s\nRepresentation: %s",
|
||||
pformat(container, indent=2),
|
||||
pformat(repre_doc, indent=2),
|
||||
pformat(repre_entity, indent=2),
|
||||
)
|
||||
|
||||
assert asset_group, (
|
||||
|
|
@ -176,8 +176,8 @@ class AudioLoader(plugin.AssetLoader):
|
|||
window_manager.windows[-1].screen.areas[0].type = old_type
|
||||
|
||||
metadata["libpath"] = str(libpath)
|
||||
metadata["representation"] = str(repre_doc["_id"])
|
||||
metadata["parent"] = str(repre_doc["parent"])
|
||||
metadata["representation"] = repre_entity["id"]
|
||||
metadata["parent"] = repre_entity["versionId"]
|
||||
metadata["audio"] = new_audio
|
||||
|
||||
def exec_remove(self, container: Dict) -> bool:
|
||||
|
|
|
|||
|
|
@ -20,7 +20,7 @@ from ayon_core.hosts.blender.api.pipeline import (
|
|||
class BlendLoader(plugin.AssetLoader):
|
||||
"""Load assets from a .blend file."""
|
||||
|
||||
families = ["model", "rig", "layout", "camera"]
|
||||
product_types = {"model", "rig", "layout", "camera"}
|
||||
representations = ["blend"]
|
||||
|
||||
label = "Append Blend"
|
||||
|
|
@ -127,15 +127,15 @@ class BlendLoader(plugin.AssetLoader):
|
|||
options: Additional settings dictionary
|
||||
"""
|
||||
libpath = self.filepath_from_context(context)
|
||||
folder_name = context["asset"]["name"]
|
||||
product_name = context["subset"]["name"]
|
||||
folder_name = context["folder"]["name"]
|
||||
product_name = context["product"]["name"]
|
||||
|
||||
try:
|
||||
product_type = context["subset"]["data"]["family"]
|
||||
product_type = context["product"]["productType"]
|
||||
except ValueError:
|
||||
product_type = "model"
|
||||
|
||||
representation = str(context["representation"]["_id"])
|
||||
representation = context["representation"]["id"]
|
||||
|
||||
asset_name = plugin.prepare_scene_name(folder_name, product_name)
|
||||
unique_number = plugin.get_unique_number(folder_name, product_name)
|
||||
|
|
@ -162,11 +162,11 @@ class BlendLoader(plugin.AssetLoader):
|
|||
"name": name,
|
||||
"namespace": namespace or '',
|
||||
"loader": str(self.__class__.__name__),
|
||||
"representation": str(context["representation"]["_id"]),
|
||||
"representation": context["representation"]["id"],
|
||||
"libpath": libpath,
|
||||
"asset_name": asset_name,
|
||||
"parent": str(context["representation"]["parent"]),
|
||||
"productType": context["subset"]["data"]["family"],
|
||||
"parent": context["representation"]["versionId"],
|
||||
"productType": context["product"]["productType"],
|
||||
"objectName": group_name,
|
||||
"members": members,
|
||||
}
|
||||
|
|
@ -185,10 +185,10 @@ class BlendLoader(plugin.AssetLoader):
|
|||
"""
|
||||
Update the loaded asset.
|
||||
"""
|
||||
repre_doc = context["representation"]
|
||||
repre_entity = context["representation"]
|
||||
group_name = container["objectName"]
|
||||
asset_group = bpy.data.objects.get(group_name)
|
||||
libpath = Path(get_representation_path(repre_doc)).as_posix()
|
||||
libpath = Path(get_representation_path(repre_entity)).as_posix()
|
||||
|
||||
assert asset_group, (
|
||||
f"The asset is not loaded: {container['objectName']}"
|
||||
|
|
@ -227,7 +227,7 @@ class BlendLoader(plugin.AssetLoader):
|
|||
obj.animation_data_create()
|
||||
obj.animation_data.action = actions[obj.name]
|
||||
|
||||
# Restore the old data, but reset memebers, as they don't exist anymore
|
||||
# Restore the old data, but reset members, as they don't exist anymore
|
||||
# This avoids a crash, because the memory addresses of those members
|
||||
# are not valid anymore
|
||||
old_data["members"] = []
|
||||
|
|
@ -235,8 +235,8 @@ class BlendLoader(plugin.AssetLoader):
|
|||
|
||||
new_data = {
|
||||
"libpath": libpath,
|
||||
"representation": str(repre_doc["_id"]),
|
||||
"parent": str(repre_doc["parent"]),
|
||||
"representation": repre_entity["id"],
|
||||
"parent": repre_entity["versionId"],
|
||||
"members": members,
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -18,7 +18,7 @@ from ayon_core.hosts.blender.api.pipeline import (
|
|||
class BlendSceneLoader(plugin.AssetLoader):
|
||||
"""Load assets from a .blend file."""
|
||||
|
||||
families = ["blendScene"]
|
||||
product_types = {"blendScene"}
|
||||
representations = ["blend"]
|
||||
|
||||
label = "Append Blend"
|
||||
|
|
@ -82,11 +82,11 @@ class BlendSceneLoader(plugin.AssetLoader):
|
|||
options: Additional settings dictionary
|
||||
"""
|
||||
libpath = self.filepath_from_context(context)
|
||||
folder_name = context["asset"]["name"]
|
||||
product_name = context["subset"]["name"]
|
||||
folder_name = context["folder"]["name"]
|
||||
product_name = context["product"]["name"]
|
||||
|
||||
try:
|
||||
product_type = context["subset"]["data"]["family"]
|
||||
product_type = context["product"]["productType"]
|
||||
except ValueError:
|
||||
product_type = "model"
|
||||
|
||||
|
|
@ -114,11 +114,11 @@ class BlendSceneLoader(plugin.AssetLoader):
|
|||
"name": name,
|
||||
"namespace": namespace or '',
|
||||
"loader": str(self.__class__.__name__),
|
||||
"representation": str(context["representation"]["_id"]),
|
||||
"representation": context["representation"]["id"],
|
||||
"libpath": libpath,
|
||||
"asset_name": asset_name,
|
||||
"parent": str(context["representation"]["parent"]),
|
||||
"productType": context["subset"]["data"]["family"],
|
||||
"parent": context["representation"]["versionId"],
|
||||
"productType": context["product"]["productType"],
|
||||
"objectName": group_name,
|
||||
"members": members,
|
||||
}
|
||||
|
|
@ -137,10 +137,10 @@ class BlendSceneLoader(plugin.AssetLoader):
|
|||
"""
|
||||
Update the loaded asset.
|
||||
"""
|
||||
repre_doc = context["representation"]
|
||||
repre_entity = context["representation"]
|
||||
group_name = container["objectName"]
|
||||
asset_group = bpy.data.collections.get(group_name)
|
||||
libpath = Path(get_representation_path(repre_doc)).as_posix()
|
||||
libpath = Path(get_representation_path(repre_entity)).as_posix()
|
||||
|
||||
assert asset_group, (
|
||||
f"The asset is not loaded: {container['objectName']}"
|
||||
|
|
@ -202,8 +202,8 @@ class BlendSceneLoader(plugin.AssetLoader):
|
|||
|
||||
new_data = {
|
||||
"libpath": libpath,
|
||||
"representation": str(repre_doc["_id"]),
|
||||
"parent": str(repre_doc["parent"]),
|
||||
"representation": repre_entity["id"],
|
||||
"parent": repre_entity["versionId"],
|
||||
"members": members,
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -23,7 +23,7 @@ class AbcCameraLoader(plugin.AssetLoader):
|
|||
Stores the imported asset in an empty named after the asset.
|
||||
"""
|
||||
|
||||
families = ["camera"]
|
||||
product_types = {"camera"}
|
||||
representations = ["abc"]
|
||||
|
||||
label = "Load Camera (ABC)"
|
||||
|
|
@ -84,8 +84,8 @@ class AbcCameraLoader(plugin.AssetLoader):
|
|||
|
||||
libpath = self.filepath_from_context(context)
|
||||
|
||||
folder_name = context["asset"]["name"]
|
||||
product_name = context["subset"]["name"]
|
||||
folder_name = context["folder"]["name"]
|
||||
product_name = context["product"]["name"]
|
||||
|
||||
asset_name = plugin.prepare_scene_name(folder_name, product_name)
|
||||
unique_number = plugin.get_unique_number(folder_name, product_name)
|
||||
|
|
@ -119,11 +119,11 @@ class AbcCameraLoader(plugin.AssetLoader):
|
|||
"name": name,
|
||||
"namespace": namespace or "",
|
||||
"loader": str(self.__class__.__name__),
|
||||
"representation": str(context["representation"]["_id"]),
|
||||
"representation": context["representation"]["id"],
|
||||
"libpath": libpath,
|
||||
"asset_name": asset_name,
|
||||
"parent": str(context["representation"]["parent"]),
|
||||
"productType": context["subset"]["data"]["family"],
|
||||
"parent": context["representation"]["versionId"],
|
||||
"productType": context["product"]["productType"],
|
||||
"objectName": group_name,
|
||||
}
|
||||
|
||||
|
|
@ -142,16 +142,16 @@ class AbcCameraLoader(plugin.AssetLoader):
|
|||
Warning:
|
||||
No nested collections are supported at the moment!
|
||||
"""
|
||||
repre_doc = context["representation"]
|
||||
repre_entity = context["representation"]
|
||||
object_name = container["objectName"]
|
||||
asset_group = bpy.data.objects.get(object_name)
|
||||
libpath = Path(get_representation_path(repre_doc))
|
||||
libpath = Path(get_representation_path(repre_entity))
|
||||
extension = libpath.suffix.lower()
|
||||
|
||||
self.log.info(
|
||||
"Container: %s\nRepresentation: %s",
|
||||
pformat(container, indent=2),
|
||||
pformat(repre_doc, indent=2),
|
||||
pformat(repre_entity, indent=2),
|
||||
)
|
||||
|
||||
assert asset_group, (
|
||||
|
|
@ -186,7 +186,7 @@ class AbcCameraLoader(plugin.AssetLoader):
|
|||
asset_group.matrix_basis = mat
|
||||
|
||||
metadata["libpath"] = str(libpath)
|
||||
metadata["representation"] = str(repre_doc["_id"])
|
||||
metadata["representation"] = repre_entity["id"]
|
||||
|
||||
def exec_remove(self, container: Dict) -> bool:
|
||||
"""Remove an existing container from a Blender scene.
|
||||
|
|
|
|||
|
|
@ -23,7 +23,7 @@ class FbxCameraLoader(plugin.AssetLoader):
|
|||
Stores the imported asset in an empty named after the asset.
|
||||
"""
|
||||
|
||||
families = ["camera"]
|
||||
product_types = {"camera"}
|
||||
representations = ["fbx"]
|
||||
|
||||
label = "Load Camera (FBX)"
|
||||
|
|
@ -87,8 +87,8 @@ class FbxCameraLoader(plugin.AssetLoader):
|
|||
options: Additional settings dictionary
|
||||
"""
|
||||
libpath = self.filepath_from_context(context)
|
||||
folder_name = context["asset"]["name"]
|
||||
product_name = context["subset"]["name"]
|
||||
folder_name = context["folder"]["name"]
|
||||
product_name = context["product"]["name"]
|
||||
|
||||
asset_name = plugin.prepare_scene_name(folder_name, product_name)
|
||||
unique_number = plugin.get_unique_number(folder_name, product_name)
|
||||
|
|
@ -122,11 +122,11 @@ class FbxCameraLoader(plugin.AssetLoader):
|
|||
"name": name,
|
||||
"namespace": namespace or '',
|
||||
"loader": str(self.__class__.__name__),
|
||||
"representation": str(context["representation"]["_id"]),
|
||||
"representation": context["representation"]["id"],
|
||||
"libpath": libpath,
|
||||
"asset_name": asset_name,
|
||||
"parent": str(context["representation"]["parent"]),
|
||||
"productType": context["subset"]["data"]["family"],
|
||||
"parent": context["representation"]["versionId"],
|
||||
"productType": context["product"]["productType"],
|
||||
"objectName": group_name
|
||||
}
|
||||
|
||||
|
|
@ -145,16 +145,16 @@ class FbxCameraLoader(plugin.AssetLoader):
|
|||
Warning:
|
||||
No nested collections are supported at the moment!
|
||||
"""
|
||||
repre_doc = context["representation"]
|
||||
repre_entity = context["representation"]
|
||||
object_name = container["objectName"]
|
||||
asset_group = bpy.data.objects.get(object_name)
|
||||
libpath = Path(get_representation_path(repre_doc))
|
||||
libpath = Path(get_representation_path(repre_entity))
|
||||
extension = libpath.suffix.lower()
|
||||
|
||||
self.log.info(
|
||||
"Container: %s\nRepresentation: %s",
|
||||
pformat(container, indent=2),
|
||||
pformat(repre_doc, indent=2),
|
||||
pformat(repre_entity, indent=2),
|
||||
)
|
||||
|
||||
assert asset_group, (
|
||||
|
|
@ -196,7 +196,7 @@ class FbxCameraLoader(plugin.AssetLoader):
|
|||
asset_group.matrix_basis = mat
|
||||
|
||||
metadata["libpath"] = str(libpath)
|
||||
metadata["representation"] = str(repre_doc["_id"])
|
||||
metadata["representation"] = repre_entity["id"]
|
||||
|
||||
def exec_remove(self, container: Dict) -> bool:
|
||||
"""Remove an existing container from a Blender scene.
|
||||
|
|
|
|||
|
|
@ -23,7 +23,7 @@ class FbxModelLoader(plugin.AssetLoader):
|
|||
Stores the imported asset in an empty named after the asset.
|
||||
"""
|
||||
|
||||
families = ["model", "rig"]
|
||||
product_types = {"model", "rig"}
|
||||
representations = ["fbx"]
|
||||
|
||||
label = "Load FBX"
|
||||
|
|
@ -131,8 +131,8 @@ class FbxModelLoader(plugin.AssetLoader):
|
|||
options: Additional settings dictionary
|
||||
"""
|
||||
libpath = self.filepath_from_context(context)
|
||||
folder_name = context["asset"]["name"]
|
||||
product_name = context["subset"]["name"]
|
||||
folder_name = context["folder"]["name"]
|
||||
product_name = context["product"]["name"]
|
||||
|
||||
asset_name = plugin.prepare_scene_name(folder_name, product_name)
|
||||
unique_number = plugin.get_unique_number(folder_name, product_name)
|
||||
|
|
@ -166,11 +166,11 @@ class FbxModelLoader(plugin.AssetLoader):
|
|||
"name": name,
|
||||
"namespace": namespace or '',
|
||||
"loader": str(self.__class__.__name__),
|
||||
"representation": str(context["representation"]["_id"]),
|
||||
"representation": context["representation"]["id"],
|
||||
"libpath": libpath,
|
||||
"asset_name": asset_name,
|
||||
"parent": str(context["representation"]["parent"]),
|
||||
"productType": context["subset"]["data"]["family"],
|
||||
"parent": context["representation"]["versionId"],
|
||||
"productType": context["product"]["productType"],
|
||||
"objectName": group_name
|
||||
}
|
||||
|
||||
|
|
@ -189,16 +189,16 @@ class FbxModelLoader(plugin.AssetLoader):
|
|||
Warning:
|
||||
No nested collections are supported at the moment!
|
||||
"""
|
||||
repre_doc = context["representation"]
|
||||
repre_entity = context["representation"]
|
||||
object_name = container["objectName"]
|
||||
asset_group = bpy.data.objects.get(object_name)
|
||||
libpath = Path(get_representation_path(repre_doc))
|
||||
libpath = Path(get_representation_path(repre_entity))
|
||||
extension = libpath.suffix.lower()
|
||||
|
||||
self.log.info(
|
||||
"Container: %s\nRepresentation: %s",
|
||||
pformat(container, indent=2),
|
||||
pformat(repre_doc, indent=2),
|
||||
pformat(repre_entity, indent=2),
|
||||
)
|
||||
|
||||
assert asset_group, (
|
||||
|
|
@ -251,7 +251,7 @@ class FbxModelLoader(plugin.AssetLoader):
|
|||
asset_group.matrix_basis = mat
|
||||
|
||||
metadata["libpath"] = str(libpath)
|
||||
metadata["representation"] = str(repre_doc["_id"])
|
||||
metadata["representation"] = repre_entity["id"]
|
||||
|
||||
def exec_remove(self, container: Dict) -> bool:
|
||||
"""Remove an existing container from a Blender scene.
|
||||
|
|
|
|||
|
|
@ -26,7 +26,7 @@ from ayon_core.hosts.blender.api import plugin
|
|||
class JsonLayoutLoader(plugin.AssetLoader):
|
||||
"""Load layout published from Unreal."""
|
||||
|
||||
families = ["layout"]
|
||||
product_types = {"layout"}
|
||||
representations = ["json"]
|
||||
|
||||
label = "Load Layout"
|
||||
|
|
@ -132,7 +132,7 @@ class JsonLayoutLoader(plugin.AssetLoader):
|
|||
# # name=f"{unique_number}_{product[name]}_animation",
|
||||
# asset=asset,
|
||||
# options={"useSelection": False}
|
||||
# # data={"dependencies": str(context["representation"]["_id"])}
|
||||
# # data={"dependencies": context["representation"]["id"]}
|
||||
# )
|
||||
|
||||
def process_asset(self,
|
||||
|
|
@ -148,8 +148,8 @@ class JsonLayoutLoader(plugin.AssetLoader):
|
|||
options: Additional settings dictionary
|
||||
"""
|
||||
libpath = self.filepath_from_context(context)
|
||||
folder_name = context["asset"]["name"]
|
||||
product_name = context["subset"]["name"]
|
||||
folder_name = context["folder"]["name"]
|
||||
product_name = context["product"]["name"]
|
||||
|
||||
asset_name = plugin.prepare_scene_name(folder_name, product_name)
|
||||
unique_number = plugin.get_unique_number(folder_name, product_name)
|
||||
|
|
@ -177,11 +177,11 @@ class JsonLayoutLoader(plugin.AssetLoader):
|
|||
"name": name,
|
||||
"namespace": namespace or '',
|
||||
"loader": str(self.__class__.__name__),
|
||||
"representation": str(context["representation"]["_id"]),
|
||||
"representation": context["representation"]["id"],
|
||||
"libpath": libpath,
|
||||
"asset_name": asset_name,
|
||||
"parent": str(context["representation"]["parent"]),
|
||||
"productType": context["subset"]["data"]["family"],
|
||||
"parent": context["representation"]["versionId"],
|
||||
"productType": context["product"]["productType"],
|
||||
"objectName": group_name
|
||||
}
|
||||
|
||||
|
|
@ -197,16 +197,16 @@ class JsonLayoutLoader(plugin.AssetLoader):
|
|||
will not be removed, only unlinked. Normally this should not be the
|
||||
case though.
|
||||
"""
|
||||
repre_doc = context["representation"]
|
||||
repre_entity = context["representation"]
|
||||
object_name = container["objectName"]
|
||||
asset_group = bpy.data.objects.get(object_name)
|
||||
libpath = Path(get_representation_path(repre_doc))
|
||||
libpath = Path(get_representation_path(repre_entity))
|
||||
extension = libpath.suffix.lower()
|
||||
|
||||
self.log.info(
|
||||
"Container: %s\nRepresentation: %s",
|
||||
pformat(container, indent=2),
|
||||
pformat(repre_doc, indent=2),
|
||||
pformat(repre_entity, indent=2),
|
||||
)
|
||||
|
||||
assert asset_group, (
|
||||
|
|
@ -270,7 +270,7 @@ class JsonLayoutLoader(plugin.AssetLoader):
|
|||
asset_group.matrix_basis = mat
|
||||
|
||||
metadata["libpath"] = str(libpath)
|
||||
metadata["representation"] = str(repre_doc["_id"])
|
||||
metadata["representation"] = repre_entity["id"]
|
||||
|
||||
def exec_remove(self, container: Dict) -> bool:
|
||||
"""Remove an existing container from a Blender scene.
|
||||
|
|
|
|||
|
|
@ -23,7 +23,7 @@ class BlendLookLoader(plugin.AssetLoader):
|
|||
contains the model. There is no further need to 'containerise' it.
|
||||
"""
|
||||
|
||||
families = ["look"]
|
||||
product_types = {"look"}
|
||||
representations = ["json"]
|
||||
|
||||
label = "Load Look"
|
||||
|
|
@ -93,8 +93,8 @@ class BlendLookLoader(plugin.AssetLoader):
|
|||
"""
|
||||
|
||||
libpath = self.filepath_from_context(context)
|
||||
folder_name = context["asset"]["name"]
|
||||
product_name = context["subset"]["name"]
|
||||
folder_name = context["folder"]["name"]
|
||||
product_name = context["product"]["name"]
|
||||
|
||||
lib_container = plugin.prepare_scene_name(
|
||||
folder_name, product_name
|
||||
|
|
@ -130,8 +130,8 @@ class BlendLookLoader(plugin.AssetLoader):
|
|||
metadata["objects"] = objects
|
||||
metadata["materials"] = materials
|
||||
|
||||
metadata["parent"] = str(context["representation"]["parent"])
|
||||
metadata["product_type"] = context["subset"]["data"]["family"]
|
||||
metadata["parent"] = context["representation"]["versionId"]
|
||||
metadata["product_type"] = context["product"]["productType"]
|
||||
|
||||
nodes = list(container.objects)
|
||||
nodes.append(container)
|
||||
|
|
@ -140,14 +140,14 @@ class BlendLookLoader(plugin.AssetLoader):
|
|||
|
||||
def update(self, container: Dict, context: Dict):
|
||||
collection = bpy.data.collections.get(container["objectName"])
|
||||
repre_doc = context["representation"]
|
||||
libpath = Path(get_representation_path(repre_doc))
|
||||
repre_entity = context["representation"]
|
||||
libpath = Path(get_representation_path(repre_entity))
|
||||
extension = libpath.suffix.lower()
|
||||
|
||||
self.log.info(
|
||||
"Container: %s\nRepresentation: %s",
|
||||
pformat(container, indent=2),
|
||||
pformat(repre_doc, indent=2),
|
||||
pformat(repre_entity, indent=2),
|
||||
)
|
||||
|
||||
assert collection, (
|
||||
|
|
@ -202,7 +202,7 @@ class BlendLookLoader(plugin.AssetLoader):
|
|||
collection_metadata["objects"] = objects
|
||||
collection_metadata["materials"] = materials
|
||||
collection_metadata["libpath"] = str(libpath)
|
||||
collection_metadata["representation"] = str(repre_doc["_id"])
|
||||
collection_metadata["representation"] = repre_entity["id"]
|
||||
|
||||
def remove(self, container: Dict) -> bool:
|
||||
collection = bpy.data.collections.get(container["objectName"])
|
||||
|
|
|
|||
|
|
@ -19,7 +19,7 @@ class ExtractABC(publish.Extractor, publish.OptionalPyblishPluginMixin):
|
|||
|
||||
# Define extract output file path
|
||||
stagingdir = self.staging_dir(instance)
|
||||
folder_name = instance.data["assetEntity"]["name"]
|
||||
folder_name = instance.data["folderEntity"]["name"]
|
||||
product_name = instance.data["productName"]
|
||||
instance_name = f"{folder_name}_{product_name}"
|
||||
filename = f"{instance_name}.abc"
|
||||
|
|
|
|||
|
|
@ -23,7 +23,7 @@ class ExtractAnimationABC(
|
|||
|
||||
# Define extract output file path
|
||||
stagingdir = self.staging_dir(instance)
|
||||
folder_name = instance.data["assetEntity"]["name"]
|
||||
folder_name = instance.data["folderEntity"]["name"]
|
||||
product_name = instance.data["productName"]
|
||||
instance_name = f"{folder_name}_{product_name}"
|
||||
filename = f"{instance_name}.abc"
|
||||
|
|
|
|||
|
|
@ -23,7 +23,7 @@ class ExtractBlend(publish.Extractor, publish.OptionalPyblishPluginMixin):
|
|||
# Define extract output file path
|
||||
|
||||
stagingdir = self.staging_dir(instance)
|
||||
folder_name = instance.data["assetEntity"]["name"]
|
||||
folder_name = instance.data["folderEntity"]["name"]
|
||||
product_name = instance.data["productName"]
|
||||
instance_name = f"{folder_name}_{product_name}"
|
||||
filename = f"{instance_name}.blend"
|
||||
|
|
|
|||
|
|
@ -26,7 +26,7 @@ class ExtractBlendAnimation(
|
|||
# Define extract output file path
|
||||
|
||||
stagingdir = self.staging_dir(instance)
|
||||
folder_name = instance.data["assetEntity"]["name"]
|
||||
folder_name = instance.data["folderEntity"]["name"]
|
||||
product_name = instance.data["productName"]
|
||||
instance_name = f"{folder_name}_{product_name}"
|
||||
filename = f"{instance_name}.blend"
|
||||
|
|
|
|||
|
|
@ -4,7 +4,6 @@ import bpy
|
|||
|
||||
from ayon_core.pipeline import publish
|
||||
from ayon_core.hosts.blender.api import plugin
|
||||
from ayon_core.hosts.blender.api.pipeline import AVALON_PROPERTY
|
||||
|
||||
|
||||
class ExtractCameraABC(publish.Extractor, publish.OptionalPyblishPluginMixin):
|
||||
|
|
@ -21,7 +20,7 @@ class ExtractCameraABC(publish.Extractor, publish.OptionalPyblishPluginMixin):
|
|||
|
||||
# Define extract output file path
|
||||
stagingdir = self.staging_dir(instance)
|
||||
folder_name = instance.data["assetEntity"]["name"]
|
||||
folder_name = instance.data["folderEntity"]["name"]
|
||||
product_name = instance.data["productName"]
|
||||
instance_name = f"{folder_name}_{product_name}"
|
||||
filename = f"{instance_name}.abc"
|
||||
|
|
|
|||
|
|
@ -20,7 +20,7 @@ class ExtractCamera(publish.Extractor, publish.OptionalPyblishPluginMixin):
|
|||
|
||||
# Define extract output file path
|
||||
stagingdir = self.staging_dir(instance)
|
||||
folder_name = instance.data["assetEntity"]["name"]
|
||||
folder_name = instance.data["folderEntity"]["name"]
|
||||
product_name = instance.data["productName"]
|
||||
instance_name = f"{folder_name}_{product_name}"
|
||||
filename = f"{instance_name}.fbx"
|
||||
|
|
|
|||
|
|
@ -4,7 +4,6 @@ import bpy
|
|||
|
||||
from ayon_core.pipeline import publish
|
||||
from ayon_core.hosts.blender.api import plugin
|
||||
from ayon_core.hosts.blender.api.pipeline import AVALON_PROPERTY
|
||||
|
||||
|
||||
class ExtractFBX(publish.Extractor, publish.OptionalPyblishPluginMixin):
|
||||
|
|
@ -21,7 +20,7 @@ class ExtractFBX(publish.Extractor, publish.OptionalPyblishPluginMixin):
|
|||
|
||||
# Define extract output file path
|
||||
stagingdir = self.staging_dir(instance)
|
||||
folder_name = instance.data["assetEntity"]["name"]
|
||||
folder_name = instance.data["folderEntity"]["name"]
|
||||
product_name = instance.data["productName"]
|
||||
instance_name = f"{folder_name}_{product_name}"
|
||||
filename = f"{instance_name}.fbx"
|
||||
|
|
|
|||
|
|
@ -145,7 +145,7 @@ class ExtractAnimationFBX(
|
|||
|
||||
root.select_set(True)
|
||||
armature.select_set(True)
|
||||
folder_name = instance.data["assetEntity"]["name"]
|
||||
folder_name = instance.data["folderEntity"]["name"]
|
||||
product_name = instance.data["productName"]
|
||||
instance_name = f"{folder_name}_{product_name}"
|
||||
fbx_filename = f"{instance_name}_{armature.name}.fbx"
|
||||
|
|
|
|||
|
|
@ -5,7 +5,8 @@ import bpy
|
|||
import bpy_extras
|
||||
import bpy_extras.anim_utils
|
||||
|
||||
from ayon_core.client import get_representation_by_name
|
||||
from ayon_api import get_representations
|
||||
|
||||
from ayon_core.pipeline import publish
|
||||
from ayon_core.hosts.blender.api import plugin
|
||||
from ayon_core.hosts.blender.api.pipeline import AVALON_PROPERTY
|
||||
|
|
@ -134,6 +135,8 @@ class ExtractLayout(publish.Extractor, publish.OptionalPyblishPluginMixin):
|
|||
fbx_count = 0
|
||||
|
||||
project_name = instance.context.data["projectName"]
|
||||
version_ids = set()
|
||||
filtered_assets = []
|
||||
for asset in asset_group.children:
|
||||
metadata = asset.get(AVALON_PROPERTY)
|
||||
if not metadata:
|
||||
|
|
@ -146,42 +149,47 @@ class ExtractLayout(publish.Extractor, publish.OptionalPyblishPluginMixin):
|
|||
)
|
||||
continue
|
||||
|
||||
filtered_assets.append((asset, metadata))
|
||||
version_ids.add(metadata["parent"])
|
||||
|
||||
repre_entities = get_representations(
|
||||
project_name,
|
||||
representation_names={"blend", "fbx", "abc"},
|
||||
version_ids=version_ids,
|
||||
fields={"id", "versionId", "name"}
|
||||
)
|
||||
repre_mapping_by_version_id = {
|
||||
version_id: {}
|
||||
for version_id in version_ids
|
||||
}
|
||||
for repre_entity in repre_entities:
|
||||
version_id = repre_entity["versionId"]
|
||||
repre_mapping_by_version_id[version_id][repre_entity["name"]] = (
|
||||
repre_entity
|
||||
)
|
||||
|
||||
for asset, metadata in filtered_assets:
|
||||
version_id = metadata["parent"]
|
||||
product_type = metadata.get("product_type")
|
||||
if product_type is None:
|
||||
product_type = metadata["family"]
|
||||
|
||||
repres_by_name = repre_mapping_by_version_id[version_id]
|
||||
|
||||
self.log.debug("Parent: {}".format(version_id))
|
||||
# Get blend reference
|
||||
blend = get_representation_by_name(
|
||||
project_name, "blend", version_id, fields=["_id"]
|
||||
)
|
||||
blend_id = None
|
||||
if blend:
|
||||
blend_id = blend["_id"]
|
||||
# Get fbx reference
|
||||
fbx = get_representation_by_name(
|
||||
project_name, "fbx", version_id, fields=["_id"]
|
||||
)
|
||||
fbx_id = None
|
||||
if fbx:
|
||||
fbx_id = fbx["_id"]
|
||||
# Get abc reference
|
||||
abc = get_representation_by_name(
|
||||
project_name, "abc", version_id, fields=["_id"]
|
||||
)
|
||||
abc_id = None
|
||||
if abc:
|
||||
abc_id = abc["_id"]
|
||||
|
||||
json_element = {}
|
||||
if blend_id:
|
||||
json_element["reference"] = str(blend_id)
|
||||
if fbx_id:
|
||||
json_element["reference_fbx"] = str(fbx_id)
|
||||
if abc_id:
|
||||
json_element["reference_abc"] = str(abc_id)
|
||||
|
||||
# Get blend, fbx and abc reference
|
||||
blend_id = repres_by_name.get("blend", {}).get("id")
|
||||
fbx_id = repres_by_name.get("fbx", {}).get("id")
|
||||
abc_id = repres_by_name.get("abc", {}).get("id")
|
||||
json_element = {
|
||||
key: value
|
||||
for key, value in (
|
||||
("reference", blend_id),
|
||||
("reference_fbx", fbx_id),
|
||||
("reference_abc", abc_id),
|
||||
)
|
||||
if value
|
||||
}
|
||||
json_element["product_type"] = product_type
|
||||
json_element["instance_name"] = asset.name
|
||||
json_element["asset_name"] = metadata["asset_name"]
|
||||
|
|
@ -228,7 +236,7 @@ class ExtractLayout(publish.Extractor, publish.OptionalPyblishPluginMixin):
|
|||
|
||||
json_data.append(json_element)
|
||||
|
||||
folder_name = instance.data["assetEntity"]["name"]
|
||||
folder_name = instance.data["folderEntity"]["name"]
|
||||
product_name = instance.data["productName"]
|
||||
instance_name = f"{folder_name}_{product_name}"
|
||||
json_filename = f"{instance_name}.json"
|
||||
|
|
|
|||
|
|
@ -55,7 +55,7 @@ class ExtractPlayblast(publish.Extractor, publish.OptionalPyblishPluginMixin):
|
|||
|
||||
# get output path
|
||||
stagingdir = self.staging_dir(instance)
|
||||
folder_name = instance.data["assetEntity"]["name"]
|
||||
folder_name = instance.data["folderEntity"]["name"]
|
||||
product_name = instance.data["productName"]
|
||||
filename = f"{folder_name}_{product_name}"
|
||||
|
||||
|
|
|
|||
|
|
@ -32,7 +32,7 @@ class ExtractThumbnail(publish.Extractor):
|
|||
return
|
||||
|
||||
stagingdir = self.staging_dir(instance)
|
||||
folder_name = instance.data["assetEntity"]["name"]
|
||||
folder_name = instance.data["folderEntity"]["name"]
|
||||
product_name = instance.data["productName"]
|
||||
filename = f"{folder_name}_{product_name}"
|
||||
|
||||
|
|
|
|||
|
|
@ -44,7 +44,7 @@ class IntegrateAnimation(
|
|||
break
|
||||
if not rep:
|
||||
continue
|
||||
obj_id = rep["representation"]["_id"]
|
||||
obj_id = rep["representation"]["id"]
|
||||
|
||||
if obj_id:
|
||||
json_dict["representation_id"] = str(obj_id)
|
||||
|
|
|
|||
|
|
@ -32,7 +32,7 @@ class ValidateDeadlinePublish(pyblish.api.InstancePlugin,
|
|||
tree = bpy.context.scene.node_tree
|
||||
output_type = "CompositorNodeOutputFile"
|
||||
output_node = None
|
||||
# Remove all output nodes that inlcude "AYON" in the name.
|
||||
# Remove all output nodes that include "AYON" in the name.
|
||||
# There should be only one.
|
||||
for node in tree.nodes:
|
||||
if node.bl_idname == output_type and "AYON" in node.name:
|
||||
|
|
|
|||
|
|
@ -37,7 +37,8 @@ class ValidateFileSaved(pyblish.api.ContextPlugin,
|
|||
if not context.data["currentFile"]:
|
||||
# File has not been saved at all and has no filename
|
||||
raise PublishValidationError(
|
||||
"Current file is empty. Save the file before continuing."
|
||||
"Current workfile has not been saved yet.\n"
|
||||
"Save the workfile before continuing."
|
||||
)
|
||||
|
||||
# Do not validate workfile has unsaved changes if only instances
|
||||
|
|
|
|||
|
|
@ -12,7 +12,7 @@ from ayon_core.pipeline.publish import (
|
|||
import ayon_core.hosts.blender.api.action
|
||||
|
||||
|
||||
class ValidateMeshNoNegativeScale(pyblish.api.Validator,
|
||||
class ValidateMeshNoNegativeScale(pyblish.api.InstancePlugin,
|
||||
OptionalPyblishPluginMixin):
|
||||
"""Ensure that meshes don't have a negative scale."""
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,94 @@
|
|||
import inspect
|
||||
from typing import List
|
||||
|
||||
import bpy
|
||||
|
||||
import pyblish.api
|
||||
|
||||
from ayon_core.pipeline.publish import (
|
||||
ValidateContentsOrder,
|
||||
OptionalPyblishPluginMixin,
|
||||
PublishValidationError,
|
||||
RepairAction
|
||||
)
|
||||
import ayon_core.hosts.blender.api.action
|
||||
|
||||
|
||||
class ValidateModelMeshUvMap1(
|
||||
pyblish.api.InstancePlugin,
|
||||
OptionalPyblishPluginMixin,
|
||||
):
|
||||
"""Validate model mesh uvs are named `map1`.
|
||||
|
||||
This is solely to get them to work nicely for the Maya pipeline.
|
||||
"""
|
||||
|
||||
order = ValidateContentsOrder
|
||||
hosts = ["blender"]
|
||||
families = ["model"]
|
||||
label = "Mesh UVs named map1"
|
||||
actions = [ayon_core.hosts.blender.api.action.SelectInvalidAction,
|
||||
RepairAction]
|
||||
optional = True
|
||||
enabled = False
|
||||
|
||||
@classmethod
|
||||
def get_invalid(cls, instance) -> List:
|
||||
|
||||
invalid = []
|
||||
for obj in instance:
|
||||
if obj.mode != "OBJECT":
|
||||
cls.log.warning(
|
||||
f"Mesh object {obj.name} should be in 'OBJECT' mode"
|
||||
" to be properly checked."
|
||||
)
|
||||
|
||||
obj_data = obj.data
|
||||
if isinstance(obj_data, bpy.types.Mesh):
|
||||
mesh = obj_data
|
||||
|
||||
# Ignore mesh without UVs
|
||||
if not mesh.uv_layers:
|
||||
continue
|
||||
|
||||
# If mesh has map1 all is ok
|
||||
if mesh.uv_layers.get("map1"):
|
||||
continue
|
||||
|
||||
cls.log.warning(
|
||||
f"Mesh object {obj.name} should be in 'OBJECT' mode"
|
||||
" to be properly checked."
|
||||
)
|
||||
invalid.append(obj)
|
||||
|
||||
return invalid
|
||||
|
||||
@classmethod
|
||||
def repair(cls, instance):
|
||||
for obj in cls.get_invalid(instance):
|
||||
mesh = obj.data
|
||||
|
||||
# Rename the first UV set to map1
|
||||
mesh.uv_layers[0].name = "map1"
|
||||
|
||||
def process(self, instance):
|
||||
if not self.is_active(instance.data):
|
||||
return
|
||||
|
||||
invalid = self.get_invalid(instance)
|
||||
if invalid:
|
||||
raise PublishValidationError(
|
||||
f"Meshes found in instance without valid UV's: {invalid}",
|
||||
description=self.get_description()
|
||||
)
|
||||
|
||||
def get_description(self):
|
||||
return inspect.cleandoc(
|
||||
"""## Meshes must have map1 uv set
|
||||
|
||||
To accompany a better Maya-focused pipeline with Alembics it is
|
||||
expected that a Mesh has a `map1` UV set. Blender defaults to
|
||||
a UV set named `UVMap` and thus needs to be renamed.
|
||||
|
||||
"""
|
||||
)
|
||||
|
|
@ -1,3 +1,4 @@
|
|||
import inspect
|
||||
from typing import List
|
||||
|
||||
import mathutils
|
||||
|
|
@ -5,29 +6,26 @@ import bpy
|
|||
|
||||
import pyblish.api
|
||||
|
||||
from ayon_core.hosts.blender.api import plugin, lib
|
||||
import ayon_core.hosts.blender.api.action
|
||||
from ayon_core.pipeline.publish import (
|
||||
ValidateContentsOrder,
|
||||
OptionalPyblishPluginMixin,
|
||||
PublishValidationError
|
||||
PublishValidationError,
|
||||
RepairAction
|
||||
)
|
||||
|
||||
|
||||
class ValidateTransformZero(pyblish.api.InstancePlugin,
|
||||
OptionalPyblishPluginMixin):
|
||||
"""Transforms can't have any values
|
||||
|
||||
To solve this issue, try freezing the transforms. So long
|
||||
as the transforms, rotation and scale values are zero,
|
||||
you're all good.
|
||||
|
||||
"""
|
||||
"""Transforms can't have any values"""
|
||||
|
||||
order = ValidateContentsOrder
|
||||
hosts = ["blender"]
|
||||
families = ["model"]
|
||||
label = "Transform Zero"
|
||||
actions = [ayon_core.hosts.blender.api.action.SelectInvalidAction]
|
||||
actions = [ayon_core.hosts.blender.api.action.SelectInvalidAction,
|
||||
RepairAction]
|
||||
|
||||
_identity = mathutils.Matrix()
|
||||
|
||||
|
|
@ -51,5 +49,46 @@ class ValidateTransformZero(pyblish.api.InstancePlugin,
|
|||
names = ", ".join(obj.name for obj in invalid)
|
||||
raise PublishValidationError(
|
||||
"Objects found in instance which do not"
|
||||
f" have transform set to zero: {names}"
|
||||
f" have transform set to zero: {names}",
|
||||
description=self.get_description()
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def repair(cls, instance):
|
||||
|
||||
invalid = cls.get_invalid(instance)
|
||||
if not invalid:
|
||||
return
|
||||
|
||||
context = plugin.create_blender_context(
|
||||
active=invalid[0], selected=invalid
|
||||
)
|
||||
with lib.maintained_selection():
|
||||
with bpy.context.temp_override(**context):
|
||||
plugin.deselect_all()
|
||||
for obj in invalid:
|
||||
obj.select_set(True)
|
||||
|
||||
# TODO: Preferably this does allow custom pivot point locations
|
||||
# and if so, this should likely apply to the delta instead
|
||||
# using `bpy.ops.object.transforms_to_deltas(mode="ALL")`
|
||||
bpy.ops.object.transform_apply(location=True,
|
||||
rotation=True,
|
||||
scale=True)
|
||||
|
||||
def get_description(self):
|
||||
return inspect.cleandoc(
|
||||
"""## Transforms can't have any values.
|
||||
|
||||
The location, rotation and scale on the transform must be at
|
||||
the default values. This also goes for the delta transforms.
|
||||
|
||||
To solve this issue, try freezing the transforms:
|
||||
- `Object` > `Apply` > `All Transforms`
|
||||
|
||||
Using the Repair action directly will do the same.
|
||||
|
||||
So long as the transforms, rotation and scale values are zero,
|
||||
you're all good.
|
||||
"""
|
||||
)
|
||||
|
|
|
|||
|
|
@ -3,7 +3,7 @@ import shutil
|
|||
import winreg
|
||||
import subprocess
|
||||
from ayon_core.lib import get_ayon_launcher_args
|
||||
from ayon_core.lib.applications import PreLaunchHook, LaunchTypes
|
||||
from ayon_applications import PreLaunchHook, LaunchTypes
|
||||
from ayon_core.hosts.celaction import CELACTION_ROOT_DIR
|
||||
|
||||
|
||||
|
|
@ -16,9 +16,9 @@ class CelactionPrelaunchHook(PreLaunchHook):
|
|||
launch_types = {LaunchTypes.local}
|
||||
|
||||
def execute(self):
|
||||
asset_doc = self.data["asset_doc"]
|
||||
width = asset_doc["data"]["resolutionWidth"]
|
||||
height = asset_doc["data"]["resolutionHeight"]
|
||||
folder_attributes = self.data["folder_entity"]["attrib"]
|
||||
width = folder_attributes["resolutionWidth"]
|
||||
height = folder_attributes["resolutionHeight"]
|
||||
|
||||
# Add workfile path to launch arguments
|
||||
workfile_path = self.workfile_path()
|
||||
|
|
@ -118,7 +118,7 @@ class CelactionPrelaunchHook(PreLaunchHook):
|
|||
def workfile_path(self):
|
||||
workfile_path = self.data["last_workfile_path"]
|
||||
|
||||
# copy workfile from template if doesnt exist any on path
|
||||
# copy workfile from template if doesn't exist any on path
|
||||
if not os.path.exists(workfile_path):
|
||||
# TODO add ability to set different template workfile path via
|
||||
# settings
|
||||
|
|
|
|||
|
|
@ -3,11 +3,11 @@ import sys
|
|||
from pprint import pformat
|
||||
|
||||
|
||||
class CollectCelactionCliKwargs(pyblish.api.Collector):
|
||||
class CollectCelactionCliKwargs(pyblish.api.ContextPlugin):
|
||||
""" Collects all keyword arguments passed from the terminal """
|
||||
|
||||
label = "Collect Celaction Cli Kwargs"
|
||||
order = pyblish.api.Collector.order - 0.1
|
||||
order = pyblish.api.CollectorOrder - 0.1
|
||||
|
||||
def process(self, context):
|
||||
args = list(sys.argv[1:])
|
||||
|
|
|
|||
|
|
@ -1,8 +1,6 @@
|
|||
import os
|
||||
import pyblish.api
|
||||
|
||||
from ayon_core.client import get_asset_name_identifier
|
||||
|
||||
|
||||
class CollectCelactionInstances(pyblish.api.ContextPlugin):
|
||||
""" Adds the celaction render instances """
|
||||
|
|
@ -16,24 +14,20 @@ class CollectCelactionInstances(pyblish.api.ContextPlugin):
|
|||
staging_dir = os.path.dirname(current_file)
|
||||
scene_file = os.path.basename(current_file)
|
||||
version = context.data["version"]
|
||||
asset_entity = context.data["assetEntity"]
|
||||
project_entity = context.data["projectEntity"]
|
||||
|
||||
asset_name = get_asset_name_identifier(asset_entity)
|
||||
folder_entity = context.data["folderEntity"]
|
||||
|
||||
folder_attributes = folder_entity["attrib"]
|
||||
|
||||
shared_instance_data = {
|
||||
"folderPath": asset_name,
|
||||
"frameStart": asset_entity["data"]["frameStart"],
|
||||
"frameEnd": asset_entity["data"]["frameEnd"],
|
||||
"handleStart": asset_entity["data"]["handleStart"],
|
||||
"handleEnd": asset_entity["data"]["handleEnd"],
|
||||
"fps": asset_entity["data"]["fps"],
|
||||
"resolutionWidth": asset_entity["data"].get(
|
||||
"resolutionWidth",
|
||||
project_entity["data"]["resolutionWidth"]),
|
||||
"resolutionHeight": asset_entity["data"].get(
|
||||
"resolutionHeight",
|
||||
project_entity["data"]["resolutionHeight"]),
|
||||
"folderPath": folder_entity["path"],
|
||||
"frameStart": folder_attributes["frameStart"],
|
||||
"frameEnd": folder_attributes["frameEnd"],
|
||||
"handleStart": folder_attributes["handleStart"],
|
||||
"handleEnd": folder_attributes["handleEnd"],
|
||||
"fps": folder_attributes["fps"],
|
||||
"resolutionWidth": folder_attributes["resolutionWidth"],
|
||||
"resolutionHeight": folder_attributes["resolutionHeight"],
|
||||
"pixelAspect": 1,
|
||||
"step": 1,
|
||||
"version": version
|
||||
|
|
@ -83,7 +77,7 @@ class CollectCelactionInstances(pyblish.api.ContextPlugin):
|
|||
# getting instance state
|
||||
instance.data["publish"] = True
|
||||
|
||||
# add assetEntity data into instance
|
||||
# add folderEntity data into instance
|
||||
instance.data.update({
|
||||
"label": "{} - farm".format(product_name),
|
||||
"productType": product_type,
|
||||
|
|
|
|||
|
|
@ -18,7 +18,7 @@ class CollectRenderPath(pyblish.api.InstancePlugin):
|
|||
def process(self, instance):
|
||||
anatomy = instance.context.data["anatomy"]
|
||||
anatomy_data = copy.deepcopy(instance.data["anatomyData"])
|
||||
padding = anatomy.templates.get("frame_padding", 4)
|
||||
padding = anatomy.templates_obj.frame_padding
|
||||
product_type = "render"
|
||||
anatomy_data.update({
|
||||
"frame": f"%0{padding}d",
|
||||
|
|
@ -28,18 +28,17 @@ class CollectRenderPath(pyblish.api.InstancePlugin):
|
|||
})
|
||||
anatomy_data["product"]["type"] = product_type
|
||||
|
||||
anatomy_filled = anatomy.format(anatomy_data)
|
||||
|
||||
# get anatomy rendering keys
|
||||
r_anatomy_key = self.anatomy_template_key_render_files
|
||||
m_anatomy_key = self.anatomy_template_key_metadata
|
||||
|
||||
# get folder and path for rendering images from celaction
|
||||
render_dir = anatomy_filled[r_anatomy_key]["folder"]
|
||||
render_path = anatomy_filled[r_anatomy_key]["path"]
|
||||
r_template_item = anatomy.get_template_item("publish", r_anatomy_key)
|
||||
render_dir = r_template_item["directory"].format_strict(anatomy_data)
|
||||
render_path = r_template_item["path"].format_strict(anatomy_data)
|
||||
self.log.debug("__ render_path: `{}`".format(render_path))
|
||||
|
||||
# create dir if it doesnt exists
|
||||
# create dir if it doesn't exists
|
||||
try:
|
||||
if not os.path.isdir(render_dir):
|
||||
os.makedirs(render_dir, exist_ok=True)
|
||||
|
|
@ -51,11 +50,14 @@ class CollectRenderPath(pyblish.api.InstancePlugin):
|
|||
instance.data["path"] = render_path
|
||||
|
||||
# get anatomy for published renders folder path
|
||||
if anatomy_filled.get(m_anatomy_key):
|
||||
instance.data["publishRenderMetadataFolder"] = anatomy_filled[
|
||||
m_anatomy_key]["folder"]
|
||||
self.log.info("Metadata render path: `{}`".format(
|
||||
instance.data["publishRenderMetadataFolder"]
|
||||
))
|
||||
m_template_item = anatomy.get_template_item(
|
||||
"publish", m_anatomy_key, default=None
|
||||
)
|
||||
if m_template_item is not None:
|
||||
metadata_path = m_template_item["directory"].format_strict(
|
||||
anatomy_data
|
||||
)
|
||||
instance.data["publishRenderMetadataFolder"] = metadata_path
|
||||
self.log.info("Metadata render path: `{}`".format(metadata_path))
|
||||
|
||||
self.log.info(f"Render output path set to: `{render_path}`")
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
"""
|
||||
OpenPype Autodesk Flame api
|
||||
AYON Autodesk Flame api
|
||||
"""
|
||||
from .constants import (
|
||||
COLOR_MAP,
|
||||
|
|
@ -23,7 +23,7 @@ from .lib import (
|
|||
reset_segment_selection,
|
||||
get_segment_attributes,
|
||||
get_clips_in_reels,
|
||||
get_reformated_filename,
|
||||
get_reformatted_filename,
|
||||
get_frame_from_filename,
|
||||
get_padding_from_filename,
|
||||
maintained_object_duplication,
|
||||
|
|
@ -101,7 +101,7 @@ __all__ = [
|
|||
"reset_segment_selection",
|
||||
"get_segment_attributes",
|
||||
"get_clips_in_reels",
|
||||
"get_reformated_filename",
|
||||
"get_reformatted_filename",
|
||||
"get_frame_from_filename",
|
||||
"get_padding_from_filename",
|
||||
"maintained_object_duplication",
|
||||
|
|
|
|||
|
|
@ -1,14 +1,14 @@
|
|||
|
||||
"""
|
||||
OpenPype Flame api constances
|
||||
AYON Flame api constances
|
||||
"""
|
||||
# OpenPype marker workflow variables
|
||||
# AYON marker workflow variables
|
||||
MARKER_NAME = "OpenPypeData"
|
||||
MARKER_DURATION = 0
|
||||
MARKER_COLOR = "cyan"
|
||||
MARKER_PUBLISH_DEFAULT = False
|
||||
|
||||
# OpenPype color definitions
|
||||
# AYON color definitions
|
||||
COLOR_MAP = {
|
||||
"red": (1.0, 0.0, 0.0),
|
||||
"orange": (1.0, 0.5, 0.0),
|
||||
|
|
|
|||
|
|
@ -607,7 +607,7 @@ def get_clips_in_reels(project):
|
|||
return output_clips
|
||||
|
||||
|
||||
def get_reformated_filename(filename, padded=True):
|
||||
def get_reformatted_filename(filename, padded=True):
|
||||
"""
|
||||
Return fixed python expression path
|
||||
|
||||
|
|
@ -615,10 +615,10 @@ def get_reformated_filename(filename, padded=True):
|
|||
filename (str): file name
|
||||
|
||||
Returns:
|
||||
type: string with reformated path
|
||||
type: string with reformatted path
|
||||
|
||||
Example:
|
||||
get_reformated_filename("plate.1001.exr") > plate.%04d.exr
|
||||
get_reformatted_filename("plate.1001.exr") > plate.%04d.exr
|
||||
|
||||
"""
|
||||
found = FRAME_PATTERN.search(filename)
|
||||
|
|
@ -980,7 +980,7 @@ class MediaInfoFile(object):
|
|||
|
||||
@property
|
||||
def file_pattern(self):
|
||||
"""Clips file patter
|
||||
"""Clips file pattern.
|
||||
|
||||
Returns:
|
||||
str: file pattern. ex. file.[1-2].exr
|
||||
|
|
|
|||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue