mirror of
https://github.com/ynput/ayon-core.git
synced 2025-12-25 05:14:40 +01:00
Merge branch 'develop' of https://github.com/ynput/ayon-core into enhancement/maya_validate_rig_out_set_node_ids_message
# Conflicts: # client/ayon_core/hosts/maya/plugins/publish/validate_rig_out_set_node_ids.py
This commit is contained in:
commit
80b94fe807
76 changed files with 2471 additions and 493 deletions
24
.github/workflows/pr_linting.yml
vendored
Normal file
24
.github/workflows/pr_linting.yml
vendored
Normal file
|
|
@ -0,0 +1,24 @@
|
|||
name: 📇 Code Linting
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ develop ]
|
||||
pull_request:
|
||||
branches: [ develop ]
|
||||
|
||||
workflow_dispatch:
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.event.pull_request.number}}
|
||||
cancel-in-progress: true
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
pull-requests: write
|
||||
|
||||
jobs:
|
||||
linting:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: chartboost/ruff-action@v1
|
||||
1
.gitignore
vendored
1
.gitignore
vendored
|
|
@ -77,6 +77,7 @@ dump.sql
|
|||
|
||||
# Poetry
|
||||
########
|
||||
.poetry/
|
||||
.python-version
|
||||
.editorconfig
|
||||
.pre-commit-config.yaml
|
||||
|
|
|
|||
|
|
@ -1,3 +1,3 @@
|
|||
flake8:
|
||||
enabled: true
|
||||
config_file: setup.cfg
|
||||
flake8:
|
||||
enabled: true
|
||||
config_file: setup.cfg
|
||||
|
|
|
|||
|
|
@ -1,12 +1,27 @@
|
|||
# See https://pre-commit.com for more information
|
||||
# See https://pre-commit.com/hooks.html for more hooks
|
||||
repos:
|
||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||
rev: v4.4.0
|
||||
hooks:
|
||||
- id: trailing-whitespace
|
||||
- id: end-of-file-fixer
|
||||
- id: check-yaml
|
||||
- id: check-added-large-files
|
||||
- id: no-commit-to-branch
|
||||
args: [ '--pattern', '^(?!((release|enhancement|feature|bugfix|documentation|tests|local|chore)\/[a-zA-Z0-9\-_]+)$).*' ]
|
||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||
rev: v4.4.0
|
||||
hooks:
|
||||
- id: trailing-whitespace
|
||||
- id: end-of-file-fixer
|
||||
- id: check-yaml
|
||||
- id: check-added-large-files
|
||||
- id: no-commit-to-branch
|
||||
args: [ '--pattern', '^(?!((release|enhancement|feature|bugfix|documentation|tests|local|chore)\/[a-zA-Z0-9\-_]+)$).*' ]
|
||||
- repo: https://github.com/codespell-project/codespell
|
||||
rev: v2.2.6
|
||||
hooks:
|
||||
- id: codespell
|
||||
additional_dependencies:
|
||||
- tomli
|
||||
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
# Ruff version.
|
||||
rev: v0.3.3
|
||||
hooks:
|
||||
# Run the linter.
|
||||
- id: ruff
|
||||
# Run the formatter.
|
||||
# - id: ruff-format
|
||||
|
|
|
|||
24
README.md
24
README.md
|
|
@ -1,8 +1,8 @@
|
|||
|
||||
AYON Core addon
|
||||
========
|
||||
AYON Core Addon
|
||||
===============
|
||||
|
||||
AYON core provides the base building blocks for all other AYON addons and integrations and is responsible for discovery and initialization of other addons.
|
||||
AYON core provides the base building blocks for all other AYON addons and integrations and is responsible for discovery and initialization of other addons.
|
||||
|
||||
- Some of its key functions include:
|
||||
- It is used as the main command line handler in [ayon-launcher](https://github.com/ynput/ayon-launcher) application.
|
||||
|
|
@ -13,8 +13,20 @@ AYON core provides the base building blocks for all other AYON addons and integr
|
|||
- Defines pipeline API used by other integrations
|
||||
- Provides all graphical tools for artists
|
||||
- Defines AYON QT styling
|
||||
- A bunch more things
|
||||
- A bunch more things
|
||||
|
||||
Together with [ayon-launcher](https://github.com/ynput/ayon-launcher) , they form the base of AYON pipeline and is one of few compulsory addons for AYON pipeline to be useful in a meaningful way.
|
||||
Together with [ayon-launcher](https://github.com/ynput/ayon-launcher) , they form the base of AYON pipeline and is one of few compulsory addons for AYON pipeline to be useful in a meaningful way.
|
||||
|
||||
AYON-core is a successor to OpenPype repository (minus all the addons) and still in the process of cleaning up of all references. Please bear with us during this transitional phase.
|
||||
AYON-core is a successor to [OpenPype repository](https://github.com/ynput/OpenPype) (minus all the addons) and still in the process of cleaning up of all references. Please bear with us during this transitional phase.
|
||||
|
||||
Development and testing notes
|
||||
-----------------------------
|
||||
There is `pyproject.toml` file in the root of the repository. This file is used to define the development environment and is used by `poetry` to create a virtual environment.
|
||||
This virtual environment is used to run tests and to develop the code, to help with
|
||||
linting and formatting. Dependencies defined here are not used in actual addon
|
||||
deployment - for that you need to edit `./client/pyproject.toml` file. That file
|
||||
will be then processed [ayon-dependencies-tool](https://github.com/ynput/ayon-dependencies-tool)
|
||||
to create dependency package.
|
||||
|
||||
Right now, this file needs to by synced with dependencies manually, but in the future
|
||||
we plan to automate process of development environment creation.
|
||||
|
|
|
|||
|
|
@ -1,58 +0,0 @@
|
|||
import os
|
||||
|
||||
from ayon_core.lib import get_ayon_launcher_args
|
||||
from ayon_core.lib.applications import (
|
||||
get_non_python_host_kwargs,
|
||||
PreLaunchHook,
|
||||
LaunchTypes,
|
||||
)
|
||||
|
||||
from ayon_core import AYON_CORE_ROOT
|
||||
|
||||
|
||||
class NonPythonHostHook(PreLaunchHook):
|
||||
"""Launch arguments preparation.
|
||||
|
||||
Non python host implementation do not launch host directly but use
|
||||
python script which launch the host. For these cases it is necessary to
|
||||
prepend python (or ayon) executable and script path before application's.
|
||||
"""
|
||||
app_groups = {"harmony", "photoshop", "aftereffects"}
|
||||
|
||||
order = 20
|
||||
launch_types = {LaunchTypes.local}
|
||||
|
||||
def execute(self):
|
||||
# Pop executable
|
||||
executable_path = self.launch_context.launch_args.pop(0)
|
||||
|
||||
# Pop rest of launch arguments - There should not be other arguments!
|
||||
remainders = []
|
||||
while self.launch_context.launch_args:
|
||||
remainders.append(self.launch_context.launch_args.pop(0))
|
||||
|
||||
script_path = os.path.join(
|
||||
AYON_CORE_ROOT,
|
||||
"scripts",
|
||||
"non_python_host_launch.py"
|
||||
)
|
||||
|
||||
new_launch_args = get_ayon_launcher_args(
|
||||
"run", script_path, executable_path
|
||||
)
|
||||
# Add workfile path if exists
|
||||
workfile_path = self.data["last_workfile_path"]
|
||||
if (
|
||||
self.data.get("start_last_workfile")
|
||||
and workfile_path
|
||||
and os.path.exists(workfile_path)):
|
||||
new_launch_args.append(workfile_path)
|
||||
|
||||
# Append as whole list as these areguments should not be separated
|
||||
self.launch_context.launch_args.append(new_launch_args)
|
||||
|
||||
if remainders:
|
||||
self.launch_context.launch_args.extend(remainders)
|
||||
|
||||
self.launch_context.kwargs = \
|
||||
get_non_python_host_kwargs(self.launch_context.kwargs)
|
||||
|
|
@ -1,6 +1,12 @@
|
|||
from .addon import AfterEffectsAddon
|
||||
from .addon import (
|
||||
AFTEREFFECTS_ADDON_ROOT,
|
||||
AfterEffectsAddon,
|
||||
get_launch_script_path,
|
||||
)
|
||||
|
||||
|
||||
__all__ = (
|
||||
"AFTEREFFECTS_ADDON_ROOT",
|
||||
"AfterEffectsAddon",
|
||||
"get_launch_script_path",
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,5 +1,9 @@
|
|||
import os
|
||||
|
||||
from ayon_core.addon import AYONAddon, IHostAddon
|
||||
|
||||
AFTEREFFECTS_ADDON_ROOT = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class AfterEffectsAddon(AYONAddon, IHostAddon):
|
||||
name = "aftereffects"
|
||||
|
|
@ -17,3 +21,16 @@ class AfterEffectsAddon(AYONAddon, IHostAddon):
|
|||
|
||||
def get_workfile_extensions(self):
|
||||
return [".aep"]
|
||||
|
||||
def get_launch_hook_paths(self, app):
|
||||
if app.host_name != self.host_name:
|
||||
return []
|
||||
return [
|
||||
os.path.join(AFTEREFFECTS_ADDON_ROOT, "hooks")
|
||||
]
|
||||
|
||||
|
||||
def get_launch_script_path():
|
||||
return os.path.join(
|
||||
AFTEREFFECTS_ADDON_ROOT, "api", "launch_script.py"
|
||||
)
|
||||
|
|
|
|||
|
|
@ -7,7 +7,6 @@ import asyncio
|
|||
import functools
|
||||
import traceback
|
||||
|
||||
|
||||
from wsrpc_aiohttp import (
|
||||
WebSocketRoute,
|
||||
WebSocketAsync
|
||||
|
|
|
|||
|
|
@ -1,4 +1,4 @@
|
|||
"""Script wraps launch mechanism of non python host implementations.
|
||||
"""Script wraps launch mechanism of AfterEffects implementations.
|
||||
|
||||
Arguments passed to the script are passed to launch function in host
|
||||
implementation. In all cases requires host app executable and may contain
|
||||
|
|
@ -8,6 +8,8 @@ workfile or others.
|
|||
import os
|
||||
import sys
|
||||
|
||||
from ayon_core.hosts.aftereffects.api.launch_logic import main as host_main
|
||||
|
||||
# Get current file to locate start point of sys.argv
|
||||
CURRENT_FILE = os.path.abspath(__file__)
|
||||
|
||||
|
|
@ -79,26 +81,9 @@ def main(argv):
|
|||
if after_script_idx is not None:
|
||||
launch_args = sys_args[after_script_idx:]
|
||||
|
||||
host_name = os.environ["AYON_HOST_NAME"].lower()
|
||||
if host_name == "photoshop":
|
||||
# TODO refactor launch logic according to AE
|
||||
from ayon_core.hosts.photoshop.api.lib import main
|
||||
elif host_name == "aftereffects":
|
||||
from ayon_core.hosts.aftereffects.api.launch_logic import main
|
||||
elif host_name == "harmony":
|
||||
from ayon_core.hosts.harmony.api.lib import main
|
||||
else:
|
||||
title = "Unknown host name"
|
||||
message = (
|
||||
"BUG: Environment variable AYON_HOST_NAME contains unknown"
|
||||
" host name \"{}\""
|
||||
).format(host_name)
|
||||
show_error_messagebox(title, message)
|
||||
return
|
||||
|
||||
if launch_args:
|
||||
# Launch host implementation
|
||||
main(*launch_args)
|
||||
host_main(*launch_args)
|
||||
else:
|
||||
# Show message box
|
||||
on_invalid_args(after_script_idx is None)
|
||||
91
client/ayon_core/hosts/aftereffects/hooks/pre_launch_args.py
Normal file
91
client/ayon_core/hosts/aftereffects/hooks/pre_launch_args.py
Normal file
|
|
@ -0,0 +1,91 @@
|
|||
import os
|
||||
import platform
|
||||
import subprocess
|
||||
|
||||
from ayon_core.lib import (
|
||||
get_ayon_launcher_args,
|
||||
is_using_ayon_console,
|
||||
)
|
||||
from ayon_core.lib.applications import (
|
||||
PreLaunchHook,
|
||||
LaunchTypes,
|
||||
)
|
||||
from ayon_core.hosts.aftereffects import get_launch_script_path
|
||||
|
||||
|
||||
def get_launch_kwargs(kwargs):
|
||||
"""Explicit setting of kwargs for Popen for AfterEffects.
|
||||
|
||||
Expected behavior
|
||||
- ayon_console opens window with logs
|
||||
- ayon has stdout/stderr available for capturing
|
||||
|
||||
Args:
|
||||
kwargs (Union[dict, None]): Current kwargs or None.
|
||||
|
||||
"""
|
||||
if kwargs is None:
|
||||
kwargs = {}
|
||||
|
||||
if platform.system().lower() != "windows":
|
||||
return kwargs
|
||||
|
||||
if is_using_ayon_console():
|
||||
kwargs.update({
|
||||
"creationflags": subprocess.CREATE_NEW_CONSOLE
|
||||
})
|
||||
else:
|
||||
kwargs.update({
|
||||
"creationflags": subprocess.CREATE_NO_WINDOW,
|
||||
"stdout": subprocess.DEVNULL,
|
||||
"stderr": subprocess.DEVNULL
|
||||
})
|
||||
return kwargs
|
||||
|
||||
|
||||
class AEPrelaunchHook(PreLaunchHook):
|
||||
"""Launch arguments preparation.
|
||||
|
||||
Hook add python executable and script path to AE implementation before
|
||||
AE executable and add last workfile path to launch arguments.
|
||||
|
||||
Existence of last workfile is checked. If workfile does not exists tries
|
||||
to copy templated workfile from predefined path.
|
||||
"""
|
||||
app_groups = {"aftereffects"}
|
||||
|
||||
order = 20
|
||||
launch_types = {LaunchTypes.local}
|
||||
|
||||
def execute(self):
|
||||
# Pop executable
|
||||
executable_path = self.launch_context.launch_args.pop(0)
|
||||
|
||||
# Pop rest of launch arguments - There should not be other arguments!
|
||||
remainders = []
|
||||
while self.launch_context.launch_args:
|
||||
remainders.append(self.launch_context.launch_args.pop(0))
|
||||
|
||||
script_path = get_launch_script_path()
|
||||
|
||||
new_launch_args = get_ayon_launcher_args(
|
||||
"run", script_path, executable_path
|
||||
)
|
||||
# Add workfile path if exists
|
||||
workfile_path = self.data["last_workfile_path"]
|
||||
if (
|
||||
self.data.get("start_last_workfile")
|
||||
and workfile_path
|
||||
and os.path.exists(workfile_path)
|
||||
):
|
||||
new_launch_args.append(workfile_path)
|
||||
|
||||
# Append as whole list as these arguments should not be separated
|
||||
self.launch_context.launch_args.append(new_launch_args)
|
||||
|
||||
if remainders:
|
||||
self.launch_context.launch_args.extend(remainders)
|
||||
|
||||
self.launch_context.kwargs = get_launch_kwargs(
|
||||
self.launch_context.kwargs
|
||||
)
|
||||
|
|
@ -1,10 +1,12 @@
|
|||
from .addon import (
|
||||
HARMONY_HOST_DIR,
|
||||
HARMONY_ADDON_ROOT,
|
||||
HarmonyAddon,
|
||||
get_launch_script_path,
|
||||
)
|
||||
|
||||
|
||||
__all__ = (
|
||||
"HARMONY_HOST_DIR",
|
||||
"HARMONY_ADDON_ROOT",
|
||||
"HarmonyAddon",
|
||||
"get_launch_script_path",
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
import os
|
||||
from ayon_core.addon import AYONAddon, IHostAddon
|
||||
|
||||
HARMONY_HOST_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
HARMONY_ADDON_ROOT = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class HarmonyAddon(AYONAddon, IHostAddon):
|
||||
|
|
@ -11,10 +11,23 @@ class HarmonyAddon(AYONAddon, IHostAddon):
|
|||
def add_implementation_envs(self, env, _app):
|
||||
"""Modify environments to contain all required for implementation."""
|
||||
openharmony_path = os.path.join(
|
||||
HARMONY_HOST_DIR, "vendor", "OpenHarmony"
|
||||
HARMONY_ADDON_ROOT, "vendor", "OpenHarmony"
|
||||
)
|
||||
# TODO check if is already set? What to do if is already set?
|
||||
env["LIB_OPENHARMONY_PATH"] = openharmony_path
|
||||
|
||||
def get_workfile_extensions(self):
|
||||
return [".zip"]
|
||||
|
||||
def get_launch_hook_paths(self, app):
|
||||
if app.host_name != self.host_name:
|
||||
return []
|
||||
return [
|
||||
os.path.join(HARMONY_ADDON_ROOT, "hooks")
|
||||
]
|
||||
|
||||
|
||||
def get_launch_script_path():
|
||||
return os.path.join(
|
||||
HARMONY_ADDON_ROOT, "api", "launch_script.py"
|
||||
)
|
||||
|
|
|
|||
93
client/ayon_core/hosts/harmony/api/launch_script.py
Normal file
93
client/ayon_core/hosts/harmony/api/launch_script.py
Normal file
|
|
@ -0,0 +1,93 @@
|
|||
"""Script wraps launch mechanism of Harmony implementations.
|
||||
|
||||
Arguments passed to the script are passed to launch function in host
|
||||
implementation. In all cases requires host app executable and may contain
|
||||
workfile or others.
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
|
||||
from ayon_core.hosts.harmony.api.lib import main as host_main
|
||||
|
||||
# Get current file to locate start point of sys.argv
|
||||
CURRENT_FILE = os.path.abspath(__file__)
|
||||
|
||||
|
||||
def show_error_messagebox(title, message, detail_message=None):
|
||||
"""Function will show message and process ends after closing it."""
|
||||
from qtpy import QtWidgets, QtCore
|
||||
from ayon_core import style
|
||||
|
||||
app = QtWidgets.QApplication([])
|
||||
app.setStyleSheet(style.load_stylesheet())
|
||||
|
||||
msgbox = QtWidgets.QMessageBox()
|
||||
msgbox.setWindowTitle(title)
|
||||
msgbox.setText(message)
|
||||
|
||||
if detail_message:
|
||||
msgbox.setDetailedText(detail_message)
|
||||
|
||||
msgbox.setWindowModality(QtCore.Qt.ApplicationModal)
|
||||
msgbox.show()
|
||||
|
||||
sys.exit(app.exec_())
|
||||
|
||||
|
||||
def on_invalid_args(script_not_found):
|
||||
"""Show to user message box saying that something went wrong.
|
||||
|
||||
Tell user that arguments to launch implementation are invalid with
|
||||
arguments details.
|
||||
|
||||
Args:
|
||||
script_not_found (bool): Use different message based on this value.
|
||||
"""
|
||||
|
||||
title = "Invalid arguments"
|
||||
joined_args = ", ".join("\"{}\"".format(arg) for arg in sys.argv)
|
||||
if script_not_found:
|
||||
submsg = "Where couldn't find script path:\n\"{}\""
|
||||
else:
|
||||
submsg = "Expected Host executable after script path:\n\"{}\""
|
||||
|
||||
message = "BUG: Got invalid arguments so can't launch Host application."
|
||||
detail_message = "Process was launched with arguments:\n{}\n\n{}".format(
|
||||
joined_args,
|
||||
submsg.format(CURRENT_FILE)
|
||||
)
|
||||
|
||||
show_error_messagebox(title, message, detail_message)
|
||||
|
||||
|
||||
def main(argv):
|
||||
# Modify current file path to find match in sys.argv which may be different
|
||||
# on windows (different letter cases and slashes).
|
||||
modified_current_file = CURRENT_FILE.replace("\\", "/").lower()
|
||||
|
||||
# Create a copy of sys argv
|
||||
sys_args = list(argv)
|
||||
after_script_idx = None
|
||||
# Find script path in sys.argv to know index of argv where host
|
||||
# executable should be.
|
||||
for idx, item in enumerate(sys_args):
|
||||
if item.replace("\\", "/").lower() == modified_current_file:
|
||||
after_script_idx = idx + 1
|
||||
break
|
||||
|
||||
# Validate that there is at least one argument after script path
|
||||
launch_args = None
|
||||
if after_script_idx is not None:
|
||||
launch_args = sys_args[after_script_idx:]
|
||||
|
||||
if launch_args:
|
||||
# Launch host implementation
|
||||
host_main(*launch_args)
|
||||
else:
|
||||
# Show message box
|
||||
on_invalid_args(after_script_idx is None)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main(sys.argv)
|
||||
|
|
@ -1,5 +1,6 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Utility functions used for Avalon - Harmony integration."""
|
||||
import platform
|
||||
import subprocess
|
||||
import threading
|
||||
import os
|
||||
|
|
@ -14,15 +15,16 @@ import json
|
|||
import signal
|
||||
import time
|
||||
from uuid import uuid4
|
||||
from qtpy import QtWidgets, QtCore, QtGui
|
||||
import collections
|
||||
|
||||
from .server import Server
|
||||
from qtpy import QtWidgets, QtCore, QtGui
|
||||
|
||||
from ayon_core.lib import is_using_ayon_console
|
||||
from ayon_core.tools.stdout_broker.app import StdOutBroker
|
||||
from ayon_core.tools.utils import host_tools
|
||||
from ayon_core import style
|
||||
from ayon_core.lib.applications import get_non_python_host_kwargs
|
||||
|
||||
from .server import Server
|
||||
|
||||
# Setup logging.
|
||||
log = logging.getLogger(__name__)
|
||||
|
|
@ -324,7 +326,18 @@ def launch_zip_file(filepath):
|
|||
return
|
||||
|
||||
print("Launching {}".format(scene_path))
|
||||
kwargs = get_non_python_host_kwargs({}, False)
|
||||
# QUESTION Could we use 'run_detached_process' from 'ayon_core.lib'?
|
||||
kwargs = {}
|
||||
if (
|
||||
platform.system().lower() == "windows"
|
||||
and not is_using_ayon_console()
|
||||
):
|
||||
kwargs.update({
|
||||
"creationflags": subprocess.CREATE_NO_WINDOW,
|
||||
"stdout": subprocess.DEVNULL,
|
||||
"stderr": subprocess.DEVNULL
|
||||
})
|
||||
|
||||
process = subprocess.Popen(
|
||||
[ProcessContext.application_path, scene_path],
|
||||
**kwargs
|
||||
|
|
|
|||
|
|
@ -15,13 +15,13 @@ from ayon_core.pipeline import (
|
|||
from ayon_core.pipeline.load import get_outdated_containers
|
||||
from ayon_core.pipeline.context_tools import get_current_project_folder
|
||||
|
||||
from ayon_core.hosts.harmony import HARMONY_HOST_DIR
|
||||
from ayon_core.hosts.harmony import HARMONY_ADDON_ROOT
|
||||
import ayon_core.hosts.harmony.api as harmony
|
||||
|
||||
|
||||
log = logging.getLogger("ayon_core.hosts.harmony")
|
||||
|
||||
PLUGINS_DIR = os.path.join(HARMONY_HOST_DIR, "plugins")
|
||||
PLUGINS_DIR = os.path.join(HARMONY_ADDON_ROOT, "plugins")
|
||||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
|
||||
|
|
|
|||
91
client/ayon_core/hosts/harmony/hooks/pre_launch_args.py
Normal file
91
client/ayon_core/hosts/harmony/hooks/pre_launch_args.py
Normal file
|
|
@ -0,0 +1,91 @@
|
|||
import os
|
||||
import platform
|
||||
import subprocess
|
||||
|
||||
from ayon_core.lib import (
|
||||
get_ayon_launcher_args,
|
||||
is_using_ayon_console,
|
||||
)
|
||||
from ayon_core.lib.applications import (
|
||||
PreLaunchHook,
|
||||
LaunchTypes,
|
||||
)
|
||||
from ayon_core.hosts.harmony import get_launch_script_path
|
||||
|
||||
|
||||
def get_launch_kwargs(kwargs):
|
||||
"""Explicit setting of kwargs for Popen for Harmony.
|
||||
|
||||
Expected behavior
|
||||
- ayon_console opens window with logs
|
||||
- ayon has stdout/stderr available for capturing
|
||||
|
||||
Args:
|
||||
kwargs (Union[dict, None]): Current kwargs or None.
|
||||
|
||||
"""
|
||||
if kwargs is None:
|
||||
kwargs = {}
|
||||
|
||||
if platform.system().lower() != "windows":
|
||||
return kwargs
|
||||
|
||||
if is_using_ayon_console():
|
||||
kwargs.update({
|
||||
"creationflags": subprocess.CREATE_NEW_CONSOLE
|
||||
})
|
||||
else:
|
||||
kwargs.update({
|
||||
"creationflags": subprocess.CREATE_NO_WINDOW,
|
||||
"stdout": subprocess.DEVNULL,
|
||||
"stderr": subprocess.DEVNULL
|
||||
})
|
||||
return kwargs
|
||||
|
||||
|
||||
class HarmonyPrelaunchHook(PreLaunchHook):
|
||||
"""Launch arguments preparation.
|
||||
|
||||
Hook add python executable and script path to Harmony implementation
|
||||
before Harmony executable and add last workfile path to launch arguments.
|
||||
|
||||
Existence of last workfile is checked. If workfile does not exists tries
|
||||
to copy templated workfile from predefined path.
|
||||
"""
|
||||
app_groups = {"harmony"}
|
||||
|
||||
order = 20
|
||||
launch_types = {LaunchTypes.local}
|
||||
|
||||
def execute(self):
|
||||
# Pop executable
|
||||
executable_path = self.launch_context.launch_args.pop(0)
|
||||
|
||||
# Pop rest of launch arguments - There should not be other arguments!
|
||||
remainders = []
|
||||
while self.launch_context.launch_args:
|
||||
remainders.append(self.launch_context.launch_args.pop(0))
|
||||
|
||||
script_path = get_launch_script_path()
|
||||
|
||||
new_launch_args = get_ayon_launcher_args(
|
||||
"run", script_path, executable_path
|
||||
)
|
||||
# Add workfile path if exists
|
||||
workfile_path = self.data["last_workfile_path"]
|
||||
if (
|
||||
self.data.get("start_last_workfile")
|
||||
and workfile_path
|
||||
and os.path.exists(workfile_path)
|
||||
):
|
||||
new_launch_args.append(workfile_path)
|
||||
|
||||
# Append as whole list as these arguments should not be separated
|
||||
self.launch_context.launch_args.append(new_launch_args)
|
||||
|
||||
if remainders:
|
||||
self.launch_context.launch_args.extend(remainders)
|
||||
|
||||
self.launch_context.kwargs = get_launch_kwargs(
|
||||
self.launch_context.kwargs
|
||||
)
|
||||
|
|
@ -11,7 +11,8 @@ class AbcLoader(load.LoaderPlugin):
|
|||
|
||||
product_types = {"model", "animation", "pointcache", "gpuCache"}
|
||||
label = "Load Alembic"
|
||||
representations = ["abc"]
|
||||
representations = ["*"]
|
||||
extensions = {"abc"}
|
||||
order = -10
|
||||
icon = "code-fork"
|
||||
color = "orange"
|
||||
|
|
|
|||
|
|
@ -11,7 +11,8 @@ class AbcArchiveLoader(load.LoaderPlugin):
|
|||
|
||||
product_types = {"model", "animation", "pointcache", "gpuCache"}
|
||||
label = "Load Alembic as Archive"
|
||||
representations = ["abc"]
|
||||
representations = ["*"]
|
||||
extensions = {"abc"}
|
||||
order = -5
|
||||
icon = "code-fork"
|
||||
color = "orange"
|
||||
|
|
|
|||
|
|
@ -167,6 +167,9 @@ class CameraLoader(load.LoaderPlugin):
|
|||
|
||||
temp_camera.destroy()
|
||||
|
||||
def switch(self, container, context):
|
||||
self.update(container, context)
|
||||
|
||||
def remove(self, container):
|
||||
|
||||
node = container["node"]
|
||||
|
|
@ -195,7 +198,6 @@ class CameraLoader(load.LoaderPlugin):
|
|||
def _match_maya_render_mask(self, camera):
|
||||
"""Workaround to match Maya render mask in Houdini"""
|
||||
|
||||
# print("Setting match maya render mask ")
|
||||
parm = camera.parm("aperture")
|
||||
expression = parm.expression()
|
||||
expression = expression.replace("return ", "aperture = ")
|
||||
|
|
|
|||
129
client/ayon_core/hosts/houdini/plugins/load/load_filepath.py
Normal file
129
client/ayon_core/hosts/houdini/plugins/load/load_filepath.py
Normal file
|
|
@ -0,0 +1,129 @@
|
|||
import os
|
||||
import re
|
||||
|
||||
from ayon_core.pipeline import load
|
||||
from openpype.hosts.houdini.api import pipeline
|
||||
|
||||
import hou
|
||||
|
||||
|
||||
class FilePathLoader(load.LoaderPlugin):
|
||||
"""Load a managed filepath to a null node.
|
||||
|
||||
This is useful if for a particular workflow there is no existing loader
|
||||
yet. A Houdini artists can load as the generic filepath loader and then
|
||||
reference the relevant Houdini parm to use the exact value. The benefit
|
||||
is that this filepath will be managed and can be updated as usual.
|
||||
|
||||
"""
|
||||
|
||||
label = "Load filepath to node"
|
||||
order = 9
|
||||
icon = "link"
|
||||
color = "white"
|
||||
product_types = {"*"}
|
||||
representations = ["*"]
|
||||
|
||||
def load(self, context, name=None, namespace=None, data=None):
|
||||
|
||||
# Get the root node
|
||||
obj = hou.node("/obj")
|
||||
|
||||
# Define node name
|
||||
namespace = namespace if namespace else context["folder"]["name"]
|
||||
node_name = "{}_{}".format(namespace, name) if namespace else name
|
||||
|
||||
# Create a null node
|
||||
container = obj.createNode("null", node_name=node_name)
|
||||
|
||||
# Destroy any children
|
||||
for node in container.children():
|
||||
node.destroy()
|
||||
|
||||
# Add filepath attribute, set value as default value
|
||||
filepath = self.format_path(
|
||||
path=self.filepath_from_context(context),
|
||||
representation=context["representation"]
|
||||
)
|
||||
parm_template_group = container.parmTemplateGroup()
|
||||
attr_folder = hou.FolderParmTemplate("attributes_folder", "Attributes")
|
||||
parm = hou.StringParmTemplate(name="filepath",
|
||||
label="Filepath",
|
||||
num_components=1,
|
||||
default_value=(filepath,))
|
||||
attr_folder.addParmTemplate(parm)
|
||||
parm_template_group.append(attr_folder)
|
||||
|
||||
# Hide some default labels
|
||||
for folder_label in ["Transform", "Render", "Misc", "Redshift OBJ"]:
|
||||
folder = parm_template_group.findFolder(folder_label)
|
||||
if not folder:
|
||||
continue
|
||||
parm_template_group.hideFolder(folder_label, True)
|
||||
|
||||
container.setParmTemplateGroup(parm_template_group)
|
||||
|
||||
container.setDisplayFlag(False)
|
||||
container.setSelectableInViewport(False)
|
||||
container.useXray(False)
|
||||
|
||||
nodes = [container]
|
||||
|
||||
self[:] = nodes
|
||||
|
||||
return pipeline.containerise(
|
||||
node_name,
|
||||
namespace,
|
||||
nodes,
|
||||
context,
|
||||
self.__class__.__name__,
|
||||
suffix="",
|
||||
)
|
||||
|
||||
def update(self, container, context):
|
||||
|
||||
# Update the file path
|
||||
representation_entity = context["representation"]
|
||||
file_path = self.format_path(
|
||||
path=self.filepath_from_context(context),
|
||||
representation=representation_entity
|
||||
)
|
||||
|
||||
node = container["node"]
|
||||
node.setParms({
|
||||
"filepath": file_path,
|
||||
"representation": str(representation_entity["id"])
|
||||
})
|
||||
|
||||
# Update the parameter default value (cosmetics)
|
||||
parm_template_group = node.parmTemplateGroup()
|
||||
parm = parm_template_group.find("filepath")
|
||||
parm.setDefaultValue((file_path,))
|
||||
parm_template_group.replace(parm_template_group.find("filepath"),
|
||||
parm)
|
||||
node.setParmTemplateGroup(parm_template_group)
|
||||
|
||||
def switch(self, container, representation):
|
||||
self.update(container, representation)
|
||||
|
||||
def remove(self, container):
|
||||
|
||||
node = container["node"]
|
||||
node.destroy()
|
||||
|
||||
@staticmethod
|
||||
def format_path(path: str, representation: dict) -> str:
|
||||
"""Format file path for sequence with $F."""
|
||||
if not os.path.exists(path):
|
||||
raise RuntimeError("Path does not exist: %s" % path)
|
||||
|
||||
# The path is either a single file or sequence in a folder.
|
||||
frame = representation["context"].get("frame")
|
||||
if frame is not None:
|
||||
# Substitute frame number in sequence with $F with padding
|
||||
ext = representation.get("ext", representation["name"])
|
||||
token = "$F{}".format(len(frame)) # e.g. $F4
|
||||
pattern = r"\.(\d+)\.{ext}$".format(ext=re.escape(ext))
|
||||
path = re.sub(pattern, ".{}.{}".format(token, ext), path)
|
||||
|
||||
return os.path.normpath(path).replace("\\", "/")
|
||||
|
|
@ -1,4 +1,5 @@
|
|||
import os
|
||||
import re
|
||||
|
||||
from ayon_core.pipeline import (
|
||||
load,
|
||||
|
|
@ -44,7 +45,14 @@ def get_image_avalon_container():
|
|||
class ImageLoader(load.LoaderPlugin):
|
||||
"""Load images into COP2"""
|
||||
|
||||
product_types = {"imagesequence"}
|
||||
product_types = {
|
||||
"imagesequence",
|
||||
"review",
|
||||
"render",
|
||||
"plate",
|
||||
"image",
|
||||
"online",
|
||||
}
|
||||
label = "Load Image (COP2)"
|
||||
representations = ["*"]
|
||||
order = -10
|
||||
|
|
@ -55,10 +63,8 @@ class ImageLoader(load.LoaderPlugin):
|
|||
def load(self, context, name=None, namespace=None, data=None):
|
||||
|
||||
# Format file name, Houdini only wants forward slashes
|
||||
file_path = self.filepath_from_context(context)
|
||||
file_path = os.path.normpath(file_path)
|
||||
file_path = file_path.replace("\\", "/")
|
||||
file_path = self._get_file_sequence(file_path)
|
||||
path = self.filepath_from_context(context)
|
||||
path = self.format_path(path, representation=context["representation"])
|
||||
|
||||
# Get the root node
|
||||
parent = get_image_avalon_container()
|
||||
|
|
@ -70,7 +76,10 @@ class ImageLoader(load.LoaderPlugin):
|
|||
node = parent.createNode("file", node_name=node_name)
|
||||
node.moveToGoodPosition()
|
||||
|
||||
node.setParms({"filename1": file_path})
|
||||
parms = {"filename1": path}
|
||||
parms.update(self.get_colorspace_parms(context["representation"]))
|
||||
|
||||
node.setParms(parms)
|
||||
|
||||
# Imprint it manually
|
||||
data = {
|
||||
|
|
@ -93,16 +102,17 @@ class ImageLoader(load.LoaderPlugin):
|
|||
|
||||
# Update the file path
|
||||
file_path = get_representation_path(repre_entity)
|
||||
file_path = file_path.replace("\\", "/")
|
||||
file_path = self._get_file_sequence(file_path)
|
||||
file_path = self.format_path(file_path, repre_entity)
|
||||
|
||||
parms = {
|
||||
"filename1": file_path,
|
||||
"representation": repre_entity["id"],
|
||||
}
|
||||
|
||||
parms.update(self.get_colorspace_parms(repre_entity))
|
||||
|
||||
# Update attributes
|
||||
node.setParms(
|
||||
{
|
||||
"filename1": file_path,
|
||||
"representation": repre_entity["id"],
|
||||
}
|
||||
)
|
||||
node.setParms(parms)
|
||||
|
||||
def remove(self, container):
|
||||
|
||||
|
|
@ -119,14 +129,58 @@ class ImageLoader(load.LoaderPlugin):
|
|||
if not parent.children():
|
||||
parent.destroy()
|
||||
|
||||
def _get_file_sequence(self, file_path):
|
||||
root = os.path.dirname(file_path)
|
||||
files = sorted(os.listdir(root))
|
||||
@staticmethod
|
||||
def format_path(path, representation):
|
||||
"""Format file path correctly for single image or sequence."""
|
||||
if not os.path.exists(path):
|
||||
raise RuntimeError("Path does not exist: %s" % path)
|
||||
|
||||
first_fname = files[0]
|
||||
prefix, padding, suffix = first_fname.rsplit(".", 2)
|
||||
fname = ".".join([prefix, "$F{}".format(len(padding)), suffix])
|
||||
return os.path.join(root, fname).replace("\\", "/")
|
||||
ext = os.path.splitext(path)[-1]
|
||||
|
||||
def switch(self, container, context):
|
||||
self.update(container, context)
|
||||
is_sequence = bool(representation["context"].get("frame"))
|
||||
# The path is either a single file or sequence in a folder.
|
||||
if not is_sequence:
|
||||
filename = path
|
||||
else:
|
||||
filename = re.sub(r"(.*)\.(\d+){}$".format(re.escape(ext)),
|
||||
"\\1.$F4{}".format(ext),
|
||||
path)
|
||||
|
||||
filename = os.path.join(path, filename)
|
||||
|
||||
filename = os.path.normpath(filename)
|
||||
filename = filename.replace("\\", "/")
|
||||
|
||||
return filename
|
||||
|
||||
def get_colorspace_parms(self, representation: dict) -> dict:
|
||||
"""Return the color space parameters.
|
||||
|
||||
Returns the values for the colorspace parameters on the node if there
|
||||
is colorspace data on the representation.
|
||||
|
||||
Arguments:
|
||||
representation (dict): The representation entity.
|
||||
|
||||
Returns:
|
||||
dict: Parm to value mapping if colorspace data is defined.
|
||||
|
||||
"""
|
||||
# Using OCIO colorspace on COP2 File node is only supported in Hou 20+
|
||||
major, _, _ = hou.applicationVersion()
|
||||
if major < 20:
|
||||
return {}
|
||||
|
||||
data = representation.get("data", {}).get("colorspaceData", {})
|
||||
if not data:
|
||||
return {}
|
||||
|
||||
colorspace = data["colorspace"]
|
||||
if colorspace:
|
||||
return {
|
||||
"colorspace": 3, # Use OpenColorIO
|
||||
"ocio_space": colorspace
|
||||
}
|
||||
|
||||
def switch(self, container, representation):
|
||||
self.update(container, representation)
|
||||
|
|
|
|||
77
client/ayon_core/hosts/houdini/plugins/load/load_usd_sop.py
Normal file
77
client/ayon_core/hosts/houdini/plugins/load/load_usd_sop.py
Normal file
|
|
@ -0,0 +1,77 @@
|
|||
import os
|
||||
|
||||
from ayon_core.pipeline import load
|
||||
from ayon_core.hosts.houdini.api import pipeline
|
||||
|
||||
|
||||
class SopUsdImportLoader(load.LoaderPlugin):
|
||||
"""Load USD to SOPs via `usdimport`"""
|
||||
|
||||
label = "Load USD to SOPs"
|
||||
product_types = {"*"}
|
||||
representations = ["usd"]
|
||||
order = -6
|
||||
icon = "code-fork"
|
||||
color = "orange"
|
||||
|
||||
def load(self, context, name=None, namespace=None, data=None):
|
||||
import hou
|
||||
|
||||
# Format file name, Houdini only wants forward slashes
|
||||
file_path = self.filepath_from_context(context)
|
||||
file_path = os.path.normpath(file_path)
|
||||
file_path = file_path.replace("\\", "/")
|
||||
|
||||
# Get the root node
|
||||
obj = hou.node("/obj")
|
||||
|
||||
# Define node name
|
||||
namespace = namespace if namespace else context["folder"]["name"]
|
||||
node_name = "{}_{}".format(namespace, name) if namespace else name
|
||||
|
||||
# Create a new geo node
|
||||
container = obj.createNode("geo", node_name=node_name)
|
||||
|
||||
# Create a usdimport node
|
||||
usdimport = container.createNode("usdimport", node_name=node_name)
|
||||
usdimport.setParms({"filepath1": file_path})
|
||||
|
||||
# Set new position for unpack node else it gets cluttered
|
||||
nodes = [container, usdimport]
|
||||
|
||||
return pipeline.containerise(
|
||||
node_name,
|
||||
namespace,
|
||||
nodes,
|
||||
context,
|
||||
self.__class__.__name__,
|
||||
suffix="",
|
||||
)
|
||||
|
||||
def update(self, container, context):
|
||||
|
||||
node = container["node"]
|
||||
try:
|
||||
usdimport_node = next(
|
||||
n for n in node.children() if n.type().name() == "usdimport"
|
||||
)
|
||||
except StopIteration:
|
||||
self.log.error("Could not find node of type `usdimport`")
|
||||
return
|
||||
|
||||
# Update the file path
|
||||
file_path = self.filepath_from_context(context)
|
||||
file_path = file_path.replace("\\", "/")
|
||||
|
||||
usdimport_node.setParms({"filepath1": file_path})
|
||||
|
||||
# Update attribute
|
||||
node.setParms({"representation": context["representation"]["id"]})
|
||||
|
||||
def remove(self, container):
|
||||
|
||||
node = container["node"]
|
||||
node.destroy()
|
||||
|
||||
def switch(self, container, representation):
|
||||
self.update(container, representation)
|
||||
|
|
@ -1,100 +0,0 @@
|
|||
import hou
|
||||
|
||||
import pyblish.api
|
||||
|
||||
from ayon_core.pipeline import AYON_INSTANCE_ID, AVALON_INSTANCE_ID
|
||||
from ayon_core.hosts.houdini.api import lib
|
||||
|
||||
|
||||
class CollectInstances(pyblish.api.ContextPlugin):
|
||||
"""Gather instances by all node in out graph and pre-defined attributes
|
||||
|
||||
This collector takes into account folders that are associated with
|
||||
an specific node and marked with a unique identifier;
|
||||
|
||||
Identifier:
|
||||
id (str): "ayon.create.instance"
|
||||
|
||||
Specific node:
|
||||
The specific node is important because it dictates in which way the
|
||||
product is being exported.
|
||||
|
||||
alembic: will export Alembic file which supports cascading attributes
|
||||
like 'cbId' and 'path'
|
||||
geometry: Can export a wide range of file types, default out
|
||||
|
||||
"""
|
||||
|
||||
order = pyblish.api.CollectorOrder - 0.01
|
||||
label = "Collect Instances"
|
||||
hosts = ["houdini"]
|
||||
|
||||
def process(self, context):
|
||||
|
||||
nodes = hou.node("/out").children()
|
||||
nodes += hou.node("/obj").children()
|
||||
|
||||
# Include instances in USD stage only when it exists so it
|
||||
# remains backwards compatible with version before houdini 18
|
||||
stage = hou.node("/stage")
|
||||
if stage:
|
||||
nodes += stage.recursiveGlob("*", filter=hou.nodeTypeFilter.Rop)
|
||||
|
||||
for node in nodes:
|
||||
|
||||
if not node.parm("id"):
|
||||
continue
|
||||
|
||||
if node.evalParm("id") not in {
|
||||
AYON_INSTANCE_ID, AVALON_INSTANCE_ID
|
||||
}:
|
||||
continue
|
||||
|
||||
# instance was created by new creator code, skip it as
|
||||
# it is already collected.
|
||||
if node.parm("creator_identifier"):
|
||||
continue
|
||||
|
||||
has_family = node.evalParm("family")
|
||||
assert has_family, "'%s' is missing 'family'" % node.name()
|
||||
|
||||
self.log.info(
|
||||
"Processing legacy instance node {}".format(node.path())
|
||||
)
|
||||
|
||||
data = lib.read(node)
|
||||
# Check bypass state and reverse
|
||||
if hasattr(node, "isBypassed"):
|
||||
data.update({"active": not node.isBypassed()})
|
||||
|
||||
# temporarily translation of `active` to `publish` till issue has
|
||||
# been resolved.
|
||||
# https://github.com/pyblish/pyblish-base/issues/307
|
||||
if "active" in data:
|
||||
data["publish"] = data["active"]
|
||||
|
||||
# Create nice name if the instance has a frame range.
|
||||
label = data.get("name", node.name())
|
||||
label += " (%s)" % data["folderPath"] # include folder in name
|
||||
|
||||
instance = context.create_instance(label)
|
||||
|
||||
# Include `families` using `family` data
|
||||
product_type = data["family"]
|
||||
data["productType"] = product_type
|
||||
instance.data["families"] = [product_type]
|
||||
|
||||
instance[:] = [node]
|
||||
instance.data["instance_node"] = node.path()
|
||||
instance.data.update(data)
|
||||
|
||||
def sort_by_family(instance):
|
||||
"""Sort by family"""
|
||||
return instance.data.get(
|
||||
"families", instance.data.get("productType")
|
||||
)
|
||||
|
||||
# Sort/grouped by family (preserving local index)
|
||||
context[:] = sorted(context, key=sort_by_family)
|
||||
|
||||
return context
|
||||
|
|
@ -41,23 +41,23 @@ class CollectKarmaROPRenderProducts(pyblish.api.InstancePlugin):
|
|||
instance.data["chunkSize"] = chunk_size
|
||||
self.log.debug("Chunk Size: %s" % chunk_size)
|
||||
|
||||
default_prefix = evalParmNoFrame(rop, "picture")
|
||||
render_products = []
|
||||
default_prefix = evalParmNoFrame(rop, "picture")
|
||||
render_products = []
|
||||
|
||||
# Default beauty AOV
|
||||
beauty_product = self.get_render_product_name(
|
||||
prefix=default_prefix, suffix=None
|
||||
)
|
||||
render_products.append(beauty_product)
|
||||
# Default beauty AOV
|
||||
beauty_product = self.get_render_product_name(
|
||||
prefix=default_prefix, suffix=None
|
||||
)
|
||||
render_products.append(beauty_product)
|
||||
|
||||
files_by_aov = {
|
||||
"beauty": self.generate_expected_files(instance,
|
||||
beauty_product)
|
||||
}
|
||||
files_by_aov = {
|
||||
"beauty": self.generate_expected_files(instance,
|
||||
beauty_product)
|
||||
}
|
||||
|
||||
filenames = list(render_products)
|
||||
instance.data["files"] = filenames
|
||||
instance.data["renderProducts"] = colorspace.ARenderProduct()
|
||||
filenames = list(render_products)
|
||||
instance.data["files"] = filenames
|
||||
instance.data["renderProducts"] = colorspace.ARenderProduct()
|
||||
|
||||
for product in render_products:
|
||||
self.log.debug("Found render product: %s" % product)
|
||||
|
|
|
|||
|
|
@ -41,57 +41,57 @@ class CollectMantraROPRenderProducts(pyblish.api.InstancePlugin):
|
|||
instance.data["chunkSize"] = chunk_size
|
||||
self.log.debug("Chunk Size: %s" % chunk_size)
|
||||
|
||||
default_prefix = evalParmNoFrame(rop, "vm_picture")
|
||||
render_products = []
|
||||
default_prefix = evalParmNoFrame(rop, "vm_picture")
|
||||
render_products = []
|
||||
|
||||
# Store whether we are splitting the render job (export + render)
|
||||
split_render = bool(rop.parm("soho_outputmode").eval())
|
||||
instance.data["splitRender"] = split_render
|
||||
export_prefix = None
|
||||
export_products = []
|
||||
if split_render:
|
||||
export_prefix = evalParmNoFrame(
|
||||
rop, "soho_diskfile", pad_character="0"
|
||||
)
|
||||
beauty_export_product = self.get_render_product_name(
|
||||
prefix=export_prefix,
|
||||
suffix=None)
|
||||
export_products.append(beauty_export_product)
|
||||
self.log.debug(
|
||||
"Found export product: {}".format(beauty_export_product)
|
||||
)
|
||||
instance.data["ifdFile"] = beauty_export_product
|
||||
instance.data["exportFiles"] = list(export_products)
|
||||
|
||||
# Default beauty AOV
|
||||
beauty_product = self.get_render_product_name(
|
||||
prefix=default_prefix, suffix=None
|
||||
# Store whether we are splitting the render job (export + render)
|
||||
split_render = bool(rop.parm("soho_outputmode").eval())
|
||||
instance.data["splitRender"] = split_render
|
||||
export_prefix = None
|
||||
export_products = []
|
||||
if split_render:
|
||||
export_prefix = evalParmNoFrame(
|
||||
rop, "soho_diskfile", pad_character="0"
|
||||
)
|
||||
render_products.append(beauty_product)
|
||||
beauty_export_product = self.get_render_product_name(
|
||||
prefix=export_prefix,
|
||||
suffix=None)
|
||||
export_products.append(beauty_export_product)
|
||||
self.log.debug(
|
||||
"Found export product: {}".format(beauty_export_product)
|
||||
)
|
||||
instance.data["ifdFile"] = beauty_export_product
|
||||
instance.data["exportFiles"] = list(export_products)
|
||||
|
||||
files_by_aov = {
|
||||
"beauty": self.generate_expected_files(instance,
|
||||
beauty_product)
|
||||
}
|
||||
# Default beauty AOV
|
||||
beauty_product = self.get_render_product_name(
|
||||
prefix=default_prefix, suffix=None
|
||||
)
|
||||
render_products.append(beauty_product)
|
||||
|
||||
aov_numbers = rop.evalParm("vm_numaux")
|
||||
if aov_numbers > 0:
|
||||
# get the filenames of the AOVs
|
||||
for i in range(1, aov_numbers + 1):
|
||||
var = rop.evalParm("vm_variable_plane%d" % i)
|
||||
if var:
|
||||
aov_name = "vm_filename_plane%d" % i
|
||||
aov_boolean = "vm_usefile_plane%d" % i
|
||||
aov_enabled = rop.evalParm(aov_boolean)
|
||||
has_aov_path = rop.evalParm(aov_name)
|
||||
if has_aov_path and aov_enabled == 1:
|
||||
aov_prefix = evalParmNoFrame(rop, aov_name)
|
||||
aov_product = self.get_render_product_name(
|
||||
prefix=aov_prefix, suffix=None
|
||||
)
|
||||
render_products.append(aov_product)
|
||||
files_by_aov = {
|
||||
"beauty": self.generate_expected_files(instance,
|
||||
beauty_product)
|
||||
}
|
||||
|
||||
files_by_aov[var] = self.generate_expected_files(instance, aov_product) # noqa
|
||||
aov_numbers = rop.evalParm("vm_numaux")
|
||||
if aov_numbers > 0:
|
||||
# get the filenames of the AOVs
|
||||
for i in range(1, aov_numbers + 1):
|
||||
var = rop.evalParm("vm_variable_plane%d" % i)
|
||||
if var:
|
||||
aov_name = "vm_filename_plane%d" % i
|
||||
aov_boolean = "vm_usefile_plane%d" % i
|
||||
aov_enabled = rop.evalParm(aov_boolean)
|
||||
has_aov_path = rop.evalParm(aov_name)
|
||||
if has_aov_path and aov_enabled == 1:
|
||||
aov_prefix = evalParmNoFrame(rop, aov_name)
|
||||
aov_product = self.get_render_product_name(
|
||||
prefix=aov_prefix, suffix=None
|
||||
)
|
||||
render_products.append(aov_product)
|
||||
|
||||
files_by_aov[var] = self.generate_expected_files(instance, aov_product) # noqa
|
||||
|
||||
for product in render_products:
|
||||
self.log.debug("Found render product: %s" % product)
|
||||
|
|
|
|||
|
|
@ -68,12 +68,15 @@ class CollectRedshiftROPRenderProducts(pyblish.api.InstancePlugin):
|
|||
files_by_aov = {
|
||||
"_": self.generate_expected_files(instance,
|
||||
beauty_product)}
|
||||
|
||||
|
||||
aovs_rop = rop.parm("RS_aovGetFromNode").evalAsNode()
|
||||
if aovs_rop:
|
||||
rop = aovs_rop
|
||||
|
||||
num_aovs = rop.evalParm("RS_aov")
|
||||
num_aovs = 0
|
||||
if not rop.evalParm('RS_aovAllAOVsDisabled'):
|
||||
num_aovs = rop.evalParm("RS_aov")
|
||||
|
||||
for index in range(num_aovs):
|
||||
i = index + 1
|
||||
|
||||
|
|
|
|||
|
|
@ -32,10 +32,7 @@ class RedshiftProxyLoader(load.LoaderPlugin):
|
|||
|
||||
def load(self, context, name=None, namespace=None, options=None):
|
||||
"""Plugin entry point."""
|
||||
try:
|
||||
product_type = context["representation"]["context"]["family"]
|
||||
except ValueError:
|
||||
product_type = "redshiftproxy"
|
||||
product_type = context["product"]["productType"]
|
||||
|
||||
folder_name = context["folder"]["name"]
|
||||
namespace = namespace or unique_namespace(
|
||||
|
|
|
|||
|
|
@ -117,11 +117,7 @@ class ReferenceLoader(plugin.ReferenceLoader):
|
|||
def process_reference(self, context, name, namespace, options):
|
||||
import maya.cmds as cmds
|
||||
|
||||
try:
|
||||
product_type = context["representation"]["context"]["family"]
|
||||
except ValueError:
|
||||
product_type = "model"
|
||||
|
||||
product_type = context["product"]["productType"]
|
||||
project_name = context["project"]["name"]
|
||||
# True by default to keep legacy behaviours
|
||||
attach_to_root = options.get("attach_to_root", True)
|
||||
|
|
|
|||
|
|
@ -25,10 +25,7 @@ class LoadVDBtoArnold(load.LoaderPlugin):
|
|||
from ayon_core.hosts.maya.api.pipeline import containerise
|
||||
from ayon_core.hosts.maya.api.lib import unique_namespace
|
||||
|
||||
try:
|
||||
product_type = context["representation"]["context"]["family"]
|
||||
except ValueError:
|
||||
product_type = "vdbcache"
|
||||
product_type = context["product"]["productType"]
|
||||
|
||||
# Check if the plugin for arnold is available on the pc
|
||||
try:
|
||||
|
|
@ -64,7 +61,7 @@ class LoadVDBtoArnold(load.LoaderPlugin):
|
|||
path = self.filepath_from_context(context)
|
||||
self._set_path(grid_node,
|
||||
path=path,
|
||||
representation=context["representation"])
|
||||
repre_entity=context["representation"])
|
||||
|
||||
# Lock the shape node so the user can't delete the transform/shape
|
||||
# as if it was referenced
|
||||
|
|
@ -94,7 +91,7 @@ class LoadVDBtoArnold(load.LoaderPlugin):
|
|||
assert len(grid_nodes) == 1, "This is a bug"
|
||||
|
||||
# Update the VRayVolumeGrid
|
||||
self._set_path(grid_nodes[0], path=path, representation=repre_entity)
|
||||
self._set_path(grid_nodes[0], path=path, repre_entity=repre_entity)
|
||||
|
||||
# Update container representation
|
||||
cmds.setAttr(container["objectName"] + ".representation",
|
||||
|
|
|
|||
|
|
@ -31,10 +31,7 @@ class LoadVDBtoRedShift(load.LoaderPlugin):
|
|||
from ayon_core.hosts.maya.api.pipeline import containerise
|
||||
from ayon_core.hosts.maya.api.lib import unique_namespace
|
||||
|
||||
try:
|
||||
product_type = context["representation"]["context"]["family"]
|
||||
except ValueError:
|
||||
product_type = "vdbcache"
|
||||
product_type = context["product"]["productType"]
|
||||
|
||||
# Check if the plugin for redshift is available on the pc
|
||||
try:
|
||||
|
|
|
|||
|
|
@ -94,10 +94,7 @@ class LoadVDBtoVRay(load.LoaderPlugin):
|
|||
"Path does not exist: %s" % path
|
||||
)
|
||||
|
||||
try:
|
||||
product_type = context["representation"]["context"]["family"]
|
||||
except ValueError:
|
||||
product_type = "vdbcache"
|
||||
product_type = context["product"]["productType"]
|
||||
|
||||
# Ensure V-ray is loaded with the vrayvolumegrid
|
||||
if not cmds.pluginInfo("vrayformaya", query=True, loaded=True):
|
||||
|
|
|
|||
|
|
@ -47,10 +47,7 @@ class VRayProxyLoader(load.LoaderPlugin):
|
|||
|
||||
"""
|
||||
|
||||
try:
|
||||
product_type = context["representation"]["context"]["family"]
|
||||
except ValueError:
|
||||
product_type = "vrayproxy"
|
||||
product_type = context["product"]["productType"]
|
||||
|
||||
# get all representations for this version
|
||||
filename = self._get_abc(
|
||||
|
|
|
|||
|
|
@ -26,10 +26,7 @@ class VRaySceneLoader(load.LoaderPlugin):
|
|||
color = "orange"
|
||||
|
||||
def load(self, context, name, namespace, data):
|
||||
try:
|
||||
product_type = context["representation"]["context"]["family"]
|
||||
except ValueError:
|
||||
product_type = "vrayscene_layer"
|
||||
product_type = context["product"]["productType"]
|
||||
|
||||
folder_name = context["folder"]["name"]
|
||||
namespace = namespace or unique_namespace(
|
||||
|
|
|
|||
|
|
@ -56,10 +56,7 @@ class YetiCacheLoader(load.LoaderPlugin):
|
|||
|
||||
"""
|
||||
|
||||
try:
|
||||
product_type = context["representation"]["context"]["family"]
|
||||
except ValueError:
|
||||
product_type = "yeticache"
|
||||
product_type = context["product"]["productType"]
|
||||
|
||||
# Build namespace
|
||||
folder_name = context["folder"]["name"]
|
||||
|
|
|
|||
|
|
@ -7,7 +7,9 @@ from ayon_core.pipeline.publish import (
|
|||
RepairAction,
|
||||
ValidateContentsOrder,
|
||||
PublishValidationError,
|
||||
OptionalPyblishPluginMixin
|
||||
OptionalPyblishPluginMixin,
|
||||
get_plugin_settings,
|
||||
apply_plugin_settings_automatically
|
||||
)
|
||||
|
||||
|
||||
|
|
@ -32,6 +34,20 @@ class ValidateOutRelatedNodeIds(pyblish.api.InstancePlugin,
|
|||
]
|
||||
optional = False
|
||||
|
||||
@classmethod
|
||||
def apply_settings(cls, project_settings):
|
||||
# Preserve automatic settings applying logic
|
||||
settings = get_plugin_settings(plugin=cls,
|
||||
project_settings=project_settings,
|
||||
log=cls.log,
|
||||
category="maya")
|
||||
apply_plugin_settings_automatically(cls, settings, logger=cls.log)
|
||||
|
||||
# Disable plug-in if cbId workflow is disabled
|
||||
if not project_settings["maya"].get("use_cbid_workflow", True):
|
||||
cls.enabled = False
|
||||
return
|
||||
|
||||
def process(self, instance):
|
||||
"""Process all meshes"""
|
||||
if not self.is_active(instance.data):
|
||||
|
|
|
|||
|
|
@ -22,6 +22,13 @@ class ValidateArnoldSceneSourceCbid(pyblish.api.InstancePlugin,
|
|||
actions = [RepairAction]
|
||||
optional = False
|
||||
|
||||
@classmethod
|
||||
def apply_settings(cls, project_settings):
|
||||
# Disable plug-in if cbId workflow is disabled
|
||||
if not project_settings["maya"].get("use_cbid_workflow", True):
|
||||
cls.enabled = False
|
||||
return
|
||||
|
||||
@staticmethod
|
||||
def _get_nodes_by_name(nodes):
|
||||
nodes_by_name = {}
|
||||
|
|
|
|||
|
|
@ -27,6 +27,13 @@ class ValidateLookIdReferenceEdits(pyblish.api.InstancePlugin):
|
|||
actions = [ayon_core.hosts.maya.api.action.SelectInvalidAction,
|
||||
RepairAction]
|
||||
|
||||
@classmethod
|
||||
def apply_settings(cls, project_settings):
|
||||
# Disable plug-in if cbId workflow is disabled
|
||||
if not project_settings["maya"].get("use_cbid_workflow", True):
|
||||
cls.enabled = False
|
||||
return
|
||||
|
||||
def process(self, instance):
|
||||
invalid = self.get_invalid(instance)
|
||||
|
||||
|
|
|
|||
|
|
@ -16,7 +16,7 @@ class ValidateShadingEngine(pyblish.api.InstancePlugin,
|
|||
|
||||
Shading engines should be named "{surface_shader}SG"
|
||||
"""
|
||||
``
|
||||
|
||||
order = ValidateContentsOrder
|
||||
families = ["look"]
|
||||
hosts = ["maya"]
|
||||
|
|
|
|||
|
|
@ -31,6 +31,13 @@ class ValidateNodeIDs(pyblish.api.InstancePlugin):
|
|||
actions = [ayon_core.hosts.maya.api.action.SelectInvalidAction,
|
||||
ayon_core.hosts.maya.api.action.GenerateUUIDsOnInvalidAction]
|
||||
|
||||
@classmethod
|
||||
def apply_settings(cls, project_settings):
|
||||
# Disable plug-in if cbId workflow is disabled
|
||||
if not project_settings["maya"].get("use_cbid_workflow", True):
|
||||
cls.enabled = False
|
||||
return
|
||||
|
||||
def process(self, instance):
|
||||
"""Process all meshes"""
|
||||
|
||||
|
|
|
|||
|
|
@ -26,6 +26,13 @@ class ValidateNodeIdsDeformedShape(pyblish.api.InstancePlugin):
|
|||
RepairAction
|
||||
]
|
||||
|
||||
@classmethod
|
||||
def apply_settings(cls, project_settings):
|
||||
# Disable plug-in if cbId workflow is disabled
|
||||
if not project_settings["maya"].get("use_cbid_workflow", True):
|
||||
cls.enabled = False
|
||||
return
|
||||
|
||||
def process(self, instance):
|
||||
"""Process all the nodes in the instance"""
|
||||
|
||||
|
|
|
|||
|
|
@ -26,6 +26,13 @@ class ValidateNodeIdsInDatabase(pyblish.api.InstancePlugin):
|
|||
actions = [ayon_core.hosts.maya.api.action.SelectInvalidAction,
|
||||
ayon_core.hosts.maya.api.action.GenerateUUIDsOnInvalidAction]
|
||||
|
||||
@classmethod
|
||||
def apply_settings(cls, project_settings):
|
||||
# Disable plug-in if cbId workflow is disabled
|
||||
if not project_settings["maya"].get("use_cbid_workflow", True):
|
||||
cls.enabled = False
|
||||
return
|
||||
|
||||
def process(self, instance):
|
||||
invalid = self.get_invalid(instance)
|
||||
if invalid:
|
||||
|
|
|
|||
|
|
@ -24,6 +24,13 @@ class ValidateNodeIDsRelated(pyblish.api.InstancePlugin,
|
|||
actions = [ayon_core.hosts.maya.api.action.SelectInvalidAction,
|
||||
ayon_core.hosts.maya.api.action.GenerateUUIDsOnInvalidAction]
|
||||
|
||||
@classmethod
|
||||
def apply_settings(cls, project_settings):
|
||||
# Disable plug-in if cbId workflow is disabled
|
||||
if not project_settings["maya"].get("use_cbid_workflow", True):
|
||||
cls.enabled = False
|
||||
return
|
||||
|
||||
def process(self, instance):
|
||||
"""Process all nodes in instance (including hierarchy)"""
|
||||
if not self.is_active(instance.data):
|
||||
|
|
|
|||
|
|
@ -26,6 +26,13 @@ class ValidateNodeIdsUnique(pyblish.api.InstancePlugin):
|
|||
actions = [ayon_core.hosts.maya.api.action.SelectInvalidAction,
|
||||
ayon_core.hosts.maya.api.action.GenerateUUIDsOnInvalidAction]
|
||||
|
||||
@classmethod
|
||||
def apply_settings(cls, project_settings):
|
||||
# Disable plug-in if cbId workflow is disabled
|
||||
if not project_settings["maya"].get("use_cbid_workflow", True):
|
||||
cls.enabled = False
|
||||
return
|
||||
|
||||
def process(self, instance):
|
||||
"""Process all meshes"""
|
||||
|
||||
|
|
|
|||
|
|
@ -8,7 +8,9 @@ from ayon_core.pipeline.publish import (
|
|||
RepairAction,
|
||||
ValidateContentsOrder,
|
||||
PublishXmlValidationError,
|
||||
OptionalPyblishPluginMixin
|
||||
OptionalPyblishPluginMixin,
|
||||
get_plugin_settings,
|
||||
apply_plugin_settings_automatically
|
||||
)
|
||||
|
||||
|
||||
|
|
@ -34,6 +36,20 @@ class ValidateRigOutSetNodeIds(pyblish.api.InstancePlugin,
|
|||
allow_history_only = False
|
||||
optional = False
|
||||
|
||||
@classmethod
|
||||
def apply_settings(cls, project_settings):
|
||||
# Preserve automatic settings applying logic
|
||||
settings = get_plugin_settings(plugin=cls,
|
||||
project_settings=project_settings,
|
||||
log=cls.log,
|
||||
category="maya")
|
||||
apply_plugin_settings_automatically(cls, settings, logger=cls.log)
|
||||
|
||||
# Disable plug-in if cbId workflow is disabled
|
||||
if not project_settings["maya"].get("use_cbid_workflow", True):
|
||||
cls.enabled = False
|
||||
return
|
||||
|
||||
def process(self, instance):
|
||||
"""Process all meshes"""
|
||||
if not self.is_active(instance.data):
|
||||
|
|
|
|||
|
|
@ -32,6 +32,13 @@ class ValidateRigOutputIds(pyblish.api.InstancePlugin):
|
|||
actions = [RepairAction,
|
||||
ayon_core.hosts.maya.api.action.SelectInvalidAction]
|
||||
|
||||
@classmethod
|
||||
def apply_settings(cls, project_settings):
|
||||
# Disable plug-in if cbId workflow is disabled
|
||||
if not project_settings["maya"].get("use_cbid_workflow", True):
|
||||
cls.enabled = False
|
||||
return
|
||||
|
||||
def process(self, instance):
|
||||
invalid = self.get_invalid(instance, compute=True)
|
||||
if invalid:
|
||||
|
|
|
|||
|
|
@ -29,7 +29,7 @@ class ValidateStepSize(pyblish.api.InstancePlugin,
|
|||
@classmethod
|
||||
def get_invalid(cls, instance):
|
||||
|
||||
objset = instance.data['name']
|
||||
objset = instance.data['instance_node']
|
||||
step = instance.data.get("step", 1.0)
|
||||
|
||||
if step < cls.MIN or step > cls.MAX:
|
||||
|
|
@ -47,4 +47,4 @@ class ValidateStepSize(pyblish.api.InstancePlugin,
|
|||
invalid = self.get_invalid(instance)
|
||||
if invalid:
|
||||
raise PublishValidationError(
|
||||
"Invalid instances found: {0}".format(invalid))
|
||||
"Instance found with invalid step size: {0}".format(invalid))
|
||||
|
|
|
|||
|
|
@ -125,8 +125,9 @@ class AssetOutliner(QtWidgets.QWidget):
|
|||
folder_items = {}
|
||||
namespaces_by_folder_path = defaultdict(set)
|
||||
for item in items:
|
||||
folder_id = item["folder"]["id"]
|
||||
folder_path = item["folder"]["path"]
|
||||
folder_entity = item["folder_entity"]
|
||||
folder_id = folder_entity["id"]
|
||||
folder_path = folder_entity["path"]
|
||||
namespaces_by_folder_path[folder_path].add(item.get("namespace"))
|
||||
|
||||
if folder_path in folder_items:
|
||||
|
|
|
|||
|
|
@ -1,10 +1,12 @@
|
|||
from .addon import (
|
||||
PHOTOSHOP_ADDON_ROOT,
|
||||
PhotoshopAddon,
|
||||
PHOTOSHOP_HOST_DIR,
|
||||
get_launch_script_path,
|
||||
)
|
||||
|
||||
|
||||
__all__ = (
|
||||
"PHOTOSHOP_ADDON_ROOT",
|
||||
"PhotoshopAddon",
|
||||
"PHOTOSHOP_HOST_DIR",
|
||||
"get_launch_script_path",
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
import os
|
||||
from ayon_core.addon import AYONAddon, IHostAddon
|
||||
|
||||
PHOTOSHOP_HOST_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
PHOTOSHOP_ADDON_ROOT = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class PhotoshopAddon(AYONAddon, IHostAddon):
|
||||
|
|
@ -20,3 +20,17 @@ class PhotoshopAddon(AYONAddon, IHostAddon):
|
|||
|
||||
def get_workfile_extensions(self):
|
||||
return [".psd", ".psb"]
|
||||
|
||||
def get_launch_hook_paths(self, app):
|
||||
if app.host_name != self.host_name:
|
||||
return []
|
||||
return [
|
||||
os.path.join(PHOTOSHOP_ADDON_ROOT, "hooks")
|
||||
]
|
||||
|
||||
|
||||
def get_launch_script_path():
|
||||
return os.path.join(
|
||||
PHOTOSHOP_ADDON_ROOT, "api", "launch_script.py"
|
||||
)
|
||||
|
||||
|
|
|
|||
93
client/ayon_core/hosts/photoshop/api/launch_script.py
Normal file
93
client/ayon_core/hosts/photoshop/api/launch_script.py
Normal file
|
|
@ -0,0 +1,93 @@
|
|||
"""Script wraps launch mechanism of Photoshop implementations.
|
||||
|
||||
Arguments passed to the script are passed to launch function in host
|
||||
implementation. In all cases requires host app executable and may contain
|
||||
workfile or others.
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
|
||||
from ayon_core.hosts.photoshop.api.lib import main as host_main
|
||||
|
||||
# Get current file to locate start point of sys.argv
|
||||
CURRENT_FILE = os.path.abspath(__file__)
|
||||
|
||||
|
||||
def show_error_messagebox(title, message, detail_message=None):
|
||||
"""Function will show message and process ends after closing it."""
|
||||
from qtpy import QtWidgets, QtCore
|
||||
from ayon_core import style
|
||||
|
||||
app = QtWidgets.QApplication([])
|
||||
app.setStyleSheet(style.load_stylesheet())
|
||||
|
||||
msgbox = QtWidgets.QMessageBox()
|
||||
msgbox.setWindowTitle(title)
|
||||
msgbox.setText(message)
|
||||
|
||||
if detail_message:
|
||||
msgbox.setDetailedText(detail_message)
|
||||
|
||||
msgbox.setWindowModality(QtCore.Qt.ApplicationModal)
|
||||
msgbox.show()
|
||||
|
||||
sys.exit(app.exec_())
|
||||
|
||||
|
||||
def on_invalid_args(script_not_found):
|
||||
"""Show to user message box saying that something went wrong.
|
||||
|
||||
Tell user that arguments to launch implementation are invalid with
|
||||
arguments details.
|
||||
|
||||
Args:
|
||||
script_not_found (bool): Use different message based on this value.
|
||||
"""
|
||||
|
||||
title = "Invalid arguments"
|
||||
joined_args = ", ".join("\"{}\"".format(arg) for arg in sys.argv)
|
||||
if script_not_found:
|
||||
submsg = "Where couldn't find script path:\n\"{}\""
|
||||
else:
|
||||
submsg = "Expected Host executable after script path:\n\"{}\""
|
||||
|
||||
message = "BUG: Got invalid arguments so can't launch Host application."
|
||||
detail_message = "Process was launched with arguments:\n{}\n\n{}".format(
|
||||
joined_args,
|
||||
submsg.format(CURRENT_FILE)
|
||||
)
|
||||
|
||||
show_error_messagebox(title, message, detail_message)
|
||||
|
||||
|
||||
def main(argv):
|
||||
# Modify current file path to find match in sys.argv which may be different
|
||||
# on windows (different letter cases and slashes).
|
||||
modified_current_file = CURRENT_FILE.replace("\\", "/").lower()
|
||||
|
||||
# Create a copy of sys argv
|
||||
sys_args = list(argv)
|
||||
after_script_idx = None
|
||||
# Find script path in sys.argv to know index of argv where host
|
||||
# executable should be.
|
||||
for idx, item in enumerate(sys_args):
|
||||
if item.replace("\\", "/").lower() == modified_current_file:
|
||||
after_script_idx = idx + 1
|
||||
break
|
||||
|
||||
# Validate that there is at least one argument after script path
|
||||
launch_args = None
|
||||
if after_script_idx is not None:
|
||||
launch_args = sys_args[after_script_idx:]
|
||||
|
||||
if launch_args:
|
||||
# Launch host implementation
|
||||
host_main(*launch_args)
|
||||
else:
|
||||
# Show message box
|
||||
on_invalid_args(after_script_idx is None)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main(sys.argv)
|
||||
|
|
@ -21,14 +21,14 @@ from ayon_core.host import (
|
|||
)
|
||||
|
||||
from ayon_core.pipeline.load import any_outdated_containers
|
||||
from ayon_core.hosts.photoshop import PHOTOSHOP_HOST_DIR
|
||||
from ayon_core.hosts.photoshop import PHOTOSHOP_ADDON_ROOT
|
||||
from ayon_core.tools.utils import get_ayon_qt_app
|
||||
|
||||
from . import lib
|
||||
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
PLUGINS_DIR = os.path.join(PHOTOSHOP_HOST_DIR, "plugins")
|
||||
PLUGINS_DIR = os.path.join(PHOTOSHOP_ADDON_ROOT, "plugins")
|
||||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
|
||||
|
|
|
|||
91
client/ayon_core/hosts/photoshop/hooks/pre_launch_args.py
Normal file
91
client/ayon_core/hosts/photoshop/hooks/pre_launch_args.py
Normal file
|
|
@ -0,0 +1,91 @@
|
|||
import os
|
||||
import platform
|
||||
import subprocess
|
||||
|
||||
from ayon_core.lib import (
|
||||
get_ayon_launcher_args,
|
||||
is_using_ayon_console,
|
||||
)
|
||||
from ayon_core.lib.applications import (
|
||||
PreLaunchHook,
|
||||
LaunchTypes,
|
||||
)
|
||||
from ayon_core.hosts.photoshop import get_launch_script_path
|
||||
|
||||
|
||||
def get_launch_kwargs(kwargs):
|
||||
"""Explicit setting of kwargs for Popen for Photoshop.
|
||||
|
||||
Expected behavior
|
||||
- ayon_console opens window with logs
|
||||
- ayon has stdout/stderr available for capturing
|
||||
|
||||
Args:
|
||||
kwargs (Union[dict, None]): Current kwargs or None.
|
||||
|
||||
"""
|
||||
if kwargs is None:
|
||||
kwargs = {}
|
||||
|
||||
if platform.system().lower() != "windows":
|
||||
return kwargs
|
||||
|
||||
if not is_using_ayon_console():
|
||||
kwargs.update({
|
||||
"creationflags": subprocess.CREATE_NEW_CONSOLE
|
||||
})
|
||||
else:
|
||||
kwargs.update({
|
||||
"creationflags": subprocess.CREATE_NO_WINDOW,
|
||||
"stdout": subprocess.DEVNULL,
|
||||
"stderr": subprocess.DEVNULL
|
||||
})
|
||||
return kwargs
|
||||
|
||||
|
||||
class PhotoshopPrelaunchHook(PreLaunchHook):
|
||||
"""Launch arguments preparation.
|
||||
|
||||
Hook add python executable and script path to Photoshop implementation
|
||||
before Photoshop executable and add last workfile path to launch arguments.
|
||||
|
||||
Existence of last workfile is checked. If workfile does not exists tries
|
||||
to copy templated workfile from predefined path.
|
||||
"""
|
||||
app_groups = {"photoshop"}
|
||||
|
||||
order = 20
|
||||
launch_types = {LaunchTypes.local}
|
||||
|
||||
def execute(self):
|
||||
# Pop executable
|
||||
executable_path = self.launch_context.launch_args.pop(0)
|
||||
|
||||
# Pop rest of launch arguments - There should not be other arguments!
|
||||
remainders = []
|
||||
while self.launch_context.launch_args:
|
||||
remainders.append(self.launch_context.launch_args.pop(0))
|
||||
|
||||
script_path = get_launch_script_path()
|
||||
|
||||
new_launch_args = get_ayon_launcher_args(
|
||||
"run", script_path, executable_path
|
||||
)
|
||||
# Add workfile path if exists
|
||||
workfile_path = self.data["last_workfile_path"]
|
||||
if (
|
||||
self.data.get("start_last_workfile")
|
||||
and workfile_path
|
||||
and os.path.exists(workfile_path)
|
||||
):
|
||||
new_launch_args.append(workfile_path)
|
||||
|
||||
# Append as whole list as these arguments should not be separated
|
||||
self.launch_context.launch_args.append(new_launch_args)
|
||||
|
||||
if remainders:
|
||||
self.launch_context.launch_args.extend(remainders)
|
||||
|
||||
self.launch_context.kwargs = get_launch_kwargs(
|
||||
self.launch_context.kwargs
|
||||
)
|
||||
|
|
@ -48,6 +48,7 @@ class AYONMenu(QtWidgets.QWidget):
|
|||
QtCore.Qt.Window
|
||||
| QtCore.Qt.CustomizeWindowHint
|
||||
| QtCore.Qt.WindowTitleHint
|
||||
| QtCore.Qt.WindowMinimizeButtonHint
|
||||
| QtCore.Qt.WindowCloseButtonHint
|
||||
| QtCore.Qt.WindowStaysOnTopHint
|
||||
)
|
||||
|
|
|
|||
|
|
@ -43,7 +43,6 @@ class CollectSequenceFrameData(
|
|||
instance.data[key] = value
|
||||
self.log.debug(f"Collected Frame range data '{key}':{value} ")
|
||||
|
||||
|
||||
def get_frame_data_from_repre_sequence(self, instance):
|
||||
repres = instance.data.get("representations")
|
||||
folder_attributes = instance.data["folderEntity"]["attrib"]
|
||||
|
|
@ -56,6 +55,9 @@ class CollectSequenceFrameData(
|
|||
return
|
||||
|
||||
files = first_repre["files"]
|
||||
if not isinstance(files, list):
|
||||
files = [files]
|
||||
|
||||
collections, _ = clique.assemble(files)
|
||||
if not collections:
|
||||
# No sequences detected and we can't retrieve
|
||||
|
|
|
|||
|
|
@ -72,7 +72,7 @@ class AnimationAlembicLoader(plugin.Loader):
|
|||
root = unreal_pipeline.AYON_ASSET_DIR
|
||||
folder_name = context["folder"]["name"]
|
||||
folder_path = context["folder"]["path"]
|
||||
product_type = context["representation"]["context"]["family"]
|
||||
product_type = context["product"]["productType"]
|
||||
suffix = "_CON"
|
||||
if folder_name:
|
||||
asset_name = "{}_{}".format(folder_name, name)
|
||||
|
|
|
|||
|
|
@ -659,7 +659,7 @@ class LayoutLoader(plugin.Loader):
|
|||
"loader": str(self.__class__.__name__),
|
||||
"representation": context["representation"]["id"],
|
||||
"parent": context["representation"]["versionId"],
|
||||
"family": context["representation"]["context"]["family"],
|
||||
"family": context["product"]["productType"],
|
||||
"loaded_assets": loaded_assets
|
||||
}
|
||||
imprint(
|
||||
|
|
|
|||
|
|
@ -393,7 +393,7 @@ class ExistingLayoutLoader(plugin.Loader):
|
|||
|
||||
folder_name = context["folder"]["name"]
|
||||
folder_path = context["folder"]["path"]
|
||||
product_type = context["representation"]["context"]["family"]
|
||||
product_type = context["product"]["productType"]
|
||||
asset_name = f"{folder_name}_{name}" if folder_name else name
|
||||
container_name = f"{folder_name}_{name}_CON"
|
||||
|
||||
|
|
|
|||
|
|
@ -152,6 +152,7 @@ from .path_tools import (
|
|||
|
||||
from .ayon_info import (
|
||||
is_running_from_build,
|
||||
is_using_ayon_console,
|
||||
is_staging_enabled,
|
||||
is_dev_mode_enabled,
|
||||
is_in_tests,
|
||||
|
|
@ -269,6 +270,7 @@ __all__ = [
|
|||
"Logger",
|
||||
|
||||
"is_running_from_build",
|
||||
"is_using_ayon_console",
|
||||
"is_staging_enabled",
|
||||
"is_dev_mode_enabled",
|
||||
"is_in_tests",
|
||||
|
|
|
|||
|
|
@ -1789,6 +1789,10 @@ def _prepare_last_workfile(data, workdir, addons_manager):
|
|||
|
||||
from ayon_core.addon import AddonsManager
|
||||
from ayon_core.pipeline import HOST_WORKFILE_EXTENSIONS
|
||||
from ayon_core.pipeline.workfile import (
|
||||
should_use_last_workfile_on_launch,
|
||||
should_open_workfiles_tool_on_launch,
|
||||
)
|
||||
|
||||
if not addons_manager:
|
||||
addons_manager = AddonsManager()
|
||||
|
|
@ -1811,7 +1815,7 @@ def _prepare_last_workfile(data, workdir, addons_manager):
|
|||
|
||||
start_last_workfile = data.get("start_last_workfile")
|
||||
if start_last_workfile is None:
|
||||
start_last_workfile = should_start_last_workfile(
|
||||
start_last_workfile = should_use_last_workfile_on_launch(
|
||||
project_name, app.host_name, task_name, task_type
|
||||
)
|
||||
else:
|
||||
|
|
@ -1819,7 +1823,7 @@ def _prepare_last_workfile(data, workdir, addons_manager):
|
|||
|
||||
data["start_last_workfile"] = start_last_workfile
|
||||
|
||||
workfile_startup = should_workfile_tool_start(
|
||||
workfile_startup = should_open_workfiles_tool_on_launch(
|
||||
project_name, app.host_name, task_name, task_type
|
||||
)
|
||||
data["workfile_startup"] = workfile_startup
|
||||
|
|
@ -1887,142 +1891,3 @@ def _prepare_last_workfile(data, workdir, addons_manager):
|
|||
|
||||
data["env"]["AYON_LAST_WORKFILE"] = last_workfile_path
|
||||
data["last_workfile_path"] = last_workfile_path
|
||||
|
||||
|
||||
def should_start_last_workfile(
|
||||
project_name, host_name, task_name, task_type, default_output=False
|
||||
):
|
||||
"""Define if host should start last version workfile if possible.
|
||||
|
||||
Default output is `False`. Can be overridden with environment variable
|
||||
`AYON_OPEN_LAST_WORKFILE`, valid values without case sensitivity are
|
||||
`"0", "1", "true", "false", "yes", "no"`.
|
||||
|
||||
Args:
|
||||
project_name (str): Project name.
|
||||
host_name (str): Host name.
|
||||
task_name (str): Task name.
|
||||
task_type (str): Task type.
|
||||
default_output (Optional[bool]): Default output if no profile is
|
||||
found.
|
||||
|
||||
Returns:
|
||||
bool: True if host should start workfile.
|
||||
|
||||
"""
|
||||
|
||||
project_settings = get_project_settings(project_name)
|
||||
profiles = (
|
||||
project_settings
|
||||
["core"]
|
||||
["tools"]
|
||||
["Workfiles"]
|
||||
["last_workfile_on_startup"]
|
||||
)
|
||||
|
||||
if not profiles:
|
||||
return default_output
|
||||
|
||||
filter_data = {
|
||||
"tasks": task_name,
|
||||
"task_types": task_type,
|
||||
"hosts": host_name
|
||||
}
|
||||
matching_item = filter_profiles(profiles, filter_data)
|
||||
|
||||
output = None
|
||||
if matching_item:
|
||||
output = matching_item.get("enabled")
|
||||
|
||||
if output is None:
|
||||
return default_output
|
||||
return output
|
||||
|
||||
|
||||
def should_workfile_tool_start(
|
||||
project_name, host_name, task_name, task_type, default_output=False
|
||||
):
|
||||
"""Define if host should start workfile tool at host launch.
|
||||
|
||||
Default output is `False`. Can be overridden with environment variable
|
||||
`AYON_WORKFILE_TOOL_ON_START`, valid values without case sensitivity are
|
||||
`"0", "1", "true", "false", "yes", "no"`.
|
||||
|
||||
Args:
|
||||
project_name (str): Project name.
|
||||
host_name (str): Host name.
|
||||
task_name (str): Task name.
|
||||
task_type (str): Task type.
|
||||
default_output (Optional[bool]): Default output if no profile is
|
||||
found.
|
||||
|
||||
Returns:
|
||||
bool: True if host should start workfile.
|
||||
|
||||
"""
|
||||
|
||||
project_settings = get_project_settings(project_name)
|
||||
profiles = (
|
||||
project_settings
|
||||
["core"]
|
||||
["tools"]
|
||||
["Workfiles"]
|
||||
["open_workfile_tool_on_startup"]
|
||||
)
|
||||
|
||||
if not profiles:
|
||||
return default_output
|
||||
|
||||
filter_data = {
|
||||
"tasks": task_name,
|
||||
"task_types": task_type,
|
||||
"hosts": host_name
|
||||
}
|
||||
matching_item = filter_profiles(profiles, filter_data)
|
||||
|
||||
output = None
|
||||
if matching_item:
|
||||
output = matching_item.get("enabled")
|
||||
|
||||
if output is None:
|
||||
return default_output
|
||||
return output
|
||||
|
||||
|
||||
def get_non_python_host_kwargs(kwargs, allow_console=True):
|
||||
"""Explicit setting of kwargs for Popen for AE/PS/Harmony.
|
||||
|
||||
Expected behavior
|
||||
- ayon_console opens window with logs
|
||||
- ayon has stdout/stderr available for capturing
|
||||
|
||||
Args:
|
||||
kwargs (dict) or None
|
||||
allow_console (bool): use False for inner Popen opening app itself or
|
||||
it will open additional console (at least for Harmony)
|
||||
"""
|
||||
|
||||
if kwargs is None:
|
||||
kwargs = {}
|
||||
|
||||
if platform.system().lower() != "windows":
|
||||
return kwargs
|
||||
|
||||
executable_path = os.environ.get("AYON_EXECUTABLE")
|
||||
|
||||
executable_filename = ""
|
||||
if executable_path:
|
||||
executable_filename = os.path.basename(executable_path)
|
||||
|
||||
is_gui_executable = "ayon_console" not in executable_filename
|
||||
if is_gui_executable:
|
||||
kwargs.update({
|
||||
"creationflags": subprocess.CREATE_NO_WINDOW,
|
||||
"stdout": subprocess.DEVNULL,
|
||||
"stderr": subprocess.DEVNULL
|
||||
})
|
||||
elif allow_console:
|
||||
kwargs.update({
|
||||
"creationflags": subprocess.CREATE_NEW_CONSOLE
|
||||
})
|
||||
return kwargs
|
||||
|
|
|
|||
|
|
@ -10,6 +10,12 @@ from .local_settings import get_local_site_id
|
|||
|
||||
|
||||
def get_ayon_launcher_version():
|
||||
"""Get AYON launcher version.
|
||||
|
||||
Returns:
|
||||
str: Version string.
|
||||
|
||||
"""
|
||||
version_filepath = os.path.join(os.environ["AYON_ROOT"], "version.py")
|
||||
if not os.path.exists(version_filepath):
|
||||
return None
|
||||
|
|
@ -24,8 +30,8 @@ def is_running_from_build():
|
|||
|
||||
Returns:
|
||||
bool: True if running from build.
|
||||
"""
|
||||
|
||||
"""
|
||||
executable_path = os.environ["AYON_EXECUTABLE"]
|
||||
executable_filename = os.path.basename(executable_path)
|
||||
if "python" in executable_filename.lower():
|
||||
|
|
@ -33,6 +39,32 @@ def is_running_from_build():
|
|||
return True
|
||||
|
||||
|
||||
def is_using_ayon_console():
|
||||
"""AYON launcher console executable is used.
|
||||
|
||||
This function make sense only on Windows platform. For other platforms
|
||||
always returns True. True is also returned if process is running from
|
||||
code.
|
||||
|
||||
AYON launcher on windows has 2 executable files. First 'ayon_console.exe'
|
||||
works as 'python.exe' executable, the second 'ayon.exe' works as
|
||||
'pythonw.exe' executable. The difference is way how stdout/stderr is
|
||||
handled (especially when calling subprocess).
|
||||
|
||||
Returns:
|
||||
bool: True if console executable is used.
|
||||
|
||||
"""
|
||||
if (
|
||||
platform.system().lower() != "windows"
|
||||
or is_running_from_build()
|
||||
):
|
||||
return True
|
||||
executable_path = os.environ["AYON_EXECUTABLE"]
|
||||
executable_filename = os.path.basename(executable_path)
|
||||
return "ayon_console" in executable_filename
|
||||
|
||||
|
||||
def is_staging_enabled():
|
||||
return os.getenv("AYON_USE_STAGING") == "1"
|
||||
|
||||
|
|
|
|||
|
|
@ -13,6 +13,11 @@ from .path_resolving import (
|
|||
create_workdir_extra_folders,
|
||||
)
|
||||
|
||||
from .utils import (
|
||||
should_use_last_workfile_on_launch,
|
||||
should_open_workfiles_tool_on_launch,
|
||||
)
|
||||
|
||||
from .build_workfile import BuildWorkfile
|
||||
|
||||
|
||||
|
|
@ -30,5 +35,8 @@ __all__ = (
|
|||
|
||||
"create_workdir_extra_folders",
|
||||
|
||||
"should_use_last_workfile_on_launch",
|
||||
"should_open_workfiles_tool_on_launch",
|
||||
|
||||
"BuildWorkfile",
|
||||
)
|
||||
|
|
|
|||
121
client/ayon_core/pipeline/workfile/utils.py
Normal file
121
client/ayon_core/pipeline/workfile/utils.py
Normal file
|
|
@ -0,0 +1,121 @@
|
|||
from ayon_core.lib import filter_profiles
|
||||
from ayon_core.settings import get_project_settings
|
||||
|
||||
|
||||
def should_use_last_workfile_on_launch(
|
||||
project_name,
|
||||
host_name,
|
||||
task_name,
|
||||
task_type,
|
||||
default_output=False,
|
||||
project_settings=None,
|
||||
):
|
||||
"""Define if host should start last version workfile if possible.
|
||||
|
||||
Default output is `False`. Can be overridden with environment variable
|
||||
`AYON_OPEN_LAST_WORKFILE`, valid values without case sensitivity are
|
||||
`"0", "1", "true", "false", "yes", "no"`.
|
||||
|
||||
Args:
|
||||
project_name (str): Name of project.
|
||||
host_name (str): Name of host which is launched. In avalon's
|
||||
application context it's value stored in app definition under
|
||||
key `"application_dir"`. Is not case sensitive.
|
||||
task_name (str): Name of task which is used for launching the host.
|
||||
Task name is not case sensitive.
|
||||
task_type (str): Task type.
|
||||
default_output (Optional[bool]): Default output value if no profile
|
||||
is found.
|
||||
project_settings (Optional[dict[str, Any]]): Project settings.
|
||||
|
||||
Returns:
|
||||
bool: True if host should start workfile.
|
||||
|
||||
"""
|
||||
if project_settings is None:
|
||||
project_settings = get_project_settings(project_name)
|
||||
profiles = (
|
||||
project_settings
|
||||
["core"]
|
||||
["tools"]
|
||||
["Workfiles"]
|
||||
["last_workfile_on_startup"]
|
||||
)
|
||||
|
||||
if not profiles:
|
||||
return default_output
|
||||
|
||||
filter_data = {
|
||||
"tasks": task_name,
|
||||
"task_types": task_type,
|
||||
"hosts": host_name
|
||||
}
|
||||
matching_item = filter_profiles(profiles, filter_data)
|
||||
|
||||
output = None
|
||||
if matching_item:
|
||||
output = matching_item.get("enabled")
|
||||
|
||||
if output is None:
|
||||
return default_output
|
||||
return output
|
||||
|
||||
|
||||
def should_open_workfiles_tool_on_launch(
|
||||
project_name,
|
||||
host_name,
|
||||
task_name,
|
||||
task_type,
|
||||
default_output=False,
|
||||
project_settings=None,
|
||||
):
|
||||
"""Define if host should start workfile tool at host launch.
|
||||
|
||||
Default output is `False`. Can be overridden with environment variable
|
||||
`AYON_WORKFILE_TOOL_ON_START`, valid values without case sensitivity are
|
||||
`"0", "1", "true", "false", "yes", "no"`.
|
||||
|
||||
Args:
|
||||
project_name (str): Name of project.
|
||||
host_name (str): Name of host which is launched. In avalon's
|
||||
application context it's value stored in app definition under
|
||||
key `"application_dir"`. Is not case sensitive.
|
||||
task_name (str): Name of task which is used for launching the host.
|
||||
Task name is not case sensitive.
|
||||
task_type (str): Task type.
|
||||
default_output (Optional[bool]): Default output value if no profile
|
||||
is found.
|
||||
project_settings (Optional[dict[str, Any]]): Project settings.
|
||||
|
||||
Returns:
|
||||
bool: True if host should start workfile.
|
||||
|
||||
"""
|
||||
|
||||
if project_settings is None:
|
||||
project_settings = get_project_settings(project_name)
|
||||
profiles = (
|
||||
project_settings
|
||||
["core"]
|
||||
["tools"]
|
||||
["Workfiles"]
|
||||
["open_workfile_tool_on_startup"]
|
||||
)
|
||||
|
||||
if not profiles:
|
||||
return default_output
|
||||
|
||||
filter_data = {
|
||||
"tasks": task_name,
|
||||
"task_types": task_type,
|
||||
"hosts": host_name
|
||||
}
|
||||
matching_item = filter_profiles(profiles, filter_data)
|
||||
|
||||
output = None
|
||||
if matching_item:
|
||||
output = matching_item.get("enabled")
|
||||
|
||||
if output is None:
|
||||
return default_output
|
||||
return output
|
||||
|
|
@ -257,6 +257,7 @@ class ExtractOIIOTranscode(publish.Extractor):
|
|||
return
|
||||
|
||||
new_repre["ext"] = output_extension
|
||||
new_repre["outputName"] = output_name
|
||||
|
||||
renamed_files = []
|
||||
for file_name in files_to_convert:
|
||||
|
|
|
|||
BIN
client/ayon_core/resources/app_icons/zbrush.png
Normal file
BIN
client/ayon_core/resources/app_icons/zbrush.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 280 KiB |
|
|
@ -6,6 +6,7 @@ from ayon_core.pipeline.actions import (
|
|||
discover_launcher_actions,
|
||||
LauncherAction,
|
||||
)
|
||||
from ayon_core.pipeline.workfile import should_use_last_workfile_on_launch
|
||||
|
||||
|
||||
# class Action:
|
||||
|
|
@ -301,8 +302,6 @@ class ActionsModel:
|
|||
host_name,
|
||||
not_open_workfile_actions
|
||||
):
|
||||
from ayon_core.lib.applications import should_start_last_workfile
|
||||
|
||||
if identifier in not_open_workfile_actions:
|
||||
return not not_open_workfile_actions[identifier]
|
||||
|
||||
|
|
@ -315,7 +314,7 @@ class ActionsModel:
|
|||
task_name = task_entity["name"]
|
||||
task_type = task_entity["taskType"]
|
||||
|
||||
output = should_start_last_workfile(
|
||||
output = should_use_last_workfile_on_launch(
|
||||
project_name,
|
||||
host_name,
|
||||
task_name,
|
||||
|
|
|
|||
564
poetry.lock
generated
Normal file
564
poetry.lock
generated
Normal file
|
|
@ -0,0 +1,564 @@
|
|||
# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
|
||||
|
||||
[[package]]
|
||||
name = "appdirs"
|
||||
version = "1.4.4"
|
||||
description = "A small Python module for determining appropriate platform-specific dirs, e.g. a \"user data dir\"."
|
||||
optional = false
|
||||
python-versions = "*"
|
||||
files = [
|
||||
{file = "appdirs-1.4.4-py2.py3-none-any.whl", hash = "sha256:a841dacd6b99318a741b166adb07e19ee71a274450e68237b4650ca1055ab128"},
|
||||
{file = "appdirs-1.4.4.tar.gz", hash = "sha256:7d5d0167b2b1ba821647616af46a749d1c653740dd0d2415100fe26e27afdf41"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "ayon-python-api"
|
||||
version = "1.0.1"
|
||||
description = "AYON Python API"
|
||||
optional = false
|
||||
python-versions = "*"
|
||||
files = [
|
||||
{file = "ayon-python-api-1.0.1.tar.gz", hash = "sha256:6a53af84903317e2097f3c6bba0094e90d905d6670fb9c7d3ad3aa9de6552bc1"},
|
||||
{file = "ayon_python_api-1.0.1-py3-none-any.whl", hash = "sha256:d4b649ac39c9003cdbd60f172c0d35f05d310fba3a0649b6d16300fe67f967d6"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
appdirs = ">=1,<2"
|
||||
requests = ">=2.27.1"
|
||||
six = ">=1.15"
|
||||
Unidecode = ">=1.2.0"
|
||||
|
||||
[[package]]
|
||||
name = "certifi"
|
||||
version = "2024.2.2"
|
||||
description = "Python package for providing Mozilla's CA Bundle."
|
||||
optional = false
|
||||
python-versions = ">=3.6"
|
||||
files = [
|
||||
{file = "certifi-2024.2.2-py3-none-any.whl", hash = "sha256:dc383c07b76109f368f6106eee2b593b04a011ea4d55f652c6ca24a754d1cdd1"},
|
||||
{file = "certifi-2024.2.2.tar.gz", hash = "sha256:0569859f95fc761b18b45ef421b1290a0f65f147e92a1e5eb3e635f9a5e4e66f"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "cfgv"
|
||||
version = "3.4.0"
|
||||
description = "Validate configuration and produce human readable error messages."
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "cfgv-3.4.0-py2.py3-none-any.whl", hash = "sha256:b7265b1f29fd3316bfcd2b330d63d024f2bfd8bcb8b0272f8e19a504856c48f9"},
|
||||
{file = "cfgv-3.4.0.tar.gz", hash = "sha256:e52591d4c5f5dead8e0f673fb16db7949d2cfb3f7da4582893288f0ded8fe560"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "charset-normalizer"
|
||||
version = "3.3.2"
|
||||
description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
|
||||
optional = false
|
||||
python-versions = ">=3.7.0"
|
||||
files = [
|
||||
{file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
|
||||
{file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
|
||||
{file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
|
||||
{file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
|
||||
{file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
|
||||
{file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
|
||||
{file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
|
||||
{file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
|
||||
{file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
|
||||
{file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
|
||||
{file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
|
||||
{file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
|
||||
{file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
|
||||
{file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
|
||||
{file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
|
||||
{file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
|
||||
{file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
|
||||
{file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
|
||||
{file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
|
||||
{file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
|
||||
{file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
|
||||
{file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
|
||||
{file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
|
||||
{file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
|
||||
{file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
|
||||
{file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
|
||||
{file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
|
||||
{file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
|
||||
{file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
|
||||
{file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
|
||||
{file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
|
||||
{file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
|
||||
{file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
|
||||
{file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
|
||||
{file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
|
||||
{file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
|
||||
{file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
|
||||
{file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
|
||||
{file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
|
||||
{file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
|
||||
{file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
|
||||
{file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
|
||||
{file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
|
||||
{file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
|
||||
{file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
|
||||
{file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
|
||||
{file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
|
||||
{file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
|
||||
{file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
|
||||
{file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
|
||||
{file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
|
||||
{file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
|
||||
{file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
|
||||
{file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
|
||||
{file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
|
||||
{file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
|
||||
{file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
|
||||
{file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
|
||||
{file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
|
||||
{file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
|
||||
{file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
|
||||
{file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
|
||||
{file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
|
||||
{file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
|
||||
{file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
|
||||
{file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
|
||||
{file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
|
||||
{file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
|
||||
{file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
|
||||
{file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
|
||||
{file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
|
||||
{file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
|
||||
{file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
|
||||
{file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
|
||||
{file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
|
||||
{file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
|
||||
{file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
|
||||
{file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
|
||||
{file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
|
||||
{file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
|
||||
{file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
|
||||
{file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
|
||||
{file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
|
||||
{file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
|
||||
{file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
|
||||
{file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
|
||||
{file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
|
||||
{file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
|
||||
{file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
|
||||
{file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "codespell"
|
||||
version = "2.2.6"
|
||||
description = "Codespell"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "codespell-2.2.6-py3-none-any.whl", hash = "sha256:9ee9a3e5df0990604013ac2a9f22fa8e57669c827124a2e961fe8a1da4cacc07"},
|
||||
{file = "codespell-2.2.6.tar.gz", hash = "sha256:a8c65d8eb3faa03deabab6b3bbe798bea72e1799c7e9e955d57eca4096abcff9"},
|
||||
]
|
||||
|
||||
[package.extras]
|
||||
dev = ["Pygments", "build", "chardet", "pre-commit", "pytest", "pytest-cov", "pytest-dependency", "ruff", "tomli", "twine"]
|
||||
hard-encoding-detection = ["chardet"]
|
||||
toml = ["tomli"]
|
||||
types = ["chardet (>=5.1.0)", "mypy", "pytest", "pytest-cov", "pytest-dependency"]
|
||||
|
||||
[[package]]
|
||||
name = "colorama"
|
||||
version = "0.4.6"
|
||||
description = "Cross-platform colored terminal text."
|
||||
optional = false
|
||||
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7"
|
||||
files = [
|
||||
{file = "colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6"},
|
||||
{file = "colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "distlib"
|
||||
version = "0.3.8"
|
||||
description = "Distribution utilities"
|
||||
optional = false
|
||||
python-versions = "*"
|
||||
files = [
|
||||
{file = "distlib-0.3.8-py2.py3-none-any.whl", hash = "sha256:034db59a0b96f8ca18035f36290806a9a6e6bd9d1ff91e45a7f172eb17e51784"},
|
||||
{file = "distlib-0.3.8.tar.gz", hash = "sha256:1530ea13e350031b6312d8580ddb6b27a104275a31106523b8f123787f494f64"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "exceptiongroup"
|
||||
version = "1.2.0"
|
||||
description = "Backport of PEP 654 (exception groups)"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
{file = "exceptiongroup-1.2.0-py3-none-any.whl", hash = "sha256:4bfd3996ac73b41e9b9628b04e079f193850720ea5945fc96a08633c66912f14"},
|
||||
{file = "exceptiongroup-1.2.0.tar.gz", hash = "sha256:91f5c769735f051a4290d52edd0858999b57e5876e9f85937691bd4c9fa3ed68"},
|
||||
]
|
||||
|
||||
[package.extras]
|
||||
test = ["pytest (>=6)"]
|
||||
|
||||
[[package]]
|
||||
name = "filelock"
|
||||
version = "3.13.1"
|
||||
description = "A platform independent file lock."
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "filelock-3.13.1-py3-none-any.whl", hash = "sha256:57dbda9b35157b05fb3e58ee91448612eb674172fab98ee235ccb0b5bee19a1c"},
|
||||
{file = "filelock-3.13.1.tar.gz", hash = "sha256:521f5f56c50f8426f5e03ad3b281b490a87ef15bc6c526f168290f0c7148d44e"},
|
||||
]
|
||||
|
||||
[package.extras]
|
||||
docs = ["furo (>=2023.9.10)", "sphinx (>=7.2.6)", "sphinx-autodoc-typehints (>=1.24)"]
|
||||
testing = ["covdefaults (>=2.3)", "coverage (>=7.3.2)", "diff-cover (>=8)", "pytest (>=7.4.3)", "pytest-cov (>=4.1)", "pytest-mock (>=3.12)", "pytest-timeout (>=2.2)"]
|
||||
typing = ["typing-extensions (>=4.8)"]
|
||||
|
||||
[[package]]
|
||||
name = "identify"
|
||||
version = "2.5.35"
|
||||
description = "File identification library for Python"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "identify-2.5.35-py2.py3-none-any.whl", hash = "sha256:c4de0081837b211594f8e877a6b4fad7ca32bbfc1a9307fdd61c28bfe923f13e"},
|
||||
{file = "identify-2.5.35.tar.gz", hash = "sha256:10a7ca245cfcd756a554a7288159f72ff105ad233c7c4b9c6f0f4d108f5f6791"},
|
||||
]
|
||||
|
||||
[package.extras]
|
||||
license = ["ukkonen"]
|
||||
|
||||
[[package]]
|
||||
name = "idna"
|
||||
version = "3.6"
|
||||
description = "Internationalized Domain Names in Applications (IDNA)"
|
||||
optional = false
|
||||
python-versions = ">=3.5"
|
||||
files = [
|
||||
{file = "idna-3.6-py3-none-any.whl", hash = "sha256:c05567e9c24a6b9faaa835c4821bad0590fbb9d5779e7caa6e1cc4978e7eb24f"},
|
||||
{file = "idna-3.6.tar.gz", hash = "sha256:9ecdbbd083b06798ae1e86adcbfe8ab1479cf864e4ee30fe4e46a003d12491ca"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "iniconfig"
|
||||
version = "2.0.0"
|
||||
description = "brain-dead simple config-ini parsing"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
{file = "iniconfig-2.0.0-py3-none-any.whl", hash = "sha256:b6a85871a79d2e3b22d2d1b94ac2824226a63c6b741c88f7ae975f18b6778374"},
|
||||
{file = "iniconfig-2.0.0.tar.gz", hash = "sha256:2d91e135bf72d31a410b17c16da610a82cb55f6b0477d1a902134b24a455b8b3"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "nodeenv"
|
||||
version = "1.8.0"
|
||||
description = "Node.js virtual environment builder"
|
||||
optional = false
|
||||
python-versions = ">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*"
|
||||
files = [
|
||||
{file = "nodeenv-1.8.0-py2.py3-none-any.whl", hash = "sha256:df865724bb3c3adc86b3876fa209771517b0cfe596beff01a92700e0e8be4cec"},
|
||||
{file = "nodeenv-1.8.0.tar.gz", hash = "sha256:d51e0c37e64fbf47d017feac3145cdbb58836d7eee8c6f6d3b6880c5456227d2"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
setuptools = "*"
|
||||
|
||||
[[package]]
|
||||
name = "packaging"
|
||||
version = "24.0"
|
||||
description = "Core utilities for Python packages"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
{file = "packaging-24.0-py3-none-any.whl", hash = "sha256:2ddfb553fdf02fb784c234c7ba6ccc288296ceabec964ad2eae3777778130bc5"},
|
||||
{file = "packaging-24.0.tar.gz", hash = "sha256:eb82c5e3e56209074766e6885bb04b8c38a0c015d0a30036ebe7ece34c9989e9"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "platformdirs"
|
||||
version = "4.2.0"
|
||||
description = "A small Python package for determining appropriate platform-specific dirs, e.g. a \"user data dir\"."
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "platformdirs-4.2.0-py3-none-any.whl", hash = "sha256:0614df2a2f37e1a662acbd8e2b25b92ccf8632929bc6d43467e17fe89c75e068"},
|
||||
{file = "platformdirs-4.2.0.tar.gz", hash = "sha256:ef0cc731df711022c174543cb70a9b5bd22e5a9337c8624ef2c2ceb8ddad8768"},
|
||||
]
|
||||
|
||||
[package.extras]
|
||||
docs = ["furo (>=2023.9.10)", "proselint (>=0.13)", "sphinx (>=7.2.6)", "sphinx-autodoc-typehints (>=1.25.2)"]
|
||||
test = ["appdirs (==1.4.4)", "covdefaults (>=2.3)", "pytest (>=7.4.3)", "pytest-cov (>=4.1)", "pytest-mock (>=3.12)"]
|
||||
|
||||
[[package]]
|
||||
name = "pluggy"
|
||||
version = "1.4.0"
|
||||
description = "plugin and hook calling mechanisms for python"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "pluggy-1.4.0-py3-none-any.whl", hash = "sha256:7db9f7b503d67d1c5b95f59773ebb58a8c1c288129a88665838012cfb07b8981"},
|
||||
{file = "pluggy-1.4.0.tar.gz", hash = "sha256:8c85c2876142a764e5b7548e7d9a0e0ddb46f5185161049a79b7e974454223be"},
|
||||
]
|
||||
|
||||
[package.extras]
|
||||
dev = ["pre-commit", "tox"]
|
||||
testing = ["pytest", "pytest-benchmark"]
|
||||
|
||||
[[package]]
|
||||
name = "pre-commit"
|
||||
version = "3.6.2"
|
||||
description = "A framework for managing and maintaining multi-language pre-commit hooks."
|
||||
optional = false
|
||||
python-versions = ">=3.9"
|
||||
files = [
|
||||
{file = "pre_commit-3.6.2-py2.py3-none-any.whl", hash = "sha256:ba637c2d7a670c10daedc059f5c49b5bd0aadbccfcd7ec15592cf9665117532c"},
|
||||
{file = "pre_commit-3.6.2.tar.gz", hash = "sha256:c3ef34f463045c88658c5b99f38c1e297abdcc0ff13f98d3370055fbbfabc67e"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
cfgv = ">=2.0.0"
|
||||
identify = ">=1.0.0"
|
||||
nodeenv = ">=0.11.1"
|
||||
pyyaml = ">=5.1"
|
||||
virtualenv = ">=20.10.0"
|
||||
|
||||
[[package]]
|
||||
name = "pytest"
|
||||
version = "8.1.1"
|
||||
description = "pytest: simple powerful testing with Python"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "pytest-8.1.1-py3-none-any.whl", hash = "sha256:2a8386cfc11fa9d2c50ee7b2a57e7d898ef90470a7a34c4b949ff59662bb78b7"},
|
||||
{file = "pytest-8.1.1.tar.gz", hash = "sha256:ac978141a75948948817d360297b7aae0fcb9d6ff6bc9ec6d514b85d5a65c044"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
colorama = {version = "*", markers = "sys_platform == \"win32\""}
|
||||
exceptiongroup = {version = ">=1.0.0rc8", markers = "python_version < \"3.11\""}
|
||||
iniconfig = "*"
|
||||
packaging = "*"
|
||||
pluggy = ">=1.4,<2.0"
|
||||
tomli = {version = ">=1", markers = "python_version < \"3.11\""}
|
||||
|
||||
[package.extras]
|
||||
testing = ["argcomplete", "attrs (>=19.2)", "hypothesis (>=3.56)", "mock", "pygments (>=2.7.2)", "requests", "setuptools", "xmlschema"]
|
||||
|
||||
[[package]]
|
||||
name = "pytest-print"
|
||||
version = "1.0.0"
|
||||
description = "pytest-print adds the printer fixture you can use to print messages to the user (directly to the pytest runner, not stdout)"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "pytest_print-1.0.0-py3-none-any.whl", hash = "sha256:23484f42b906b87e31abd564761efffeb0348a6f83109fb857ee6e8e5df42b69"},
|
||||
{file = "pytest_print-1.0.0.tar.gz", hash = "sha256:1fcde9945fba462227a8959271369b10bb7a193be8452162707e63cd60875ca0"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
pytest = ">=7.4"
|
||||
|
||||
[package.extras]
|
||||
test = ["covdefaults (>=2.3)", "coverage (>=7.3)", "pytest-mock (>=3.11.1)"]
|
||||
|
||||
[[package]]
|
||||
name = "pyyaml"
|
||||
version = "6.0.1"
|
||||
description = "YAML parser and emitter for Python"
|
||||
optional = false
|
||||
python-versions = ">=3.6"
|
||||
files = [
|
||||
{file = "PyYAML-6.0.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d858aa552c999bc8a8d57426ed01e40bef403cd8ccdd0fc5f6f04a00414cac2a"},
|
||||
{file = "PyYAML-6.0.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:fd66fc5d0da6d9815ba2cebeb4205f95818ff4b79c3ebe268e75d961704af52f"},
|
||||
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
|
||||
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
|
||||
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
|
||||
{file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
|
||||
{file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
|
||||
{file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
|
||||
{file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
|
||||
{file = "PyYAML-6.0.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f003ed9ad21d6a4713f0a9b5a7a0a79e08dd0f221aff4525a2be4c346ee60aab"},
|
||||
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
|
||||
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
|
||||
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
|
||||
{file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
|
||||
{file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
|
||||
{file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
|
||||
{file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
|
||||
{file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
|
||||
{file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
|
||||
{file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
|
||||
{file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
|
||||
{file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
|
||||
{file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
|
||||
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
|
||||
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
|
||||
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:afd7e57eddb1a54f0f1a974bc4391af8bcce0b444685d936840f125cf046d5bd"},
|
||||
{file = "PyYAML-6.0.1-cp36-cp36m-win32.whl", hash = "sha256:fca0e3a251908a499833aa292323f32437106001d436eca0e6e7833256674585"},
|
||||
{file = "PyYAML-6.0.1-cp36-cp36m-win_amd64.whl", hash = "sha256:f22ac1c3cac4dbc50079e965eba2c1058622631e526bd9afd45fedd49ba781fa"},
|
||||
{file = "PyYAML-6.0.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:b1275ad35a5d18c62a7220633c913e1b42d44b46ee12554e5fd39c70a243d6a3"},
|
||||
{file = "PyYAML-6.0.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:18aeb1bf9a78867dc38b259769503436b7c72f7a1f1f4c93ff9a17de54319b27"},
|
||||
{file = "PyYAML-6.0.1-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:596106435fa6ad000c2991a98fa58eeb8656ef2325d7e158344fb33864ed87e3"},
|
||||
{file = "PyYAML-6.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:baa90d3f661d43131ca170712d903e6295d1f7a0f595074f151c0aed377c9b9c"},
|
||||
{file = "PyYAML-6.0.1-cp37-cp37m-win32.whl", hash = "sha256:9046c58c4395dff28dd494285c82ba00b546adfc7ef001486fbf0324bc174fba"},
|
||||
{file = "PyYAML-6.0.1-cp37-cp37m-win_amd64.whl", hash = "sha256:4fb147e7a67ef577a588a0e2c17b6db51dda102c71de36f8549b6816a96e1867"},
|
||||
{file = "PyYAML-6.0.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1d4c7e777c441b20e32f52bd377e0c409713e8bb1386e1099c2415f26e479595"},
|
||||
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
|
||||
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
|
||||
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
|
||||
{file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
|
||||
{file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
|
||||
{file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
|
||||
{file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
|
||||
{file = "PyYAML-6.0.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:c8098ddcc2a85b61647b2590f825f3db38891662cfc2fc776415143f599bb859"},
|
||||
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
|
||||
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
|
||||
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
|
||||
{file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
|
||||
{file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
|
||||
{file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
|
||||
{file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "requests"
|
||||
version = "2.31.0"
|
||||
description = "Python HTTP for Humans."
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
{file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
|
||||
{file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
certifi = ">=2017.4.17"
|
||||
charset-normalizer = ">=2,<4"
|
||||
idna = ">=2.5,<4"
|
||||
urllib3 = ">=1.21.1,<3"
|
||||
|
||||
[package.extras]
|
||||
socks = ["PySocks (>=1.5.6,!=1.5.7)"]
|
||||
use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
|
||||
|
||||
[[package]]
|
||||
name = "ruff"
|
||||
version = "0.3.3"
|
||||
description = "An extremely fast Python linter and code formatter, written in Rust."
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
{file = "ruff-0.3.3-py3-none-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl", hash = "sha256:973a0e388b7bc2e9148c7f9be8b8c6ae7471b9be37e1cc732f8f44a6f6d7720d"},
|
||||
{file = "ruff-0.3.3-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:cfa60d23269d6e2031129b053fdb4e5a7b0637fc6c9c0586737b962b2f834493"},
|
||||
{file = "ruff-0.3.3-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1eca7ff7a47043cf6ce5c7f45f603b09121a7cc047447744b029d1b719278eb5"},
|
||||
{file = "ruff-0.3.3-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e7d3f6762217c1da954de24b4a1a70515630d29f71e268ec5000afe81377642d"},
|
||||
{file = "ruff-0.3.3-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b24c19e8598916d9c6f5a5437671f55ee93c212a2c4c569605dc3842b6820386"},
|
||||
{file = "ruff-0.3.3-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:5a6cbf216b69c7090f0fe4669501a27326c34e119068c1494f35aaf4cc683778"},
|
||||
{file = "ruff-0.3.3-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:352e95ead6964974b234e16ba8a66dad102ec7bf8ac064a23f95371d8b198aab"},
|
||||
{file = "ruff-0.3.3-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d6ab88c81c4040a817aa432484e838aaddf8bfd7ca70e4e615482757acb64f8"},
|
||||
{file = "ruff-0.3.3-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:79bca3a03a759cc773fca69e0bdeac8abd1c13c31b798d5bb3c9da4a03144a9f"},
|
||||
{file = "ruff-0.3.3-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:2700a804d5336bcffe063fd789ca2c7b02b552d2e323a336700abb8ae9e6a3f8"},
|
||||
{file = "ruff-0.3.3-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:fd66469f1a18fdb9d32e22b79f486223052ddf057dc56dea0caaf1a47bdfaf4e"},
|
||||
{file = "ruff-0.3.3-py3-none-musllinux_1_2_i686.whl", hash = "sha256:45817af234605525cdf6317005923bf532514e1ea3d9270acf61ca2440691376"},
|
||||
{file = "ruff-0.3.3-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:0da458989ce0159555ef224d5b7c24d3d2e4bf4c300b85467b08c3261c6bc6a8"},
|
||||
{file = "ruff-0.3.3-py3-none-win32.whl", hash = "sha256:f2831ec6a580a97f1ea82ea1eda0401c3cdf512cf2045fa3c85e8ef109e87de0"},
|
||||
{file = "ruff-0.3.3-py3-none-win_amd64.whl", hash = "sha256:be90bcae57c24d9f9d023b12d627e958eb55f595428bafcb7fec0791ad25ddfc"},
|
||||
{file = "ruff-0.3.3-py3-none-win_arm64.whl", hash = "sha256:0171aab5fecdc54383993389710a3d1227f2da124d76a2784a7098e818f92d61"},
|
||||
{file = "ruff-0.3.3.tar.gz", hash = "sha256:38671be06f57a2f8aba957d9f701ea889aa5736be806f18c0cd03d6ff0cbca8d"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "setuptools"
|
||||
version = "69.2.0"
|
||||
description = "Easily download, build, install, upgrade, and uninstall Python packages"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "setuptools-69.2.0-py3-none-any.whl", hash = "sha256:c21c49fb1042386df081cb5d86759792ab89efca84cf114889191cd09aacc80c"},
|
||||
{file = "setuptools-69.2.0.tar.gz", hash = "sha256:0ff4183f8f42cd8fa3acea16c45205521a4ef28f73c6391d8a25e92893134f2e"},
|
||||
]
|
||||
|
||||
[package.extras]
|
||||
docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (<7.2.5)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
|
||||
testing = ["build[virtualenv]", "filelock (>=3.4.0)", "importlib-metadata", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "mypy (==1.9)", "packaging (>=23.2)", "pip (>=19.1)", "pytest (>=6)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-home (>=0.5)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff (>=0.2.1)", "pytest-timeout", "pytest-xdist (>=3)", "tomli", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
|
||||
testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.2)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
|
||||
|
||||
[[package]]
|
||||
name = "six"
|
||||
version = "1.16.0"
|
||||
description = "Python 2 and 3 compatibility utilities"
|
||||
optional = false
|
||||
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*"
|
||||
files = [
|
||||
{file = "six-1.16.0-py2.py3-none-any.whl", hash = "sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254"},
|
||||
{file = "six-1.16.0.tar.gz", hash = "sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "tomli"
|
||||
version = "2.0.1"
|
||||
description = "A lil' TOML parser"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
{file = "tomli-2.0.1-py3-none-any.whl", hash = "sha256:939de3e7a6161af0c887ef91b7d41a53e7c5a1ca976325f429cb46ea9bc30ecc"},
|
||||
{file = "tomli-2.0.1.tar.gz", hash = "sha256:de526c12914f0c550d15924c62d72abc48d6fe7364aa87328337a31007fe8a4f"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "unidecode"
|
||||
version = "1.3.8"
|
||||
description = "ASCII transliterations of Unicode text"
|
||||
optional = false
|
||||
python-versions = ">=3.5"
|
||||
files = [
|
||||
{file = "Unidecode-1.3.8-py3-none-any.whl", hash = "sha256:d130a61ce6696f8148a3bd8fe779c99adeb4b870584eeb9526584e9aa091fd39"},
|
||||
{file = "Unidecode-1.3.8.tar.gz", hash = "sha256:cfdb349d46ed3873ece4586b96aa75258726e2fa8ec21d6f00a591d98806c2f4"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "urllib3"
|
||||
version = "2.2.1"
|
||||
description = "HTTP library with thread-safe connection pooling, file post, and more."
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "urllib3-2.2.1-py3-none-any.whl", hash = "sha256:450b20ec296a467077128bff42b73080516e71b56ff59a60a02bef2232c4fa9d"},
|
||||
{file = "urllib3-2.2.1.tar.gz", hash = "sha256:d0570876c61ab9e520d776c38acbbb5b05a776d3f9ff98a5c8fd5162a444cf19"},
|
||||
]
|
||||
|
||||
[package.extras]
|
||||
brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
|
||||
h2 = ["h2 (>=4,<5)"]
|
||||
socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
|
||||
zstd = ["zstandard (>=0.18.0)"]
|
||||
|
||||
[[package]]
|
||||
name = "virtualenv"
|
||||
version = "20.25.1"
|
||||
description = "Virtual Python Environment builder"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
{file = "virtualenv-20.25.1-py3-none-any.whl", hash = "sha256:961c026ac520bac5f69acb8ea063e8a4f071bcc9457b9c1f28f6b085c511583a"},
|
||||
{file = "virtualenv-20.25.1.tar.gz", hash = "sha256:e08e13ecdca7a0bd53798f356d5831434afa5b07b93f0abdf0797b7a06ffe197"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
distlib = ">=0.3.7,<1"
|
||||
filelock = ">=3.12.2,<4"
|
||||
platformdirs = ">=3.9.1,<5"
|
||||
|
||||
[package.extras]
|
||||
docs = ["furo (>=2023.7.26)", "proselint (>=0.13)", "sphinx (>=7.1.2)", "sphinx-argparse (>=0.4)", "sphinxcontrib-towncrier (>=0.2.1a0)", "towncrier (>=23.6)"]
|
||||
test = ["covdefaults (>=2.3)", "coverage (>=7.2.7)", "coverage-enable-subprocess (>=1)", "flaky (>=3.7)", "packaging (>=23.1)", "pytest (>=7.4)", "pytest-env (>=0.8.2)", "pytest-freezer (>=0.4.8)", "pytest-mock (>=3.11.1)", "pytest-randomly (>=3.12)", "pytest-timeout (>=2.1)", "setuptools (>=68)", "time-machine (>=2.10)"]
|
||||
|
||||
[metadata]
|
||||
lock-version = "2.0"
|
||||
python-versions = ">=3.9.1,<3.10"
|
||||
content-hash = "1bb724694792fbc2b3c05e3355e6c25305d9f4034eb7b1b4b1791ee95427f8d2"
|
||||
3
poetry.toml
Normal file
3
poetry.toml
Normal file
|
|
@ -0,0 +1,3 @@
|
|||
[virtualenvs]
|
||||
in-project = true
|
||||
create = true
|
||||
102
pyproject.toml
Normal file
102
pyproject.toml
Normal file
|
|
@ -0,0 +1,102 @@
|
|||
# WARNING: This file is used only for development done on this addon.
|
||||
# Be aware that dependencies used here might not match the ones used by
|
||||
# the specific addon bundle set up on the AYON server. This file should
|
||||
# be used only for local development and CI/CD purposes.
|
||||
|
||||
[tool.poetry]
|
||||
name = "ayon-core"
|
||||
version = "0.3.0"
|
||||
description = ""
|
||||
authors = ["Ynput Team <team@ynput.io>"]
|
||||
readme = "README.md"
|
||||
|
||||
[tool.poetry.dependencies]
|
||||
python = ">=3.9.1,<3.10"
|
||||
|
||||
|
||||
[tool.poetry.dev-dependencies]
|
||||
# test dependencies
|
||||
pytest = "^8.0"
|
||||
pytest-print = "^1.0"
|
||||
ayon-python-api = "^1.0"
|
||||
# linting dependencies
|
||||
ruff = "^0.3.3"
|
||||
pre-commit = "^3.6.2"
|
||||
codespell = "^2.2.6"
|
||||
|
||||
|
||||
[tool.ruff]
|
||||
# Exclude a variety of commonly ignored directories.
|
||||
exclude = [
|
||||
".bzr",
|
||||
".direnv",
|
||||
".eggs",
|
||||
".git",
|
||||
".git-rewrite",
|
||||
".hg",
|
||||
".ipynb_checkpoints",
|
||||
".mypy_cache",
|
||||
".nox",
|
||||
".pants.d",
|
||||
".pyenv",
|
||||
".pytest_cache",
|
||||
".pytype",
|
||||
".ruff_cache",
|
||||
".svn",
|
||||
".tox",
|
||||
".venv",
|
||||
".vscode",
|
||||
"__pypackages__",
|
||||
"_build",
|
||||
"buck-out",
|
||||
"build",
|
||||
"dist",
|
||||
"node_modules",
|
||||
"site-packages",
|
||||
"venv",
|
||||
"vendor",
|
||||
"generated",
|
||||
]
|
||||
|
||||
# Same as Black.
|
||||
line-length = 79
|
||||
indent-width = 4
|
||||
|
||||
# Assume Python 3.9
|
||||
target-version = "py39"
|
||||
|
||||
[tool.ruff.lint]
|
||||
# Enable Pyflakes (`F`) and a subset of the pycodestyle (`E`) codes by default.
|
||||
select = ["E4", "E7", "E9", "F"]
|
||||
ignore = []
|
||||
|
||||
# Allow fix for all enabled rules (when `--fix`) is provided.
|
||||
fixable = ["ALL"]
|
||||
unfixable = []
|
||||
|
||||
# Allow unused variables when underscore-prefixed.
|
||||
dummy-variable-rgx = "^(_+|(_+[a-zA-Z0-9_]*[a-zA-Z0-9]+?))$"
|
||||
|
||||
[tool.ruff.format]
|
||||
# Like Black, use double quotes for strings.
|
||||
quote-style = "double"
|
||||
|
||||
# Like Black, indent with spaces, rather than tabs.
|
||||
indent-style = "space"
|
||||
|
||||
# Like Black, respect magic trailing commas.
|
||||
skip-magic-trailing-comma = false
|
||||
|
||||
# Like Black, automatically detect the appropriate line ending.
|
||||
line-ending = "auto"
|
||||
|
||||
[tool.codespell]
|
||||
# Ignore words that are not in the dictionary.
|
||||
ignore-words-list = "ayon,ynput"
|
||||
skip = "./.*,./package/*,*/vendor/*,*/unreal/integration/*,*/aftereffects/api/extension/js/libs/*"
|
||||
count = true
|
||||
quiet-level = 3
|
||||
|
||||
[build-system]
|
||||
requires = ["poetry-core"]
|
||||
build-backend = "poetry.core.masonry.api"
|
||||
|
|
@ -1225,6 +1225,32 @@
|
|||
}
|
||||
]
|
||||
},
|
||||
"zbrush": {
|
||||
"enabled": true,
|
||||
"label": "Zbrush",
|
||||
"icon": "{}/app_icons/zbrush.png",
|
||||
"host_name": "zbrush",
|
||||
"environment": "{\n \"ZBRUSH_PLUGIN_PATH\": [\n \"{ZBRUSH_PLUGIN_PATH}\",\n \"{OPENPYPE_STUDIO_PLUGINS}/zbrush/api/zscripts\"\n ]\n}",
|
||||
"variants": [
|
||||
{
|
||||
"name": "2024",
|
||||
"use_python_2": false,
|
||||
"executables": {
|
||||
"windows": [
|
||||
"C:\\Program Files\\Maxon ZBrush 2024\\ZBrush.exe"
|
||||
],
|
||||
"darwin": [],
|
||||
"linux": []
|
||||
},
|
||||
"arguments": {
|
||||
"windows": [],
|
||||
"darwin": [],
|
||||
"linux": []
|
||||
},
|
||||
"environment": "{}"
|
||||
}
|
||||
]
|
||||
},
|
||||
"additional_apps": []
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -188,6 +188,8 @@ class ApplicationsSettings(BaseSettingsModel):
|
|||
default_factory=AppGroupWithPython, title="Wrap")
|
||||
openrv: AppGroup = SettingsField(
|
||||
default_factory=AppGroupWithPython, title="OpenRV")
|
||||
zbrush: AppGroup = SettingsField(
|
||||
default_factory=AppGroupWithPython, title="Zbrush")
|
||||
additional_apps: list[AdditionalAppGroup] = SettingsField(
|
||||
default_factory=list, title="Additional Applications")
|
||||
|
||||
|
|
|
|||
|
|
@ -1 +1 @@
|
|||
__version__ = "0.1.7"
|
||||
__version__ = "0.1.8"
|
||||
|
|
|
|||
|
|
@ -30,6 +30,15 @@ class ExtMappingItemModel(BaseSettingsModel):
|
|||
class MayaSettings(BaseSettingsModel):
|
||||
"""Maya Project Settings."""
|
||||
|
||||
use_cbid_workflow: bool = SettingsField(
|
||||
True, title="Use cbId workflow",
|
||||
description=(
|
||||
"When enabled, a per node `cbId` identifier will be created and "
|
||||
"validated for many product types. This is then used for look "
|
||||
"publishing and many others. By disabling this, the `cbId` "
|
||||
"attribute will still be created on scene save but it will not "
|
||||
"be validated."))
|
||||
|
||||
open_workfile_post_initialization: bool = SettingsField(
|
||||
True, title="Open Workfile Post Initialization")
|
||||
explicit_plugins_loading: ExplicitPluginsLoadingModel = SettingsField(
|
||||
|
|
@ -88,6 +97,7 @@ DEFAULT_MEL_WORKSPACE_SETTINGS = "\n".join((
|
|||
))
|
||||
|
||||
DEFAULT_MAYA_SETTING = {
|
||||
"use_cbid_workflow": True,
|
||||
"open_workfile_post_initialization": True,
|
||||
"explicit_plugins_loading": DEFAULT_EXPLITCIT_PLUGINS_LOADING_SETTINGS,
|
||||
"imageio": DEFAULT_IMAGEIO_SETTINGS,
|
||||
|
|
|
|||
|
|
@ -1,3 +1,3 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Package declaring addon version."""
|
||||
__version__ = "0.1.10"
|
||||
__version__ = "0.1.11"
|
||||
|
|
|
|||
283
tools/manage.ps1
Executable file
283
tools/manage.ps1
Executable file
|
|
@ -0,0 +1,283 @@
|
|||
<#
|
||||
.SYNOPSIS
|
||||
Helper script to run various tasks on ayon-core addon repository.
|
||||
|
||||
.DESCRIPTION
|
||||
This script will detect Python installation, and build OpenPype to `build`
|
||||
directory using existing virtual environment created by Poetry (or
|
||||
by running `/tools/create_venv.ps1`). It will then shuffle dependencies in
|
||||
build folder to optimize for different Python versions (2/3) in Python host.
|
||||
|
||||
.EXAMPLE
|
||||
|
||||
PS> .\tools\manage.ps1
|
||||
|
||||
.EXAMPLE
|
||||
|
||||
To create virtual environment using Poetry:
|
||||
PS> .\tools\manage.ps1 create-env
|
||||
|
||||
.EXAMPLE
|
||||
|
||||
To run Ruff check:
|
||||
PS> .\tools\manage.ps1 ruff-check
|
||||
|
||||
.LINK
|
||||
https://github.com/ynput/ayon-core
|
||||
|
||||
#>
|
||||
|
||||
# Settings and gitmodule init
|
||||
$CurrentDir = Get-Location
|
||||
$ScriptDir = Split-Path -Path $MyInvocation.MyCommand.Definition -Parent
|
||||
$RepoRoot = (Get-Item $ScriptDir).parent.FullName
|
||||
& git submodule update --init --recursive
|
||||
$env:PSModulePath = $env:PSModulePath + ";$($openpype_root)\tools\modules\powershell"
|
||||
|
||||
$FunctionName=$ARGS[0]
|
||||
$Arguments=@()
|
||||
if ($ARGS.Length -gt 1) {
|
||||
$Arguments = $ARGS[1..($ARGS.Length - 1)]
|
||||
}
|
||||
|
||||
function Exit-WithCode($exitcode) {
|
||||
# Only exit this host process if it's a child of another PowerShell parent process...
|
||||
$parentPID = (Get-CimInstance -ClassName Win32_Process -Filter "ProcessId=$PID" | Select-Object -Property ParentProcessId).ParentProcessId
|
||||
$parentProcName = (Get-CimInstance -ClassName Win32_Process -Filter "ProcessId=$parentPID" | Select-Object -Property Name).Name
|
||||
if ('powershell.exe' -eq $parentProcName) { $host.SetShouldExit($exitcode) }
|
||||
|
||||
exit $exitcode
|
||||
}
|
||||
|
||||
function Test-CommandExists {
|
||||
param (
|
||||
[Parameter(Mandatory=$true)]
|
||||
[string]$command
|
||||
)
|
||||
|
||||
$commandExists = $null -ne (Get-Command $command -ErrorAction SilentlyContinue)
|
||||
return $commandExists
|
||||
}
|
||||
|
||||
function Write-Info {
|
||||
<#
|
||||
.SYNOPSIS
|
||||
Write-Info function to write information messages.
|
||||
|
||||
It uses Write-Color if that is available, otherwise falls back to Write-Host.
|
||||
|
||||
#>
|
||||
[CmdletBinding()]
|
||||
param (
|
||||
[alias ('T')] [String[]]$Text,
|
||||
[alias ('C', 'ForegroundColor', 'FGC')] [ConsoleColor[]]$Color = [ConsoleColor]::White,
|
||||
[alias ('B', 'BGC')] [ConsoleColor[]]$BackGroundColor = $null,
|
||||
[alias ('Indent')][int] $StartTab = 0,
|
||||
[int] $LinesBefore = 0,
|
||||
[int] $LinesAfter = 0,
|
||||
[int] $StartSpaces = 0,
|
||||
[alias ('L')] [string] $LogFile = '',
|
||||
[Alias('DateFormat', 'TimeFormat')][string] $DateTimeFormat = 'yyyy-MM-dd HH:mm:ss',
|
||||
[alias ('LogTimeStamp')][bool] $LogTime = $true,
|
||||
[int] $LogRetry = 2,
|
||||
[ValidateSet('unknown', 'string', 'unicode', 'bigendianunicode', 'utf8', 'utf7', 'utf32', 'ascii', 'default', 'oem')][string]$Encoding = 'Unicode',
|
||||
[switch] $ShowTime,
|
||||
[switch] $NoNewLine
|
||||
)
|
||||
if (Test-CommandExists "Write-Color") {
|
||||
Write-Color -Text $Text -Color $Color -BackGroundColor $BackGroundColor -StartTab $StartTab -LinesBefore $LinesBefore -LinesAfter $LinesAfter -StartSpaces $StartSpaces -LogFile $LogFile -DateTimeFormat $DateTimeFormat -LogTime $LogTime -LogRetry $LogRetry -Encoding $Encoding -ShowTime $ShowTime -NoNewLine $NoNewLine
|
||||
} else {
|
||||
$message = $Text -join ' '
|
||||
if ($NoNewLine)
|
||||
{
|
||||
Write-Host $message -NoNewline
|
||||
}
|
||||
else
|
||||
{
|
||||
Write-Host $message
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
$art = @"
|
||||
|
||||
▄██▄
|
||||
▄███▄ ▀██▄ ▀██▀ ▄██▀ ▄██▀▀▀██▄ ▀███▄ █▄
|
||||
▄▄ ▀██▄ ▀██▄ ▄██▀ ██▀ ▀██▄ ▄ ▀██▄ ███
|
||||
▄██▀ ██▄ ▀ ▄▄ ▀ ██ ▄██ ███ ▀██▄ ███
|
||||
▄██▀ ▀██▄ ██ ▀██▄ ▄██▀ ███ ▀██ ▀█▀
|
||||
▄██▀ ▀██▄ ▀█ ▀██▄▄▄▄██▀ █▀ ▀██▄
|
||||
|
||||
· · - =[ by YNPUT ]:[ http://ayon.ynput.io ]= - · ·
|
||||
|
||||
"@
|
||||
|
||||
function Write-AsciiArt() {
|
||||
Write-Host $art -ForegroundColor DarkGreen
|
||||
}
|
||||
|
||||
function Show-PSWarning() {
|
||||
if ($PSVersionTable.PSVersion.Major -lt 7) {
|
||||
Write-Info -Text "!!! ", "You are using old version of PowerShell - ", "$($PSVersionTable.PSVersion.Major).$($PSVersionTable.PSVersion.Minor)" -Color Red, Yellow, White
|
||||
Write-Info -Text " Please update to at least 7.0 - ", "https://github.com/PowerShell/PowerShell/releases" -Color Yellow, White
|
||||
Exit-WithCode 1
|
||||
}
|
||||
}
|
||||
|
||||
function Install-Poetry() {
|
||||
Write-Info -Text ">>> ", "Installing Poetry ... " -Color Green, Gray
|
||||
$python = "python"
|
||||
if (Get-Command "pyenv" -ErrorAction SilentlyContinue) {
|
||||
if (-not (Test-Path -PathType Leaf -Path "$($RepoRoot)\.python-version")) {
|
||||
$result = & pyenv global
|
||||
if ($result -eq "no global version configured") {
|
||||
Write-Info "!!! Using pyenv but having no local or global version of Python set." -Color Red, Yellow
|
||||
Exit-WithCode 1
|
||||
}
|
||||
}
|
||||
$python = & pyenv which python
|
||||
|
||||
}
|
||||
|
||||
$env:POETRY_HOME="$RepoRoot\.poetry"
|
||||
(Invoke-WebRequest -Uri https://install.python-poetry.org/ -UseBasicParsing).Content | & $($python) -
|
||||
}
|
||||
|
||||
function Set-Cwd() {
|
||||
Set-Location -Path $RepoRoot
|
||||
}
|
||||
|
||||
function Restore-Cwd() {
|
||||
$tmp_current_dir = Get-Location
|
||||
if ("$tmp_current_dir" -ne "$CurrentDir") {
|
||||
Write-Info -Text ">>> ", "Restoring current directory" -Color Green, Gray
|
||||
Set-Location -Path $CurrentDir
|
||||
}
|
||||
}
|
||||
|
||||
function Initialize-Environment {
|
||||
Write-Info -Text ">>> ", "Reading Poetry ... " -Color Green, Gray -NoNewline
|
||||
if (-not(Test-Path -PathType Container -Path "$( $env:POETRY_HOME )\bin"))
|
||||
{
|
||||
Write-Info -Text "NOT FOUND" -Color Yellow
|
||||
Install-Poetry
|
||||
Write-Info -Text "INSTALLED" -Color Cyan
|
||||
}
|
||||
else
|
||||
{
|
||||
Write-Info -Text "OK" -Color Green
|
||||
}
|
||||
|
||||
if (-not(Test-Path -PathType Leaf -Path "$( $repo_root )\poetry.lock"))
|
||||
{
|
||||
Write-Info -Text ">>> ", "Installing virtual environment and creating lock." -Color Green, Gray
|
||||
}
|
||||
else
|
||||
{
|
||||
Write-Info -Text ">>> ", "Installing virtual environment from lock." -Color Green, Gray
|
||||
}
|
||||
$startTime = [int][double]::Parse((Get-Date -UFormat %s))
|
||||
& "$env:POETRY_HOME\bin\poetry" config virtualenvs.in-project true --local
|
||||
& "$env:POETRY_HOME\bin\poetry" config virtualenvs.create true --local
|
||||
& "$env:POETRY_HOME\bin\poetry" install --no-root $poetry_verbosity --ansi
|
||||
if ($LASTEXITCODE -ne 0)
|
||||
{
|
||||
Write-Info -Text "!!! ", "Poetry command failed." -Color Red, Yellow
|
||||
Restore-Cwd
|
||||
Exit-WithCode 1
|
||||
}
|
||||
if (Test-Path -PathType Container -Path "$( $repo_root )\.git")
|
||||
{
|
||||
Write-Info -Text ">>> ", "Installing pre-commit hooks ..." -Color Green, White
|
||||
& "$env:POETRY_HOME\bin\poetry" run pre-commit install
|
||||
if ($LASTEXITCODE -ne 0)
|
||||
{
|
||||
Write-Info -Text "!!! ", "Installation of pre-commit hooks failed." -Color Red, Yellow
|
||||
}
|
||||
}
|
||||
$endTime = [int][double]::Parse((Get-Date -UFormat %s))
|
||||
Restore-Cwd
|
||||
try
|
||||
{
|
||||
if (Test-CommandExists "New-BurntToastNotification")
|
||||
{
|
||||
$app_logo = "$repo_root\tools\icons\ayon.ico"
|
||||
New-BurntToastNotification -AppLogo "$app_logo" -Text "AYON", "Virtual environment created.", "All done in $( $endTime - $startTime ) secs."
|
||||
}
|
||||
}
|
||||
catch {}
|
||||
Write-Info -Text ">>> ", "Virtual environment created." -Color Green, White
|
||||
}
|
||||
|
||||
function Invoke-Ruff {
|
||||
param (
|
||||
[switch] $Fix
|
||||
)
|
||||
$Poetry = "$RepoRoot\.poetry\bin\poetry.exe"
|
||||
$RuffArgs = @( "run", "ruff", "check" )
|
||||
if ($Fix) {
|
||||
$RuffArgs += "--fix"
|
||||
}
|
||||
& $Poetry $RuffArgs
|
||||
}
|
||||
|
||||
function Invoke-Codespell {
|
||||
param (
|
||||
[switch] $Fix
|
||||
)
|
||||
$Poetry = "$RepoRoot\.poetry\bin\poetry.exe"
|
||||
$CodespellArgs = @( "run", "codespell" )
|
||||
if ($Fix) {
|
||||
$CodespellArgs += "--fix"
|
||||
}
|
||||
& $Poetry $CodespellArgs
|
||||
}
|
||||
|
||||
function Write-Help {
|
||||
<#
|
||||
.SYNOPSIS
|
||||
Write-Help function to write help messages.
|
||||
#>
|
||||
Write-Host ""
|
||||
Write-Host "AYON Addon management script"
|
||||
Write-Host ""
|
||||
Write-Info -Text "Usage: ", "./manage.ps1 ", "[command]" -Color Gray, White, Cyan
|
||||
Write-Host ""
|
||||
Write-Host "Commands:"
|
||||
Write-Info -Text " create-env ", "Install Poetry and update venv by lock file" -Color White, Cyan
|
||||
Write-Info -Text " ruff-check ", "Run Ruff check for the repository" -Color White, Cyan
|
||||
Write-Info -Text " ruff-fix ", "Run Ruff fix for the repository" -Color White, Cyan
|
||||
Write-Info -Text " codespell ", "Run codespell check for the repository" -Color White, Cyan
|
||||
Write-Host ""
|
||||
}
|
||||
|
||||
function Resolve-Function {
|
||||
if ($null -eq $FunctionName) {
|
||||
Write-Help
|
||||
return
|
||||
}
|
||||
$FunctionName = $FunctionName.ToLower() -replace "\W"
|
||||
if ($FunctionName -eq "createenv") {
|
||||
Set-Cwd
|
||||
Initialize-Environment
|
||||
} elseif ($FunctionName -eq "ruffcheck") {
|
||||
Set-Cwd
|
||||
Invoke-Ruff
|
||||
} elseif ($FunctionName -eq "rufffix") {
|
||||
Set-Cwd
|
||||
Invoke-Ruff -Fix
|
||||
} elseif ($FunctionName -eq "codespell") {
|
||||
Set-Cwd
|
||||
Invoke-CodeSpell
|
||||
} else {
|
||||
Write-Host "Unknown function ""$FunctionName"""
|
||||
Write-Help
|
||||
}
|
||||
}
|
||||
|
||||
# -----------------------------------------------------
|
||||
|
||||
Show-PSWarning
|
||||
Write-AsciiArt
|
||||
|
||||
Resolve-Function
|
||||
222
tools/manage.sh
Executable file
222
tools/manage.sh
Executable file
|
|
@ -0,0 +1,222 @@
|
|||
# Colors for terminal
|
||||
|
||||
RST='\033[0m' # Text Reset
|
||||
|
||||
# Regular Colors
|
||||
Black='\033[0;30m' # Black
|
||||
Red='\033[0;31m' # Red
|
||||
Green='\033[0;32m' # Green
|
||||
Yellow='\033[0;33m' # Yellow
|
||||
Blue='\033[0;34m' # Blue
|
||||
Purple='\033[0;35m' # Purple
|
||||
Cyan='\033[0;36m' # Cyan
|
||||
White='\033[0;37m' # White
|
||||
|
||||
# Bold
|
||||
BBlack='\033[1;30m' # Black
|
||||
BRed='\033[1;31m' # Red
|
||||
BGreen='\033[1;32m' # Green
|
||||
BYellow='\033[1;33m' # Yellow
|
||||
BBlue='\033[1;34m' # Blue
|
||||
BPurple='\033[1;35m' # Purple
|
||||
BCyan='\033[1;36m' # Cyan
|
||||
BWhite='\033[1;37m' # White
|
||||
|
||||
# Bold High Intensity
|
||||
BIBlack='\033[1;90m' # Black
|
||||
BIRed='\033[1;91m' # Red
|
||||
BIGreen='\033[1;92m' # Green
|
||||
BIYellow='\033[1;93m' # Yellow
|
||||
BIBlue='\033[1;94m' # Blue
|
||||
BIPurple='\033[1;95m' # Purple
|
||||
BICyan='\033[1;96m' # Cyan
|
||||
BIWhite='\033[1;97m' # White
|
||||
|
||||
|
||||
##############################################################################
|
||||
# Detect required version of python
|
||||
# Globals:
|
||||
# colors
|
||||
# PYTHON
|
||||
# Arguments:
|
||||
# None
|
||||
# Returns:
|
||||
# None
|
||||
###############################################################################
|
||||
detect_python () {
|
||||
echo -e "${BIGreen}>>>${RST} Using python \c"
|
||||
command -v python >/dev/null 2>&1 || { echo -e "${BIRed}- NOT FOUND${RST} ${BIYellow}You need Python 3.9 installed to continue.${RST}"; return 1; }
|
||||
local version_command="import sys;print('{0}.{1}'.format(sys.version_info[0], sys.version_info[1]))"
|
||||
local python_version="$(python <<< ${version_command})"
|
||||
oIFS="$IFS"
|
||||
IFS=.
|
||||
set -- $python_version
|
||||
IFS="$oIFS"
|
||||
if [ "$1" -ge "3" ] && [ "$2" -ge "9" ] ; then
|
||||
if [ "$2" -gt "9" ] ; then
|
||||
echo -e "${BIWhite}[${RST} ${BIRed}$1.$2 ${BIWhite}]${RST} - ${BIRed}FAILED${RST} ${BIYellow}Version is new and unsupported, use${RST} ${BIPurple}3.9.x${RST}"; return 1;
|
||||
else
|
||||
echo -e "${BIWhite}[${RST} ${BIGreen}$1.$2${RST} ${BIWhite}]${RST}"
|
||||
fi
|
||||
else
|
||||
command -v python >/dev/null 2>&1 || { echo -e "${BIRed}$1.$2$ - ${BIRed}FAILED${RST} ${BIYellow}Version is old and unsupported${RST}"; return 1; }
|
||||
fi
|
||||
}
|
||||
|
||||
install_poetry () {
|
||||
echo -e "${BIGreen}>>>${RST} Installing Poetry ..."
|
||||
export POETRY_HOME="$repo_root/.poetry"
|
||||
command -v curl >/dev/null 2>&1 || { echo -e "${BIRed}!!!${RST}${BIYellow} Missing ${RST}${BIBlue}curl${BIYellow} command.${RST}"; return 1; }
|
||||
curl -sSL https://install.python-poetry.org/ | python -
|
||||
}
|
||||
|
||||
##############################################################################
|
||||
# Return absolute path
|
||||
# Globals:
|
||||
# None
|
||||
# Arguments:
|
||||
# Path to resolve
|
||||
# Returns:
|
||||
# None
|
||||
###############################################################################
|
||||
realpath () {
|
||||
echo $(cd $(dirname "$1"); pwd)/$(basename "$1")
|
||||
}
|
||||
|
||||
##############################################################################
|
||||
# Create Virtual Environment
|
||||
# Globals:
|
||||
# repo_root
|
||||
# POETRY_HOME
|
||||
# poetry_verbosity
|
||||
# Arguments:
|
||||
# Path to resolve
|
||||
# Returns:
|
||||
# None
|
||||
###############################################################################
|
||||
create_env () {
|
||||
# Directories
|
||||
pushd "$repo_root" > /dev/null || return > /dev/null
|
||||
|
||||
echo -e "${BIGreen}>>>${RST} Reading Poetry ... \c"
|
||||
if [ -f "$POETRY_HOME/bin/poetry" ]; then
|
||||
echo -e "${BIGreen}OK${RST}"
|
||||
else
|
||||
echo -e "${BIYellow}NOT FOUND${RST}"
|
||||
install_poetry || { echo -e "${BIRed}!!!${RST} Poetry installation failed"; return 1; }
|
||||
fi
|
||||
|
||||
if [ -f "$repo_root/poetry.lock" ]; then
|
||||
echo -e "${BIGreen}>>>${RST} Updating dependencies ..."
|
||||
else
|
||||
echo -e "${BIGreen}>>>${RST} Installing dependencies ..."
|
||||
fi
|
||||
|
||||
"$POETRY_HOME/bin/poetry" install --no-root $poetry_verbosity || { echo -e "${BIRed}!!!${RST} Poetry environment installation failed"; return 1; }
|
||||
if [ $? -ne 0 ] ; then
|
||||
echo -e "${BIRed}!!!${RST} Virtual environment creation failed."
|
||||
return 1
|
||||
fi
|
||||
|
||||
echo -e "${BIGreen}>>>${RST} Cleaning cache files ..."
|
||||
clean_pyc
|
||||
|
||||
"$POETRY_HOME/bin/poetry" run python -m pip install --disable-pip-version-check --force-reinstall pip
|
||||
|
||||
if [ -d "$repo_root/.git" ]; then
|
||||
echo -e "${BIGreen}>>>${RST} Installing pre-commit hooks ..."
|
||||
"$POETRY_HOME/bin/poetry" run pre-commit install
|
||||
fi
|
||||
}
|
||||
|
||||
print_art() {
|
||||
echo -e "${BGreen}"
|
||||
cat <<-EOF
|
||||
|
||||
▄██▄
|
||||
▄███▄ ▀██▄ ▀██▀ ▄██▀ ▄██▀▀▀██▄ ▀███▄ █▄
|
||||
▄▄ ▀██▄ ▀██▄ ▄██▀ ██▀ ▀██▄ ▄ ▀██▄ ███
|
||||
▄██▀ ██▄ ▀ ▄▄ ▀ ██ ▄██ ███ ▀██▄ ███
|
||||
▄██▀ ▀██▄ ██ ▀██▄ ▄██▀ ███ ▀██ ▀█▀
|
||||
▄██▀ ▀██▄ ▀█ ▀██▄▄▄▄██▀ █▀ ▀██▄
|
||||
|
||||
· · - =[ by YNPUT ]:[ http://ayon.ynput.io ]= - · ·
|
||||
|
||||
EOF
|
||||
echo -e "${RST}"
|
||||
}
|
||||
|
||||
default_help() {
|
||||
print_art
|
||||
echo -e "${BWhite}AYON Addon management script${RST}"
|
||||
echo ""
|
||||
echo -e "Usage: ${BWhite}./manage.sh${RST} ${BICyan}[command]${RST}"
|
||||
echo ""
|
||||
echo -e "${BWhite}Commands:${RST}"
|
||||
echo -e " ${BWhite}create-env${RST} ${BCyan}Install Poetry and update venv by lock file${RST}"
|
||||
echo -e " ${BWhite}ruff-check${RST} ${BCyan}Run Ruff check for the repository${RST}"
|
||||
echo -e " ${BWhite}ruff-fix${RST} ${BCyan}Run Ruff fix for the repository${RST}"
|
||||
echo -e " ${BWhite}codespell${RST} ${BCyan}Run codespell check for the repository${RST}"
|
||||
echo ""
|
||||
}
|
||||
|
||||
run_ruff () {
|
||||
echo -e "${BIGreen}>>>${RST} Running Ruff check ..."
|
||||
"$POETRY_HOME/bin/poetry" run ruff check
|
||||
}
|
||||
|
||||
run_ruff_check () {
|
||||
echo -e "${BIGreen}>>>${RST} Running Ruff fix ..."
|
||||
"$POETRY_HOME/bin/poetry" run ruff check --fix
|
||||
}
|
||||
|
||||
run_codespell () {
|
||||
echo -e "${BIGreen}>>>${RST} Running codespell check ..."
|
||||
"$POETRY_HOME/bin/poetry" run codespell
|
||||
}
|
||||
|
||||
main () {
|
||||
detect_python || return 1
|
||||
|
||||
# Directories
|
||||
repo_root=$(realpath $(dirname $(dirname "${BASH_SOURCE[0]}")))
|
||||
|
||||
if [[ -z $POETRY_HOME ]]; then
|
||||
export POETRY_HOME="$repo_root/.poetry"
|
||||
fi
|
||||
|
||||
pushd "$repo_root" > /dev/null || return > /dev/null
|
||||
|
||||
# Use first argument, lower and keep only characters
|
||||
function_name="$(echo "$1" | tr '[:upper:]' '[:lower:]' | sed 's/[^a-z]*//g')"
|
||||
|
||||
case $function_name in
|
||||
"createenv")
|
||||
create_env || return_code=$?
|
||||
exit $return_code
|
||||
;;
|
||||
"ruffcheck")
|
||||
run_ruff || return_code=$?
|
||||
exit $return_code
|
||||
;;
|
||||
"rufffix")
|
||||
run_ruff_check || return_code=$?
|
||||
exit $return_code
|
||||
;;
|
||||
"codespell")
|
||||
run_codespell || return_code=$?
|
||||
exit $return_code
|
||||
;;
|
||||
esac
|
||||
|
||||
if [ "$function_name" != "" ]; then
|
||||
echo -e "${BIRed}!!!${RST} Unknown function name: $function_name"
|
||||
fi
|
||||
|
||||
default_help
|
||||
exit $return_code
|
||||
}
|
||||
|
||||
return_code=0
|
||||
main "$@" || return_code=$?
|
||||
exit $return_code
|
||||
Loading…
Add table
Add a link
Reference in a new issue