mirror of
https://github.com/ynput/ayon-core.git
synced 2025-12-24 21:04:40 +01:00
Merge branch 'develop' into main
This commit is contained in:
commit
f4195e568e
112 changed files with 6967 additions and 1278 deletions
80
CHANGELOG.md
80
CHANGELOG.md
|
|
@ -1,5 +1,37 @@
|
|||
# Changelog
|
||||
|
||||
## [2.18.0](https://github.com/pypeclub/openpype/tree/2.18.0) (2021-05-18)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/openpype/compare/2.17.3...2.18.0)
|
||||
|
||||
**Enhancements:**
|
||||
|
||||
- Use SubsetLoader and multiple contexts for delete_old_versions [\#1484](ttps://github.com/pypeclub/OpenPype/pull/1484))
|
||||
- TVPaint: Increment workfile version on successfull publish. [\#1489](https://github.com/pypeclub/OpenPype/pull/1489)
|
||||
- Maya: Use of multiple deadline servers [\#1483](https://github.com/pypeclub/OpenPype/pull/1483)
|
||||
|
||||
**Fixed bugs:**
|
||||
|
||||
- Use instance frame start instead of timeline. [\#1486](https://github.com/pypeclub/OpenPype/pull/1486)
|
||||
- Maya: Redshift - set proper start frame on proxy [\#1480](https://github.com/pypeclub/OpenPype/pull/1480)
|
||||
- Maya: wrong collection of playblasted frames [\#1517](https://github.com/pypeclub/OpenPype/pull/1517)
|
||||
- Existing subsets hints in creator [\#1502](https://github.com/pypeclub/OpenPype/pull/1502)
|
||||
|
||||
## [2.17.3](https://github.com/pypeclub/openpype/tree/2.17.3) (2021-05-06)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/openpype/compare/CI/3.0.0-rc.3...2.17.3)
|
||||
|
||||
**Fixed bugs:**
|
||||
|
||||
- Nuke: workfile version synced to db version always [\#1479](https://github.com/pypeclub/OpenPype/pull/1479)
|
||||
|
||||
## [2.17.2](https://github.com/pypeclub/openpype/tree/2.17.2) (2021-05-04)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/openpype/compare/CI/3.0.0-rc.1...2.17.2)
|
||||
|
||||
**Enhancements:**
|
||||
|
||||
- Forward/Backward compatible apps and tools with OpenPype 3 [\#1463](https://github.com/pypeclub/OpenPype/pull/1463)
|
||||
|
||||
## [2.17.1](https://github.com/pypeclub/openpype/tree/2.17.1) (2021-04-30)
|
||||
|
||||
|
|
@ -7,28 +39,30 @@
|
|||
|
||||
**Enhancements:**
|
||||
|
||||
- Nuke: deadline submission with gpu [\#1414](https://github.com/pypeclub/OpenPype/pull/1414)
|
||||
- TVPaint frame range definition [\#1424](https://github.com/pypeclub/OpenPype/pull/1424)
|
||||
- PS - group all published instances [\#1415](https://github.com/pypeclub/OpenPype/pull/1415)
|
||||
- Nuke: deadline submission with gpu [\#1414](https://github.com/pypeclub/OpenPype/pull/1414)
|
||||
- Add task name to context pop up. [\#1383](https://github.com/pypeclub/OpenPype/pull/1383)
|
||||
- AE add duration validation [\#1363](https://github.com/pypeclub/OpenPype/pull/1363)
|
||||
- Maya: Support for Redshift proxies [\#1360](https://github.com/pypeclub/OpenPype/pull/1360)
|
||||
- Enhance review letterbox feature. [\#1371](https://github.com/pypeclub/OpenPype/pull/1371)
|
||||
|
||||
**Fixed bugs:**
|
||||
|
||||
- Nuke: fixing undo for loaded mov and sequence [\#1433](https://github.com/pypeclub/OpenPype/pull/1433)
|
||||
- AE - validation for duration was 1 frame shorter [\#1426](https://github.com/pypeclub/OpenPype/pull/1426)
|
||||
- Houdini menu filename [\#1417](https://github.com/pypeclub/OpenPype/pull/1417)
|
||||
- Maya: Vray - problem getting all file nodes for look publishing [\#1399](https://github.com/pypeclub/OpenPype/pull/1399)
|
||||
- AE - validation for duration was 1 frame shorter [\#1426](https://github.com/pypeclub/OpenPype/pull/1426)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Maya: Vray - problem getting all file nodes for look publishing [\#1399](https://github.com/pypeclub/OpenPype/pull/1399)
|
||||
- Maya: Support for Redshift proxies [\#1360](https://github.com/pypeclub/OpenPype/pull/1360)
|
||||
|
||||
## [2.17.0](https://github.com/pypeclub/openpype/tree/2.17.0) (2021-04-20)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/openpype/compare/3.0.0-beta2...2.17.0)
|
||||
[Full Changelog](https://github.com/pypeclub/openpype/compare/CI/3.0.0-beta.2...2.17.0)
|
||||
|
||||
**Enhancements:**
|
||||
|
||||
- Forward compatible ftrack group [\#1243](https://github.com/pypeclub/OpenPype/pull/1243)
|
||||
- Settings in mongo as dict [\#1221](https://github.com/pypeclub/OpenPype/pull/1221)
|
||||
- Maya: Make tx option configurable with presets [\#1328](https://github.com/pypeclub/OpenPype/pull/1328)
|
||||
- TVPaint asset name validation [\#1302](https://github.com/pypeclub/OpenPype/pull/1302)
|
||||
- TV Paint: Set initial project settings. [\#1299](https://github.com/pypeclub/OpenPype/pull/1299)
|
||||
|
|
@ -56,35 +90,6 @@
|
|||
- Nuke: reverse search to make it more versatile [\#1178](https://github.com/pypeclub/OpenPype/pull/1178)
|
||||
|
||||
|
||||
## [2.16.1](https://github.com/pypeclub/pype/tree/2.16.1) (2021-04-13)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/pype/compare/2.16.0...2.16.1)
|
||||
|
||||
**Enhancements:**
|
||||
|
||||
- Nuke: comp renders mix up [\#1301](https://github.com/pypeclub/pype/pull/1301)
|
||||
- Validate project settings [\#1297](https://github.com/pypeclub/pype/pull/1297)
|
||||
- After Effects: added SubsetManager [\#1234](https://github.com/pypeclub/pype/pull/1234)
|
||||
|
||||
**Fixed bugs:**
|
||||
|
||||
- Ftrack custom attributes in bulks [\#1312](https://github.com/pypeclub/pype/pull/1312)
|
||||
- Ftrack optional pypclub role [\#1303](https://github.com/pypeclub/pype/pull/1303)
|
||||
- AE remove orphaned instance from workfile - fix self.stub [\#1282](https://github.com/pypeclub/pype/pull/1282)
|
||||
- Avalon schema names [\#1242](https://github.com/pypeclub/pype/pull/1242)
|
||||
- Handle duplication of Task name [\#1226](https://github.com/pypeclub/pype/pull/1226)
|
||||
- Modified path of plugin loads for Harmony and TVPaint [\#1217](https://github.com/pypeclub/pype/pull/1217)
|
||||
- Regex checks in profiles filtering [\#1214](https://github.com/pypeclub/pype/pull/1214)
|
||||
- Bulk mov strict task [\#1204](https://github.com/pypeclub/pype/pull/1204)
|
||||
- Update custom ftrack session attributes [\#1202](https://github.com/pypeclub/pype/pull/1202)
|
||||
- Nuke: write node colorspace ignore `default\(\)` label [\#1199](https://github.com/pypeclub/pype/pull/1199)
|
||||
- Nuke: reverse search to make it more versatile [\#1178](https://github.com/pypeclub/pype/pull/1178)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Forward compatible ftrack group [\#1243](https://github.com/pypeclub/pype/pull/1243)
|
||||
- Error message in pyblish UI [\#1206](https://github.com/pypeclub/pype/pull/1206)
|
||||
- Nuke: deadline submission with search replaced env values from preset [\#1194](https://github.com/pypeclub/pype/pull/1194)
|
||||
|
||||
## [2.16.0](https://github.com/pypeclub/pype/tree/2.16.0) (2021-03-22)
|
||||
|
||||
|
|
@ -1145,4 +1150,7 @@ A large cleanup release. Most of the change are under the hood.
|
|||
\* *This Changelog was automatically generated by [github_changelog_generator](https://github.com/github-changelog-generator/github-changelog-generator)*
|
||||
|
||||
|
||||
\* *This Changelog was automatically generated by [github_changelog_generator](https://github.com/github-changelog-generator/github-changelog-generator)*
|
||||
|
||||
|
||||
\* *This Changelog was automatically generated by [github_changelog_generator](https://github.com/github-changelog-generator/github-changelog-generator)*
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Bootstrap OpenPype repositories."""
|
||||
import functools
|
||||
from __future__ import annotations
|
||||
import logging as log
|
||||
import os
|
||||
import re
|
||||
|
|
@ -9,10 +9,12 @@ import sys
|
|||
import tempfile
|
||||
from pathlib import Path
|
||||
from typing import Union, Callable, List, Tuple
|
||||
|
||||
from zipfile import ZipFile, BadZipFile
|
||||
|
||||
from appdirs import user_data_dir
|
||||
from speedcopy import copyfile
|
||||
import semver
|
||||
|
||||
from .user_settings import (
|
||||
OpenPypeSecureRegistry,
|
||||
|
|
@ -26,159 +28,138 @@ LOG_WARNING = 1
|
|||
LOG_ERROR = 3
|
||||
|
||||
|
||||
@functools.total_ordering
|
||||
class OpenPypeVersion:
|
||||
class OpenPypeVersion(semver.VersionInfo):
|
||||
"""Class for storing information about OpenPype version.
|
||||
|
||||
Attributes:
|
||||
major (int): [1].2.3-client-variant
|
||||
minor (int): 1.[2].3-client-variant
|
||||
subversion (int): 1.2.[3]-client-variant
|
||||
client (str): 1.2.3-[client]-variant
|
||||
variant (str): 1.2.3-client-[variant]
|
||||
staging (bool): True if it is staging version
|
||||
path (str): path to OpenPype
|
||||
|
||||
"""
|
||||
major = 0
|
||||
minor = 0
|
||||
subversion = 0
|
||||
variant = ""
|
||||
client = None
|
||||
staging = False
|
||||
path = None
|
||||
_VERSION_REGEX = re.compile(r"(?P<major>0|[1-9]\d*)\.(?P<minor>0|[1-9]\d*)\.(?P<patch>0|[1-9]\d*)(?:-(?P<prerelease>(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\.(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\+(?P<buildmetadata>[0-9a-zA-Z-]+(?:\.[0-9a-zA-Z-]+)*))?$") # noqa: E501
|
||||
|
||||
_version_regex = re.compile(
|
||||
r"(?P<major>\d+)\.(?P<minor>\d+)\.(?P<sub>\d+)(-(?P<var1>staging)|-(?P<client>.+)(-(?P<var2>staging)))?") # noqa: E501
|
||||
def __init__(self, *args, **kwargs):
|
||||
"""Create OpenPype version.
|
||||
|
||||
@property
|
||||
def version(self):
|
||||
"""return formatted version string."""
|
||||
return self._compose_version()
|
||||
.. deprecated:: 3.0.0-rc.2
|
||||
`client` and `variant` are removed.
|
||||
|
||||
@version.setter
|
||||
def version(self, val):
|
||||
decomposed = self._decompose_version(val)
|
||||
self.major = decomposed[0]
|
||||
self.minor = decomposed[1]
|
||||
self.subversion = decomposed[2]
|
||||
self.variant = decomposed[3]
|
||||
self.client = decomposed[4]
|
||||
|
||||
def __init__(self, major: int = None, minor: int = None,
|
||||
subversion: int = None, version: str = None,
|
||||
variant: str = "", client: str = None,
|
||||
path: Path = None):
|
||||
self.path = path
|
||||
Args:
|
||||
major (int): version when you make incompatible API changes.
|
||||
minor (int): version when you add functionality in a
|
||||
backwards-compatible manner.
|
||||
patch (int): version when you make backwards-compatible bug fixes.
|
||||
prerelease (str): an optional prerelease string
|
||||
build (str): an optional build string
|
||||
version (str): if set, it will be parsed and will override
|
||||
parameters like `major`, `minor` and so on.
|
||||
staging (bool): set to True if version is staging.
|
||||
path (Path): path to version location.
|
||||
|
||||
if (
|
||||
major is None or minor is None or subversion is None
|
||||
) and version is None:
|
||||
raise ValueError("Need version specified in some way.")
|
||||
if version:
|
||||
values = self._decompose_version(version)
|
||||
self.major = values[0]
|
||||
self.minor = values[1]
|
||||
self.subversion = values[2]
|
||||
self.variant = values[3]
|
||||
self.client = values[4]
|
||||
else:
|
||||
self.major = major
|
||||
self.minor = minor
|
||||
self.subversion = subversion
|
||||
# variant is set only if it is "staging", otherwise "production" is
|
||||
# implied and no need to mention it in version string.
|
||||
if variant == "staging":
|
||||
self.variant = variant
|
||||
self.client = client
|
||||
"""
|
||||
self.path = None
|
||||
self.staging = False
|
||||
|
||||
def _compose_version(self):
|
||||
version = "{}.{}.{}".format(self.major, self.minor, self.subversion)
|
||||
if "version" in kwargs.keys():
|
||||
if not kwargs.get("version"):
|
||||
raise ValueError("Invalid version specified")
|
||||
v = OpenPypeVersion.parse(kwargs.get("version"))
|
||||
kwargs["major"] = v.major
|
||||
kwargs["minor"] = v.minor
|
||||
kwargs["patch"] = v.patch
|
||||
kwargs["prerelease"] = v.prerelease
|
||||
kwargs["build"] = v.build
|
||||
kwargs.pop("version")
|
||||
|
||||
if self.client:
|
||||
version = "{}-{}".format(version, self.client)
|
||||
if kwargs.get("path"):
|
||||
if isinstance(kwargs.get("path"), str):
|
||||
self.path = Path(kwargs.get("path"))
|
||||
elif isinstance(kwargs.get("path"), Path):
|
||||
self.path = kwargs.get("path")
|
||||
else:
|
||||
raise TypeError("Path must be str or Path")
|
||||
kwargs.pop("path")
|
||||
|
||||
if self.variant == "staging":
|
||||
version = "{}-{}".format(version, self.variant)
|
||||
if "path" in kwargs.keys():
|
||||
kwargs.pop("path")
|
||||
|
||||
return version
|
||||
if kwargs.get("staging"):
|
||||
self.staging = kwargs.get("staging", False)
|
||||
kwargs.pop("staging")
|
||||
|
||||
@classmethod
|
||||
def _decompose_version(cls, version_string: str) -> tuple:
|
||||
m = re.search(cls._version_regex, version_string)
|
||||
if not m:
|
||||
raise ValueError(
|
||||
"Cannot parse version string: {}".format(version_string))
|
||||
if "staging" in kwargs.keys():
|
||||
kwargs.pop("staging")
|
||||
|
||||
variant = None
|
||||
if m.group("var1") == "staging" or m.group("var2") == "staging":
|
||||
variant = "staging"
|
||||
if self.staging:
|
||||
if kwargs.get("build"):
|
||||
if "staging" not in kwargs.get("build"):
|
||||
kwargs["build"] = "{}-staging".format(kwargs.get("build"))
|
||||
else:
|
||||
kwargs["build"] = "staging"
|
||||
|
||||
client = m.group("client")
|
||||
if kwargs.get("build") and "staging" in kwargs.get("build", ""):
|
||||
self.staging = True
|
||||
|
||||
return (int(m.group("major")), int(m.group("minor")),
|
||||
int(m.group("sub")), variant, client)
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
def __eq__(self, other):
|
||||
if not isinstance(other, self.__class__):
|
||||
return False
|
||||
return self.version == other.version
|
||||
|
||||
def __str__(self):
|
||||
return self.version
|
||||
result = super().__eq__(other)
|
||||
return bool(result and self.staging == other.staging)
|
||||
|
||||
def __repr__(self):
|
||||
return "{}, {}: {}".format(
|
||||
self.__class__.__name__, self.version, self.path)
|
||||
|
||||
def __hash__(self):
|
||||
return hash(self.version)
|
||||
|
||||
def __lt__(self, other):
|
||||
if (self.major, self.minor, self.subversion) < \
|
||||
(other.major, other.minor, other.subversion):
|
||||
return True
|
||||
|
||||
# 1.2.3-staging < 1.2.3-client-staging
|
||||
if self.get_main_version() == other.get_main_version() and \
|
||||
not self.client and self.variant and \
|
||||
other.client and other.variant:
|
||||
return True
|
||||
|
||||
# 1.2.3 < 1.2.3-staging
|
||||
if self.get_main_version() == other.get_main_version() and \
|
||||
not self.client and self.variant and \
|
||||
not other.client and not other.variant:
|
||||
return True
|
||||
|
||||
# 1.2.3 < 1.2.3-client
|
||||
if self.get_main_version() == other.get_main_version() and \
|
||||
not self.client and not self.variant and \
|
||||
other.client and not other.variant:
|
||||
return True
|
||||
|
||||
# 1.2.3 < 1.2.3-client-staging
|
||||
if self.get_main_version() == other.get_main_version() and \
|
||||
not self.client and not self.variant and other.client:
|
||||
return True
|
||||
|
||||
# 1.2.3-client-staging < 1.2.3-client
|
||||
if self.get_main_version() == other.get_main_version() and \
|
||||
self.client and self.variant and \
|
||||
other.client and not other.variant:
|
||||
return True
|
||||
return "<{}: {} - path={}>".format(
|
||||
self.__class__.__name__, str(self), self.path)
|
||||
|
||||
def __lt__(self, other: OpenPypeVersion):
|
||||
result = super().__lt__(other)
|
||||
# prefer path over no path
|
||||
if self.version == other.version and \
|
||||
not self.path and other.path:
|
||||
if self == other and not self.path and other.path:
|
||||
return True
|
||||
|
||||
# prefer path with dir over path with file
|
||||
return self.version == other.version and self.path and \
|
||||
other.path and self.path.is_file() and \
|
||||
other.path.is_dir()
|
||||
if self == other and self.path and other.path and \
|
||||
other.path.is_dir() and self.path.is_file():
|
||||
return True
|
||||
|
||||
if self.finalize_version() == other.finalize_version() and \
|
||||
self.prerelease == other.prerelease and \
|
||||
self.is_staging() and not other.is_staging():
|
||||
return True
|
||||
|
||||
return result
|
||||
|
||||
def set_staging(self) -> OpenPypeVersion:
|
||||
"""Set version as staging and return it.
|
||||
|
||||
This will preserve current one.
|
||||
|
||||
Returns:
|
||||
OpenPypeVersion: Set as staging.
|
||||
|
||||
"""
|
||||
if self.staging:
|
||||
return self
|
||||
return self.replace(parts={"build": f"{self.build}-staging"})
|
||||
|
||||
def set_production(self) -> OpenPypeVersion:
|
||||
"""Set version as production and return it.
|
||||
|
||||
This will preserve current one.
|
||||
|
||||
Returns:
|
||||
OpenPypeVersion: Set as production.
|
||||
|
||||
"""
|
||||
if not self.staging:
|
||||
return self
|
||||
return self.replace(
|
||||
parts={"build": self.build.replace("-staging", "")})
|
||||
|
||||
def is_staging(self) -> bool:
|
||||
"""Test if current version is staging one."""
|
||||
return self.variant == "staging"
|
||||
return self.staging
|
||||
|
||||
def get_main_version(self) -> str:
|
||||
"""Return main version component.
|
||||
|
|
@ -186,11 +167,13 @@ class OpenPypeVersion:
|
|||
This returns x.x.x part of version from possibly more complex one
|
||||
like x.x.x-foo-bar.
|
||||
|
||||
.. deprecated:: 3.0.0-rc.2
|
||||
use `finalize_version()` instead.
|
||||
Returns:
|
||||
str: main version component
|
||||
|
||||
"""
|
||||
return "{}.{}.{}".format(self.major, self.minor, self.subversion)
|
||||
return str(self.finalize_version())
|
||||
|
||||
@staticmethod
|
||||
def version_in_str(string: str) -> Tuple:
|
||||
|
|
@ -203,15 +186,22 @@ class OpenPypeVersion:
|
|||
tuple: True/False and OpenPypeVersion if found.
|
||||
|
||||
"""
|
||||
try:
|
||||
result = OpenPypeVersion._decompose_version(string)
|
||||
except ValueError:
|
||||
m = re.search(OpenPypeVersion._VERSION_REGEX, string)
|
||||
if not m:
|
||||
return False, None
|
||||
return True, OpenPypeVersion(major=result[0],
|
||||
minor=result[1],
|
||||
subversion=result[2],
|
||||
variant=result[3],
|
||||
client=result[4])
|
||||
version = OpenPypeVersion.parse(string[m.start():m.end()])
|
||||
return True, version
|
||||
|
||||
@classmethod
|
||||
def parse(cls, version):
|
||||
"""Extends parse to handle ta handle staging variant."""
|
||||
v = super().parse(version)
|
||||
openpype_version = cls(major=v.major, minor=v.minor,
|
||||
patch=v.patch, prerelease=v.prerelease,
|
||||
build=v.build)
|
||||
if v.build and "staging" in v.build:
|
||||
openpype_version.staging = True
|
||||
return openpype_version
|
||||
|
||||
|
||||
class BootstrapRepos:
|
||||
|
|
@ -269,7 +259,7 @@ class BootstrapRepos:
|
|||
"""Get path for specific version in list of OpenPype versions.
|
||||
|
||||
Args:
|
||||
version (str): Version string to look for (1.2.4-staging)
|
||||
version (str): Version string to look for (1.2.4+staging)
|
||||
version_list (list of OpenPypeVersion): list of version to search.
|
||||
|
||||
Returns:
|
||||
|
|
@ -632,7 +622,7 @@ class BootstrapRepos:
|
|||
" not implemented yet."))
|
||||
|
||||
dir_to_search = self.data_dir
|
||||
|
||||
user_versions = self.get_openpype_versions(self.data_dir, staging)
|
||||
# if we have openpype_path specified, search only there.
|
||||
if openpype_path:
|
||||
dir_to_search = openpype_path
|
||||
|
|
@ -652,6 +642,7 @@ class BootstrapRepos:
|
|||
pass
|
||||
|
||||
openpype_versions = self.get_openpype_versions(dir_to_search, staging)
|
||||
openpype_versions += user_versions
|
||||
|
||||
# remove zip file version if needed.
|
||||
if not include_zips:
|
||||
|
|
@ -764,12 +755,13 @@ class BootstrapRepos:
|
|||
|
||||
destination = self.data_dir / version.path.stem
|
||||
if destination.exists():
|
||||
assert destination.is_dir()
|
||||
try:
|
||||
destination.unlink()
|
||||
except OSError:
|
||||
shutil.rmtree(destination)
|
||||
except OSError as e:
|
||||
msg = f"!!! Cannot remove already existing {destination}"
|
||||
self._print(msg, LOG_ERROR, exc_info=True)
|
||||
return None
|
||||
raise e
|
||||
|
||||
destination.mkdir(parents=True)
|
||||
|
||||
|
|
@ -821,7 +813,6 @@ class BootstrapRepos:
|
|||
OpenPypeVersionIOError: If copying or zipping fail.
|
||||
|
||||
"""
|
||||
|
||||
if self.is_inside_user_data(openpype_version.path) and not openpype_version.path.is_file(): # noqa
|
||||
raise OpenPypeVersionExists(
|
||||
"OpenPype already inside user data dir")
|
||||
|
|
@ -868,26 +859,20 @@ class BootstrapRepos:
|
|||
# set zip as version source
|
||||
openpype_version.path = temp_zip
|
||||
|
||||
if self.is_inside_user_data(openpype_version.path):
|
||||
raise OpenPypeVersionInvalid(
|
||||
"Version is in user data dir.")
|
||||
openpype_version.path = self._copy_zip(
|
||||
openpype_version.path, destination)
|
||||
|
||||
elif openpype_version.path.is_file():
|
||||
# check if file is zip (by extension)
|
||||
if openpype_version.path.suffix.lower() != ".zip":
|
||||
raise OpenPypeVersionInvalid("Invalid file format")
|
||||
|
||||
if not self.is_inside_user_data(openpype_version.path):
|
||||
try:
|
||||
# copy file to destination
|
||||
self._print("Copying zip to destination ...")
|
||||
_destination_zip = destination.parent / openpype_version.path.name # noqa: E501
|
||||
copyfile(
|
||||
openpype_version.path.as_posix(),
|
||||
_destination_zip.as_posix())
|
||||
except OSError as e:
|
||||
self._print(
|
||||
"cannot copy version to user data directory", LOG_ERROR,
|
||||
exc_info=True)
|
||||
raise OpenPypeVersionIOError((
|
||||
f"can't copy version {openpype_version.path.as_posix()} "
|
||||
f"to destination {destination.parent.as_posix()}")) from e
|
||||
if not self.is_inside_user_data(openpype_version.path):
|
||||
openpype_version.path = self._copy_zip(
|
||||
openpype_version.path, destination)
|
||||
|
||||
# extract zip there
|
||||
self._print("extracting zip to destination ...")
|
||||
|
|
@ -896,6 +881,23 @@ class BootstrapRepos:
|
|||
|
||||
return destination
|
||||
|
||||
def _copy_zip(self, source: Path, destination: Path) -> Path:
|
||||
try:
|
||||
# copy file to destination
|
||||
self._print("Copying zip to destination ...")
|
||||
_destination_zip = destination.parent / source.name # noqa: E501
|
||||
copyfile(
|
||||
source.as_posix(),
|
||||
_destination_zip.as_posix())
|
||||
except OSError as e:
|
||||
self._print(
|
||||
"cannot copy version to user data directory", LOG_ERROR,
|
||||
exc_info=True)
|
||||
raise OpenPypeVersionIOError((
|
||||
f"can't copy version {source.as_posix()} "
|
||||
f"to destination {destination.parent.as_posix()}")) from e
|
||||
return _destination_zip
|
||||
|
||||
def _is_openpype_in_dir(self,
|
||||
dir_item: Path,
|
||||
detected_version: OpenPypeVersion) -> bool:
|
||||
|
|
|
|||
|
|
@ -67,6 +67,15 @@ def patched_discover(superclass):
|
|||
@import_wrapper
|
||||
def install():
|
||||
"""Install Pype to Avalon."""
|
||||
from pyblish.lib import MessageHandler
|
||||
|
||||
def modified_emit(obj, record):
|
||||
"""Method replacing `emit` in Pyblish's MessageHandler."""
|
||||
record.msg = record.getMessage()
|
||||
obj.records.append(record)
|
||||
|
||||
MessageHandler.emit = modified_emit
|
||||
|
||||
log.info("Registering global plug-ins..")
|
||||
pyblish.register_plugin_path(PUBLISH_PATH)
|
||||
pyblish.register_discovery_filter(filter_pyblish_plugins)
|
||||
|
|
|
|||
|
|
@ -224,6 +224,11 @@ def launch(app, project, asset, task,
|
|||
PypeCommands().run_application(app, project, asset, task, tools, arguments)
|
||||
|
||||
|
||||
@main.command(context_settings={"ignore_unknown_options": True})
|
||||
def projectmanager():
|
||||
PypeCommands().launch_project_manager()
|
||||
|
||||
|
||||
@main.command(
|
||||
context_settings=dict(
|
||||
ignore_unknown_options=True,
|
||||
|
|
|
|||
|
|
@ -0,0 +1,9 @@
|
|||
def add_implementation_envs(env, _app):
|
||||
"""Modify environments to contain all required for implementation."""
|
||||
defaults = {
|
||||
"OPENPYPE_LOG_NO_COLORS": "True",
|
||||
"WEBSOCKET_URL": "ws://localhost:8097/ws/"
|
||||
}
|
||||
for key, value in defaults.items():
|
||||
if not env.get(key):
|
||||
env[key] = value
|
||||
|
|
@ -112,38 +112,4 @@ def get_asset_settings():
|
|||
"duration": duration
|
||||
}
|
||||
|
||||
try:
|
||||
# temporary, in pype3 replace with api.get_current_project_settings
|
||||
skip_resolution_check = (
|
||||
api.get_current_project_settings()
|
||||
["plugins"]
|
||||
["aftereffects"]
|
||||
["publish"]
|
||||
["ValidateSceneSettings"]
|
||||
["skip_resolution_check"]
|
||||
)
|
||||
skip_timelines_check = (
|
||||
api.get_current_project_settings()
|
||||
["plugins"]
|
||||
["aftereffects"]
|
||||
["publish"]
|
||||
["ValidateSceneSettings"]
|
||||
["skip_timelines_check"]
|
||||
)
|
||||
except KeyError:
|
||||
skip_resolution_check = ['*']
|
||||
skip_timelines_check = ['*']
|
||||
|
||||
if os.getenv('AVALON_TASK') in skip_resolution_check or \
|
||||
'*' in skip_timelines_check:
|
||||
scene_data.pop("resolutionWidth")
|
||||
scene_data.pop("resolutionHeight")
|
||||
|
||||
if entity_type in skip_timelines_check or '*' in skip_timelines_check:
|
||||
scene_data.pop('fps', None)
|
||||
scene_data.pop('frameStart', None)
|
||||
scene_data.pop('frameEnd', None)
|
||||
scene_data.pop('handleStart', None)
|
||||
scene_data.pop('handleEnd', None)
|
||||
|
||||
return scene_data
|
||||
|
|
|
|||
|
|
@ -1,6 +1,7 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Validate scene settings."""
|
||||
import os
|
||||
import re
|
||||
|
||||
import pyblish.api
|
||||
|
||||
|
|
@ -56,13 +57,26 @@ class ValidateSceneSettings(pyblish.api.InstancePlugin):
|
|||
hosts = ["aftereffects"]
|
||||
optional = True
|
||||
|
||||
skip_timelines_check = ["*"] # * >> skip for all
|
||||
skip_resolution_check = ["*"]
|
||||
skip_timelines_check = [".*"] # * >> skip for all
|
||||
skip_resolution_check = [".*"]
|
||||
|
||||
def process(self, instance):
|
||||
"""Plugin entry point."""
|
||||
expected_settings = api.get_asset_settings()
|
||||
self.log.info("expected_settings::{}".format(expected_settings))
|
||||
self.log.info("config from DB::{}".format(expected_settings))
|
||||
|
||||
if any(re.search(pattern, os.getenv('AVALON_TASK'))
|
||||
for pattern in self.skip_resolution_check):
|
||||
expected_settings.pop("resolutionWidth")
|
||||
expected_settings.pop("resolutionHeight")
|
||||
|
||||
if any(re.search(pattern, os.getenv('AVALON_TASK'))
|
||||
for pattern in self.skip_timelines_check):
|
||||
expected_settings.pop('fps', None)
|
||||
expected_settings.pop('frameStart', None)
|
||||
expected_settings.pop('frameEnd', None)
|
||||
expected_settings.pop('handleStart', None)
|
||||
expected_settings.pop('handleEnd', None)
|
||||
|
||||
# handle case where ftrack uses only two decimal places
|
||||
# 23.976023976023978 vs. 23.98
|
||||
|
|
@ -76,6 +90,8 @@ class ValidateSceneSettings(pyblish.api.InstancePlugin):
|
|||
duration = instance.data.get("frameEndHandle") - \
|
||||
instance.data.get("frameStartHandle") + 1
|
||||
|
||||
self.log.debug("filtered config::{}".format(expected_settings))
|
||||
|
||||
current_settings = {
|
||||
"fps": fps,
|
||||
"frameStartHandle": instance.data.get("frameStartHandle"),
|
||||
|
|
|
|||
|
|
@ -0,0 +1,41 @@
|
|||
import os
|
||||
|
||||
|
||||
def add_implementation_envs(env, _app):
|
||||
"""Modify environments to contain all required for implementation."""
|
||||
# Prepare path to implementation script
|
||||
implementation_user_script_path = os.path.join(
|
||||
os.environ["OPENPYPE_REPOS_ROOT"],
|
||||
"repos",
|
||||
"avalon-core",
|
||||
"setup",
|
||||
"blender"
|
||||
)
|
||||
|
||||
# Add blender implementation script path to PYTHONPATH
|
||||
python_path = env.get("PYTHONPATH") or ""
|
||||
python_path_parts = [
|
||||
path
|
||||
for path in python_path.split(os.pathsep)
|
||||
if path
|
||||
]
|
||||
python_path_parts.insert(0, implementation_user_script_path)
|
||||
env["PYTHONPATH"] = os.pathsep.join(python_path_parts)
|
||||
|
||||
# Modify Blender user scripts path
|
||||
blender_user_scripts = env.get("BLENDER_USER_SCRIPTS") or ""
|
||||
previous_user_scripts = []
|
||||
for path in blender_user_scripts.split(os.pathsep):
|
||||
if path and os.path.exists(path):
|
||||
path = os.path.normpath(path)
|
||||
if path != implementation_user_script_path:
|
||||
previous_user_scripts.append(path)
|
||||
|
||||
env["OPENPYPE_BLENDER_USER_SCRIPTS"] = os.pathsep.join(
|
||||
previous_user_scripts
|
||||
)
|
||||
env["BLENDER_USER_SCRIPTS"] = implementation_user_script_path
|
||||
|
||||
# Define Qt binding if not defined
|
||||
if not env.get("QT_PREFERRED_BINDING"):
|
||||
env["QT_PREFERRED_BINDING"] = "PySide2"
|
||||
|
|
@ -20,21 +20,9 @@ class CreateLayout(openpype.hosts.blender.api.plugin.Creator):
|
|||
asset = self.data["asset"]
|
||||
subset = self.data["subset"]
|
||||
name = openpype.hosts.blender.api.plugin.asset_name(asset, subset)
|
||||
collection = bpy.data.collections.new(name=name)
|
||||
bpy.context.scene.collection.children.link(collection)
|
||||
collection = bpy.context.collection
|
||||
collection.name = name
|
||||
self.data['task'] = api.Session.get('AVALON_TASK')
|
||||
lib.imprint(collection, self.data)
|
||||
|
||||
# Add the rig object and all the children meshes to
|
||||
# a set and link them all at the end to avoid duplicates.
|
||||
# Blender crashes if trying to link an object that is already linked.
|
||||
# This links automatically the children meshes if they were not
|
||||
# selected, and doesn't link them twice if they, insted,
|
||||
# were manually selected by the user.
|
||||
objects_to_link = set()
|
||||
|
||||
if (self.options or {}).get("useSelection"):
|
||||
for obj in lib.get_selection():
|
||||
collection.children.link(obj.users_collection[0])
|
||||
|
||||
return collection
|
||||
|
|
|
|||
|
|
@ -367,13 +367,13 @@ class UnrealLayoutLoader(plugin.AssetLoader):
|
|||
# Y axis mirrored
|
||||
obj.location = (
|
||||
location.get('x'),
|
||||
-location.get('y'),
|
||||
location.get('y'),
|
||||
location.get('z')
|
||||
)
|
||||
obj.rotation_euler = (
|
||||
rotation.get('x') + math.pi / 2,
|
||||
-rotation.get('y'),
|
||||
-rotation.get('z')
|
||||
rotation.get('x'),
|
||||
rotation.get('y'),
|
||||
rotation.get('z')
|
||||
)
|
||||
obj.scale = (
|
||||
scale.get('x'),
|
||||
|
|
|
|||
|
|
@ -108,19 +108,21 @@ class BlendModelLoader(plugin.AssetLoader):
|
|||
self.__class__.__name__,
|
||||
)
|
||||
|
||||
container_metadata = container.get(
|
||||
blender.pipeline.AVALON_PROPERTY)
|
||||
metadata = container.get(blender.pipeline.AVALON_PROPERTY)
|
||||
|
||||
container_metadata["libpath"] = libpath
|
||||
container_metadata["lib_container"] = lib_container
|
||||
metadata["libpath"] = libpath
|
||||
metadata["lib_container"] = lib_container
|
||||
|
||||
obj_container = self._process(
|
||||
libpath, lib_container, container_name, None)
|
||||
|
||||
container_metadata["obj_container"] = obj_container
|
||||
metadata["obj_container"] = obj_container
|
||||
|
||||
# Save the list of objects in the metadata container
|
||||
container_metadata["objects"] = obj_container.all_objects
|
||||
metadata["objects"] = obj_container.all_objects
|
||||
|
||||
metadata["parent"] = str(context["representation"]["parent"])
|
||||
metadata["family"] = context["representation"]["context"]["family"]
|
||||
|
||||
nodes = list(container.objects)
|
||||
nodes.append(container)
|
||||
|
|
|
|||
|
|
@ -155,18 +155,20 @@ class BlendRigLoader(plugin.AssetLoader):
|
|||
self.__class__.__name__,
|
||||
)
|
||||
|
||||
container_metadata = container.get(
|
||||
blender.pipeline.AVALON_PROPERTY)
|
||||
metadata = container.get(blender.pipeline.AVALON_PROPERTY)
|
||||
|
||||
container_metadata["libpath"] = libpath
|
||||
container_metadata["lib_container"] = lib_container
|
||||
metadata["libpath"] = libpath
|
||||
metadata["lib_container"] = lib_container
|
||||
|
||||
obj_container = self._process(
|
||||
libpath, lib_container, collection_name, None, None)
|
||||
|
||||
container_metadata["obj_container"] = obj_container
|
||||
metadata["obj_container"] = obj_container
|
||||
# Save the list of objects in the metadata container
|
||||
container_metadata["objects"] = obj_container.all_objects
|
||||
metadata["objects"] = obj_container.all_objects
|
||||
|
||||
metadata["parent"] = str(context["representation"]["parent"])
|
||||
metadata["family"] = context["representation"]["context"]["family"]
|
||||
|
||||
nodes = list(container.objects)
|
||||
nodes.append(container)
|
||||
|
|
|
|||
92
openpype/hosts/blender/plugins/publish/extract_layout.py
Normal file
92
openpype/hosts/blender/plugins/publish/extract_layout.py
Normal file
|
|
@ -0,0 +1,92 @@
|
|||
import os
|
||||
import json
|
||||
|
||||
import bpy
|
||||
|
||||
from avalon import blender, io
|
||||
import openpype.api
|
||||
|
||||
|
||||
class ExtractLayout(openpype.api.Extractor):
|
||||
"""Extract a layout."""
|
||||
|
||||
label = "Extract Layout"
|
||||
hosts = ["blender"]
|
||||
families = ["layout"]
|
||||
optional = True
|
||||
|
||||
def process(self, instance):
|
||||
# Define extract output file path
|
||||
stagingdir = self.staging_dir(instance)
|
||||
|
||||
# Perform extraction
|
||||
self.log.info("Performing extraction..")
|
||||
|
||||
json_data = []
|
||||
|
||||
for collection in instance:
|
||||
for asset in collection.children:
|
||||
collection = bpy.data.collections[asset.name]
|
||||
container = bpy.data.collections[asset.name + '_CON']
|
||||
metadata = container.get(blender.pipeline.AVALON_PROPERTY)
|
||||
|
||||
parent = metadata["parent"]
|
||||
family = metadata["family"]
|
||||
|
||||
self.log.debug("Parent: {}".format(parent))
|
||||
blend = io.find_one(
|
||||
{
|
||||
"type": "representation",
|
||||
"parent": io.ObjectId(parent),
|
||||
"name": "blend"
|
||||
},
|
||||
projection={"_id": True})
|
||||
blend_id = blend["_id"]
|
||||
|
||||
json_element = {}
|
||||
json_element["reference"] = str(blend_id)
|
||||
json_element["family"] = family
|
||||
json_element["instance_name"] = asset.name
|
||||
json_element["asset_name"] = metadata["lib_container"]
|
||||
json_element["file_path"] = metadata["libpath"]
|
||||
|
||||
obj = collection.objects[0]
|
||||
|
||||
json_element["transform"] = {
|
||||
"translation": {
|
||||
"x": obj.location.x,
|
||||
"y": obj.location.y,
|
||||
"z": obj.location.z
|
||||
},
|
||||
"rotation": {
|
||||
"x": obj.rotation_euler.x,
|
||||
"y": obj.rotation_euler.y,
|
||||
"z": obj.rotation_euler.z,
|
||||
},
|
||||
"scale": {
|
||||
"x": obj.scale.x,
|
||||
"y": obj.scale.y,
|
||||
"z": obj.scale.z
|
||||
}
|
||||
}
|
||||
json_data.append(json_element)
|
||||
|
||||
json_filename = "{}.json".format(instance.name)
|
||||
json_path = os.path.join(stagingdir, json_filename)
|
||||
|
||||
with open(json_path, "w+") as file:
|
||||
json.dump(json_data, fp=file, indent=2)
|
||||
|
||||
if "representations" not in instance.data:
|
||||
instance.data["representations"] = []
|
||||
|
||||
representation = {
|
||||
'name': 'json',
|
||||
'ext': 'json',
|
||||
'files': json_filename,
|
||||
"stagingDir": stagingdir,
|
||||
}
|
||||
instance.data["representations"].append(representation)
|
||||
|
||||
self.log.info("Extracted instance '%s' to: %s",
|
||||
instance.name, representation)
|
||||
|
|
@ -0,0 +1,10 @@
|
|||
import os
|
||||
|
||||
|
||||
def add_implementation_envs(env, _app):
|
||||
"""Modify environments to contain all required for implementation."""
|
||||
openharmony_path = os.path.join(
|
||||
os.environ["OPENPYPE_REPOS_ROOT"], "pype", "vendor", "OpenHarmony"
|
||||
)
|
||||
# TODO check if is already set? What to do if is already set?
|
||||
env["LIB_OPENHARMONY_PATH"] = openharmony_path
|
||||
|
|
@ -3,6 +3,7 @@
|
|||
import os
|
||||
from pathlib import Path
|
||||
import logging
|
||||
import re
|
||||
|
||||
from openpype import lib
|
||||
from openpype.api import (get_current_project_settings)
|
||||
|
|
@ -63,26 +64,9 @@ def get_asset_settings():
|
|||
"handleStart": handle_start,
|
||||
"handleEnd": handle_end,
|
||||
"resolutionWidth": resolution_width,
|
||||
"resolutionHeight": resolution_height
|
||||
"resolutionHeight": resolution_height,
|
||||
"entityType": entity_type
|
||||
}
|
||||
settings = get_current_project_settings()
|
||||
|
||||
try:
|
||||
skip_resolution_check = \
|
||||
settings["harmony"]["general"]["skip_resolution_check"]
|
||||
skip_timelines_check = \
|
||||
settings["harmony"]["general"]["skip_timelines_check"]
|
||||
except KeyError:
|
||||
skip_resolution_check = []
|
||||
skip_timelines_check = []
|
||||
|
||||
if os.getenv('AVALON_TASK') in skip_resolution_check:
|
||||
scene_data.pop("resolutionWidth")
|
||||
scene_data.pop("resolutionHeight")
|
||||
|
||||
if entity_type in skip_timelines_check:
|
||||
scene_data.pop('frameStart', None)
|
||||
scene_data.pop('frameEnd', None)
|
||||
|
||||
return scene_data
|
||||
|
||||
|
|
|
|||
|
|
@ -2,6 +2,7 @@
|
|||
"""Validate scene settings."""
|
||||
import os
|
||||
import json
|
||||
import re
|
||||
|
||||
import pyblish.api
|
||||
|
||||
|
|
@ -41,22 +42,42 @@ class ValidateSceneSettings(pyblish.api.InstancePlugin):
|
|||
families = ["workfile"]
|
||||
hosts = ["harmony"]
|
||||
actions = [ValidateSceneSettingsRepair]
|
||||
optional = True
|
||||
|
||||
frame_check_filter = ["_ch_", "_pr_", "_intd_", "_extd_"]
|
||||
# used for skipping resolution validation for render tasks
|
||||
render_check_filter = ["render", "Render"]
|
||||
# skip frameEnd check if asset contains any of:
|
||||
frame_check_filter = ["_ch_", "_pr_", "_intd_", "_extd_"] # regex
|
||||
|
||||
# skip resolution check if Task name matches any of regex patterns
|
||||
skip_resolution_check = ["render", "Render"] # regex
|
||||
|
||||
# skip frameStart, frameEnd check if Task name matches any of regex patt.
|
||||
skip_timelines_check = [] # regex
|
||||
|
||||
def process(self, instance):
|
||||
"""Plugin entry point."""
|
||||
expected_settings = openpype.hosts.harmony.api.get_asset_settings()
|
||||
self.log.info(expected_settings)
|
||||
self.log.info("scene settings from DB:".format(expected_settings))
|
||||
|
||||
expected_settings = _update_frames(dict.copy(expected_settings))
|
||||
expected_settings["frameEndHandle"] = expected_settings["frameEnd"] +\
|
||||
expected_settings["handleEnd"]
|
||||
|
||||
if any(string in instance.context.data['anatomyData']['asset']
|
||||
for string in self.frame_check_filter):
|
||||
if (any(re.search(pattern, os.getenv('AVALON_TASK'))
|
||||
for pattern in self.skip_resolution_check)):
|
||||
expected_settings.pop("resolutionWidth")
|
||||
expected_settings.pop("resolutionHeight")
|
||||
|
||||
entity_type = expected_settings.get("entityType")
|
||||
if (any(re.search(pattern, entity_type)
|
||||
for pattern in self.skip_timelines_check)):
|
||||
expected_settings.pop('frameStart', None)
|
||||
expected_settings.pop('frameEnd', None)
|
||||
|
||||
expected_settings.pop("entityType") # not useful after the check
|
||||
|
||||
asset_name = instance.context.data['anatomyData']['asset']
|
||||
if any(re.search(pattern, asset_name)
|
||||
for pattern in self.frame_check_filter):
|
||||
expected_settings.pop("frameEnd")
|
||||
|
||||
# handle case where ftrack uses only two decimal places
|
||||
|
|
@ -66,13 +87,7 @@ class ValidateSceneSettings(pyblish.api.InstancePlugin):
|
|||
fps = float(
|
||||
"{:.2f}".format(instance.context.data.get("frameRate")))
|
||||
|
||||
if any(string in instance.context.data['anatomyData']['task']
|
||||
for string in self.render_check_filter):
|
||||
self.log.debug("Render task detected, resolution check skipped")
|
||||
expected_settings.pop("resolutionWidth")
|
||||
expected_settings.pop("resolutionHeight")
|
||||
|
||||
self.log.debug(expected_settings)
|
||||
self.log.debug("filtered settings: {}".format(expected_settings))
|
||||
|
||||
current_settings = {
|
||||
"fps": fps,
|
||||
|
|
@ -84,7 +99,7 @@ class ValidateSceneSettings(pyblish.api.InstancePlugin):
|
|||
"resolutionWidth": instance.context.data.get("resolutionWidth"),
|
||||
"resolutionHeight": instance.context.data.get("resolutionHeight"),
|
||||
}
|
||||
self.log.debug("curr:: {}".format(current_settings))
|
||||
self.log.debug("current scene settings {}".format(current_settings))
|
||||
|
||||
invalid_settings = []
|
||||
for key, value in expected_settings.items():
|
||||
|
|
|
|||
|
|
@ -0,0 +1,40 @@
|
|||
import os
|
||||
import platform
|
||||
|
||||
|
||||
def add_implementation_envs(env, _app):
|
||||
# Add requirements to HIERO_PLUGIN_PATH
|
||||
pype_root = os.environ["OPENPYPE_REPOS_ROOT"]
|
||||
new_hiero_paths = [
|
||||
os.path.join(pype_root, "openpype", "hosts", "hiero", "startup")
|
||||
]
|
||||
old_hiero_path = env.get("HIERO_PLUGIN_PATH") or ""
|
||||
for path in old_hiero_path.split(os.pathsep):
|
||||
if not path or not os.path.exists(path):
|
||||
continue
|
||||
|
||||
norm_path = os.path.normpath(path)
|
||||
if norm_path not in new_hiero_paths:
|
||||
new_hiero_paths.append(norm_path)
|
||||
|
||||
env["HIERO_PLUGIN_PATH"] = os.pathsep.join(new_hiero_paths)
|
||||
|
||||
# Try to add QuickTime to PATH
|
||||
quick_time_path = "C:/Program Files (x86)/QuickTime/QTSystem"
|
||||
if platform.system() == "windows" and os.path.exists(quick_time_path):
|
||||
path_value = env.get("PATH") or ""
|
||||
path_paths = [
|
||||
path
|
||||
for path in path_value.split(os.pathsep)
|
||||
if path
|
||||
]
|
||||
path_paths.append(quick_time_path)
|
||||
env["PATH"] = os.pathsep.join(path_paths)
|
||||
|
||||
# Set default values if are not already set via settings
|
||||
defaults = {
|
||||
"LOGLEVEL": "DEBUG"
|
||||
}
|
||||
for key, value in defaults.items():
|
||||
if not env.get(key):
|
||||
env[key] = value
|
||||
|
|
@ -0,0 +1,38 @@
|
|||
import os
|
||||
|
||||
|
||||
def add_implementation_envs(env, _app):
|
||||
# Add requirements to HOUDINI_PATH and HOUDINI_MENU_PATH
|
||||
pype_root = os.environ["OPENPYPE_REPOS_ROOT"]
|
||||
|
||||
startup_path = os.path.join(
|
||||
pype_root, "openpype", "hosts", "houdini", "startup"
|
||||
)
|
||||
new_houdini_path = [startup_path]
|
||||
new_houdini_menu_path = [startup_path]
|
||||
|
||||
old_houdini_path = env.get("HOUDINI_PATH") or ""
|
||||
old_houdini_menu_path = env.get("HOUDINI_MENU_PATH") or ""
|
||||
|
||||
for path in old_houdini_path.split(os.pathsep):
|
||||
if not path or not os.path.exists(path):
|
||||
continue
|
||||
|
||||
norm_path = os.path.normpath(path)
|
||||
if norm_path not in new_houdini_path:
|
||||
new_houdini_path.append(norm_path)
|
||||
|
||||
for path in old_houdini_menu_path.split(os.pathsep):
|
||||
if not path or not os.path.exists(path):
|
||||
continue
|
||||
|
||||
norm_path = os.path.normpath(path)
|
||||
if norm_path not in new_houdini_menu_path:
|
||||
new_houdini_menu_path.append(norm_path)
|
||||
|
||||
# Add ampersand for unknown reason (Maybe is needed in Houdini?)
|
||||
new_houdini_path.append("&")
|
||||
new_houdini_menu_path.append("&")
|
||||
|
||||
env["HOUDINI_PATH"] = os.pathsep.join(new_houdini_path)
|
||||
env["HOUDINI_MENU_PATH"] = os.pathsep.join(new_houdini_menu_path)
|
||||
|
|
@ -0,0 +1,29 @@
|
|||
import os
|
||||
|
||||
|
||||
def add_implementation_envs(env, _app):
|
||||
# Add requirements to PYTHONPATH
|
||||
pype_root = os.environ["OPENPYPE_REPOS_ROOT"]
|
||||
new_python_paths = [
|
||||
os.path.join(pype_root, "openpype", "hosts", "maya", "startup"),
|
||||
os.path.join(pype_root, "repos", "avalon-core", "setup", "maya"),
|
||||
os.path.join(pype_root, "tools", "mayalookassigner")
|
||||
]
|
||||
old_python_path = env.get("PYTHONPATH") or ""
|
||||
for path in old_python_path.split(os.pathsep):
|
||||
if not path or not os.path.exists(path):
|
||||
continue
|
||||
|
||||
norm_path = os.path.normpath(path)
|
||||
if norm_path not in new_python_paths:
|
||||
new_python_paths.append(norm_path)
|
||||
|
||||
env["PYTHONPATH"] = os.pathsep.join(new_python_paths)
|
||||
|
||||
# Set default values if are not already set via settings
|
||||
defaults = {
|
||||
"OPENPYPE_LOG_NO_COLORS": "Yes"
|
||||
}
|
||||
for key, value in defaults.items():
|
||||
if not env.get(key):
|
||||
env[key] = value
|
||||
|
|
@ -184,7 +184,7 @@ class AExpectedFiles:
|
|||
(str): sanitized camera name
|
||||
|
||||
Example:
|
||||
>>> sanizite_camera_name('test:camera_01')
|
||||
>>> AExpectedFiles.sanizite_camera_name('test:camera_01')
|
||||
test_camera_01
|
||||
|
||||
"""
|
||||
|
|
@ -230,7 +230,7 @@ class AExpectedFiles:
|
|||
if self.layer.startswith("rs_"):
|
||||
layer_name = self.layer[3:]
|
||||
|
||||
scene_data = {
|
||||
return {
|
||||
"frameStart": int(self.get_render_attribute("startFrame")),
|
||||
"frameEnd": int(self.get_render_attribute("endFrame")),
|
||||
"frameStep": int(self.get_render_attribute("byFrameStep")),
|
||||
|
|
@ -245,7 +245,6 @@ class AExpectedFiles:
|
|||
"filePrefix": file_prefix,
|
||||
"enabledAOVs": self.get_aovs(),
|
||||
}
|
||||
return scene_data
|
||||
|
||||
def _generate_single_file_sequence(
|
||||
self, layer_data, force_aov_name=None):
|
||||
|
|
@ -685,8 +684,6 @@ class ExpectedFilesRedshift(AExpectedFiles):
|
|||
"""Expected files for Redshift renderer.
|
||||
|
||||
Attributes:
|
||||
ext_mapping (list): Mapping redshift extension dropdown values
|
||||
to strings.
|
||||
|
||||
unmerged_aovs (list): Name of aovs that are not merged into resulting
|
||||
exr and we need them specified in expectedFiles output.
|
||||
|
|
@ -695,8 +692,6 @@ class ExpectedFilesRedshift(AExpectedFiles):
|
|||
|
||||
unmerged_aovs = ["Cryptomatte"]
|
||||
|
||||
ext_mapping = ["iff", "exr", "tif", "png", "tga", "jpg"]
|
||||
|
||||
def __init__(self, layer, render_instance):
|
||||
"""Construtor."""
|
||||
super(ExpectedFilesRedshift, self).__init__(layer, render_instance)
|
||||
|
|
@ -785,12 +780,10 @@ class ExpectedFilesRedshift(AExpectedFiles):
|
|||
# anyway.
|
||||
return enabled_aovs
|
||||
|
||||
default_ext = self.ext_mapping[
|
||||
cmds.getAttr("redshiftOptions.imageFormat")
|
||||
]
|
||||
default_ext = cmds.getAttr(
|
||||
"redshiftOptions.imageFormat", asString=True)
|
||||
rs_aovs = cmds.ls(type="RedshiftAOV", referencedNodes=False)
|
||||
|
||||
# todo: find out how to detect multichannel exr for redshift
|
||||
for aov in rs_aovs:
|
||||
enabled = self.maya_is_true(cmds.getAttr("{}.enabled".format(aov)))
|
||||
for override in self.get_layer_overrides(
|
||||
|
|
|
|||
|
|
@ -12,7 +12,7 @@ from openpype.hosts.maya.api import (
|
|||
lib,
|
||||
plugin
|
||||
)
|
||||
from openpype.api import get_system_settings
|
||||
from openpype.api import (get_system_settings, get_asset)
|
||||
|
||||
|
||||
class CreateRender(plugin.Creator):
|
||||
|
|
@ -104,7 +104,7 @@ class CreateRender(plugin.Creator):
|
|||
# namespace is not empty, so we leave it untouched
|
||||
pass
|
||||
|
||||
while(cmds.namespace(exists=namespace_name)):
|
||||
while cmds.namespace(exists=namespace_name):
|
||||
namespace_name = "_{}{}".format(str(instance), index)
|
||||
index += 1
|
||||
|
||||
|
|
@ -125,7 +125,7 @@ class CreateRender(plugin.Creator):
|
|||
cmds.sets(sets, forceElement=instance)
|
||||
|
||||
# if no render layers are present, create default one with
|
||||
# asterix selector
|
||||
# asterisk selector
|
||||
if not layers:
|
||||
render_layer = self._rs.createRenderLayer('Main')
|
||||
collection = render_layer.createCollection("defaultCollection")
|
||||
|
|
@ -137,9 +137,7 @@ class CreateRender(plugin.Creator):
|
|||
if renderer.startswith('renderman'):
|
||||
renderer = 'renderman'
|
||||
|
||||
cmds.setAttr(self._image_prefix_nodes[renderer],
|
||||
self._image_prefixes[renderer],
|
||||
type="string")
|
||||
self._set_default_renderer_settings(renderer)
|
||||
|
||||
def _create_render_settings(self):
|
||||
# get pools
|
||||
|
|
@ -318,3 +316,86 @@ class CreateRender(plugin.Creator):
|
|||
False if os.getenv("OPENPYPE_DONT_VERIFY_SSL", True) else True
|
||||
) # noqa
|
||||
return requests.get(*args, **kwargs)
|
||||
|
||||
def _set_default_renderer_settings(self, renderer):
|
||||
"""Set basic settings based on renderer.
|
||||
|
||||
Args:
|
||||
renderer (str): Renderer name.
|
||||
|
||||
"""
|
||||
cmds.setAttr(self._image_prefix_nodes[renderer],
|
||||
self._image_prefixes[renderer],
|
||||
type="string")
|
||||
|
||||
asset = get_asset()
|
||||
|
||||
if renderer == "arnold":
|
||||
# set format to exr
|
||||
cmds.setAttr(
|
||||
"defaultArnoldDriver.ai_translator", "exr", type="string")
|
||||
# enable animation
|
||||
cmds.setAttr("defaultRenderGlobals.outFormatControl", 0)
|
||||
cmds.setAttr("defaultRenderGlobals.animation", 1)
|
||||
cmds.setAttr("defaultRenderGlobals.putFrameBeforeExt", 1)
|
||||
cmds.setAttr("defaultRenderGlobals.extensionPadding", 4)
|
||||
|
||||
# resolution
|
||||
cmds.setAttr(
|
||||
"defaultResolution.width",
|
||||
asset["data"].get("resolutionWidth"))
|
||||
cmds.setAttr(
|
||||
"defaultResolution.height",
|
||||
asset["data"].get("resolutionHeight"))
|
||||
|
||||
if renderer == "vray":
|
||||
vray_settings = cmds.ls(type="VRaySettingsNode")
|
||||
if not vray_settings:
|
||||
node = cmds.createNode("VRaySettingsNode")
|
||||
else:
|
||||
node = vray_settings[0]
|
||||
|
||||
# set underscore as element separator instead of default `.`
|
||||
cmds.setAttr(
|
||||
"{}.fileNameRenderElementSeparator".format(
|
||||
node),
|
||||
"_"
|
||||
)
|
||||
# set format to exr
|
||||
cmds.setAttr(
|
||||
"{}.imageFormatStr".format(node), 5)
|
||||
|
||||
# animType
|
||||
cmds.setAttr(
|
||||
"{}.animType".format(node), 1)
|
||||
|
||||
# resolution
|
||||
cmds.setAttr(
|
||||
"{}.width".format(node),
|
||||
asset["data"].get("resolutionWidth"))
|
||||
cmds.setAttr(
|
||||
"{}.height".format(node),
|
||||
asset["data"].get("resolutionHeight"))
|
||||
|
||||
if renderer == "redshift":
|
||||
redshift_settings = cmds.ls(type="RedshiftOptions")
|
||||
if not redshift_settings:
|
||||
node = cmds.createNode("RedshiftOptions")
|
||||
else:
|
||||
node = redshift_settings[0]
|
||||
|
||||
# set exr
|
||||
cmds.setAttr("{}.imageFormat".format(node), 1)
|
||||
# resolution
|
||||
cmds.setAttr(
|
||||
"defaultResolution.width",
|
||||
asset["data"].get("resolutionWidth"))
|
||||
cmds.setAttr(
|
||||
"defaultResolution.height",
|
||||
asset["data"].get("resolutionHeight"))
|
||||
|
||||
# enable animation
|
||||
cmds.setAttr("defaultRenderGlobals.outFormatControl", 0)
|
||||
cmds.setAttr("defaultRenderGlobals.animation", 1)
|
||||
cmds.setAttr("defaultRenderGlobals.putFrameBeforeExt", 1)
|
||||
cmds.setAttr("defaultRenderGlobals.extensionPadding", 4)
|
||||
|
|
|
|||
|
|
@ -96,19 +96,25 @@ class ExtractPlayblast(openpype.api.Extractor):
|
|||
# Remove panel key since it's internal value to capture_gui
|
||||
preset.pop("panel", None)
|
||||
|
||||
|
||||
self.log.info('using viewport preset: {}'.format(preset))
|
||||
|
||||
path = capture.capture(**preset)
|
||||
playblast = self._fix_playblast_output_path(path)
|
||||
|
||||
self.log.info("file list {}".format(playblast))
|
||||
self.log.debug("playblast path {}".format(path))
|
||||
|
||||
collected_frames = os.listdir(stagingdir)
|
||||
collections, remainder = clique.assemble(collected_frames)
|
||||
input_path = os.path.join(
|
||||
stagingdir, collections[0].format('{head}{padding}{tail}'))
|
||||
self.log.info("input {}".format(input_path))
|
||||
collected_files = os.listdir(stagingdir)
|
||||
collections, remainder = clique.assemble(collected_files)
|
||||
|
||||
self.log.debug("filename {}".format(filename))
|
||||
frame_collection = None
|
||||
for collection in collections:
|
||||
filebase = collection.format('{head}').rstrip(".")
|
||||
self.log.debug("collection head {}".format(filebase))
|
||||
if filebase in filename:
|
||||
frame_collection = collection
|
||||
self.log.info(
|
||||
"we found collection of interest {}".format(
|
||||
str(frame_collection)))
|
||||
|
||||
if "representations" not in instance.data:
|
||||
instance.data["representations"] = []
|
||||
|
|
@ -119,12 +125,11 @@ class ExtractPlayblast(openpype.api.Extractor):
|
|||
|
||||
# Add camera node name to representation data
|
||||
camera_node_name = pm.ls(camera)[0].getTransform().name()
|
||||
|
||||
|
||||
representation = {
|
||||
'name': 'png',
|
||||
'ext': 'png',
|
||||
'files': collected_frames,
|
||||
'files': list(frame_collection),
|
||||
"stagingDir": stagingdir,
|
||||
"frameStart": start,
|
||||
"frameEnd": end,
|
||||
|
|
@ -135,44 +140,6 @@ class ExtractPlayblast(openpype.api.Extractor):
|
|||
}
|
||||
instance.data["representations"].append(representation)
|
||||
|
||||
def _fix_playblast_output_path(self, filepath):
|
||||
"""Workaround a bug in maya.cmds.playblast to return correct filepath.
|
||||
|
||||
When the `viewer` argument is set to False and maya.cmds.playblast
|
||||
does not automatically open the playblasted file the returned
|
||||
filepath does not have the file's extension added correctly.
|
||||
|
||||
To workaround this we just glob.glob() for any file extensions and
|
||||
assume the latest modified file is the correct file and return it.
|
||||
"""
|
||||
# Catch cancelled playblast
|
||||
if filepath is None:
|
||||
self.log.warning("Playblast did not result in output path. "
|
||||
"Playblast is probably interrupted.")
|
||||
return None
|
||||
|
||||
# Fix: playblast not returning correct filename (with extension)
|
||||
# Lets assume the most recently modified file is the correct one.
|
||||
if not os.path.exists(filepath):
|
||||
directory = os.path.dirname(filepath)
|
||||
filename = os.path.basename(filepath)
|
||||
# check if the filepath is has frame based filename
|
||||
# example : capture.####.png
|
||||
parts = filename.split(".")
|
||||
if len(parts) == 3:
|
||||
query = os.path.join(directory, "{}.*.{}".format(parts[0],
|
||||
parts[-1]))
|
||||
files = glob.glob(query)
|
||||
else:
|
||||
files = glob.glob("{}.*".format(filepath))
|
||||
|
||||
if not files:
|
||||
raise RuntimeError("Couldn't find playblast from: "
|
||||
"{0}".format(filepath))
|
||||
filepath = max(files, key=os.path.getmtime)
|
||||
|
||||
return filepath
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def maintained_time():
|
||||
|
|
|
|||
|
|
@ -1,8 +1,9 @@
|
|||
import os
|
||||
# -*- coding: utf-8 -*-
|
||||
"""Maya validator for render settings."""
|
||||
import re
|
||||
from collections import OrderedDict
|
||||
|
||||
from maya import cmds, mel
|
||||
import pymel.core as pm
|
||||
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
|
|
@ -60,6 +61,8 @@ class ValidateRenderSettings(pyblish.api.InstancePlugin):
|
|||
'renderman': '<layer>_<aov>.<f4>.<ext>'
|
||||
}
|
||||
|
||||
redshift_AOV_prefix = "<BeautyPath>/<BeautyFile>_<RenderPass>"
|
||||
|
||||
# WARNING: There is bug? in renderman, translating <scene> token
|
||||
# to something left behind mayas default image prefix. So instead
|
||||
# `SceneName_v01` it translates to:
|
||||
|
|
@ -120,25 +123,59 @@ class ValidateRenderSettings(pyblish.api.InstancePlugin):
|
|||
"doesn't have: '<renderlayer>' or "
|
||||
"'<layer>' token".format(prefix))
|
||||
|
||||
if len(cameras) > 1:
|
||||
if not re.search(cls.R_CAMERA_TOKEN, prefix):
|
||||
invalid = True
|
||||
cls.log.error("Wrong image prefix [ {} ] - "
|
||||
"doesn't have: '<camera>' token".format(prefix))
|
||||
if len(cameras) > 1 and not re.search(cls.R_CAMERA_TOKEN, prefix):
|
||||
invalid = True
|
||||
cls.log.error("Wrong image prefix [ {} ] - "
|
||||
"doesn't have: '<camera>' token".format(prefix))
|
||||
|
||||
# renderer specific checks
|
||||
if renderer == "vray":
|
||||
# no vray checks implemented yet
|
||||
pass
|
||||
elif renderer == "redshift":
|
||||
vray_settings = cmds.ls(type="VRaySettingsNode")
|
||||
if not vray_settings:
|
||||
node = cmds.createNode("VRaySettingsNode")
|
||||
else:
|
||||
node = vray_settings[0]
|
||||
|
||||
if cmds.getAttr(
|
||||
"{}.fileNameRenderElementSeparator".format(node)) != "_":
|
||||
invalid = False
|
||||
cls.log.error("AOV separator is not set correctly.")
|
||||
|
||||
if renderer == "redshift":
|
||||
if re.search(cls.R_AOV_TOKEN, prefix):
|
||||
invalid = True
|
||||
cls.log.error("Do not use AOV token [ {} ] - "
|
||||
"Redshift automatically append AOV name and "
|
||||
"it doesn't make much sense with "
|
||||
"Multipart EXR".format(prefix))
|
||||
cls.log.error(("Do not use AOV token [ {} ] - "
|
||||
"Redshift is using image prefixes per AOV so "
|
||||
"it doesn't make much sense using it in global"
|
||||
"image prefix").format(prefix))
|
||||
# get redshift AOVs
|
||||
rs_aovs = cmds.ls(type="RedshiftAOV", referencedNodes=False)
|
||||
for aov in rs_aovs:
|
||||
aov_prefix = cmds.getAttr("{}.filePrefix".format(aov))
|
||||
# check their image prefix
|
||||
if aov_prefix != cls.redshift_AOV_prefix:
|
||||
cls.log.error(("AOV ({}) image prefix is not set "
|
||||
"correctly {} != {}").format(
|
||||
cmds.getAttr("{}.name".format(aov)),
|
||||
cmds.getAttr("{}.filePrefix".format(aov)),
|
||||
aov_prefix
|
||||
))
|
||||
invalid = True
|
||||
# get aov format
|
||||
aov_ext = cmds.getAttr(
|
||||
"{}.fileFormat".format(aov), asString=True)
|
||||
|
||||
elif renderer == "renderman":
|
||||
default_ext = cmds.getAttr(
|
||||
"redshiftOptions.imageFormat", asString=True)
|
||||
|
||||
if default_ext != aov_ext:
|
||||
cls.log.error(("AOV file format is not the same "
|
||||
"as the one set globally "
|
||||
"{} != {}").format(default_ext,
|
||||
aov_ext))
|
||||
invalid = True
|
||||
|
||||
if renderer == "renderman":
|
||||
file_prefix = cmds.getAttr("rmanGlobals.imageFileFormat")
|
||||
dir_prefix = cmds.getAttr("rmanGlobals.imageOutputDir")
|
||||
|
||||
|
|
@ -151,7 +188,7 @@ class ValidateRenderSettings(pyblish.api.InstancePlugin):
|
|||
cls.log.error("Wrong directory prefix [ {} ]".format(
|
||||
dir_prefix))
|
||||
|
||||
else:
|
||||
if renderer == "arnold":
|
||||
multipart = cmds.getAttr("defaultArnoldDriver.mergeAOVs")
|
||||
if multipart:
|
||||
if re.search(cls.R_AOV_TOKEN, prefix):
|
||||
|
|
@ -177,6 +214,43 @@ class ValidateRenderSettings(pyblish.api.InstancePlugin):
|
|||
cls.log.error("Expecting padding of {} ( {} )".format(
|
||||
cls.DEFAULT_PADDING, "0" * cls.DEFAULT_PADDING))
|
||||
|
||||
# load validation definitions from settings
|
||||
validation_settings = (
|
||||
instance.context.data["project_settings"]["maya"]["publish"]["ValidateRenderSettings"].get( # noqa: E501
|
||||
"{}_render_attributes".format(renderer))
|
||||
)
|
||||
|
||||
# go through definitions and test if such node.attribute exists.
|
||||
# if so, compare its value from the one required.
|
||||
for attr, value in OrderedDict(validation_settings).items():
|
||||
# first get node of that type
|
||||
cls.log.debug("{}: {}".format(attr, value))
|
||||
node_type = attr.split(".")[0]
|
||||
attribute_name = ".".join(attr.split(".")[1:])
|
||||
nodes = cmds.ls(type=node_type)
|
||||
|
||||
if not isinstance(nodes, list):
|
||||
cls.log.warning("No nodes of '{}' found.".format(node_type))
|
||||
continue
|
||||
|
||||
for node in nodes:
|
||||
try:
|
||||
render_value = cmds.getAttr(
|
||||
"{}.{}".format(node, attribute_name))
|
||||
except RuntimeError:
|
||||
invalid = True
|
||||
cls.log.error(
|
||||
"Cannot get value of {}.{}".format(
|
||||
node, attribute_name))
|
||||
else:
|
||||
if value != render_value:
|
||||
invalid = True
|
||||
cls.log.error(
|
||||
("Invalid value {} set on {}.{}. "
|
||||
"Expecting {}").format(
|
||||
render_value, node, attribute_name, value)
|
||||
)
|
||||
|
||||
return invalid
|
||||
|
||||
@classmethod
|
||||
|
|
@ -210,3 +284,29 @@ class ValidateRenderSettings(pyblish.api.InstancePlugin):
|
|||
cmds.setAttr("rmanGlobals.imageOutputDir",
|
||||
cls.RendermanDirPrefix,
|
||||
type="string")
|
||||
|
||||
if renderer == "vray":
|
||||
vray_settings = cmds.ls(type="VRaySettingsNode")
|
||||
if not vray_settings:
|
||||
node = cmds.createNode("VRaySettingsNode")
|
||||
else:
|
||||
node = vray_settings[0]
|
||||
|
||||
cmds.setAttr(
|
||||
"{}.fileNameRenderElementSeparator".format(
|
||||
node),
|
||||
"_"
|
||||
)
|
||||
|
||||
if renderer == "redshift":
|
||||
# get redshift AOVs
|
||||
rs_aovs = cmds.ls(type="RedshiftAOV", referencedNodes=False)
|
||||
for aov in rs_aovs:
|
||||
# fix AOV prefixes
|
||||
cmds.setAttr(
|
||||
"{}.filePrefix".format(aov), cls.redshift_AOV_prefix)
|
||||
# fix AOV file format
|
||||
default_ext = cmds.getAttr(
|
||||
"redshiftOptions.imageFormat", asString=True)
|
||||
cmds.setAttr(
|
||||
"{}.fileFormat".format(aov), default_ext)
|
||||
|
|
|
|||
|
|
@ -0,0 +1,43 @@
|
|||
import os
|
||||
import platform
|
||||
|
||||
|
||||
def add_implementation_envs(env, _app):
|
||||
# Add requirements to NUKE_PATH
|
||||
pype_root = os.environ["OPENPYPE_REPOS_ROOT"]
|
||||
new_nuke_paths = [
|
||||
os.path.join(pype_root, "openpype", "hosts", "nuke", "startup"),
|
||||
os.path.join(
|
||||
pype_root, "repos", "avalon-core", "setup", "nuke", "nuke_path"
|
||||
)
|
||||
]
|
||||
old_nuke_path = env.get("NUKE_PATH") or ""
|
||||
for path in old_nuke_path.split(os.pathsep):
|
||||
if not path or not os.path.exists(path):
|
||||
continue
|
||||
|
||||
norm_path = os.path.normpath(path)
|
||||
if norm_path not in new_nuke_paths:
|
||||
new_nuke_paths.append(norm_path)
|
||||
|
||||
env["NUKE_PATH"] = os.pathsep.join(new_nuke_paths)
|
||||
|
||||
# Try to add QuickTime to PATH
|
||||
quick_time_path = "C:/Program Files (x86)/QuickTime/QTSystem"
|
||||
if platform.system() == "windows" and os.path.exists(quick_time_path):
|
||||
path_value = env.get("PATH") or ""
|
||||
path_paths = [
|
||||
path
|
||||
for path in path_value.split(os.pathsep)
|
||||
if path
|
||||
]
|
||||
path_paths.append(quick_time_path)
|
||||
env["PATH"] = os.pathsep.join(path_paths)
|
||||
|
||||
# Set default values if are not already set via settings
|
||||
defaults = {
|
||||
"LOGLEVEL": "DEBUG"
|
||||
}
|
||||
for key, value in defaults.items():
|
||||
if not env.get(key):
|
||||
env[key] = value
|
||||
|
|
@ -1,6 +1,5 @@
|
|||
import nuke
|
||||
import contextlib
|
||||
|
||||
from avalon.vendor import qargparse
|
||||
from avalon import api, io
|
||||
from openpype.api import get_current_project_settings
|
||||
from openpype.hosts.nuke.api.lib import (
|
||||
|
|
@ -8,41 +7,6 @@ from openpype.hosts.nuke.api.lib import (
|
|||
)
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def preserve_trim(node):
|
||||
"""Preserve the relative trim of the Loader tool.
|
||||
|
||||
This tries to preserve the loader's trim (trim in and trim out) after
|
||||
the context by reapplying the "amount" it trims on the clip's length at
|
||||
start and end.
|
||||
|
||||
"""
|
||||
# working script frame range
|
||||
script_start = nuke.root()["first_frame"].value()
|
||||
|
||||
start_at_frame = None
|
||||
offset_frame = None
|
||||
if node['frame_mode'].value() == "start at":
|
||||
start_at_frame = node['frame'].value()
|
||||
if node['frame_mode'].value() == "offset":
|
||||
offset_frame = node['frame'].value()
|
||||
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
if start_at_frame:
|
||||
node['frame_mode'].setValue("start at")
|
||||
node['frame'].setValue(str(script_start))
|
||||
print("start frame of Read was set to"
|
||||
"{}".format(script_start))
|
||||
|
||||
if offset_frame:
|
||||
node['frame_mode'].setValue("offset")
|
||||
node['frame'].setValue(str((script_start + offset_frame)))
|
||||
print("start frame of Read was set to"
|
||||
"{}".format(script_start))
|
||||
|
||||
|
||||
def add_review_presets_config():
|
||||
returning = {
|
||||
"families": list(),
|
||||
|
|
@ -79,14 +43,30 @@ class LoadMov(api.Loader):
|
|||
|
||||
script_start = nuke.root()["first_frame"].value()
|
||||
|
||||
# options gui
|
||||
defaults = {
|
||||
"start_at_workfile": True
|
||||
}
|
||||
|
||||
options = [
|
||||
qargparse.Boolean(
|
||||
"start_at_workfile",
|
||||
help="Load at workfile start frame",
|
||||
default=True
|
||||
)
|
||||
]
|
||||
|
||||
node_name_template = "{class_name}_{ext}"
|
||||
|
||||
def load(self, context, name, namespace, data):
|
||||
def load(self, context, name, namespace, options):
|
||||
from avalon.nuke import (
|
||||
containerise,
|
||||
viewer_update_and_undo_stop
|
||||
)
|
||||
|
||||
start_at_workfile = options.get(
|
||||
"start_at_workfile", self.defaults["start_at_workfile"])
|
||||
|
||||
version = context['version']
|
||||
version_data = version.get("data", {})
|
||||
repr_id = context["representation"]["_id"]
|
||||
|
|
@ -149,10 +129,14 @@ class LoadMov(api.Loader):
|
|||
read_node["first"].setValue(first)
|
||||
read_node["origlast"].setValue(last)
|
||||
read_node["last"].setValue(last)
|
||||
|
||||
# start at script start
|
||||
read_node['frame_mode'].setValue("start at")
|
||||
read_node['frame'].setValue(str(self.script_start))
|
||||
|
||||
if start_at_workfile:
|
||||
# start at workfile start
|
||||
read_node['frame'].setValue(str(self.script_start))
|
||||
else:
|
||||
# start at version frame start
|
||||
read_node['frame'].setValue(str(orig_first - handle_start))
|
||||
|
||||
if colorspace:
|
||||
read_node["colorspace"].setValue(str(colorspace))
|
||||
|
|
@ -266,29 +250,29 @@ class LoadMov(api.Loader):
|
|||
# create handles offset (only to last, because of mov)
|
||||
last += handle_start + handle_end
|
||||
|
||||
# Update the loader's path whilst preserving some values
|
||||
with preserve_trim(read_node):
|
||||
read_node["file"].setValue(file)
|
||||
self.log.info("__ node['file']: {}".format(
|
||||
read_node["file"].value()))
|
||||
read_node["file"].setValue(file)
|
||||
|
||||
# Set the global in to the start frame of the sequence
|
||||
read_node["origfirst"].setValue(first)
|
||||
read_node["first"].setValue(first)
|
||||
read_node["origlast"].setValue(last)
|
||||
read_node["last"].setValue(last)
|
||||
# Set the global in to the start frame of the sequence
|
||||
read_node["origfirst"].setValue(first)
|
||||
read_node["first"].setValue(first)
|
||||
read_node["origlast"].setValue(last)
|
||||
read_node["last"].setValue(last)
|
||||
read_node['frame_mode'].setValue("start at")
|
||||
|
||||
# start at script start
|
||||
read_node['frame_mode'].setValue("start at")
|
||||
if int(self.script_start) == int(read_node['frame'].value()):
|
||||
# start at workfile start
|
||||
read_node['frame'].setValue(str(self.script_start))
|
||||
else:
|
||||
# start at version frame start
|
||||
read_node['frame'].setValue(str(orig_first - handle_start))
|
||||
|
||||
if colorspace:
|
||||
read_node["colorspace"].setValue(str(colorspace))
|
||||
if colorspace:
|
||||
read_node["colorspace"].setValue(str(colorspace))
|
||||
|
||||
preset_clrsp = get_imageio_input_colorspace(file)
|
||||
preset_clrsp = get_imageio_input_colorspace(file)
|
||||
|
||||
if preset_clrsp is not None:
|
||||
read_node["colorspace"].setValue(preset_clrsp)
|
||||
if preset_clrsp is not None:
|
||||
read_node["colorspace"].setValue(preset_clrsp)
|
||||
|
||||
updated_dict = {}
|
||||
updated_dict.update({
|
||||
|
|
@ -321,8 +305,8 @@ class LoadMov(api.Loader):
|
|||
|
||||
from avalon.nuke import viewer_update_and_undo_stop
|
||||
|
||||
node = nuke.toNode(container['objectName'])
|
||||
assert node.Class() == "Read", "Must be Read"
|
||||
read_node = nuke.toNode(container['objectName'])
|
||||
assert read_node.Class() == "Read", "Must be Read"
|
||||
|
||||
with viewer_update_and_undo_stop():
|
||||
nuke.delete(node)
|
||||
nuke.delete(read_node)
|
||||
|
|
|
|||
|
|
@ -1,74 +1,11 @@
|
|||
import nuke
|
||||
import contextlib
|
||||
|
||||
from avalon.vendor import qargparse
|
||||
from avalon import api, io
|
||||
from openpype.hosts.nuke.api.lib import (
|
||||
get_imageio_input_colorspace
|
||||
)
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def preserve_trim(node):
|
||||
"""Preserve the relative trim of the Loader tool.
|
||||
|
||||
This tries to preserve the loader's trim (trim in and trim out) after
|
||||
the context by reapplying the "amount" it trims on the clip's length at
|
||||
start and end.
|
||||
|
||||
"""
|
||||
# working script frame range
|
||||
script_start = nuke.root()["first_frame"].value()
|
||||
|
||||
start_at_frame = None
|
||||
offset_frame = None
|
||||
if node['frame_mode'].value() == "start at":
|
||||
start_at_frame = node['frame'].value()
|
||||
if node['frame_mode'].value() == "offset":
|
||||
offset_frame = node['frame'].value()
|
||||
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
if start_at_frame:
|
||||
node['frame_mode'].setValue("start at")
|
||||
node['frame'].setValue(str(script_start))
|
||||
print("start frame of Read was set to"
|
||||
"{}".format(script_start))
|
||||
|
||||
if offset_frame:
|
||||
node['frame_mode'].setValue("offset")
|
||||
node['frame'].setValue(str((script_start + offset_frame)))
|
||||
print("start frame of Read was set to"
|
||||
"{}".format(script_start))
|
||||
|
||||
|
||||
def loader_shift(node, frame, relative=False):
|
||||
"""Shift global in time by i preserving duration
|
||||
|
||||
This moves the loader by i frames preserving global duration. When relative
|
||||
is False it will shift the global in to the start frame.
|
||||
|
||||
Args:
|
||||
loader (tool): The fusion loader tool.
|
||||
frame (int): The amount of frames to move.
|
||||
relative (bool): When True the shift is relative, else the shift will
|
||||
change the global in to frame.
|
||||
|
||||
Returns:
|
||||
int: The resulting relative frame change (how much it moved)
|
||||
|
||||
"""
|
||||
# working script frame range
|
||||
script_start = nuke.root()["first_frame"].value()
|
||||
|
||||
if relative:
|
||||
node['frame_mode'].setValue("start at")
|
||||
node['frame'].setValue(str(script_start))
|
||||
else:
|
||||
node['frame_mode'].setValue("start at")
|
||||
node['frame'].setValue(str(frame))
|
||||
|
||||
|
||||
class LoadSequence(api.Loader):
|
||||
"""Load image sequence into Nuke"""
|
||||
|
||||
|
|
@ -80,14 +17,32 @@ class LoadSequence(api.Loader):
|
|||
icon = "file-video-o"
|
||||
color = "white"
|
||||
|
||||
script_start = nuke.root()["first_frame"].value()
|
||||
|
||||
# option gui
|
||||
defaults = {
|
||||
"start_at_workfile": True
|
||||
}
|
||||
|
||||
options = [
|
||||
qargparse.Boolean(
|
||||
"start_at_workfile",
|
||||
help="Load at workfile start frame",
|
||||
default=True
|
||||
)
|
||||
]
|
||||
|
||||
node_name_template = "{class_name}_{ext}"
|
||||
|
||||
def load(self, context, name, namespace, data):
|
||||
def load(self, context, name, namespace, options):
|
||||
from avalon.nuke import (
|
||||
containerise,
|
||||
viewer_update_and_undo_stop
|
||||
)
|
||||
|
||||
start_at_workfile = options.get(
|
||||
"start_at_workfile", self.defaults["start_at_workfile"])
|
||||
|
||||
version = context['version']
|
||||
version_data = version.get("data", {})
|
||||
repr_id = context["representation"]["_id"]
|
||||
|
|
@ -139,32 +94,31 @@ class LoadSequence(api.Loader):
|
|||
read_name = self.node_name_template.format(**name_data)
|
||||
|
||||
# Create the Loader with the filename path set
|
||||
|
||||
# TODO: it might be universal read to img/geo/camera
|
||||
r = nuke.createNode(
|
||||
read_node = nuke.createNode(
|
||||
"Read",
|
||||
"name {}".format(read_name))
|
||||
|
||||
# to avoid multiple undo steps for rest of process
|
||||
# we will switch off undo-ing
|
||||
with viewer_update_and_undo_stop():
|
||||
r["file"].setValue(file)
|
||||
read_node["file"].setValue(file)
|
||||
|
||||
# Set colorspace defined in version data
|
||||
colorspace = context["version"]["data"].get("colorspace")
|
||||
if colorspace:
|
||||
r["colorspace"].setValue(str(colorspace))
|
||||
read_node["colorspace"].setValue(str(colorspace))
|
||||
|
||||
preset_clrsp = get_imageio_input_colorspace(file)
|
||||
|
||||
if preset_clrsp is not None:
|
||||
r["colorspace"].setValue(preset_clrsp)
|
||||
read_node["colorspace"].setValue(preset_clrsp)
|
||||
|
||||
loader_shift(r, first, relative=True)
|
||||
r["origfirst"].setValue(int(first))
|
||||
r["first"].setValue(int(first))
|
||||
r["origlast"].setValue(int(last))
|
||||
r["last"].setValue(int(last))
|
||||
# set start frame depending on workfile or version
|
||||
self.loader_shift(read_node, start_at_workfile)
|
||||
read_node["origfirst"].setValue(int(first))
|
||||
read_node["first"].setValue(int(first))
|
||||
read_node["origlast"].setValue(int(last))
|
||||
read_node["last"].setValue(int(last))
|
||||
|
||||
# add additional metadata from the version to imprint Avalon knob
|
||||
add_keys = ["frameStart", "frameEnd",
|
||||
|
|
@ -181,48 +135,20 @@ class LoadSequence(api.Loader):
|
|||
|
||||
data_imprint.update({"objectName": read_name})
|
||||
|
||||
r["tile_color"].setValue(int("0x4ecd25ff", 16))
|
||||
read_node["tile_color"].setValue(int("0x4ecd25ff", 16))
|
||||
|
||||
if version_data.get("retime", None):
|
||||
speed = version_data.get("speed", 1)
|
||||
time_warp_nodes = version_data.get("timewarps", [])
|
||||
self.make_retimes(r, speed, time_warp_nodes)
|
||||
self.make_retimes(read_node, speed, time_warp_nodes)
|
||||
|
||||
return containerise(r,
|
||||
return containerise(read_node,
|
||||
name=name,
|
||||
namespace=namespace,
|
||||
context=context,
|
||||
loader=self.__class__.__name__,
|
||||
data=data_imprint)
|
||||
|
||||
def make_retimes(self, node, speed, time_warp_nodes):
|
||||
''' Create all retime and timewarping nodes with coppied animation '''
|
||||
if speed != 1:
|
||||
rtn = nuke.createNode(
|
||||
"Retime",
|
||||
"speed {}".format(speed))
|
||||
rtn["before"].setValue("continue")
|
||||
rtn["after"].setValue("continue")
|
||||
rtn["input.first_lock"].setValue(True)
|
||||
rtn["input.first"].setValue(
|
||||
self.handle_start + self.first_frame
|
||||
)
|
||||
|
||||
if time_warp_nodes != []:
|
||||
for timewarp in time_warp_nodes:
|
||||
twn = nuke.createNode(timewarp["Class"],
|
||||
"name {}".format(timewarp["name"]))
|
||||
if isinstance(timewarp["lookup"], list):
|
||||
# if array for animation
|
||||
twn["lookup"].setAnimated()
|
||||
for i, value in enumerate(timewarp["lookup"]):
|
||||
twn["lookup"].setValueAt(
|
||||
(self.first_frame + i) + value,
|
||||
(self.first_frame + i))
|
||||
else:
|
||||
# if static value `int`
|
||||
twn["lookup"].setValue(timewarp["lookup"])
|
||||
|
||||
def switch(self, container, representation):
|
||||
self.update(container, representation)
|
||||
|
||||
|
|
@ -239,9 +165,9 @@ class LoadSequence(api.Loader):
|
|||
update_container
|
||||
)
|
||||
|
||||
node = nuke.toNode(container['objectName'])
|
||||
read_node = nuke.toNode(container['objectName'])
|
||||
|
||||
assert node.Class() == "Read", "Must be Read"
|
||||
assert read_node.Class() == "Read", "Must be Read"
|
||||
|
||||
repr_cont = representation["context"]
|
||||
|
||||
|
|
@ -288,23 +214,23 @@ class LoadSequence(api.Loader):
|
|||
self.log.warning(
|
||||
"Missing start frame for updated version"
|
||||
"assuming starts at frame 0 for: "
|
||||
"{} ({})".format(node['name'].value(), representation))
|
||||
"{} ({})".format(read_node['name'].value(), representation))
|
||||
first = 0
|
||||
|
||||
first -= self.handle_start
|
||||
last += self.handle_end
|
||||
|
||||
# Update the loader's path whilst preserving some values
|
||||
with preserve_trim(node):
|
||||
node["file"].setValue(file)
|
||||
self.log.info("__ node['file']: {}".format(node["file"].value()))
|
||||
read_node["file"].setValue(file)
|
||||
|
||||
# Set the global in to the start frame of the sequence
|
||||
loader_shift(node, first, relative=True)
|
||||
node["origfirst"].setValue(int(first))
|
||||
node["first"].setValue(int(first))
|
||||
node["origlast"].setValue(int(last))
|
||||
node["last"].setValue(int(last))
|
||||
# set start frame depending on workfile or version
|
||||
self.loader_shift(
|
||||
read_node,
|
||||
bool("start at" in read_node['frame_mode'].value()))
|
||||
|
||||
read_node["origfirst"].setValue(int(first))
|
||||
read_node["first"].setValue(int(first))
|
||||
read_node["origlast"].setValue(int(last))
|
||||
read_node["last"].setValue(int(last))
|
||||
|
||||
updated_dict = {}
|
||||
updated_dict.update({
|
||||
|
|
@ -321,20 +247,20 @@ class LoadSequence(api.Loader):
|
|||
"outputDir": version_data.get("outputDir"),
|
||||
})
|
||||
|
||||
# change color of node
|
||||
# change color of read_node
|
||||
if version.get("name") not in [max_version]:
|
||||
node["tile_color"].setValue(int("0xd84f20ff", 16))
|
||||
read_node["tile_color"].setValue(int("0xd84f20ff", 16))
|
||||
else:
|
||||
node["tile_color"].setValue(int("0x4ecd25ff", 16))
|
||||
read_node["tile_color"].setValue(int("0x4ecd25ff", 16))
|
||||
|
||||
if version_data.get("retime", None):
|
||||
speed = version_data.get("speed", 1)
|
||||
time_warp_nodes = version_data.get("timewarps", [])
|
||||
self.make_retimes(node, speed, time_warp_nodes)
|
||||
self.make_retimes(read_node, speed, time_warp_nodes)
|
||||
|
||||
# Update the imprinted representation
|
||||
update_container(
|
||||
node,
|
||||
read_node,
|
||||
updated_dict
|
||||
)
|
||||
self.log.info("udated to version: {}".format(version.get("name")))
|
||||
|
|
@ -343,8 +269,48 @@ class LoadSequence(api.Loader):
|
|||
|
||||
from avalon.nuke import viewer_update_and_undo_stop
|
||||
|
||||
node = nuke.toNode(container['objectName'])
|
||||
assert node.Class() == "Read", "Must be Read"
|
||||
read_node = nuke.toNode(container['objectName'])
|
||||
assert read_node.Class() == "Read", "Must be Read"
|
||||
|
||||
with viewer_update_and_undo_stop():
|
||||
nuke.delete(node)
|
||||
nuke.delete(read_node)
|
||||
|
||||
def make_retimes(self, speed, time_warp_nodes):
|
||||
''' Create all retime and timewarping nodes with coppied animation '''
|
||||
if speed != 1:
|
||||
rtn = nuke.createNode(
|
||||
"Retime",
|
||||
"speed {}".format(speed))
|
||||
rtn["before"].setValue("continue")
|
||||
rtn["after"].setValue("continue")
|
||||
rtn["input.first_lock"].setValue(True)
|
||||
rtn["input.first"].setValue(
|
||||
self.handle_start + self.first_frame
|
||||
)
|
||||
|
||||
if time_warp_nodes != []:
|
||||
for timewarp in time_warp_nodes:
|
||||
twn = nuke.createNode(timewarp["Class"],
|
||||
"name {}".format(timewarp["name"]))
|
||||
if isinstance(timewarp["lookup"], list):
|
||||
# if array for animation
|
||||
twn["lookup"].setAnimated()
|
||||
for i, value in enumerate(timewarp["lookup"]):
|
||||
twn["lookup"].setValueAt(
|
||||
(self.first_frame + i) + value,
|
||||
(self.first_frame + i))
|
||||
else:
|
||||
# if static value `int`
|
||||
twn["lookup"].setValue(timewarp["lookup"])
|
||||
|
||||
def loader_shift(self, read_node, workfile_start=False):
|
||||
""" Set start frame of read node to a workfile start
|
||||
|
||||
Args:
|
||||
read_node (nuke.Node): The nuke's read node
|
||||
workfile_start (bool): set workfile start frame if true
|
||||
|
||||
"""
|
||||
if workfile_start:
|
||||
read_node['frame_mode'].setValue("start at")
|
||||
read_node['frame'].setValue(str(self.script_start))
|
||||
|
|
|
|||
|
|
@ -0,0 +1,9 @@
|
|||
def add_implementation_envs(env, _app):
|
||||
"""Modify environments to contain all required for implementation."""
|
||||
defaults = {
|
||||
"OPENPYPE_LOG_NO_COLORS": "True",
|
||||
"WEBSOCKET_URL": "ws://localhost:8099/ws/"
|
||||
}
|
||||
for key, value in defaults.items():
|
||||
if not env.get(key):
|
||||
env[key] = value
|
||||
|
|
@ -0,0 +1,8 @@
|
|||
def add_implementation_envs(env, _app):
|
||||
"""Modify environments to contain all required for implementation."""
|
||||
defaults = {
|
||||
"OPENPYPE_LOG_NO_COLORS": "True"
|
||||
}
|
||||
for key, value in defaults.items():
|
||||
if not env.get(key):
|
||||
env[key] = value
|
||||
|
|
@ -0,0 +1,18 @@
|
|||
import os
|
||||
|
||||
|
||||
def add_implementation_envs(env, _app):
|
||||
"""Modify environments to contain all required for implementation."""
|
||||
# Set AVALON_UNREAL_PLUGIN required for Unreal implementation
|
||||
unreal_plugin_path = os.path.join(
|
||||
os.environ["OPENPYPE_REPOS_ROOT"], "repos", "avalon-unreal-integration"
|
||||
)
|
||||
env["AVALON_UNREAL_PLUGIN"] = unreal_plugin_path
|
||||
|
||||
# Set default environments if are not set via settings
|
||||
defaults = {
|
||||
"OPENPYPE_LOG_NO_COLORS": "True"
|
||||
}
|
||||
for key, value in defaults.items():
|
||||
if not env.get(key):
|
||||
env[key] = value
|
||||
|
|
@ -78,14 +78,14 @@ class ExtractLayout(openpype.api.Extractor):
|
|||
|
||||
json_element["transform"] = {
|
||||
"translation": {
|
||||
"x": transform.translation.x,
|
||||
"x": -transform.translation.x,
|
||||
"y": transform.translation.y,
|
||||
"z": transform.translation.z
|
||||
},
|
||||
"rotation": {
|
||||
"x": math.radians(transform.rotation.euler().x),
|
||||
"x": math.radians(transform.rotation.euler().x + 90.0),
|
||||
"y": math.radians(transform.rotation.euler().y),
|
||||
"z": math.radians(transform.rotation.euler().z),
|
||||
"z": math.radians(180.0 - transform.rotation.euler().z)
|
||||
},
|
||||
"scale": {
|
||||
"x": transform.scale3d.x,
|
||||
|
|
|
|||
|
|
@ -58,6 +58,10 @@ from .python_module_tools import (
|
|||
)
|
||||
|
||||
from .avalon_context import (
|
||||
CURRENT_DOC_SCHEMAS,
|
||||
PROJECT_NAME_ALLOWED_SYMBOLS,
|
||||
PROJECT_NAME_REGEX,
|
||||
create_project,
|
||||
is_latest,
|
||||
any_outdated,
|
||||
get_asset,
|
||||
|
|
@ -163,6 +167,10 @@ __all__ = [
|
|||
"recursive_bases_from_class",
|
||||
"classes_from_module",
|
||||
|
||||
"CURRENT_DOC_SCHEMAS",
|
||||
"PROJECT_NAME_ALLOWED_SYMBOLS",
|
||||
"PROJECT_NAME_REGEX",
|
||||
"create_project",
|
||||
"is_latest",
|
||||
"any_outdated",
|
||||
"get_asset",
|
||||
|
|
|
|||
|
|
@ -3,7 +3,6 @@ import re
|
|||
import copy
|
||||
import json
|
||||
import platform
|
||||
import getpass
|
||||
import collections
|
||||
import inspect
|
||||
import subprocess
|
||||
|
|
@ -362,7 +361,6 @@ class ApplicationManager:
|
|||
context = ApplicationLaunchContext(
|
||||
app, executable, **data
|
||||
)
|
||||
# TODO pass context through launch hooks
|
||||
return context.launch()
|
||||
|
||||
|
||||
|
|
@ -626,7 +624,7 @@ class ApplicationLaunchContext:
|
|||
|
||||
# Logger
|
||||
logger_name = "{}-{}".format(self.__class__.__name__, self.app_name)
|
||||
self.log = PypeLogger().get_logger(logger_name)
|
||||
self.log = PypeLogger.get_logger(logger_name)
|
||||
|
||||
self.executable = executable
|
||||
|
||||
|
|
@ -1033,7 +1031,7 @@ def _merge_env(env, current_env):
|
|||
return result
|
||||
|
||||
|
||||
def prepare_host_environments(data):
|
||||
def prepare_host_environments(data, implementation_envs=True):
|
||||
"""Modify launch environments based on launched app and context.
|
||||
|
||||
Args:
|
||||
|
|
@ -1089,7 +1087,24 @@ def prepare_host_environments(data):
|
|||
# Merge dictionaries
|
||||
env_values = _merge_env(tool_env, env_values)
|
||||
|
||||
final_env = _merge_env(acre.compute(env_values), data["env"])
|
||||
loaded_env = _merge_env(acre.compute(env_values), data["env"])
|
||||
|
||||
final_env = None
|
||||
# Add host specific environments
|
||||
if app.host_name and implementation_envs:
|
||||
module = __import__("openpype.hosts", fromlist=[app.host_name])
|
||||
host_module = getattr(module, app.host_name, None)
|
||||
add_implementation_envs = None
|
||||
if host_module:
|
||||
add_implementation_envs = getattr(
|
||||
host_module, "add_implementation_envs", None
|
||||
)
|
||||
if add_implementation_envs:
|
||||
# Function may only modify passed dict without returning value
|
||||
final_env = add_implementation_envs(loaded_env, app)
|
||||
|
||||
if final_env is None:
|
||||
final_env = loaded_env
|
||||
|
||||
# Update env
|
||||
data["env"].update(final_env)
|
||||
|
|
|
|||
|
|
@ -17,6 +17,99 @@ avalon = None
|
|||
log = logging.getLogger("AvalonContext")
|
||||
|
||||
|
||||
CURRENT_DOC_SCHEMAS = {
|
||||
"project": "openpype:project-3.0",
|
||||
"asset": "openpype:asset-3.0",
|
||||
"config": "openpype:config-2.0"
|
||||
}
|
||||
PROJECT_NAME_ALLOWED_SYMBOLS = "a-zA-Z0-9_"
|
||||
PROJECT_NAME_REGEX = re.compile(
|
||||
"^[{}]+$".format(PROJECT_NAME_ALLOWED_SYMBOLS)
|
||||
)
|
||||
|
||||
|
||||
def create_project(
|
||||
project_name, project_code, library_project=False, dbcon=None
|
||||
):
|
||||
"""Create project using OpenPype settings.
|
||||
|
||||
This project creation function is not validating project document on
|
||||
creation. It is because project document is created blindly with only
|
||||
minimum required information about project which is it's name, code, type
|
||||
and schema.
|
||||
|
||||
Entered project name must be unique and project must not exist yet.
|
||||
|
||||
Args:
|
||||
project_name(str): New project name. Should be unique.
|
||||
project_code(str): Project's code should be unique too.
|
||||
library_project(bool): Project is library project.
|
||||
dbcon(AvalonMongoDB): Object of connection to MongoDB.
|
||||
|
||||
Raises:
|
||||
ValueError: When project name already exists in MongoDB.
|
||||
|
||||
Returns:
|
||||
dict: Created project document.
|
||||
"""
|
||||
|
||||
from openpype.settings import ProjectSettings, SaveWarningExc
|
||||
from avalon.api import AvalonMongoDB
|
||||
from avalon.schema import validate
|
||||
|
||||
if dbcon is None:
|
||||
dbcon = AvalonMongoDB()
|
||||
|
||||
if not PROJECT_NAME_REGEX.match(project_name):
|
||||
raise ValueError((
|
||||
"Project name \"{}\" contain invalid characters"
|
||||
).format(project_name))
|
||||
|
||||
database = dbcon.database
|
||||
project_doc = database[project_name].find_one(
|
||||
{"type": "project"},
|
||||
{"name": 1}
|
||||
)
|
||||
if project_doc:
|
||||
raise ValueError("Project with name \"{}\" already exists".format(
|
||||
project_name
|
||||
))
|
||||
|
||||
project_doc = {
|
||||
"type": "project",
|
||||
"name": project_name,
|
||||
"data": {
|
||||
"code": project_code,
|
||||
"library_project": library_project
|
||||
},
|
||||
"schema": CURRENT_DOC_SCHEMAS["project"]
|
||||
}
|
||||
# Insert document with basic data
|
||||
database[project_name].insert_one(project_doc)
|
||||
# Load ProjectSettings for the project and save it to store all attributes
|
||||
# and Anatomy
|
||||
try:
|
||||
project_settings_entity = ProjectSettings(project_name)
|
||||
project_settings_entity.save()
|
||||
except SaveWarningExc as exc:
|
||||
print(str(exc))
|
||||
except Exception:
|
||||
database[project_name].delete_one({"type": "project"})
|
||||
raise
|
||||
|
||||
project_doc = database[project_name].find_one({"type": "project"})
|
||||
|
||||
try:
|
||||
# Validate created project document
|
||||
validate(project_doc)
|
||||
except Exception:
|
||||
# Remove project if is not valid
|
||||
database[project_name].delete_one({"type": "project"})
|
||||
raise
|
||||
|
||||
return project_doc
|
||||
|
||||
|
||||
def with_avalon(func):
|
||||
@functools.wraps(func)
|
||||
def wrap_avalon(*args, **kwargs):
|
||||
|
|
|
|||
|
|
@ -26,9 +26,7 @@ from openpype.modules.ftrack.lib import (
|
|||
|
||||
BaseEvent
|
||||
)
|
||||
from openpype.modules.ftrack.lib.avalon_sync import (
|
||||
EntitySchemas
|
||||
)
|
||||
from openpype.lib import CURRENT_DOC_SCHEMAS
|
||||
|
||||
|
||||
class SyncToAvalonEvent(BaseEvent):
|
||||
|
|
@ -1128,7 +1126,7 @@ class SyncToAvalonEvent(BaseEvent):
|
|||
"_id": mongo_id,
|
||||
"name": name,
|
||||
"type": "asset",
|
||||
"schema": EntitySchemas["asset"],
|
||||
"schema": CURRENT_DOC_SCHEMAS["asset"],
|
||||
"parent": proj["_id"],
|
||||
"data": {
|
||||
"ftrackId": ftrack_ent["id"],
|
||||
|
|
|
|||
|
|
@ -34,7 +34,7 @@ log = Logger.get_logger(__name__)
|
|||
|
||||
|
||||
# Current schemas for avalon types
|
||||
EntitySchemas = {
|
||||
CURRENT_DOC_SCHEMAS = {
|
||||
"project": "openpype:project-3.0",
|
||||
"asset": "openpype:asset-3.0",
|
||||
"config": "openpype:config-2.0"
|
||||
|
|
@ -1862,7 +1862,7 @@ class SyncEntitiesFactory:
|
|||
|
||||
item["_id"] = new_id
|
||||
item["parent"] = self.avalon_project_id
|
||||
item["schema"] = EntitySchemas["asset"]
|
||||
item["schema"] = CURRENT_DOC_SCHEMAS["asset"]
|
||||
item["data"]["visualParent"] = avalon_parent
|
||||
|
||||
new_id_str = str(new_id)
|
||||
|
|
@ -2003,8 +2003,8 @@ class SyncEntitiesFactory:
|
|||
|
||||
project_item["_id"] = new_id
|
||||
project_item["parent"] = None
|
||||
project_item["schema"] = EntitySchemas["project"]
|
||||
project_item["config"]["schema"] = EntitySchemas["config"]
|
||||
project_item["schema"] = CURRENT_DOC_SCHEMAS["project"]
|
||||
project_item["config"]["schema"] = CURRENT_DOC_SCHEMAS["config"]
|
||||
|
||||
self.ftrack_avalon_mapper[self.ft_project_id] = new_id
|
||||
self.avalon_ftrack_mapper[new_id] = self.ft_project_id
|
||||
|
|
|
|||
|
|
@ -7,6 +7,8 @@ log = Logger().get_logger("SyncServer")
|
|||
|
||||
@six.add_metaclass(abc.ABCMeta)
|
||||
class AbstractProvider:
|
||||
CODE = ''
|
||||
LABEL = ''
|
||||
|
||||
def __init__(self, project_name, site_name, tree=None, presets=None):
|
||||
self.presets = None
|
||||
|
|
@ -25,6 +27,17 @@ class AbstractProvider:
|
|||
(boolean)
|
||||
"""
|
||||
|
||||
@classmethod
|
||||
@abc.abstractmethod
|
||||
def get_configurable_items(cls):
|
||||
"""
|
||||
Returns filtered dict of editable properties
|
||||
|
||||
|
||||
Returns:
|
||||
(dict)
|
||||
"""
|
||||
|
||||
@abc.abstractmethod
|
||||
def upload_file(self, source_path, path,
|
||||
server, collection, file, representation, site,
|
||||
|
|
|
|||
|
|
@ -1,22 +1,33 @@
|
|||
from __future__ import print_function
|
||||
import os.path
|
||||
from googleapiclient.discovery import build
|
||||
import google.oauth2.service_account as service_account
|
||||
from googleapiclient import errors
|
||||
from .abstract_provider import AbstractProvider
|
||||
from googleapiclient.http import MediaFileUpload, MediaIoBaseDownload
|
||||
import time
|
||||
import sys
|
||||
import six
|
||||
import platform
|
||||
|
||||
from openpype.api import Logger
|
||||
from openpype.api import get_system_settings
|
||||
from ..utils import time_function, ResumableError
|
||||
import time
|
||||
from .abstract_provider import AbstractProvider
|
||||
from ..utils import time_function, ResumableError, EditableScopes
|
||||
|
||||
log = Logger().get_logger("SyncServer")
|
||||
|
||||
try:
|
||||
from googleapiclient.discovery import build
|
||||
import google.oauth2.service_account as service_account
|
||||
from googleapiclient import errors
|
||||
from googleapiclient.http import MediaFileUpload, MediaIoBaseDownload
|
||||
except (ImportError, SyntaxError):
|
||||
if six.PY3:
|
||||
six.reraise(*sys.exc_info())
|
||||
|
||||
# handle imports from Python 2 hosts - in those only basic methods are used
|
||||
log.warning("Import failed, imported from Python 2, operations will fail.")
|
||||
|
||||
SCOPES = ['https://www.googleapis.com/auth/drive.metadata.readonly',
|
||||
'https://www.googleapis.com/auth/drive.file',
|
||||
'https://www.googleapis.com/auth/drive.readonly'] # for write|delete
|
||||
|
||||
log = Logger().get_logger("SyncServer")
|
||||
|
||||
|
||||
class GDriveHandler(AbstractProvider):
|
||||
"""
|
||||
|
|
@ -42,15 +53,20 @@ class GDriveHandler(AbstractProvider):
|
|||
}
|
||||
}
|
||||
"""
|
||||
CODE = 'gdrive'
|
||||
LABEL = 'Google Drive'
|
||||
|
||||
FOLDER_STR = 'application/vnd.google-apps.folder'
|
||||
MY_DRIVE_STR = 'My Drive' # name of root folder of regular Google drive
|
||||
CHUNK_SIZE = 2097152 # must be divisible by 256!
|
||||
CHUNK_SIZE = 2097152 # must be divisible by 256! used for upload chunks
|
||||
|
||||
def __init__(self, project_name, site_name, tree=None, presets=None):
|
||||
self.presets = None
|
||||
self.active = False
|
||||
self.project_name = project_name
|
||||
self.site_name = site_name
|
||||
self.service = None
|
||||
self.root = None
|
||||
|
||||
self.presets = presets
|
||||
if not self.presets:
|
||||
|
|
@ -58,18 +74,15 @@ class GDriveHandler(AbstractProvider):
|
|||
format(site_name))
|
||||
return
|
||||
|
||||
if not os.path.exists(self.presets["credentials_url"]):
|
||||
log.info("Sync Server: No credentials for Gdrive provider! ")
|
||||
cred_path = self.presets.get("credentials_url", {}).\
|
||||
get(platform.system().lower()) or ''
|
||||
if not os.path.exists(cred_path):
|
||||
msg = "Sync Server: No credentials for gdrive provider " + \
|
||||
"for '{}' on path '{}'!".format(site_name, cred_path)
|
||||
log.info(msg)
|
||||
return
|
||||
|
||||
self.service = self._get_gd_service()
|
||||
try:
|
||||
self.root = self._prepare_root_info()
|
||||
except errors.HttpError:
|
||||
log.warning("HttpError in sync loop, "
|
||||
"trying next loop",
|
||||
exc_info=True)
|
||||
raise ResumableError
|
||||
self.service = self._get_gd_service(cred_path)
|
||||
|
||||
self._tree = tree
|
||||
self.active = True
|
||||
|
|
@ -80,7 +93,34 @@ class GDriveHandler(AbstractProvider):
|
|||
Returns:
|
||||
(boolean)
|
||||
"""
|
||||
return self.active
|
||||
return self.service is not None
|
||||
|
||||
@classmethod
|
||||
def get_configurable_items(cls):
|
||||
"""
|
||||
Returns filtered dict of editable properties.
|
||||
|
||||
|
||||
Returns:
|
||||
(dict)
|
||||
"""
|
||||
# {platform} tells that value is multiplatform and only specific OS
|
||||
# should be returned
|
||||
editable = {
|
||||
# credentials could be override on Project or User level
|
||||
'credentials_url': {
|
||||
'scope': [EditableScopes.PROJECT,
|
||||
EditableScopes.LOCAL],
|
||||
'label': "Credentials url",
|
||||
'type': 'text',
|
||||
'namespace': '{project_settings}/global/sync_server/sites/{site}/credentials_url/{platform}' # noqa: E501
|
||||
},
|
||||
# roots could be override only on Project leve, User cannot
|
||||
'root': {'scope': [EditableScopes.PROJECT],
|
||||
'label': "Roots",
|
||||
'type': 'dict'}
|
||||
}
|
||||
return editable
|
||||
|
||||
def get_roots_config(self, anatomy=None):
|
||||
"""
|
||||
|
|
@ -537,7 +577,7 @@ class GDriveHandler(AbstractProvider):
|
|||
return
|
||||
return provider_presets
|
||||
|
||||
def _get_gd_service(self):
|
||||
def _get_gd_service(self, credentials_path):
|
||||
"""
|
||||
Authorize client with 'credentials.json', uses service account.
|
||||
Service account needs to have target folder shared with.
|
||||
|
|
@ -546,11 +586,18 @@ class GDriveHandler(AbstractProvider):
|
|||
Returns:
|
||||
None
|
||||
"""
|
||||
creds = service_account.Credentials.from_service_account_file(
|
||||
self.presets["credentials_url"],
|
||||
scopes=SCOPES)
|
||||
service = build('drive', 'v3',
|
||||
credentials=creds, cache_discovery=False)
|
||||
service = None
|
||||
try:
|
||||
creds = service_account.Credentials.from_service_account_file(
|
||||
credentials_path,
|
||||
scopes=SCOPES)
|
||||
service = build('drive', 'v3',
|
||||
credentials=creds, cache_discovery=False)
|
||||
except Exception:
|
||||
log.error("Connection failed, " +
|
||||
"check '{}' credentials file".format(credentials_path),
|
||||
exc_info=True)
|
||||
|
||||
return service
|
||||
|
||||
def _prepare_root_info(self):
|
||||
|
|
@ -562,39 +609,47 @@ class GDriveHandler(AbstractProvider):
|
|||
|
||||
Returns:
|
||||
(dicts) of dicts where root folders are keys
|
||||
throws ResumableError in case of errors.HttpError
|
||||
"""
|
||||
roots = {}
|
||||
config_roots = self.get_roots_config()
|
||||
for path in config_roots.values():
|
||||
if self.MY_DRIVE_STR in path:
|
||||
roots[self.MY_DRIVE_STR] = self.service.files()\
|
||||
.get(fileId='root').execute()
|
||||
else:
|
||||
shared_drives = []
|
||||
page_token = None
|
||||
try:
|
||||
for path in config_roots.values():
|
||||
if self.MY_DRIVE_STR in path:
|
||||
roots[self.MY_DRIVE_STR] = self.service.files()\
|
||||
.get(fileId='root')\
|
||||
.execute()
|
||||
else:
|
||||
shared_drives = []
|
||||
page_token = None
|
||||
|
||||
while True:
|
||||
response = self.service.drives().list(
|
||||
pageSize=100,
|
||||
pageToken=page_token).execute()
|
||||
shared_drives.extend(response.get('drives', []))
|
||||
page_token = response.get('nextPageToken', None)
|
||||
if page_token is None:
|
||||
break
|
||||
while True:
|
||||
response = self.service.drives().list(
|
||||
pageSize=100,
|
||||
pageToken=page_token).execute()
|
||||
shared_drives.extend(response.get('drives', []))
|
||||
page_token = response.get('nextPageToken', None)
|
||||
if page_token is None:
|
||||
break
|
||||
|
||||
folders = path.split('/')
|
||||
if len(folders) < 2:
|
||||
raise ValueError("Wrong root folder definition {}".
|
||||
format(path))
|
||||
folders = path.split('/')
|
||||
if len(folders) < 2:
|
||||
raise ValueError("Wrong root folder definition {}".
|
||||
format(path))
|
||||
|
||||
for shared_drive in shared_drives:
|
||||
if folders[1] in shared_drive["name"]:
|
||||
roots[shared_drive["name"]] = {
|
||||
"name": shared_drive["name"],
|
||||
"id": shared_drive["id"]}
|
||||
if self.MY_DRIVE_STR not in roots: # add My Drive always
|
||||
roots[self.MY_DRIVE_STR] = self.service.files() \
|
||||
.get(fileId='root').execute()
|
||||
for shared_drive in shared_drives:
|
||||
if folders[1] in shared_drive["name"]:
|
||||
roots[shared_drive["name"]] = {
|
||||
"name": shared_drive["name"],
|
||||
"id": shared_drive["id"]}
|
||||
if self.MY_DRIVE_STR not in roots: # add My Drive always
|
||||
roots[self.MY_DRIVE_STR] = self.service.files() \
|
||||
.get(fileId='root').execute()
|
||||
except errors.HttpError:
|
||||
log.warning("HttpError in sync loop, "
|
||||
"trying next loop",
|
||||
exc_info=True)
|
||||
raise ResumableError
|
||||
|
||||
return roots
|
||||
|
||||
|
|
@ -615,6 +670,9 @@ class GDriveHandler(AbstractProvider):
|
|||
(dictionary) path as a key, folder id as a value
|
||||
"""
|
||||
log.debug("build_tree len {}".format(len(folders)))
|
||||
if not self.root: # build only when necessary, could be expensive
|
||||
self.root = self._prepare_root_info()
|
||||
|
||||
root_ids = []
|
||||
default_root_id = None
|
||||
tree = {}
|
||||
|
|
|
|||
|
|
@ -65,6 +65,17 @@ class ProviderFactory:
|
|||
info = self._get_creator_info(provider)
|
||||
return info[1]
|
||||
|
||||
def get_provider_configurable_items(self, provider):
|
||||
"""
|
||||
Returns dict of modifiable properties for 'provider'.
|
||||
|
||||
Provider contains information which its properties and on what
|
||||
level could be override
|
||||
"""
|
||||
provider_info = self._get_creator_info(provider)
|
||||
|
||||
return provider_info[0].get_configurable_items()
|
||||
|
||||
def _get_creator_info(self, provider):
|
||||
"""
|
||||
Collect all necessary info for provider. Currently only creator
|
||||
|
|
@ -91,5 +102,5 @@ factory = ProviderFactory()
|
|||
# there is implementing 'GDriveHandler' class
|
||||
# 7 denotes number of files that could be synced in single loop - learned by
|
||||
# trial and error
|
||||
factory.register_provider('gdrive', GDriveHandler, 7)
|
||||
factory.register_provider('local_drive', LocalDriveHandler, 50)
|
||||
factory.register_provider(GDriveHandler.CODE, GDriveHandler, 7)
|
||||
factory.register_provider(LocalDriveHandler.CODE, LocalDriveHandler, 50)
|
||||
|
|
|
|||
|
|
@ -7,22 +7,43 @@ import time
|
|||
from openpype.api import Logger, Anatomy
|
||||
from .abstract_provider import AbstractProvider
|
||||
|
||||
from ..utils import EditableScopes
|
||||
|
||||
log = Logger().get_logger("SyncServer")
|
||||
|
||||
|
||||
class LocalDriveHandler(AbstractProvider):
|
||||
CODE = 'local_drive'
|
||||
LABEL = 'Local drive'
|
||||
|
||||
""" Handles required operations on mounted disks with OS """
|
||||
def __init__(self, project_name, site_name, tree=None, presets=None):
|
||||
self.presets = None
|
||||
self.active = False
|
||||
self.project_name = project_name
|
||||
self.site_name = site_name
|
||||
self._editable_properties = {}
|
||||
|
||||
self.active = self.is_active()
|
||||
|
||||
def is_active(self):
|
||||
return True
|
||||
|
||||
@classmethod
|
||||
def get_configurable_items(cls):
|
||||
"""
|
||||
Returns filtered dict of editable properties
|
||||
|
||||
Returns:
|
||||
(dict)
|
||||
"""
|
||||
editable = {
|
||||
'root': {'scope': [EditableScopes.LOCAL],
|
||||
'label': "Roots",
|
||||
'type': 'dict'}
|
||||
}
|
||||
return editable
|
||||
|
||||
def upload_file(self, source_path, target_path,
|
||||
server, collection, file, representation, site,
|
||||
overwrite=False, direction="Upload"):
|
||||
|
|
@ -149,7 +170,10 @@ class LocalDriveHandler(AbstractProvider):
|
|||
site=site,
|
||||
progress=status_val
|
||||
)
|
||||
target_file_size = os.path.getsize(target_path)
|
||||
try:
|
||||
target_file_size = os.path.getsize(target_path)
|
||||
except FileNotFoundError:
|
||||
pass
|
||||
time.sleep(0.5)
|
||||
|
||||
def _normalize_site_name(self, site_name):
|
||||
|
|
|
|||
|
|
@ -206,14 +206,14 @@ def _get_configured_sites_from_setting(module, project_name, project_setting):
|
|||
all_sites = module._get_default_site_configs()
|
||||
all_sites.update(project_setting.get("sites"))
|
||||
for site_name, config in all_sites.items():
|
||||
handler = initiated_handlers. \
|
||||
get((config["provider"], site_name))
|
||||
provider = module.get_provider_for_site(site=site_name)
|
||||
handler = initiated_handlers.get((provider, site_name))
|
||||
if not handler:
|
||||
handler = lib.factory.get_provider(config["provider"],
|
||||
handler = lib.factory.get_provider(provider,
|
||||
project_name,
|
||||
site_name,
|
||||
presets=config)
|
||||
initiated_handlers[(config["provider"], site_name)] = \
|
||||
initiated_handlers[(provider, site_name)] = \
|
||||
handler
|
||||
|
||||
if handler.is_active():
|
||||
|
|
@ -274,6 +274,9 @@ class SyncServerThread(threading.Thread):
|
|||
self.module.set_sync_project_settings() # clean cache
|
||||
for collection, preset in self.module.sync_project_settings.\
|
||||
items():
|
||||
if collection not in self.module.get_enabled_projects():
|
||||
continue
|
||||
|
||||
start_time = time.time()
|
||||
local_site, remote_site = self._working_sites(collection)
|
||||
if not all([local_site, remote_site]):
|
||||
|
|
@ -295,13 +298,14 @@ class SyncServerThread(threading.Thread):
|
|||
processed_file_path = set()
|
||||
|
||||
site_preset = preset.get('sites')[remote_site]
|
||||
remote_provider = site_preset['provider']
|
||||
remote_provider = \
|
||||
self.module.get_provider_for_site(site=remote_site)
|
||||
handler = lib.factory.get_provider(remote_provider,
|
||||
collection,
|
||||
remote_site,
|
||||
presets=site_preset)
|
||||
limit = lib.factory.get_provider_batch_limit(
|
||||
site_preset['provider'])
|
||||
remote_provider)
|
||||
# first call to get_provider could be expensive, its
|
||||
# building folder tree structure in memory
|
||||
# call only if needed, eg. DO_UPLOAD or DO_DOWNLOAD
|
||||
|
|
@ -451,8 +455,9 @@ class SyncServerThread(threading.Thread):
|
|||
remote_site))
|
||||
return None, None
|
||||
|
||||
if not all([site_is_working(self.module, collection, local_site),
|
||||
site_is_working(self.module, collection, remote_site)]):
|
||||
configured_sites = _get_configured_sites(self.module, collection)
|
||||
if not all([local_site in configured_sites,
|
||||
remote_site in configured_sites]):
|
||||
log.debug("Some of the sites {} - {} is not ".format(local_site,
|
||||
remote_site) +
|
||||
"working properly")
|
||||
|
|
|
|||
|
|
@ -2,6 +2,8 @@ import os
|
|||
from bson.objectid import ObjectId
|
||||
from datetime import datetime
|
||||
import threading
|
||||
import platform
|
||||
import copy
|
||||
|
||||
from avalon.api import AvalonMongoDB
|
||||
|
||||
|
|
@ -9,12 +11,18 @@ from .. import PypeModule, ITrayModule
|
|||
from openpype.api import (
|
||||
Anatomy,
|
||||
get_project_settings,
|
||||
get_system_settings,
|
||||
get_local_site_id)
|
||||
from openpype.lib import PypeLogger
|
||||
from openpype.settings.lib import (
|
||||
get_default_project_settings,
|
||||
get_default_anatomy_settings,
|
||||
get_anatomy_settings)
|
||||
|
||||
from .providers.local_drive import LocalDriveHandler
|
||||
from .providers import lib
|
||||
|
||||
from .utils import time_function, SyncStatus
|
||||
from .utils import time_function, SyncStatus, EditableScopes
|
||||
|
||||
|
||||
log = PypeLogger().get_logger("SyncServer")
|
||||
|
|
@ -340,18 +348,6 @@ class SyncServerModule(PypeModule, ITrayModule):
|
|||
|
||||
return self._get_enabled_sites_from_settings(sync_settings)
|
||||
|
||||
def get_configurable_items_for_site(self, project_name, site_name):
|
||||
"""
|
||||
Returns list of items that should be configurable by User
|
||||
|
||||
Returns:
|
||||
(list of dict)
|
||||
[{key:"root", label:"root", value:"valueFromSettings"}]
|
||||
"""
|
||||
# if project_name is None: ..for get_default_project_settings
|
||||
# return handler.get_configurable_items()
|
||||
pass
|
||||
|
||||
def get_active_site(self, project_name):
|
||||
"""
|
||||
Returns active (mine) site for 'project_name' from settings
|
||||
|
|
@ -402,6 +398,205 @@ class SyncServerModule(PypeModule, ITrayModule):
|
|||
|
||||
return remote_site
|
||||
|
||||
def get_local_settings_schema(self):
|
||||
"""Wrapper for Local settings - all projects incl. Default"""
|
||||
return self.get_configurable_items(EditableScopes.LOCAL)
|
||||
|
||||
def get_configurable_items(self, scope=None):
|
||||
"""
|
||||
Returns list of sites that could be configurable for all projects.
|
||||
|
||||
Could be filtered by 'scope' argument (list)
|
||||
|
||||
Args:
|
||||
scope (list of utils.EditableScope)
|
||||
|
||||
Returns:
|
||||
(dict of list of dict)
|
||||
{
|
||||
siteA : [
|
||||
{
|
||||
key:"root", label:"root",
|
||||
"value":"{'work': 'c:/projects'}",
|
||||
"type": "dict",
|
||||
"children":[
|
||||
{ "key": "work",
|
||||
"type": "text",
|
||||
"value": "c:/projects"}
|
||||
]
|
||||
},
|
||||
{
|
||||
key:"credentials_url", label:"Credentials url",
|
||||
"value":"'c:/projects/cred.json'", "type": "text",
|
||||
"namespace": "{project_setting}/global/sync_server/
|
||||
sites"
|
||||
}
|
||||
]
|
||||
}
|
||||
"""
|
||||
editable = {}
|
||||
applicable_projects = list(self.connection.projects())
|
||||
applicable_projects.append(None)
|
||||
for project in applicable_projects:
|
||||
project_name = None
|
||||
if project:
|
||||
project_name = project["name"]
|
||||
|
||||
items = self.get_configurable_items_for_project(project_name,
|
||||
scope)
|
||||
editable.update(items)
|
||||
|
||||
return editable
|
||||
|
||||
def get_local_settings_schema_for_project(self, project_name):
|
||||
"""Wrapper for Local settings - for specific 'project_name'"""
|
||||
return self.get_configurable_items_for_project(project_name,
|
||||
EditableScopes.LOCAL)
|
||||
|
||||
def get_configurable_items_for_project(self, project_name=None,
|
||||
scope=None):
|
||||
"""
|
||||
Returns list of items that could be configurable for specific
|
||||
'project_name'
|
||||
|
||||
Args:
|
||||
project_name (str) - None > default project,
|
||||
scope (list of utils.EditableScope)
|
||||
(optional, None is all scopes, default is LOCAL)
|
||||
|
||||
Returns:
|
||||
(dict of list of dict)
|
||||
{
|
||||
siteA : [
|
||||
{
|
||||
key:"root", label:"root",
|
||||
"type": "dict",
|
||||
"children":[
|
||||
{ "key": "work",
|
||||
"type": "text",
|
||||
"value": "c:/projects"}
|
||||
]
|
||||
},
|
||||
{
|
||||
key:"credentials_url", label:"Credentials url",
|
||||
"value":"'c:/projects/cred.json'", "type": "text",
|
||||
"namespace": "{project_setting}/global/sync_server/
|
||||
sites"
|
||||
}
|
||||
]
|
||||
}
|
||||
"""
|
||||
allowed_sites = set()
|
||||
sites = self.get_all_site_configs(project_name)
|
||||
if project_name:
|
||||
# Local Settings can select only from allowed sites for project
|
||||
allowed_sites.update(set(self.get_active_sites(project_name)))
|
||||
allowed_sites.update(set(self.get_remote_sites(project_name)))
|
||||
|
||||
editable = {}
|
||||
for site_name in sites.keys():
|
||||
if allowed_sites and site_name not in allowed_sites:
|
||||
continue
|
||||
|
||||
items = self.get_configurable_items_for_site(project_name,
|
||||
site_name,
|
||||
scope)
|
||||
# Local Settings need 'local' instead of real value
|
||||
site_name = site_name.replace(get_local_site_id(), 'local')
|
||||
editable[site_name] = items
|
||||
|
||||
return editable
|
||||
|
||||
def get_local_settings_schema_for_site(self, project_name, site_name):
|
||||
"""Wrapper for Local settings - for particular 'site_name and proj."""
|
||||
return self.get_configurable_items_for_site(project_name,
|
||||
site_name,
|
||||
EditableScopes.LOCAL)
|
||||
|
||||
def get_configurable_items_for_site(self, project_name=None,
|
||||
site_name=None,
|
||||
scope=None):
|
||||
"""
|
||||
Returns list of items that could be configurable.
|
||||
|
||||
Args:
|
||||
project_name (str) - None > default project
|
||||
site_name (str)
|
||||
scope (list of utils.EditableScope)
|
||||
(optional, None is all scopes)
|
||||
|
||||
Returns:
|
||||
(list)
|
||||
[
|
||||
{
|
||||
key:"root", label:"root", type:"dict",
|
||||
"children":[
|
||||
{ "key": "work",
|
||||
"type": "text",
|
||||
"value": "c:/projects"}
|
||||
]
|
||||
}, ...
|
||||
]
|
||||
"""
|
||||
provider_name = self.get_provider_for_site(site=site_name)
|
||||
items = lib.factory.get_provider_configurable_items(provider_name)
|
||||
|
||||
if project_name:
|
||||
sync_s = self.get_sync_project_setting(project_name,
|
||||
exclude_locals=True,
|
||||
cached=False)
|
||||
else:
|
||||
sync_s = get_default_project_settings(exclude_locals=True)
|
||||
sync_s = sync_s["global"]["sync_server"]
|
||||
sync_s["sites"].update(
|
||||
self._get_default_site_configs(self.enabled))
|
||||
|
||||
editable = []
|
||||
if type(scope) is not list:
|
||||
scope = [scope]
|
||||
scope = set(scope)
|
||||
for key, properties in items.items():
|
||||
if scope is None or scope.intersection(set(properties["scope"])):
|
||||
val = sync_s.get("sites", {}).get(site_name, {}).get(key)
|
||||
|
||||
item = {
|
||||
"key": key,
|
||||
"label": properties["label"],
|
||||
"type": properties["type"]
|
||||
}
|
||||
|
||||
if properties.get("namespace"):
|
||||
item["namespace"] = properties.get("namespace")
|
||||
if "platform" in item["namespace"]:
|
||||
try:
|
||||
if val:
|
||||
val = val[platform.system().lower()]
|
||||
except KeyError:
|
||||
st = "{}'s field value {} should be".format(key, val) # noqa: E501
|
||||
log.error(st + " multiplatform dict")
|
||||
|
||||
item["namespace"] = item["namespace"].replace('{site}',
|
||||
site_name)
|
||||
children = []
|
||||
if properties["type"] == "dict":
|
||||
if val:
|
||||
for val_key, val_val in val.items():
|
||||
child = {
|
||||
"type": "text",
|
||||
"key": val_key,
|
||||
"value": val_val
|
||||
}
|
||||
children.append(child)
|
||||
|
||||
if properties["type"] == "dict":
|
||||
item["children"] = children
|
||||
else:
|
||||
item["value"] = val
|
||||
|
||||
editable.append(item)
|
||||
|
||||
return editable
|
||||
|
||||
def reset_timer(self):
|
||||
"""
|
||||
Called when waiting for next loop should be skipped.
|
||||
|
|
@ -418,7 +613,7 @@ class SyncServerModule(PypeModule, ITrayModule):
|
|||
for project in self.connection.projects():
|
||||
project_name = project["name"]
|
||||
project_settings = self.get_sync_project_setting(project_name)
|
||||
if project_settings:
|
||||
if project_settings and project_settings.get("enabled"):
|
||||
enabled_projects.append(project_name)
|
||||
|
||||
return enabled_projects
|
||||
|
|
@ -570,75 +765,145 @@ class SyncServerModule(PypeModule, ITrayModule):
|
|||
|
||||
return self._sync_project_settings
|
||||
|
||||
def set_sync_project_settings(self):
|
||||
def set_sync_project_settings(self, exclude_locals=False):
|
||||
"""
|
||||
Set sync_project_settings for all projects (caching)
|
||||
|
||||
Args:
|
||||
exclude_locals (bool): ignore overrides from Local Settings
|
||||
For performance
|
||||
"""
|
||||
sync_project_settings = {}
|
||||
|
||||
for collection in self.connection.database.collection_names(False):
|
||||
sync_settings = self._parse_sync_settings_from_settings(
|
||||
get_project_settings(collection))
|
||||
if sync_settings:
|
||||
default_sites = self._get_default_site_configs()
|
||||
sync_settings['sites'].update(default_sites)
|
||||
sync_project_settings[collection] = sync_settings
|
||||
|
||||
if not sync_project_settings:
|
||||
log.info("No enabled and configured projects for sync.")
|
||||
sync_project_settings = self._prepare_sync_project_settings(
|
||||
exclude_locals)
|
||||
|
||||
self._sync_project_settings = sync_project_settings
|
||||
|
||||
def get_sync_project_setting(self, project_name):
|
||||
def _prepare_sync_project_settings(self, exclude_locals):
|
||||
sync_project_settings = {}
|
||||
system_sites = self.get_all_site_configs()
|
||||
for collection in self.connection.database.collection_names(False):
|
||||
sites = copy.deepcopy(system_sites) # get all configured sites
|
||||
proj_settings = self._parse_sync_settings_from_settings(
|
||||
get_project_settings(collection,
|
||||
exclude_locals=exclude_locals))
|
||||
sites.update(self._get_default_site_configs(
|
||||
proj_settings["enabled"], collection))
|
||||
sites.update(proj_settings['sites'])
|
||||
proj_settings["sites"] = sites
|
||||
|
||||
sync_project_settings[collection] = proj_settings
|
||||
if not sync_project_settings:
|
||||
log.info("No enabled and configured projects for sync.")
|
||||
return sync_project_settings
|
||||
|
||||
def get_sync_project_setting(self, project_name, exclude_locals=False,
|
||||
cached=True):
|
||||
""" Handles pulling sync_server's settings for enabled 'project_name'
|
||||
|
||||
Args:
|
||||
project_name (str): used in project settings
|
||||
exclude_locals (bool): ignore overrides from Local Settings
|
||||
cached (bool): use pre-cached values, or return fresh ones
|
||||
cached values needed for single loop (with all overrides)
|
||||
fresh values needed for Local settings (without overrides)
|
||||
Returns:
|
||||
(dict): settings dictionary for the enabled project,
|
||||
empty if no settings or sync is disabled
|
||||
"""
|
||||
# presets set already, do not call again and again
|
||||
# self.log.debug("project preset {}".format(self.presets))
|
||||
if self.sync_project_settings and \
|
||||
self.sync_project_settings.get(project_name):
|
||||
return self.sync_project_settings.get(project_name)
|
||||
if not cached:
|
||||
return self._prepare_sync_project_settings(exclude_locals)\
|
||||
[project_name]
|
||||
|
||||
settings = get_project_settings(project_name)
|
||||
return self._parse_sync_settings_from_settings(settings)
|
||||
if not self.sync_project_settings or \
|
||||
not self.sync_project_settings.get(project_name):
|
||||
self.set_sync_project_settings(exclude_locals)
|
||||
|
||||
return self.sync_project_settings.get(project_name)
|
||||
|
||||
def _parse_sync_settings_from_settings(self, settings):
|
||||
""" settings from api.get_project_settings, TOOD rename """
|
||||
sync_settings = settings.get("global").get("sync_server")
|
||||
if not sync_settings:
|
||||
log.info("No project setting not syncing.")
|
||||
return {}
|
||||
if sync_settings.get("enabled"):
|
||||
return sync_settings
|
||||
|
||||
return {}
|
||||
return sync_settings
|
||||
|
||||
def _get_default_site_configs(self):
|
||||
def get_all_site_configs(self, project_name=None):
|
||||
"""
|
||||
Returns skeleton settings for 'studio' and user's local site
|
||||
Returns (dict) with all sites configured system wide.
|
||||
|
||||
Args:
|
||||
project_name (str)(optional): if present, check if not disabled
|
||||
|
||||
Returns:
|
||||
(dict): {'studio': {'provider':'local_drive'...},
|
||||
'MY_LOCAL': {'provider':....}}
|
||||
"""
|
||||
default_config = {'provider': 'local_drive'}
|
||||
all_sites = {self.DEFAULT_SITE: default_config,
|
||||
get_local_site_id(): default_config}
|
||||
sys_sett = get_system_settings()
|
||||
sync_sett = sys_sett["modules"].get("sync_server")
|
||||
|
||||
project_enabled = True
|
||||
if project_name:
|
||||
project_enabled = project_name in self.get_enabled_projects()
|
||||
sync_enabled = sync_sett["enabled"] and project_enabled
|
||||
|
||||
system_sites = {}
|
||||
if sync_enabled:
|
||||
for site, detail in sync_sett.get("sites", {}).items():
|
||||
system_sites[site] = detail
|
||||
|
||||
system_sites.update(self._get_default_site_configs(sync_enabled,
|
||||
project_name))
|
||||
|
||||
return system_sites
|
||||
|
||||
def _get_default_site_configs(self, sync_enabled=True, project_name=None):
|
||||
"""
|
||||
Returns settings for 'studio' and user's local site
|
||||
|
||||
Returns base values from setting, not overriden by Local Settings,
|
||||
eg. value used to push TO LS not to get actual value for syncing.
|
||||
"""
|
||||
if not project_name:
|
||||
anatomy_sett = get_default_anatomy_settings(exclude_locals=True)
|
||||
else:
|
||||
anatomy_sett = get_anatomy_settings(project_name,
|
||||
exclude_locals=True)
|
||||
roots = {}
|
||||
for root, config in anatomy_sett["roots"].items():
|
||||
roots[root] = config[platform.system().lower()]
|
||||
studio_config = {
|
||||
'provider': 'local_drive',
|
||||
"root": roots
|
||||
}
|
||||
all_sites = {self.DEFAULT_SITE: studio_config}
|
||||
if sync_enabled:
|
||||
all_sites[get_local_site_id()] = {'provider': 'local_drive'}
|
||||
return all_sites
|
||||
|
||||
def get_provider_for_site(self, project_name, site):
|
||||
def get_provider_for_site(self, project_name=None, site=None):
|
||||
"""
|
||||
Return provider name for site.
|
||||
Return provider name for site (unique name across all projects.
|
||||
"""
|
||||
site_preset = self.get_sync_project_setting(project_name)["sites"].\
|
||||
get(site)
|
||||
if site_preset:
|
||||
return site_preset["provider"]
|
||||
sites = {self.DEFAULT_SITE: "local_drive",
|
||||
self.LOCAL_SITE: "local_drive",
|
||||
get_local_site_id(): "local_drive"}
|
||||
|
||||
return "NA"
|
||||
if site in sites.keys():
|
||||
return sites[site]
|
||||
|
||||
if project_name: # backward compatibility
|
||||
proj_settings = self.get_sync_project_setting(project_name)
|
||||
provider = proj_settings.get("sites", {}).get(site, {}).\
|
||||
get("provider")
|
||||
if provider:
|
||||
return provider
|
||||
|
||||
sys_sett = get_system_settings()
|
||||
sync_sett = sys_sett["modules"].get("sync_server")
|
||||
for site, detail in sync_sett.get("sites", {}).items():
|
||||
sites[site] = detail.get("provider")
|
||||
|
||||
return sites.get(site, 'N/A')
|
||||
|
||||
@time_function
|
||||
def get_sync_representations(self, collection, active_site, remote_site):
|
||||
|
|
@ -757,6 +1022,15 @@ class SyncServerModule(PypeModule, ITrayModule):
|
|||
Always is comparing local record, eg. site with
|
||||
'name' == self.presets[PROJECT_NAME]['config']["active_site"]
|
||||
|
||||
This leads to trigger actual upload or download, there is
|
||||
a use case 'studio' <> 'remote' where user should publish
|
||||
to 'studio', see progress in Tray GUI, but do not do
|
||||
physical upload/download
|
||||
(as multiple user would be doing that).
|
||||
|
||||
Do physical U/D only when any of the sites is user's local, in that
|
||||
case only user has the data and must U/D.
|
||||
|
||||
Args:
|
||||
file (dictionary): of file from representation in Mongo
|
||||
local_site (string): - local side of compare (usually 'studio')
|
||||
|
|
@ -766,8 +1040,12 @@ class SyncServerModule(PypeModule, ITrayModule):
|
|||
(string) - one of SyncStatus
|
||||
"""
|
||||
sites = file.get("sites") or []
|
||||
# if isinstance(sites, list): # temporary, old format of 'sites'
|
||||
# return SyncStatus.DO_NOTHING
|
||||
|
||||
if get_local_site_id() not in (local_site, remote_site):
|
||||
# don't do upload/download for studio sites
|
||||
log.debug("No local site {} - {}".format(local_site, remote_site))
|
||||
return SyncStatus.DO_NOTHING
|
||||
|
||||
_, remote_rec = self._get_site_rec(sites, remote_site) or {}
|
||||
if remote_rec: # sync remote target
|
||||
created_dt = remote_rec.get("created_dt")
|
||||
|
|
@ -1116,7 +1394,7 @@ class SyncServerModule(PypeModule, ITrayModule):
|
|||
format(site_name))
|
||||
return
|
||||
|
||||
provider_name = self.get_provider_for_site(collection, site_name)
|
||||
provider_name = self.get_provider_for_site(site=site_name)
|
||||
|
||||
if provider_name == 'local_drive':
|
||||
query = {
|
||||
|
|
|
|||
|
|
@ -158,7 +158,7 @@ def translate_provider_for_icon(sync_server, project, site):
|
|||
"""
|
||||
if site == sync_server.DEFAULT_SITE:
|
||||
return sync_server.DEFAULT_SITE
|
||||
return sync_server.get_provider_for_site(project, site)
|
||||
return sync_server.get_provider_for_site(site=site)
|
||||
|
||||
|
||||
def get_item_by_id(model, object_id):
|
||||
|
|
|
|||
|
|
@ -236,7 +236,7 @@ class _SyncRepresentationWidget(QtWidgets.QWidget):
|
|||
|
||||
for site, progress in {active_site: local_progress,
|
||||
remote_site: remote_progress}.items():
|
||||
provider = self.sync_server.get_provider_for_site(project, site)
|
||||
provider = self.sync_server.get_provider_for_site(site=site)
|
||||
if provider == 'local_drive':
|
||||
if 'studio' in site:
|
||||
txt = " studio version"
|
||||
|
|
|
|||
|
|
@ -33,3 +33,9 @@ def time_function(method):
|
|||
return result
|
||||
|
||||
return timed
|
||||
|
||||
|
||||
class EditableScopes:
|
||||
SYSTEM = 0
|
||||
PROJECT = 1
|
||||
LOCAL = 2
|
||||
|
|
|
|||
|
|
@ -55,7 +55,7 @@ class ExtractReview(pyblish.api.InstancePlugin):
|
|||
profiles = None
|
||||
|
||||
def process(self, instance):
|
||||
self.log.debug(instance.data["representations"])
|
||||
self.log.debug(str(instance.data["representations"]))
|
||||
# Skip review when requested.
|
||||
if not instance.data.get("review", True):
|
||||
return
|
||||
|
|
|
|||
|
|
@ -110,6 +110,12 @@ class PypeCommands:
|
|||
with open(output_json_path, "w") as file_stream:
|
||||
json.dump(env, file_stream, indent=4)
|
||||
|
||||
@staticmethod
|
||||
def launch_project_manager():
|
||||
from openpype.tools import project_manager
|
||||
|
||||
project_manager.main()
|
||||
|
||||
def texture_copy(self, project, asset, path):
|
||||
pass
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,18 @@
|
|||
{
|
||||
"publish": {
|
||||
"ValidateSceneSettings": {
|
||||
"enabled": true,
|
||||
"optional": true,
|
||||
"active": true,
|
||||
"skip_resolution_check": [".*"],
|
||||
"skip_timelines_check": [".*"]
|
||||
},
|
||||
"AfterEffectsSubmitDeadline": {
|
||||
"use_published": true,
|
||||
"priority": 50,
|
||||
"primary_pool": "",
|
||||
"secondary_pool": "",
|
||||
"chunk_size": 1000000
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
@ -1,7 +1,6 @@
|
|||
{
|
||||
"events": {
|
||||
"sync_to_avalon": {
|
||||
"enabled": true,
|
||||
"statuses_name_change": [
|
||||
"ready",
|
||||
"not ready"
|
||||
|
|
|
|||
|
|
@ -267,13 +267,6 @@
|
|||
"remote_site": "studio"
|
||||
},
|
||||
"sites": {
|
||||
"gdrive": {
|
||||
"provider": "gdrive",
|
||||
"credentials_url": "",
|
||||
"root": {
|
||||
"work": ""
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"project_plugins": {
|
||||
|
|
|
|||
|
|
@ -1,14 +1,18 @@
|
|||
{
|
||||
"general": {
|
||||
"skip_resolution_check": [],
|
||||
"skip_timelines_check": []
|
||||
},
|
||||
"publish": {
|
||||
"CollectPalettes": {
|
||||
"allowed_tasks": [
|
||||
"."
|
||||
".*"
|
||||
]
|
||||
},
|
||||
"ValidateSceneSettings": {
|
||||
"enabled": true,
|
||||
"optional": true,
|
||||
"active": true,
|
||||
"frame_check_filter": [],
|
||||
"skip_resolution_check": [],
|
||||
"skip_timelines_check": []
|
||||
},
|
||||
"HarmonySubmitDeadline": {
|
||||
"use_published": false,
|
||||
"priority": 50,
|
||||
|
|
|
|||
|
|
@ -135,6 +135,12 @@
|
|||
"enabled": false,
|
||||
"attributes": {}
|
||||
},
|
||||
"ValidateRenderSettings": {
|
||||
"arnold_render_attributes": [],
|
||||
"vray_render_attributes": [],
|
||||
"redshift_render_attributes": [],
|
||||
"renderman_render_attributes": []
|
||||
},
|
||||
"ValidateModelName": {
|
||||
"enabled": false,
|
||||
"material_file": {
|
||||
|
|
|
|||
|
|
@ -5,18 +5,11 @@
|
|||
"icon": "{}/app_icons/maya.png",
|
||||
"host_name": "maya",
|
||||
"environment": {
|
||||
"PYTHONPATH": [
|
||||
"{OPENPYPE_REPOS_ROOT}/openpype/hosts/maya/startup",
|
||||
"{OPENPYPE_REPOS_ROOT}/repos/avalon-core/setup/maya",
|
||||
"{OPENPYPE_REPOS_ROOT}/repos/maya-look-assigner",
|
||||
"{PYTHONPATH}"
|
||||
],
|
||||
"MAYA_DISABLE_CLIC_IPM": "Yes",
|
||||
"MAYA_DISABLE_CIP": "Yes",
|
||||
"MAYA_DISABLE_CER": "Yes",
|
||||
"PYMEL_SKIP_MEL_INIT": "Yes",
|
||||
"LC_ALL": "C",
|
||||
"OPENPYPE_LOG_NO_COLORS": "Yes"
|
||||
"LC_ALL": "C"
|
||||
},
|
||||
"variants": {
|
||||
"2022": {
|
||||
|
|
@ -110,15 +103,7 @@
|
|||
"icon": "{}/app_icons/nuke.png",
|
||||
"host_name": "nuke",
|
||||
"environment": {
|
||||
"NUKE_PATH": [
|
||||
"{OPENPYPE_REPOS_ROOT}/repos/avalon-core/setup/nuke/nuke_path",
|
||||
"{OPENPYPE_REPOS_ROOT}/openpype/hosts/nuke/startup",
|
||||
"{OPENPYPE_STUDIO_PLUGINS}/nuke"
|
||||
],
|
||||
"PATH": {
|
||||
"windows": "C:/Program Files (x86)/QuickTime/QTSystem/;{PATH}"
|
||||
},
|
||||
"LOGLEVEL": "DEBUG"
|
||||
"NUKE_PATH": "{OPENPYPE_STUDIO_PLUGINS}/nuke"
|
||||
},
|
||||
"variants": {
|
||||
"13-0": {
|
||||
|
|
@ -224,15 +209,7 @@
|
|||
"icon": "{}/app_icons/nuke.png",
|
||||
"host_name": "nuke",
|
||||
"environment": {
|
||||
"NUKE_PATH": [
|
||||
"{OPENPYPE_REPOS_ROOT}/repos/avalon-core/setup/nuke/nuke_path",
|
||||
"{OPENPYPE_REPOS_ROOT}/openpype/hosts/nuke/startup",
|
||||
"{OPENPYPE_STUDIO_PLUGINS}/nuke"
|
||||
],
|
||||
"PATH": {
|
||||
"windows": "C:/Program Files (x86)/QuickTime/QTSystem/;{PATH}"
|
||||
},
|
||||
"LOGLEVEL": "DEBUG"
|
||||
"NUKE_PATH": "{OPENPYPE_STUDIO_PLUGINS}/nuke"
|
||||
},
|
||||
"variants": {
|
||||
"13-0": {
|
||||
|
|
@ -368,15 +345,8 @@
|
|||
"icon": "{}/app_icons/nuke.png",
|
||||
"host_name": "hiero",
|
||||
"environment": {
|
||||
"HIERO_PLUGIN_PATH": [
|
||||
"{OPENPYPE_REPOS_ROOT}/openpype/hosts/hiero/startup"
|
||||
],
|
||||
"PATH": {
|
||||
"windows": "C:/Program Files (x86)/QuickTime/QTSystem/;{PATH}"
|
||||
},
|
||||
"WORKFILES_STARTUP": "0",
|
||||
"TAG_ASSETBUILD_STARTUP": "0",
|
||||
"LOGLEVEL": "DEBUG"
|
||||
"TAG_ASSETBUILD_STARTUP": "0"
|
||||
},
|
||||
"variants": {
|
||||
"13-0": {
|
||||
|
|
@ -510,15 +480,8 @@
|
|||
"icon": "{}/app_icons/hiero.png",
|
||||
"host_name": "hiero",
|
||||
"environment": {
|
||||
"HIERO_PLUGIN_PATH": [
|
||||
"{OPENPYPE_REPOS_ROOT}/openpype/hosts/hiero/startup"
|
||||
],
|
||||
"PATH": {
|
||||
"windows": "C:/Program Files (x86)/QuickTime/QTSystem/;{PATH}"
|
||||
},
|
||||
"WORKFILES_STARTUP": "0",
|
||||
"TAG_ASSETBUILD_STARTUP": "0",
|
||||
"LOGLEVEL": "DEBUG"
|
||||
"TAG_ASSETBUILD_STARTUP": "0"
|
||||
},
|
||||
"variants": {
|
||||
"13-0": {
|
||||
|
|
@ -783,18 +746,7 @@
|
|||
"label": "Houdini",
|
||||
"icon": "{}/app_icons/houdini.png",
|
||||
"host_name": "houdini",
|
||||
"environment": {
|
||||
"HOUDINI_PATH": {
|
||||
"darwin": "{OPENPYPE_REPOS_ROOT}/openpype/hosts/houdini/startup:&",
|
||||
"linux": "{OPENPYPE_REPOS_ROOT}/openpype/hosts/houdini/startup:&",
|
||||
"windows": "{OPENPYPE_REPOS_ROOT}/openpype/hosts/houdini/startup;&"
|
||||
},
|
||||
"HOUDINI_MENU_PATH": {
|
||||
"darwin": "{OPENPYPE_REPOS_ROOT}/openpype/hosts/houdini/startup:&",
|
||||
"linux": "{OPENPYPE_REPOS_ROOT}/openpype/hosts/houdini/startup:&",
|
||||
"windows": "{OPENPYPE_REPOS_ROOT}/openpype/hosts/houdini/startup;&"
|
||||
}
|
||||
},
|
||||
"environment": {},
|
||||
"variants": {
|
||||
"18-5": {
|
||||
"use_python_2": true,
|
||||
|
|
@ -852,14 +804,7 @@
|
|||
"label": "Blender",
|
||||
"icon": "{}/app_icons/blender.png",
|
||||
"host_name": "blender",
|
||||
"environment": {
|
||||
"BLENDER_USER_SCRIPTS": "{OPENPYPE_REPOS_ROOT}/repos/avalon-core/setup/blender",
|
||||
"PYTHONPATH": [
|
||||
"{OPENPYPE_REPOS_ROOT}/repos/avalon-core/setup/blender",
|
||||
"{PYTHONPATH}"
|
||||
],
|
||||
"QT_PREFERRED_BINDING": "PySide2"
|
||||
},
|
||||
"environment": {},
|
||||
"variants": {
|
||||
"2-83": {
|
||||
"use_python_2": false,
|
||||
|
|
@ -940,8 +885,7 @@
|
|||
"icon": "{}/app_icons/harmony.png",
|
||||
"host_name": "harmony",
|
||||
"environment": {
|
||||
"AVALON_HARMONY_WORKFILES_ON_LAUNCH": "1",
|
||||
"LIB_OPENHARMONY_PATH": "{OPENPYPE_REPOS_ROOT}/pype/vendor/OpenHarmony"
|
||||
"AVALON_HARMONY_WORKFILES_ON_LAUNCH": "1"
|
||||
},
|
||||
"variants": {
|
||||
"20": {
|
||||
|
|
@ -985,9 +929,7 @@
|
|||
"label": "TVPaint",
|
||||
"icon": "{}/app_icons/tvpaint.png",
|
||||
"host_name": "tvpaint",
|
||||
"environment": {
|
||||
"OPENPYPE_LOG_NO_COLORS": "True"
|
||||
},
|
||||
"environment": {},
|
||||
"variants": {
|
||||
"animation_11-64bits": {
|
||||
"use_python_2": false,
|
||||
|
|
@ -1034,8 +976,6 @@
|
|||
"host_name": "photoshop",
|
||||
"environment": {
|
||||
"AVALON_PHOTOSHOP_WORKFILES_ON_LAUNCH": "1",
|
||||
"OPENPYPE_LOG_NO_COLORS": "Yes",
|
||||
"WEBSOCKET_URL": "ws://localhost:8099/ws/",
|
||||
"WORKFILES_SAVE_AS": "Yes"
|
||||
},
|
||||
"variants": {
|
||||
|
|
@ -1084,8 +1024,6 @@
|
|||
"host_name": "aftereffects",
|
||||
"environment": {
|
||||
"AVALON_AFTEREFFECTS_WORKFILES_ON_LAUNCH": "1",
|
||||
"OPENPYPE_LOG_NO_COLORS": "Yes",
|
||||
"WEBSOCKET_URL": "ws://localhost:8097/ws/",
|
||||
"WORKFILES_SAVE_AS": "Yes"
|
||||
},
|
||||
"variants": {
|
||||
|
|
@ -1159,10 +1097,7 @@
|
|||
"label": "Unreal Editor",
|
||||
"icon": "{}/app_icons/ue4.png'",
|
||||
"host_name": "unreal",
|
||||
"environment": {
|
||||
"AVALON_UNREAL_PLUGIN": "{OPENPYPE_REPOS_ROOT}/repos/avalon-unreal-integration",
|
||||
"OPENPYPE_LOG_NO_COLORS": "True"
|
||||
},
|
||||
"environment": {},
|
||||
"variants": {
|
||||
"4-26": {
|
||||
"use_python_2": false,
|
||||
|
|
|
|||
|
|
@ -135,7 +135,8 @@
|
|||
"workspace_name": ""
|
||||
},
|
||||
"sync_server": {
|
||||
"enabled": false
|
||||
"enabled": false,
|
||||
"sites": {}
|
||||
},
|
||||
"deadline": {
|
||||
"enabled": true,
|
||||
|
|
|
|||
|
|
@ -101,7 +101,8 @@ from .enum_entity import (
|
|||
BaseEnumEntity,
|
||||
EnumEntity,
|
||||
AppsEnumEntity,
|
||||
ToolsEnumEntity
|
||||
ToolsEnumEntity,
|
||||
ProvidersEnum
|
||||
)
|
||||
|
||||
from .list_entity import ListEntity
|
||||
|
|
@ -149,6 +150,7 @@ __all__ = (
|
|||
"EnumEntity",
|
||||
"AppsEnumEntity",
|
||||
"ToolsEnumEntity",
|
||||
"ProvidersEnum",
|
||||
|
||||
"ListEntity",
|
||||
|
||||
|
|
|
|||
|
|
@ -434,8 +434,19 @@ class DictMutableKeysEntity(EndpointEntity):
|
|||
if using_values_from_state:
|
||||
if _settings_value is NOT_SET:
|
||||
initial_value = NOT_SET
|
||||
|
||||
elif self.store_as_list:
|
||||
new_initial_value = []
|
||||
for key, value in _settings_value:
|
||||
if key in initial_value:
|
||||
new_initial_value.append(key, initial_value.pop(key))
|
||||
|
||||
for key, value in initial_value.items():
|
||||
new_initial_value.append(key, value)
|
||||
initial_value = new_initial_value
|
||||
else:
|
||||
initial_value = _settings_value
|
||||
|
||||
self.initial_value = initial_value
|
||||
|
||||
def _convert_to_regex_valid_key(self, key):
|
||||
|
|
|
|||
|
|
@ -217,3 +217,41 @@ class ToolsEnumEntity(BaseEnumEntity):
|
|||
if key in self.valid_keys:
|
||||
new_value.append(key)
|
||||
self._current_value = new_value
|
||||
|
||||
|
||||
class ProvidersEnum(BaseEnumEntity):
|
||||
schema_types = ["providers-enum"]
|
||||
|
||||
def _item_initalization(self):
|
||||
self.multiselection = False
|
||||
self.value_on_not_set = ""
|
||||
self.enum_items = []
|
||||
self.valid_keys = set()
|
||||
self.valid_value_types = (str, )
|
||||
self.placeholder = None
|
||||
|
||||
def _get_enum_values(self):
|
||||
from openpype.modules.sync_server.providers import lib as lib_providers
|
||||
|
||||
providers = lib_providers.factory.providers
|
||||
|
||||
valid_keys = set()
|
||||
valid_keys.add('')
|
||||
enum_items = [{'': 'Choose Provider'}]
|
||||
for provider_code, provider_info in providers.items():
|
||||
provider, _ = provider_info
|
||||
enum_items.append({provider_code: provider.LABEL})
|
||||
valid_keys.add(provider_code)
|
||||
|
||||
return enum_items, valid_keys
|
||||
|
||||
def set_override_state(self, *args, **kwargs):
|
||||
super(ProvidersEnum, self).set_override_state(*args, **kwargs)
|
||||
|
||||
self.enum_items, self.valid_keys = self._get_enum_values()
|
||||
|
||||
value_on_not_set = list(self.valid_keys)[0]
|
||||
if self._current_value is NOT_SET:
|
||||
self._current_value = value_on_not_set
|
||||
|
||||
self.value_on_not_set = value_on_not_set
|
||||
|
|
|
|||
|
|
@ -78,6 +78,10 @@
|
|||
"type": "schema",
|
||||
"name": "schema_project_hiero"
|
||||
},
|
||||
{
|
||||
"type": "schema",
|
||||
"name": "schema_project_aftereffects"
|
||||
},
|
||||
{
|
||||
"type": "schema",
|
||||
"name": "schema_project_harmony"
|
||||
|
|
|
|||
|
|
@ -0,0 +1,90 @@
|
|||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"key": "aftereffects",
|
||||
"label": "AfterEffects",
|
||||
"is_file": true,
|
||||
"children": [
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"key": "publish",
|
||||
"label": "Publish plugins",
|
||||
"children": [
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"key": "ValidateSceneSettings",
|
||||
"label": "Validate Scene Settings",
|
||||
"checkbox_key": "enabled",
|
||||
"children": [
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "enabled",
|
||||
"label": "Enabled"
|
||||
},
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "optional",
|
||||
"label": "Optional"
|
||||
},
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "active",
|
||||
"label": "Active"
|
||||
},
|
||||
{
|
||||
"type": "label",
|
||||
"label": "Validate if FPS and Resolution match shot data"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "skip_resolution_check",
|
||||
"object_type": "text",
|
||||
"label": "Skip Resolution Check for Tasks"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "skip_timelines_check",
|
||||
"object_type": "text",
|
||||
"label": "Skip Timeline Check for Tasks"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"key": "AfterEffectsSubmitDeadline",
|
||||
"label": "AfterEffects Submit to Deadline",
|
||||
"children": [
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "use_published",
|
||||
"label": "Use Published scene"
|
||||
},
|
||||
{
|
||||
"type": "number",
|
||||
"key": "priority",
|
||||
"label": "Priority"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "primary_pool",
|
||||
"label": "Primary Pool"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "secondary_pool",
|
||||
"label": "Secondary Pool"
|
||||
},
|
||||
{
|
||||
"type": "number",
|
||||
"key": "chunk_size",
|
||||
"label": "Frames Per Task"
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
|
|
@ -14,13 +14,7 @@
|
|||
"type": "dict",
|
||||
"key": "sync_to_avalon",
|
||||
"label": "Sync to avalon",
|
||||
"checkbox_key": "enabled",
|
||||
"children": [
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "enabled",
|
||||
"label": "Enabled"
|
||||
},
|
||||
{
|
||||
"type": "label",
|
||||
"label": "Allow name and hierarchy change only if following statuses are on all children tasks"
|
||||
|
|
|
|||
|
|
@ -5,26 +5,6 @@
|
|||
"label": "Harmony",
|
||||
"is_file": true,
|
||||
"children": [
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"key": "general",
|
||||
"label": "General",
|
||||
"children": [
|
||||
{
|
||||
"type": "list",
|
||||
"key": "skip_resolution_check",
|
||||
"object_type": "text",
|
||||
"label": "Skip Resolution Check for Tasks"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "skip_timelines_check",
|
||||
"object_type": "text",
|
||||
"label": "Skip Timeliene Check for Tasks"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
|
|
@ -45,6 +25,52 @@
|
|||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"key": "ValidateSceneSettings",
|
||||
"label": "Validate Scene Settings",
|
||||
"checkbox_key": "enabled",
|
||||
"children": [
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "enabled",
|
||||
"label": "Enabled"
|
||||
},
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "optional",
|
||||
"label": "Optional"
|
||||
},
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "active",
|
||||
"label": "Active"
|
||||
},
|
||||
{
|
||||
"type": "label",
|
||||
"label": "Validate if FrameStart, FrameEnd and Resolution match shot data"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "frame_check_filter",
|
||||
"label": "Skip Frame check for Assets with",
|
||||
"object_type": "text"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "skip_resolution_check",
|
||||
"object_type": "text",
|
||||
"label": "Skip Resolution Check for Tasks"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "skip_timelines_check",
|
||||
"object_type": "text",
|
||||
"label": "Skip Timeline Check for Tasks"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
|
|
@ -59,7 +85,7 @@
|
|||
{
|
||||
"type": "number",
|
||||
"key": "priority",
|
||||
"label": "priority"
|
||||
"label": "Priority"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
|
|
@ -74,7 +100,7 @@
|
|||
{
|
||||
"type": "number",
|
||||
"key": "chunk_size",
|
||||
"label": "Chunk Size"
|
||||
"label": "Frames Per Task"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
|
|
|||
|
|
@ -50,14 +50,10 @@
|
|||
"type": "dict",
|
||||
"children": [
|
||||
{
|
||||
"type": "text",
|
||||
"key": "provider",
|
||||
"label": "Provider"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"type": "path",
|
||||
"key": "credentials_url",
|
||||
"label": "Credentials url"
|
||||
"label": "Credentials url",
|
||||
"multiplatform": true
|
||||
},
|
||||
{
|
||||
"type": "dict-modifiable",
|
||||
|
|
|
|||
|
|
@ -72,6 +72,56 @@
|
|||
}
|
||||
]
|
||||
},
|
||||
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"key": "ValidateRenderSettings",
|
||||
"label": "ValidateRenderSettings",
|
||||
"children": [
|
||||
{
|
||||
"type": "dict-modifiable",
|
||||
"store_as_list": true,
|
||||
"key": "arnold_render_attributes",
|
||||
"label": "Arnold Render Attributes",
|
||||
"use_label_wrap": true,
|
||||
"object_type": {
|
||||
"type": "text"
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "dict-modifiable",
|
||||
"store_as_list": true,
|
||||
"key": "vray_render_attributes",
|
||||
"label": "Vray Render Attributes",
|
||||
"use_label_wrap": true,
|
||||
"object_type": {
|
||||
"type": "text"
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "dict-modifiable",
|
||||
"store_as_list": true,
|
||||
"key": "redshift_render_attributes",
|
||||
"label": "Redshift Render Attributes",
|
||||
"use_label_wrap": true,
|
||||
"object_type": {
|
||||
"type": "text"
|
||||
}
|
||||
},
|
||||
{
|
||||
"type": "dict-modifiable",
|
||||
"store_as_list": true,
|
||||
"key": "renderman_render_attributes",
|
||||
"label": "Renderman Render Attributes",
|
||||
"use_label_wrap": true,
|
||||
"object_type": {
|
||||
"type": "text"
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
|
||||
{
|
||||
"type": "collapsible-wrap",
|
||||
"label": "Model",
|
||||
|
|
|
|||
|
|
@ -85,11 +85,32 @@
|
|||
"label": "Site Sync",
|
||||
"collapsible": true,
|
||||
"checkbox_key": "enabled",
|
||||
"children": [{
|
||||
"type": "boolean",
|
||||
"key": "enabled",
|
||||
"label": "Enabled"
|
||||
}]
|
||||
"children": [
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "enabled",
|
||||
"label": "Enabled"
|
||||
},
|
||||
{
|
||||
"type": "dict-modifiable",
|
||||
"collapsible": true,
|
||||
"key": "sites",
|
||||
"label": "Sites",
|
||||
"collapsible_key": false,
|
||||
"is_file": true,
|
||||
"object_type":
|
||||
{
|
||||
"type": "dict",
|
||||
"children": [
|
||||
{
|
||||
"type": "providers-enum",
|
||||
"key": "provider",
|
||||
"label": "Provider"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
]
|
||||
},{
|
||||
"type": "dict",
|
||||
"key": "deadline",
|
||||
|
|
|
|||
|
|
@ -192,7 +192,7 @@ class App(QtWidgets.QWidget):
|
|||
for i, (asset, item) in enumerate(asset_nodes.items()):
|
||||
|
||||
# Label prefix
|
||||
prefix = "({}/{})".format(i+1, len(asset_nodes))
|
||||
prefix = "({}/{})".format(i + 1, len(asset_nodes))
|
||||
|
||||
# Assign the first matching look relevant for this asset
|
||||
# (since assigning multiple to the same nodes makes no sense)
|
||||
|
|
@ -212,18 +212,19 @@ class App(QtWidgets.QWidget):
|
|||
self.echo("{} Assigning {} to {}\t".format(prefix,
|
||||
subset_name,
|
||||
asset))
|
||||
nodes = item["nodes"]
|
||||
|
||||
self.echo("Getting vray proxy nodes ...")
|
||||
vray_proxies = set(cmds.ls(type="VRayProxy"))
|
||||
nodes = set(item["nodes"]).difference(vray_proxies)
|
||||
if cmds.pluginInfo('vrayformaya', query=True, loaded=True):
|
||||
self.echo("Getting vray proxy nodes ...")
|
||||
vray_proxies = set(cmds.ls(type="VRayProxy"))
|
||||
nodes = list(set(item["nodes"]).difference(vray_proxies))
|
||||
if vray_proxies:
|
||||
for vp in vray_proxies:
|
||||
vrayproxy_assign_look(vp, subset_name)
|
||||
|
||||
# Assign look
|
||||
# Assign look
|
||||
if nodes:
|
||||
assign_look_by_version([nodes], version_id=version["_id"])
|
||||
|
||||
if vray_proxies:
|
||||
for vp in vray_proxies:
|
||||
vrayproxy_assign_look(vp, subset_name)
|
||||
assign_look_by_version(nodes, version_id=version["_id"])
|
||||
|
||||
end = time.time()
|
||||
|
||||
|
|
|
|||
10
openpype/tools/project_manager/__init__.py
Normal file
10
openpype/tools/project_manager/__init__.py
Normal file
|
|
@ -0,0 +1,10 @@
|
|||
from .project_manager import (
|
||||
ProjectManagerWindow,
|
||||
main
|
||||
)
|
||||
|
||||
|
||||
__all__ = (
|
||||
"ProjectManagerWindow",
|
||||
"main"
|
||||
)
|
||||
5
openpype/tools/project_manager/__main__.py
Normal file
5
openpype/tools/project_manager/__main__.py
Normal file
|
|
@ -0,0 +1,5 @@
|
|||
from project_manager import main
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
50
openpype/tools/project_manager/project_manager/__init__.py
Normal file
50
openpype/tools/project_manager/project_manager/__init__.py
Normal file
|
|
@ -0,0 +1,50 @@
|
|||
__all__ = (
|
||||
"IDENTIFIER_ROLE",
|
||||
|
||||
"HierarchyView",
|
||||
|
||||
"ProjectModel",
|
||||
"CreateProjectDialog",
|
||||
|
||||
"HierarchyModel",
|
||||
"HierarchySelectionModel",
|
||||
"BaseItem",
|
||||
"RootItem",
|
||||
"ProjectItem",
|
||||
"AssetItem",
|
||||
"TaskItem",
|
||||
|
||||
"ProjectManagerWindow",
|
||||
"main"
|
||||
)
|
||||
|
||||
|
||||
from .constants import (
|
||||
IDENTIFIER_ROLE
|
||||
)
|
||||
from .widgets import CreateProjectDialog
|
||||
from .view import HierarchyView
|
||||
from .model import (
|
||||
ProjectModel,
|
||||
|
||||
HierarchyModel,
|
||||
HierarchySelectionModel,
|
||||
BaseItem,
|
||||
RootItem,
|
||||
ProjectItem,
|
||||
AssetItem,
|
||||
TaskItem
|
||||
)
|
||||
from .window import ProjectManagerWindow
|
||||
|
||||
|
||||
def main():
|
||||
import sys
|
||||
from Qt import QtWidgets
|
||||
|
||||
app = QtWidgets.QApplication([])
|
||||
|
||||
window = ProjectManagerWindow()
|
||||
window.show()
|
||||
|
||||
sys.exit(app.exec_())
|
||||
13
openpype/tools/project_manager/project_manager/constants.py
Normal file
13
openpype/tools/project_manager/project_manager/constants.py
Normal file
|
|
@ -0,0 +1,13 @@
|
|||
import re
|
||||
from Qt import QtCore
|
||||
|
||||
|
||||
IDENTIFIER_ROLE = QtCore.Qt.UserRole + 1
|
||||
DUPLICATED_ROLE = QtCore.Qt.UserRole + 2
|
||||
HIERARCHY_CHANGE_ABLE_ROLE = QtCore.Qt.UserRole + 3
|
||||
REMOVED_ROLE = QtCore.Qt.UserRole + 4
|
||||
ITEM_TYPE_ROLE = QtCore.Qt.UserRole + 5
|
||||
EDITOR_OPENED_ROLE = QtCore.Qt.UserRole + 6
|
||||
|
||||
NAME_ALLOWED_SYMBOLS = "a-zA-Z0-9_"
|
||||
NAME_REGEX = re.compile("^[" + NAME_ALLOWED_SYMBOLS + "]*$")
|
||||
159
openpype/tools/project_manager/project_manager/delegates.py
Normal file
159
openpype/tools/project_manager/project_manager/delegates.py
Normal file
|
|
@ -0,0 +1,159 @@
|
|||
from Qt import QtWidgets, QtCore
|
||||
|
||||
from .widgets import (
|
||||
NameTextEdit,
|
||||
FilterComboBox
|
||||
)
|
||||
from .multiselection_combobox import MultiSelectionComboBox
|
||||
|
||||
|
||||
class ResizeEditorDelegate(QtWidgets.QStyledItemDelegate):
|
||||
@staticmethod
|
||||
def _q_smart_min_size(editor):
|
||||
min_size_hint = editor.minimumSizeHint()
|
||||
size_policy = editor.sizePolicy()
|
||||
width = 0
|
||||
height = 0
|
||||
if size_policy.horizontalPolicy() != QtWidgets.QSizePolicy.Ignored:
|
||||
if (
|
||||
size_policy.horizontalPolicy()
|
||||
& QtWidgets.QSizePolicy.ShrinkFlag
|
||||
):
|
||||
width = min_size_hint.width()
|
||||
else:
|
||||
width = max(
|
||||
editor.sizeHint().width(),
|
||||
min_size_hint.width()
|
||||
)
|
||||
|
||||
if size_policy.verticalPolicy() != QtWidgets.QSizePolicy.Ignored:
|
||||
if size_policy.verticalPolicy() & QtWidgets.QSizePolicy.ShrinkFlag:
|
||||
height = min_size_hint.height()
|
||||
else:
|
||||
height = max(
|
||||
editor.sizeHint().height(),
|
||||
min_size_hint.height()
|
||||
)
|
||||
|
||||
output = QtCore.QSize(width, height).boundedTo(editor.maximumSize())
|
||||
min_size = editor.minimumSize()
|
||||
if min_size.width() > 0:
|
||||
output.setWidth(min_size.width())
|
||||
if min_size.height() > 0:
|
||||
output.setHeight(min_size.height())
|
||||
|
||||
return output.expandedTo(QtCore.QSize(0, 0))
|
||||
|
||||
def updateEditorGeometry(self, editor, option, index):
|
||||
self.initStyleOption(option, index)
|
||||
|
||||
option.showDecorationSelected = editor.style().styleHint(
|
||||
QtWidgets.QStyle.SH_ItemView_ShowDecorationSelected, None, editor
|
||||
)
|
||||
|
||||
widget = option.widget
|
||||
|
||||
style = widget.style() if widget else QtWidgets.QApplication.style()
|
||||
geo = style.subElementRect(
|
||||
QtWidgets.QStyle.SE_ItemViewItemText, option, widget
|
||||
)
|
||||
delta = self._q_smart_min_size(editor).width() - geo.width()
|
||||
if delta > 0:
|
||||
if editor.layoutDirection() == QtCore.Qt.RightToLeft:
|
||||
geo.adjust(-delta, 0, 0, 0)
|
||||
else:
|
||||
geo.adjust(0, 0, delta, 0)
|
||||
editor.setGeometry(geo)
|
||||
|
||||
|
||||
class NumberDelegate(QtWidgets.QStyledItemDelegate):
|
||||
def __init__(self, minimum, maximum, decimals, *args, **kwargs):
|
||||
super(NumberDelegate, self).__init__(*args, **kwargs)
|
||||
self.minimum = minimum
|
||||
self.maximum = maximum
|
||||
self.decimals = decimals
|
||||
|
||||
def createEditor(self, parent, option, index):
|
||||
if self.decimals > 0:
|
||||
editor = QtWidgets.QDoubleSpinBox(parent)
|
||||
else:
|
||||
editor = QtWidgets.QSpinBox(parent)
|
||||
|
||||
editor.setObjectName("NumberEditor")
|
||||
editor.setMinimum(self.minimum)
|
||||
editor.setMaximum(self.maximum)
|
||||
editor.setButtonSymbols(QtWidgets.QSpinBox.NoButtons)
|
||||
|
||||
value = index.data(QtCore.Qt.EditRole)
|
||||
if value is not None:
|
||||
try:
|
||||
if isinstance(value, str):
|
||||
value = float(value)
|
||||
editor.setValue(value)
|
||||
|
||||
except Exception:
|
||||
print("Couldn't set invalid value \"{}\"".format(str(value)))
|
||||
|
||||
return editor
|
||||
|
||||
|
||||
class NameDelegate(QtWidgets.QStyledItemDelegate):
|
||||
def createEditor(self, parent, option, index):
|
||||
editor = NameTextEdit(parent)
|
||||
editor.setObjectName("NameEditor")
|
||||
value = index.data(QtCore.Qt.EditRole)
|
||||
if value is not None:
|
||||
editor.setText(str(value))
|
||||
return editor
|
||||
|
||||
|
||||
class TypeDelegate(QtWidgets.QStyledItemDelegate):
|
||||
def __init__(self, project_doc_cache, *args, **kwargs):
|
||||
self._project_doc_cache = project_doc_cache
|
||||
super(TypeDelegate, self).__init__(*args, **kwargs)
|
||||
|
||||
def createEditor(self, parent, option, index):
|
||||
editor = FilterComboBox(parent)
|
||||
editor.setObjectName("TypeEditor")
|
||||
editor.style().polish(editor)
|
||||
if not self._project_doc_cache.project_doc:
|
||||
return editor
|
||||
|
||||
task_type_defs = self._project_doc_cache.project_doc["config"]["tasks"]
|
||||
editor.addItems(list(task_type_defs.keys()))
|
||||
|
||||
return editor
|
||||
|
||||
def setEditorData(self, editor, index):
|
||||
value = index.data(QtCore.Qt.EditRole)
|
||||
index = editor.findText(value)
|
||||
if index >= 0:
|
||||
editor.setCurrentIndex(index)
|
||||
|
||||
def setModelData(self, editor, model, index):
|
||||
editor.value_cleanup()
|
||||
super(TypeDelegate, self).setModelData(editor, model, index)
|
||||
|
||||
|
||||
class ToolsDelegate(QtWidgets.QStyledItemDelegate):
|
||||
def __init__(self, tools_cache, *args, **kwargs):
|
||||
self._tools_cache = tools_cache
|
||||
super(ToolsDelegate, self).__init__(*args, **kwargs)
|
||||
|
||||
def createEditor(self, parent, option, index):
|
||||
editor = MultiSelectionComboBox(parent)
|
||||
editor.setObjectName("ToolEditor")
|
||||
if not self._tools_cache.tools_data:
|
||||
return editor
|
||||
|
||||
for key, label in self._tools_cache.tools_data:
|
||||
editor.addItem(label, key)
|
||||
|
||||
return editor
|
||||
|
||||
def setEditorData(self, editor, index):
|
||||
value = index.data(QtCore.Qt.EditRole)
|
||||
editor.set_value(value)
|
||||
|
||||
def setModelData(self, editor, model, index):
|
||||
model.setData(index, editor.value(), QtCore.Qt.EditRole)
|
||||
2004
openpype/tools/project_manager/project_manager/model.py
Normal file
2004
openpype/tools/project_manager/project_manager/model.py
Normal file
File diff suppressed because it is too large
Load diff
|
|
@ -0,0 +1,215 @@
|
|||
from Qt import QtCore, QtGui, QtWidgets
|
||||
|
||||
|
||||
class ComboItemDelegate(QtWidgets.QStyledItemDelegate):
|
||||
"""
|
||||
Helper styled delegate (mostly based on existing private Qt's
|
||||
delegate used by the QtWidgets.QComboBox). Used to style the popup like a
|
||||
list view (e.g windows style).
|
||||
"""
|
||||
|
||||
def paint(self, painter, option, index):
|
||||
option = QtWidgets.QStyleOptionViewItem(option)
|
||||
option.showDecorationSelected = True
|
||||
|
||||
# option.state &= (
|
||||
# ~QtWidgets.QStyle.State_HasFocus
|
||||
# & ~QtWidgets.QStyle.State_MouseOver
|
||||
# )
|
||||
super(ComboItemDelegate, self).paint(painter, option, index)
|
||||
|
||||
|
||||
class MultiSelectionComboBox(QtWidgets.QComboBox):
|
||||
value_changed = QtCore.Signal()
|
||||
ignored_keys = {
|
||||
QtCore.Qt.Key_Up,
|
||||
QtCore.Qt.Key_Down,
|
||||
QtCore.Qt.Key_PageDown,
|
||||
QtCore.Qt.Key_PageUp,
|
||||
QtCore.Qt.Key_Home,
|
||||
QtCore.Qt.Key_End
|
||||
}
|
||||
|
||||
def __init__(self, parent=None, **kwargs):
|
||||
super(MultiSelectionComboBox, self).__init__(parent=parent, **kwargs)
|
||||
self.setObjectName("MultiSelectionComboBox")
|
||||
self.setFocusPolicy(QtCore.Qt.StrongFocus)
|
||||
|
||||
self._popup_is_shown = False
|
||||
self._block_mouse_release_timer = QtCore.QTimer(self, singleShot=True)
|
||||
self._initial_mouse_pos = None
|
||||
self._delegate = ComboItemDelegate(self)
|
||||
self.setItemDelegate(self._delegate)
|
||||
|
||||
def mousePressEvent(self, event):
|
||||
"""Reimplemented."""
|
||||
self._popup_is_shown = False
|
||||
super(MultiSelectionComboBox, self).mousePressEvent(event)
|
||||
if self._popup_is_shown:
|
||||
self._initial_mouse_pos = self.mapToGlobal(event.pos())
|
||||
self._block_mouse_release_timer.start(
|
||||
QtWidgets.QApplication.doubleClickInterval()
|
||||
)
|
||||
|
||||
def showPopup(self):
|
||||
"""Reimplemented."""
|
||||
super(MultiSelectionComboBox, self).showPopup()
|
||||
view = self.view()
|
||||
view.installEventFilter(self)
|
||||
view.viewport().installEventFilter(self)
|
||||
self._popup_is_shown = True
|
||||
|
||||
def hidePopup(self):
|
||||
"""Reimplemented."""
|
||||
self.view().removeEventFilter(self)
|
||||
self.view().viewport().removeEventFilter(self)
|
||||
self._popup_is_shown = False
|
||||
self._initial_mouse_pos = None
|
||||
super(MultiSelectionComboBox, self).hidePopup()
|
||||
self.view().clearFocus()
|
||||
|
||||
def _event_popup_shown(self, obj, event):
|
||||
if not self._popup_is_shown:
|
||||
return
|
||||
|
||||
current_index = self.view().currentIndex()
|
||||
model = self.model()
|
||||
|
||||
if event.type() == QtCore.QEvent.MouseMove:
|
||||
if (
|
||||
self.view().isVisible()
|
||||
and self._initial_mouse_pos is not None
|
||||
and self._block_mouse_release_timer.isActive()
|
||||
):
|
||||
diff = obj.mapToGlobal(event.pos()) - self._initial_mouse_pos
|
||||
if diff.manhattanLength() > 9:
|
||||
self._block_mouse_release_timer.stop()
|
||||
return
|
||||
|
||||
index_flags = current_index.flags()
|
||||
state = current_index.data(QtCore.Qt.CheckStateRole)
|
||||
new_state = None
|
||||
|
||||
if event.type() == QtCore.QEvent.MouseButtonRelease:
|
||||
if (
|
||||
self._block_mouse_release_timer.isActive()
|
||||
or not current_index.isValid()
|
||||
or not self.view().isVisible()
|
||||
or not self.view().rect().contains(event.pos())
|
||||
or not index_flags & QtCore.Qt.ItemIsSelectable
|
||||
or not index_flags & QtCore.Qt.ItemIsEnabled
|
||||
or not index_flags & QtCore.Qt.ItemIsUserCheckable
|
||||
):
|
||||
return
|
||||
|
||||
if state == QtCore.Qt.Unchecked:
|
||||
new_state = QtCore.Qt.Checked
|
||||
else:
|
||||
new_state = QtCore.Qt.Unchecked
|
||||
|
||||
elif event.type() == QtCore.QEvent.KeyPress:
|
||||
# TODO: handle QtCore.Qt.Key_Enter, Key_Return?
|
||||
if event.key() == QtCore.Qt.Key_Space:
|
||||
# toogle the current items check state
|
||||
if (
|
||||
index_flags & QtCore.Qt.ItemIsUserCheckable
|
||||
and index_flags & QtCore.Qt.ItemIsTristate
|
||||
):
|
||||
new_state = QtCore.Qt.CheckState((int(state) + 1) % 3)
|
||||
|
||||
elif index_flags & QtCore.Qt.ItemIsUserCheckable:
|
||||
if state != QtCore.Qt.Checked:
|
||||
new_state = QtCore.Qt.Checked
|
||||
else:
|
||||
new_state = QtCore.Qt.Unchecked
|
||||
|
||||
if new_state is not None:
|
||||
model.setData(current_index, new_state, QtCore.Qt.CheckStateRole)
|
||||
self.view().update(current_index)
|
||||
self.value_changed.emit()
|
||||
return True
|
||||
|
||||
def eventFilter(self, obj, event):
|
||||
"""Reimplemented."""
|
||||
result = self._event_popup_shown(obj, event)
|
||||
if result is not None:
|
||||
return result
|
||||
|
||||
return super(MultiSelectionComboBox, self).eventFilter(obj, event)
|
||||
|
||||
def addItem(self, *args, **kwargs):
|
||||
idx = self.count()
|
||||
super(MultiSelectionComboBox, self).addItem(*args, **kwargs)
|
||||
self.model().item(idx).setCheckable(True)
|
||||
|
||||
def paintEvent(self, event):
|
||||
"""Reimplemented."""
|
||||
painter = QtWidgets.QStylePainter(self)
|
||||
option = QtWidgets.QStyleOptionComboBox()
|
||||
self.initStyleOption(option)
|
||||
painter.drawComplexControl(QtWidgets.QStyle.CC_ComboBox, option)
|
||||
|
||||
# draw the icon and text
|
||||
items = self.checked_items_text()
|
||||
if not items:
|
||||
return
|
||||
|
||||
text_rect = self.style().subControlRect(
|
||||
QtWidgets.QStyle.CC_ComboBox,
|
||||
option,
|
||||
QtWidgets.QStyle.SC_ComboBoxEditField
|
||||
)
|
||||
text = ", ".join(items)
|
||||
new_text = self.fontMetrics().elidedText(
|
||||
text, QtCore.Qt.ElideRight, text_rect.width()
|
||||
)
|
||||
painter.drawText(
|
||||
text_rect,
|
||||
QtCore.Qt.AlignLeft | QtCore.Qt.AlignVCenter,
|
||||
new_text
|
||||
)
|
||||
|
||||
def setItemCheckState(self, index, state):
|
||||
self.setItemData(index, state, QtCore.Qt.CheckStateRole)
|
||||
|
||||
def set_value(self, values):
|
||||
for idx in range(self.count()):
|
||||
value = self.itemData(idx, role=QtCore.Qt.UserRole)
|
||||
if value in values:
|
||||
check_state = QtCore.Qt.Checked
|
||||
else:
|
||||
check_state = QtCore.Qt.Unchecked
|
||||
self.setItemData(idx, check_state, QtCore.Qt.CheckStateRole)
|
||||
|
||||
def value(self):
|
||||
items = list()
|
||||
for idx in range(self.count()):
|
||||
state = self.itemData(idx, role=QtCore.Qt.CheckStateRole)
|
||||
if state == QtCore.Qt.Checked:
|
||||
items.append(
|
||||
self.itemData(idx, role=QtCore.Qt.UserRole)
|
||||
)
|
||||
return items
|
||||
|
||||
def checked_items_text(self):
|
||||
items = list()
|
||||
for idx in range(self.count()):
|
||||
state = self.itemData(idx, role=QtCore.Qt.CheckStateRole)
|
||||
if state == QtCore.Qt.Checked:
|
||||
items.append(self.itemText(idx))
|
||||
return items
|
||||
|
||||
def wheelEvent(self, event):
|
||||
event.ignore()
|
||||
|
||||
def keyPressEvent(self, event):
|
||||
if (
|
||||
event.key() == QtCore.Qt.Key_Down
|
||||
and event.modifiers() & QtCore.Qt.AltModifier
|
||||
):
|
||||
return self.showPopup()
|
||||
|
||||
if event.key() in self.ignored_keys:
|
||||
return event.ignore()
|
||||
|
||||
return super(MultiSelectionComboBox, self).keyPressEvent(event)
|
||||
|
|
@ -0,0 +1,98 @@
|
|||
import os
|
||||
from openpype import resources
|
||||
from avalon.vendor import qtawesome
|
||||
|
||||
|
||||
class ResourceCache:
|
||||
colors = {
|
||||
"standard": "#333333",
|
||||
"new": "#2d9a4c",
|
||||
"warning": "#c83232"
|
||||
}
|
||||
icons = None
|
||||
|
||||
@classmethod
|
||||
def get_icon(cls, *keys):
|
||||
output = cls.get_icons()
|
||||
for key in keys:
|
||||
output = output[key]
|
||||
return output
|
||||
|
||||
@classmethod
|
||||
def get_icons(cls):
|
||||
if cls.icons is None:
|
||||
cls.icons = {
|
||||
"asset": {
|
||||
"default": qtawesome.icon(
|
||||
"fa.folder",
|
||||
color=cls.colors["standard"]
|
||||
),
|
||||
"new": qtawesome.icon(
|
||||
"fa.folder",
|
||||
color=cls.colors["new"]
|
||||
),
|
||||
"invalid": qtawesome.icon(
|
||||
"fa.exclamation-triangle",
|
||||
color=cls.colors["warning"]
|
||||
),
|
||||
"removed": qtawesome.icon(
|
||||
"fa.trash",
|
||||
color=cls.colors["warning"]
|
||||
)
|
||||
},
|
||||
"task": {
|
||||
"default": qtawesome.icon(
|
||||
"fa.check-circle-o",
|
||||
color=cls.colors["standard"]
|
||||
),
|
||||
"new": qtawesome.icon(
|
||||
"fa.check-circle",
|
||||
color=cls.colors["new"]
|
||||
),
|
||||
"invalid": qtawesome.icon(
|
||||
"fa.exclamation-circle",
|
||||
color=cls.colors["warning"]
|
||||
),
|
||||
"removed": qtawesome.icon(
|
||||
"fa.trash",
|
||||
color=cls.colors["warning"]
|
||||
)
|
||||
},
|
||||
"refresh": qtawesome.icon(
|
||||
"fa.refresh",
|
||||
color=cls.colors["standard"]
|
||||
)
|
||||
}
|
||||
return cls.icons
|
||||
|
||||
@classmethod
|
||||
def get_color(cls, color_name):
|
||||
return cls.colors[color_name]
|
||||
|
||||
@classmethod
|
||||
def style_fill_data(cls):
|
||||
output = {}
|
||||
for color_name, color_value in cls.colors.items():
|
||||
key = "color:{}".format(color_name)
|
||||
output[key] = color_value
|
||||
return output
|
||||
|
||||
|
||||
def load_stylesheet():
|
||||
from . import qrc_resources
|
||||
|
||||
qrc_resources.qInitResources()
|
||||
|
||||
current_dir = os.path.dirname(os.path.abspath(__file__))
|
||||
style_path = os.path.join(current_dir, "style.css")
|
||||
with open(style_path, "r") as style_file:
|
||||
stylesheet = style_file.read()
|
||||
|
||||
for key, value in ResourceCache.style_fill_data().items():
|
||||
replacement_key = "{" + key + "}"
|
||||
stylesheet = stylesheet.replace(replacement_key, value)
|
||||
return stylesheet
|
||||
|
||||
|
||||
def app_icon_path():
|
||||
return resources.pype_icon_filepath()
|
||||
Binary file not shown.
|
After Width: | Height: | Size: 166 B |
Binary file not shown.
|
After Width: | Height: | Size: 165 B |
|
|
@ -0,0 +1,105 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
|
||||
# Resource object code
|
||||
#
|
||||
# Created by: The Resource Compiler for PyQt5 (Qt v5.15.2)
|
||||
#
|
||||
# WARNING! All changes made in this file will be lost!
|
||||
|
||||
from PyQt5 import QtCore
|
||||
|
||||
|
||||
qt_resource_data = b"\
|
||||
\x00\x00\x00\xa5\
|
||||
\x89\
|
||||
\x50\x4e\x47\x0d\x0a\x1a\x0a\x00\x00\x00\x0d\x49\x48\x44\x52\x00\
|
||||
\x00\x00\x09\x00\x00\x00\x06\x08\x04\x00\x00\x00\xbb\xce\x7c\x4e\
|
||||
\x00\x00\x00\x01\x73\x52\x47\x42\x00\xae\xce\x1c\xe9\x00\x00\x00\
|
||||
\x02\x62\x4b\x47\x44\x00\x9c\x53\x34\xfc\x5d\x00\x00\x00\x09\x70\
|
||||
\x48\x59\x73\x00\x00\x0b\x13\x00\x00\x0b\x13\x01\x00\x9a\x9c\x18\
|
||||
\x00\x00\x00\x07\x74\x49\x4d\x45\x07\xdc\x08\x17\x0b\x02\x04\x6d\
|
||||
\x98\x1b\x69\x00\x00\x00\x29\x49\x44\x41\x54\x08\xd7\x63\x60\xc0\
|
||||
\x00\x8c\x0c\x0c\xff\xcf\xa3\x08\x18\x32\x32\x30\x20\x0b\x32\x1a\
|
||||
\x32\x30\x30\x42\x98\x10\x41\x46\x43\x14\x13\x50\xb5\xa3\x01\x00\
|
||||
\xd6\x10\x07\xd2\x2f\x48\xdf\x4a\x00\x00\x00\x00\x49\x45\x4e\x44\
|
||||
\xae\x42\x60\x82\
|
||||
\x00\x00\x00\xa6\
|
||||
\x89\
|
||||
\x50\x4e\x47\x0d\x0a\x1a\x0a\x00\x00\x00\x0d\x49\x48\x44\x52\x00\
|
||||
\x00\x00\x09\x00\x00\x00\x06\x08\x04\x00\x00\x00\xbb\xce\x7c\x4e\
|
||||
\x00\x00\x00\x01\x73\x52\x47\x42\x00\xae\xce\x1c\xe9\x00\x00\x00\
|
||||
\x02\x62\x4b\x47\x44\x00\xff\x87\x8f\xcc\xbf\x00\x00\x00\x09\x70\
|
||||
\x48\x59\x73\x00\x00\x0b\x13\x00\x00\x0b\x13\x01\x00\x9a\x9c\x18\
|
||||
\x00\x00\x00\x07\x74\x49\x4d\x45\x07\xdc\x08\x17\x08\x15\x3b\xdc\
|
||||
\x3b\x0c\x9b\x00\x00\x00\x2a\x49\x44\x41\x54\x08\xd7\x63\x60\xc0\
|
||||
\x00\x8c\x0c\x0c\x73\x3e\x20\x0b\xa4\x08\x30\x32\x30\x20\x0b\xa6\
|
||||
\x08\x30\x30\x30\x42\x98\x10\xc1\x14\x01\x14\x13\x50\xb5\xa3\x01\
|
||||
\x00\xc6\xb9\x07\x90\x5d\x66\x1f\x83\x00\x00\x00\x00\x49\x45\x4e\
|
||||
\x44\xae\x42\x60\x82\
|
||||
"
|
||||
|
||||
|
||||
qt_resource_name = b"\
|
||||
\x00\x08\
|
||||
\x06\xc5\x8e\xa5\
|
||||
\x00\x6f\
|
||||
\x00\x70\x00\x65\x00\x6e\x00\x70\x00\x79\x00\x70\x00\x65\
|
||||
\x00\x06\
|
||||
\x07\x03\x7d\xc3\
|
||||
\x00\x69\
|
||||
\x00\x6d\x00\x61\x00\x67\x00\x65\x00\x73\
|
||||
\x00\x12\
|
||||
\x01\x2e\x03\x27\
|
||||
\x00\x63\
|
||||
\x00\x6f\x00\x6d\x00\x62\x00\x6f\x00\x62\x00\x6f\x00\x78\x00\x5f\x00\x61\x00\x72\x00\x72\x00\x6f\x00\x77\x00\x2e\x00\x70\x00\x6e\
|
||||
\x00\x67\
|
||||
\x00\x1b\
|
||||
\x03\x5a\x32\x27\
|
||||
\x00\x63\
|
||||
\x00\x6f\x00\x6d\x00\x62\x00\x6f\x00\x62\x00\x6f\x00\x78\x00\x5f\x00\x61\x00\x72\x00\x72\x00\x6f\x00\x77\x00\x5f\x00\x64\x00\x69\
|
||||
\x00\x73\x00\x61\x00\x62\x00\x6c\x00\x65\x00\x64\x00\x2e\x00\x70\x00\x6e\x00\x67\
|
||||
"
|
||||
|
||||
|
||||
qt_resource_struct_v1 = b"\
|
||||
\x00\x00\x00\x00\x00\x02\x00\x00\x00\x01\x00\x00\x00\x01\
|
||||
\x00\x00\x00\x00\x00\x02\x00\x00\x00\x01\x00\x00\x00\x02\
|
||||
\x00\x00\x00\x16\x00\x02\x00\x00\x00\x02\x00\x00\x00\x03\
|
||||
\x00\x00\x00\x28\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00\
|
||||
\x00\x00\x00\x52\x00\x00\x00\x00\x00\x01\x00\x00\x00\xa9\
|
||||
"
|
||||
|
||||
|
||||
qt_resource_struct_v2 = b"\
|
||||
\x00\x00\x00\x00\x00\x02\x00\x00\x00\x01\x00\x00\x00\x01\
|
||||
\x00\x00\x00\x00\x00\x00\x00\x00\
|
||||
\x00\x00\x00\x00\x00\x02\x00\x00\x00\x01\x00\x00\x00\x02\
|
||||
\x00\x00\x00\x00\x00\x00\x00\x00\
|
||||
\x00\x00\x00\x16\x00\x02\x00\x00\x00\x02\x00\x00\x00\x03\
|
||||
\x00\x00\x00\x00\x00\x00\x00\x00\
|
||||
\x00\x00\x00\x28\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00\
|
||||
\x00\x00\x01\x76\x41\x9d\xa2\x35\
|
||||
\x00\x00\x00\x52\x00\x00\x00\x00\x00\x01\x00\x00\x00\xa9\
|
||||
\x00\x00\x01\x76\x41\x9d\xa2\x35\
|
||||
"
|
||||
|
||||
|
||||
qt_version = [int(v) for v in QtCore.qVersion().split('.')]
|
||||
if qt_version < [5, 8, 0]:
|
||||
rcc_version = 1
|
||||
qt_resource_struct = qt_resource_struct_v1
|
||||
else:
|
||||
rcc_version = 2
|
||||
qt_resource_struct = qt_resource_struct_v2
|
||||
|
||||
|
||||
def qInitResources():
|
||||
QtCore.qRegisterResourceData(
|
||||
rcc_version, qt_resource_struct, qt_resource_name, qt_resource_data
|
||||
)
|
||||
|
||||
|
||||
def qCleanupResources():
|
||||
QtCore.qUnregisterResourceData(
|
||||
rcc_version, qt_resource_struct, qt_resource_name, qt_resource_data
|
||||
)
|
||||
|
|
@ -0,0 +1,84 @@
|
|||
# Resource object code (Python 3)
|
||||
# Created by: object code
|
||||
# Created by: The Resource Compiler for Qt version 5.15.2
|
||||
# WARNING! All changes made in this file will be lost!
|
||||
|
||||
from PySide2 import QtCore
|
||||
|
||||
|
||||
qt_resource_data = b"\
|
||||
\x00\x00\x00\xa5\
|
||||
\x89\
|
||||
PNG\x0d\x0a\x1a\x0a\x00\x00\x00\x0dIHDR\x00\
|
||||
\x00\x00\x09\x00\x00\x00\x06\x08\x04\x00\x00\x00\xbb\xce|N\
|
||||
\x00\x00\x00\x01sRGB\x00\xae\xce\x1c\xe9\x00\x00\x00\
|
||||
\x02bKGD\x00\x9cS4\xfc]\x00\x00\x00\x09p\
|
||||
HYs\x00\x00\x0b\x13\x00\x00\x0b\x13\x01\x00\x9a\x9c\x18\
|
||||
\x00\x00\x00\x07tIME\x07\xdc\x08\x17\x0b\x02\x04m\
|
||||
\x98\x1bi\x00\x00\x00)IDAT\x08\xd7c`\xc0\
|
||||
\x00\x8c\x0c\x0c\xff\xcf\xa3\x08\x18220 \x0b2\x1a\
|
||||
200B\x98\x10AFC\x14\x13P\xb5\xa3\x01\x00\
|
||||
\xd6\x10\x07\xd2/H\xdfJ\x00\x00\x00\x00IEND\
|
||||
\xaeB`\x82\
|
||||
\x00\x00\x00\xa6\
|
||||
\x89\
|
||||
PNG\x0d\x0a\x1a\x0a\x00\x00\x00\x0dIHDR\x00\
|
||||
\x00\x00\x09\x00\x00\x00\x06\x08\x04\x00\x00\x00\xbb\xce|N\
|
||||
\x00\x00\x00\x01sRGB\x00\xae\xce\x1c\xe9\x00\x00\x00\
|
||||
\x02bKGD\x00\xff\x87\x8f\xcc\xbf\x00\x00\x00\x09p\
|
||||
HYs\x00\x00\x0b\x13\x00\x00\x0b\x13\x01\x00\x9a\x9c\x18\
|
||||
\x00\x00\x00\x07tIME\x07\xdc\x08\x17\x08\x15;\xdc\
|
||||
;\x0c\x9b\x00\x00\x00*IDAT\x08\xd7c`\xc0\
|
||||
\x00\x8c\x0c\x0cs> \x0b\xa4\x08020 \x0b\xa6\
|
||||
\x08000B\x98\x10\xc1\x14\x01\x14\x13P\xb5\xa3\x01\
|
||||
\x00\xc6\xb9\x07\x90]f\x1f\x83\x00\x00\x00\x00IEN\
|
||||
D\xaeB`\x82\
|
||||
"
|
||||
|
||||
|
||||
qt_resource_name = b"\
|
||||
\x00\x08\
|
||||
\x06\xc5\x8e\xa5\
|
||||
\x00o\
|
||||
\x00p\x00e\x00n\x00p\x00y\x00p\x00e\
|
||||
\x00\x06\
|
||||
\x07\x03}\xc3\
|
||||
\x00i\
|
||||
\x00m\x00a\x00g\x00e\x00s\
|
||||
\x00\x12\
|
||||
\x01.\x03'\
|
||||
\x00c\
|
||||
\x00o\x00m\x00b\x00o\x00b\x00o\x00x\x00_\x00a\x00r\x00r\x00o\x00w\x00.\x00p\x00n\
|
||||
\x00g\
|
||||
\x00\x1b\
|
||||
\x03Z2'\
|
||||
\x00c\
|
||||
\x00o\x00m\x00b\x00o\x00b\x00o\x00x\x00_\x00a\x00r\x00r\x00o\x00w\x00_\x00d\x00i\
|
||||
\x00s\x00a\x00b\x00l\x00e\x00d\x00.\x00p\x00n\x00g\
|
||||
"
|
||||
|
||||
|
||||
qt_resource_struct = b"\
|
||||
\x00\x00\x00\x00\x00\x02\x00\x00\x00\x01\x00\x00\x00\x01\
|
||||
\x00\x00\x00\x00\x00\x00\x00\x00\
|
||||
\x00\x00\x00\x00\x00\x02\x00\x00\x00\x01\x00\x00\x00\x02\
|
||||
\x00\x00\x00\x00\x00\x00\x00\x00\
|
||||
\x00\x00\x00\x16\x00\x02\x00\x00\x00\x02\x00\x00\x00\x03\
|
||||
\x00\x00\x00\x00\x00\x00\x00\x00\
|
||||
\x00\x00\x00(\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00\
|
||||
\x00\x00\x01vA\x9d\xa25\
|
||||
\x00\x00\x00R\x00\x00\x00\x00\x00\x01\x00\x00\x00\xa9\
|
||||
\x00\x00\x01vA\x9d\xa25\
|
||||
"
|
||||
|
||||
|
||||
def qInitResources():
|
||||
QtCore.qRegisterResourceData(
|
||||
0x03, qt_resource_struct, qt_resource_name, qt_resource_data
|
||||
)
|
||||
|
||||
|
||||
def qCleanupResources():
|
||||
QtCore.qUnregisterResourceData(
|
||||
0x03, qt_resource_struct, qt_resource_name, qt_resource_data
|
||||
)
|
||||
|
|
@ -0,0 +1,32 @@
|
|||
import Qt
|
||||
|
||||
|
||||
initialized = False
|
||||
resources = None
|
||||
if Qt.__binding__ == "PySide2":
|
||||
from . import pyside2_resources as resources
|
||||
elif Qt.__binding__ == "PyQt5":
|
||||
from . import pyqt5_resources as resources
|
||||
|
||||
|
||||
def qInitResources():
|
||||
global resources
|
||||
global initialized
|
||||
if resources is not None and not initialized:
|
||||
initialized = True
|
||||
resources.qInitResources()
|
||||
|
||||
|
||||
def qCleanupResources():
|
||||
global resources
|
||||
global initialized
|
||||
if resources is not None:
|
||||
initialized = False
|
||||
resources.qCleanupResources()
|
||||
|
||||
|
||||
__all__ = (
|
||||
"resources",
|
||||
"qInitResources",
|
||||
"qCleanupResources"
|
||||
)
|
||||
|
|
@ -0,0 +1,6 @@
|
|||
<RCC>
|
||||
<qresource prefix="/openpype">
|
||||
<file>images/combobox_arrow.png</file>
|
||||
<file>images/combobox_arrow_disabled.png</file>
|
||||
</qresource>
|
||||
</RCC>
|
||||
|
|
@ -0,0 +1,21 @@
|
|||
QTreeView::item {
|
||||
padding-top: 3px;
|
||||
padding-bottom: 3px;
|
||||
padding-right: 3px;
|
||||
}
|
||||
|
||||
|
||||
QTreeView::item:selected, QTreeView::item:selected:!active {
|
||||
background: rgba(0, 122, 204, 127);
|
||||
color: black;
|
||||
}
|
||||
|
||||
#RefreshBtn {
|
||||
padding: 2px;
|
||||
}
|
||||
|
||||
#TypeEditor, #ToolEditor, #NameEditor, #NumberEditor {
|
||||
background: transparent;
|
||||
border: 1px solid #005c99;
|
||||
border-radius: 0.3em;
|
||||
}
|
||||
643
openpype/tools/project_manager/project_manager/view.py
Normal file
643
openpype/tools/project_manager/project_manager/view.py
Normal file
|
|
@ -0,0 +1,643 @@
|
|||
import collections
|
||||
from queue import Queue
|
||||
|
||||
from Qt import QtWidgets, QtCore, QtGui
|
||||
|
||||
from .delegates import (
|
||||
NumberDelegate,
|
||||
NameDelegate,
|
||||
TypeDelegate,
|
||||
ToolsDelegate
|
||||
)
|
||||
|
||||
from openpype.lib import ApplicationManager
|
||||
from .constants import (
|
||||
REMOVED_ROLE,
|
||||
IDENTIFIER_ROLE,
|
||||
ITEM_TYPE_ROLE,
|
||||
HIERARCHY_CHANGE_ABLE_ROLE,
|
||||
EDITOR_OPENED_ROLE
|
||||
)
|
||||
|
||||
|
||||
class NameDef:
|
||||
pass
|
||||
|
||||
|
||||
class NumberDef:
|
||||
def __init__(self, minimum=None, maximum=None, decimals=None):
|
||||
self.minimum = 0 if minimum is None else minimum
|
||||
self.maximum = 999999 if maximum is None else maximum
|
||||
self.decimals = 0 if decimals is None else decimals
|
||||
|
||||
|
||||
class TypeDef:
|
||||
pass
|
||||
|
||||
|
||||
class ToolsDef:
|
||||
pass
|
||||
|
||||
|
||||
class ProjectDocCache:
|
||||
def __init__(self, dbcon):
|
||||
self.dbcon = dbcon
|
||||
self.project_doc = None
|
||||
|
||||
def set_project(self, project_name):
|
||||
self.project_doc = None
|
||||
|
||||
if not project_name:
|
||||
return
|
||||
|
||||
self.project_doc = self.dbcon.database[project_name].find_one(
|
||||
{"type": "project"}
|
||||
)
|
||||
|
||||
|
||||
class ToolsCache:
|
||||
def __init__(self):
|
||||
self.tools_data = []
|
||||
|
||||
def refresh(self):
|
||||
app_manager = ApplicationManager()
|
||||
tools_data = []
|
||||
for tool_name, tool in app_manager.tools.items():
|
||||
tools_data.append(
|
||||
(tool_name, tool.label)
|
||||
)
|
||||
self.tools_data = tools_data
|
||||
|
||||
|
||||
class HierarchyView(QtWidgets.QTreeView):
|
||||
"""A tree view that deselects on clicking on an empty area in the view"""
|
||||
column_delegate_defs = {
|
||||
"name": NameDef(),
|
||||
"type": TypeDef(),
|
||||
"frameStart": NumberDef(1),
|
||||
"frameEnd": NumberDef(1),
|
||||
"fps": NumberDef(1, decimals=2),
|
||||
"resolutionWidth": NumberDef(0),
|
||||
"resolutionHeight": NumberDef(0),
|
||||
"handleStart": NumberDef(0),
|
||||
"handleEnd": NumberDef(0),
|
||||
"clipIn": NumberDef(1),
|
||||
"clipOut": NumberDef(1),
|
||||
"pixelAspect": NumberDef(0, decimals=2),
|
||||
"tools_env": ToolsDef()
|
||||
}
|
||||
|
||||
columns_sizes = {
|
||||
"default": {
|
||||
"stretch": QtWidgets.QHeaderView.ResizeToContents
|
||||
},
|
||||
"name": {
|
||||
"stretch": QtWidgets.QHeaderView.Stretch
|
||||
},
|
||||
"type": {
|
||||
"stretch": QtWidgets.QHeaderView.Interactive,
|
||||
"width": 100
|
||||
},
|
||||
"tools_env": {
|
||||
"stretch": QtWidgets.QHeaderView.Interactive,
|
||||
"width": 140
|
||||
},
|
||||
"pixelAspect": {
|
||||
"stretch": QtWidgets.QHeaderView.Interactive,
|
||||
"width": 80
|
||||
}
|
||||
}
|
||||
persistent_columns = {
|
||||
"type",
|
||||
"frameStart",
|
||||
"frameEnd",
|
||||
"fps",
|
||||
"resolutionWidth",
|
||||
"resolutionHeight",
|
||||
"handleStart",
|
||||
"handleEnd",
|
||||
"clipIn",
|
||||
"clipOut",
|
||||
"pixelAspect",
|
||||
"tools_env"
|
||||
}
|
||||
|
||||
def __init__(self, dbcon, source_model, parent):
|
||||
super(HierarchyView, self).__init__(parent)
|
||||
# Direct access to model
|
||||
self._source_model = source_model
|
||||
self._editors_mapping = {}
|
||||
self._persisten_editors = set()
|
||||
# Access to parent because of `show_message` method
|
||||
self._parent = parent
|
||||
|
||||
project_doc_cache = ProjectDocCache(dbcon)
|
||||
tools_cache = ToolsCache()
|
||||
|
||||
main_delegate = QtWidgets.QStyledItemDelegate()
|
||||
self.setItemDelegate(main_delegate)
|
||||
self.setAlternatingRowColors(True)
|
||||
self.setSelectionMode(HierarchyView.ExtendedSelection)
|
||||
self.setContextMenuPolicy(QtCore.Qt.CustomContextMenu)
|
||||
|
||||
column_delegates = {}
|
||||
column_key_to_index = {}
|
||||
for key, item_type in self.column_delegate_defs.items():
|
||||
if isinstance(item_type, NameDef):
|
||||
delegate = NameDelegate()
|
||||
|
||||
elif isinstance(item_type, NumberDef):
|
||||
delegate = NumberDelegate(
|
||||
item_type.minimum,
|
||||
item_type.maximum,
|
||||
item_type.decimals
|
||||
)
|
||||
|
||||
elif isinstance(item_type, TypeDef):
|
||||
delegate = TypeDelegate(project_doc_cache)
|
||||
|
||||
elif isinstance(item_type, ToolsDef):
|
||||
delegate = ToolsDelegate(tools_cache)
|
||||
|
||||
column = self._source_model.columns.index(key)
|
||||
self.setItemDelegateForColumn(column, delegate)
|
||||
column_delegates[key] = delegate
|
||||
column_key_to_index[key] = column
|
||||
|
||||
source_model.index_moved.connect(self._on_rows_moved)
|
||||
self.customContextMenuRequested.connect(self._on_context_menu)
|
||||
self._source_model.project_changed.connect(self._on_project_reset)
|
||||
|
||||
self._project_doc_cache = project_doc_cache
|
||||
self._tools_cache = tools_cache
|
||||
|
||||
self._delegate = main_delegate
|
||||
self._column_delegates = column_delegates
|
||||
self._column_key_to_index = column_key_to_index
|
||||
|
||||
def header_init(self):
|
||||
header = self.header()
|
||||
header.setStretchLastSection(False)
|
||||
|
||||
default_behavior = self.columns_sizes["default"]
|
||||
widths_by_idx = {}
|
||||
for idx in range(header.count()):
|
||||
key = self._source_model.columns[idx]
|
||||
behavior = self.columns_sizes.get(key, default_behavior)
|
||||
logical_index = header.logicalIndex(idx)
|
||||
stretch = behavior["stretch"]
|
||||
header.setSectionResizeMode(logical_index, stretch)
|
||||
width = behavior.get("width")
|
||||
if width is not None:
|
||||
widths_by_idx[idx] = width
|
||||
|
||||
for idx, width in widths_by_idx.items():
|
||||
self.setColumnWidth(idx, width)
|
||||
|
||||
def set_project(self, project_name):
|
||||
# Trigger helpers first
|
||||
self._project_doc_cache.set_project(project_name)
|
||||
self._tools_cache.refresh()
|
||||
|
||||
# Trigger update of model after all data for delegates are filled
|
||||
self._source_model.set_project(project_name)
|
||||
|
||||
def _on_project_reset(self):
|
||||
self.header_init()
|
||||
|
||||
self.collapseAll()
|
||||
|
||||
project_item = self._source_model.project_item
|
||||
if project_item:
|
||||
index = self._source_model.index_for_item(project_item)
|
||||
self.expand(index)
|
||||
|
||||
def _on_rows_moved(self, index):
|
||||
parent_index = index.parent()
|
||||
if not self.isExpanded(parent_index):
|
||||
self.expand(parent_index)
|
||||
|
||||
def commitData(self, editor):
|
||||
super(HierarchyView, self).commitData(editor)
|
||||
current_index = self.currentIndex()
|
||||
column = current_index.column()
|
||||
row = current_index.row()
|
||||
skipped_index = None
|
||||
# Change column from "type" to "name"
|
||||
if column == 1:
|
||||
new_index = self._source_model.index(
|
||||
current_index.row(),
|
||||
0,
|
||||
current_index.parent()
|
||||
)
|
||||
self.setCurrentIndex(new_index)
|
||||
elif column > 0:
|
||||
indexes = []
|
||||
for index in self.selectedIndexes():
|
||||
if index.column() == column:
|
||||
if index.row() == row:
|
||||
skipped_index = index
|
||||
else:
|
||||
indexes.append(index)
|
||||
|
||||
if skipped_index is not None:
|
||||
value = current_index.data(QtCore.Qt.EditRole)
|
||||
for index in indexes:
|
||||
index.model().setData(index, value, QtCore.Qt.EditRole)
|
||||
|
||||
# Update children data
|
||||
self.updateEditorData()
|
||||
|
||||
def _deselect_editor(self, editor):
|
||||
if editor:
|
||||
if isinstance(
|
||||
editor, (QtWidgets.QSpinBox, QtWidgets.QDoubleSpinBox)
|
||||
):
|
||||
line_edit = editor.findChild(QtWidgets.QLineEdit)
|
||||
line_edit.deselect()
|
||||
|
||||
elif isinstance(editor, QtWidgets.QLineEdit):
|
||||
editor.deselect()
|
||||
|
||||
def edit(self, index, *args, **kwargs):
|
||||
result = super(HierarchyView, self).edit(index, *args, **kwargs)
|
||||
if result:
|
||||
# Mark index to not return text for DisplayRole
|
||||
editor = self.indexWidget(index)
|
||||
if (
|
||||
editor not in self._persisten_editors
|
||||
and editor not in self._editors_mapping
|
||||
):
|
||||
self._editors_mapping[editor] = index
|
||||
self._source_model.setData(index, True, EDITOR_OPENED_ROLE)
|
||||
# Deselect content of editor
|
||||
# QUESTION not sure if we want do this all the time
|
||||
self._deselect_editor(editor)
|
||||
return result
|
||||
|
||||
def closeEditor(self, editor, hint):
|
||||
if (
|
||||
editor not in self._persisten_editors
|
||||
and editor in self._editors_mapping
|
||||
):
|
||||
index = self._editors_mapping.pop(editor)
|
||||
self._source_model.setData(index, False, EDITOR_OPENED_ROLE)
|
||||
super(HierarchyView, self).closeEditor(editor, hint)
|
||||
|
||||
def openPersistentEditor(self, index):
|
||||
self._source_model.setData(index, True, EDITOR_OPENED_ROLE)
|
||||
super(HierarchyView, self).openPersistentEditor(index)
|
||||
editor = self.indexWidget(index)
|
||||
self._persisten_editors.add(editor)
|
||||
self._deselect_editor(editor)
|
||||
|
||||
def closePersistentEditor(self, index):
|
||||
self._source_model.setData(index, False, EDITOR_OPENED_ROLE)
|
||||
editor = self.indexWidget(index)
|
||||
self._persisten_editors.remove(editor)
|
||||
super(HierarchyView, self).closePersistentEditor(index)
|
||||
|
||||
def rowsInserted(self, parent_index, start, end):
|
||||
super(HierarchyView, self).rowsInserted(parent_index, start, end)
|
||||
|
||||
for row in range(start, end + 1):
|
||||
for key, column in self._column_key_to_index.items():
|
||||
if key not in self.persistent_columns:
|
||||
continue
|
||||
col_index = self._source_model.index(row, column, parent_index)
|
||||
if bool(
|
||||
self._source_model.flags(col_index)
|
||||
& QtCore.Qt.ItemIsEditable
|
||||
):
|
||||
self.openPersistentEditor(col_index)
|
||||
|
||||
# Expand parent on insert
|
||||
if not self.isExpanded(parent_index):
|
||||
self.expand(parent_index)
|
||||
|
||||
def mousePressEvent(self, event):
|
||||
index = self.indexAt(event.pos())
|
||||
if not index.isValid():
|
||||
# clear the selection
|
||||
self.clearSelection()
|
||||
# clear the current index
|
||||
self.setCurrentIndex(QtCore.QModelIndex())
|
||||
|
||||
super(HierarchyView, self).mousePressEvent(event)
|
||||
|
||||
def keyPressEvent(self, event):
|
||||
call_super = False
|
||||
if event.key() == QtCore.Qt.Key_Delete:
|
||||
self._delete_items()
|
||||
|
||||
elif event.matches(QtGui.QKeySequence.Copy):
|
||||
self._copy_items()
|
||||
|
||||
elif event.matches(QtGui.QKeySequence.Paste):
|
||||
self._paste_items()
|
||||
|
||||
elif event.key() in (QtCore.Qt.Key_Return, QtCore.Qt.Key_Enter):
|
||||
mdfs = event.modifiers()
|
||||
if mdfs == (QtCore.Qt.ShiftModifier | QtCore.Qt.ControlModifier):
|
||||
self._on_ctrl_shift_enter_pressed()
|
||||
elif mdfs == QtCore.Qt.ShiftModifier:
|
||||
self._on_shift_enter_pressed()
|
||||
else:
|
||||
if self.state() == HierarchyView.NoState:
|
||||
self._on_enter_pressed()
|
||||
|
||||
elif event.modifiers() == QtCore.Qt.ControlModifier:
|
||||
if event.key() == QtCore.Qt.Key_Left:
|
||||
self._on_left_ctrl_pressed()
|
||||
elif event.key() == QtCore.Qt.Key_Right:
|
||||
self._on_right_ctrl_pressed()
|
||||
elif event.key() == QtCore.Qt.Key_Up:
|
||||
self._on_up_ctrl_pressed()
|
||||
elif event.key() == QtCore.Qt.Key_Down:
|
||||
self._on_down_ctrl_pressed()
|
||||
else:
|
||||
call_super = True
|
||||
|
||||
if call_super:
|
||||
super(HierarchyView, self).keyPressEvent(event)
|
||||
else:
|
||||
event.accept()
|
||||
|
||||
def _copy_items(self, indexes=None):
|
||||
try:
|
||||
if indexes is None:
|
||||
indexes = self.selectedIndexes()
|
||||
mime_data = self._source_model.copy_mime_data(indexes)
|
||||
|
||||
QtWidgets.QApplication.clipboard().setMimeData(mime_data)
|
||||
self._show_message("Tasks copied")
|
||||
except ValueError as exc:
|
||||
self._show_message(str(exc))
|
||||
|
||||
def _paste_items(self):
|
||||
index = self.currentIndex()
|
||||
mime_data = QtWidgets.QApplication.clipboard().mimeData()
|
||||
self._source_model.paste_mime_data(index, mime_data)
|
||||
|
||||
def _delete_items(self, indexes=None):
|
||||
if indexes is None:
|
||||
indexes = self.selectedIndexes()
|
||||
self._source_model.delete_indexes(indexes)
|
||||
|
||||
def _on_ctrl_shift_enter_pressed(self):
|
||||
self._add_task_and_edit()
|
||||
|
||||
def add_asset(self, parent_index=None):
|
||||
if parent_index is None:
|
||||
parent_index = self.currentIndex()
|
||||
|
||||
if not parent_index.isValid():
|
||||
return
|
||||
|
||||
# Stop editing
|
||||
self.setState(HierarchyView.NoState)
|
||||
QtWidgets.QApplication.processEvents()
|
||||
|
||||
return self._source_model.add_new_asset(parent_index)
|
||||
|
||||
def add_task(self, parent_index=None):
|
||||
if parent_index is None:
|
||||
parent_index = self.currentIndex()
|
||||
|
||||
if not parent_index.isValid():
|
||||
return
|
||||
|
||||
return self._source_model.add_new_task(parent_index)
|
||||
|
||||
def _add_asset_and_edit(self, parent_index=None):
|
||||
new_index = self.add_asset(parent_index)
|
||||
if new_index is None:
|
||||
return
|
||||
|
||||
# Change current index
|
||||
self.selectionModel().setCurrentIndex(
|
||||
new_index,
|
||||
QtCore.QItemSelectionModel.Clear
|
||||
| QtCore.QItemSelectionModel.Select
|
||||
)
|
||||
# Start editing
|
||||
self.edit(new_index)
|
||||
|
||||
def _add_task_and_edit(self):
|
||||
new_index = self.add_task()
|
||||
if new_index is None:
|
||||
return
|
||||
|
||||
# Stop editing
|
||||
self.setState(HierarchyView.NoState)
|
||||
QtWidgets.QApplication.processEvents()
|
||||
|
||||
# TODO change hardcoded column index to coded
|
||||
task_type_index = self._source_model.index(
|
||||
new_index.row(), 1, new_index.parent()
|
||||
)
|
||||
# Change current index
|
||||
self.selectionModel().setCurrentIndex(
|
||||
task_type_index,
|
||||
QtCore.QItemSelectionModel.Clear
|
||||
| QtCore.QItemSelectionModel.Select
|
||||
)
|
||||
# Start editing
|
||||
self.edit(task_type_index)
|
||||
|
||||
def _on_shift_enter_pressed(self):
|
||||
parent_index = self.currentIndex()
|
||||
if not parent_index.isValid():
|
||||
return
|
||||
|
||||
if parent_index.data(ITEM_TYPE_ROLE) == "asset":
|
||||
parent_index = parent_index.parent()
|
||||
self._add_asset_and_edit(parent_index)
|
||||
|
||||
def _on_up_ctrl_pressed(self):
|
||||
indexes = self.selectedIndexes()
|
||||
self._source_model.move_vertical(indexes, -1)
|
||||
|
||||
def _on_down_ctrl_pressed(self):
|
||||
indexes = self.selectedIndexes()
|
||||
self._source_model.move_vertical(indexes, 1)
|
||||
|
||||
def _on_left_ctrl_pressed(self):
|
||||
indexes = self.selectedIndexes()
|
||||
self._source_model.move_horizontal(indexes, -1)
|
||||
|
||||
def _on_right_ctrl_pressed(self):
|
||||
indexes = self.selectedIndexes()
|
||||
self._source_model.move_horizontal(indexes, 1)
|
||||
|
||||
def _on_enter_pressed(self):
|
||||
index = self.currentIndex()
|
||||
if (
|
||||
index.isValid()
|
||||
and index.flags() & QtCore.Qt.ItemIsEditable
|
||||
):
|
||||
self.edit(index)
|
||||
|
||||
def _remove_delete_flag(self, item_ids):
|
||||
"""Remove deletion flag on items marked for deletion."""
|
||||
self._source_model.remove_delete_flag(item_ids)
|
||||
|
||||
def _expand_items(self, indexes):
|
||||
"""Expand multiple items with all it's children.
|
||||
|
||||
Args:
|
||||
indexes (list): List of QModelIndex that should be expanded.
|
||||
"""
|
||||
process_queue = Queue()
|
||||
for index in indexes:
|
||||
if index.column() == 0:
|
||||
process_queue.put(index)
|
||||
|
||||
item_ids = set()
|
||||
# Use deque as expanding not visible items as first is faster
|
||||
indexes_deque = collections.deque()
|
||||
while not process_queue.empty():
|
||||
index = process_queue.get()
|
||||
item_id = index.data(IDENTIFIER_ROLE)
|
||||
if item_id in item_ids:
|
||||
continue
|
||||
item_ids.add(item_id)
|
||||
|
||||
indexes_deque.append(index)
|
||||
|
||||
for row in range(self._source_model.rowCount(index)):
|
||||
process_queue.put(self._source_model.index(
|
||||
row, 0, index
|
||||
))
|
||||
|
||||
while indexes_deque:
|
||||
self.expand(indexes_deque.pop())
|
||||
|
||||
def _collapse_items(self, indexes):
|
||||
"""Collapse multiple items with all it's children.
|
||||
|
||||
Args:
|
||||
indexes (list): List of QModelIndex that should be collapsed.
|
||||
"""
|
||||
item_ids = set()
|
||||
process_queue = Queue()
|
||||
for index in indexes:
|
||||
if index.column() == 0:
|
||||
process_queue.put(index)
|
||||
|
||||
while not process_queue.empty():
|
||||
index = process_queue.get()
|
||||
item_id = index.data(IDENTIFIER_ROLE)
|
||||
if item_id in item_ids:
|
||||
continue
|
||||
item_ids.add(item_id)
|
||||
|
||||
self.collapse(index)
|
||||
|
||||
for row in range(self._source_model.rowCount(index)):
|
||||
process_queue.put(self._source_model.index(
|
||||
row, 0, index
|
||||
))
|
||||
|
||||
def _show_message(self, message):
|
||||
"""Show message to user."""
|
||||
self._parent.show_message(message)
|
||||
|
||||
def _on_context_menu(self, point):
|
||||
"""Context menu on right click.
|
||||
|
||||
Currently is menu shown only on "name" column.
|
||||
"""
|
||||
index = self.indexAt(point)
|
||||
column = index.column()
|
||||
if column != 0:
|
||||
return
|
||||
|
||||
actions = []
|
||||
|
||||
context_menu = QtWidgets.QMenu(self)
|
||||
|
||||
indexes = self.selectedIndexes()
|
||||
|
||||
items_by_id = {}
|
||||
for index in indexes:
|
||||
if index.column() != column:
|
||||
continue
|
||||
|
||||
item_id = index.data(IDENTIFIER_ROLE)
|
||||
items_by_id[item_id] = self._source_model.items_by_id[item_id]
|
||||
|
||||
item_ids = tuple(items_by_id.keys())
|
||||
if len(item_ids) == 1:
|
||||
item = items_by_id[item_ids[0]]
|
||||
item_type = item.data(ITEM_TYPE_ROLE)
|
||||
if item_type in ("asset", "project"):
|
||||
add_asset_action = QtWidgets.QAction("Add Asset", context_menu)
|
||||
add_asset_action.triggered.connect(
|
||||
self._add_asset_and_edit
|
||||
)
|
||||
actions.append(add_asset_action)
|
||||
|
||||
if item_type in ("asset", "task"):
|
||||
add_task_action = QtWidgets.QAction("Add Task", context_menu)
|
||||
add_task_action.triggered.connect(
|
||||
self._add_task_and_edit
|
||||
)
|
||||
actions.append(add_task_action)
|
||||
|
||||
# Remove delete tag on items
|
||||
removed_item_ids = []
|
||||
show_delete_items = False
|
||||
for item_id, item in items_by_id.items():
|
||||
if item.data(REMOVED_ROLE):
|
||||
removed_item_ids.append(item_id)
|
||||
elif (
|
||||
not show_delete_items
|
||||
and item.data(ITEM_TYPE_ROLE) != "project"
|
||||
and item.data(HIERARCHY_CHANGE_ABLE_ROLE)
|
||||
):
|
||||
show_delete_items = True
|
||||
|
||||
if show_delete_items:
|
||||
action = QtWidgets.QAction("Delete items", context_menu)
|
||||
action.triggered.connect(
|
||||
lambda: self._delete_items()
|
||||
)
|
||||
actions.append(action)
|
||||
|
||||
if removed_item_ids:
|
||||
action = QtWidgets.QAction("Keep items", context_menu)
|
||||
action.triggered.connect(
|
||||
lambda: self._remove_delete_flag(removed_item_ids)
|
||||
)
|
||||
actions.append(action)
|
||||
|
||||
# Collapse/Expand action
|
||||
show_collapse_expand_action = False
|
||||
for item_id in item_ids:
|
||||
item = items_by_id[item_id]
|
||||
item_type = item.data(ITEM_TYPE_ROLE)
|
||||
if item_type != "task":
|
||||
show_collapse_expand_action = True
|
||||
break
|
||||
|
||||
if show_collapse_expand_action:
|
||||
expand_action = QtWidgets.QAction("Expand all", context_menu)
|
||||
collapse_action = QtWidgets.QAction("Collapse all", context_menu)
|
||||
expand_action.triggered.connect(
|
||||
lambda: self._expand_items(indexes)
|
||||
)
|
||||
collapse_action.triggered.connect(
|
||||
lambda: self._collapse_items(indexes)
|
||||
)
|
||||
actions.append(expand_action)
|
||||
actions.append(collapse_action)
|
||||
|
||||
if not actions:
|
||||
return
|
||||
|
||||
for action in actions:
|
||||
context_menu.addAction(action)
|
||||
|
||||
global_point = self.viewport().mapToGlobal(point)
|
||||
context_menu.exec_(global_point)
|
||||
281
openpype/tools/project_manager/project_manager/widgets.py
Normal file
281
openpype/tools/project_manager/project_manager/widgets.py
Normal file
|
|
@ -0,0 +1,281 @@
|
|||
import re
|
||||
|
||||
from .constants import (
|
||||
NAME_ALLOWED_SYMBOLS,
|
||||
NAME_REGEX
|
||||
)
|
||||
from openpype.lib import (
|
||||
create_project,
|
||||
PROJECT_NAME_ALLOWED_SYMBOLS,
|
||||
PROJECT_NAME_REGEX
|
||||
)
|
||||
from avalon.api import AvalonMongoDB
|
||||
|
||||
from Qt import QtWidgets, QtCore
|
||||
|
||||
|
||||
class NameTextEdit(QtWidgets.QLineEdit):
|
||||
def __init__(self, *args, **kwargs):
|
||||
super(NameTextEdit, self).__init__(*args, **kwargs)
|
||||
|
||||
self.textChanged.connect(self._on_text_change)
|
||||
|
||||
def _on_text_change(self, text):
|
||||
if NAME_REGEX.match(text):
|
||||
return
|
||||
|
||||
idx = self.cursorPosition()
|
||||
before_text = text[0:idx]
|
||||
after_text = text[idx:len(text)]
|
||||
sub_regex = "[^{}]+".format(NAME_ALLOWED_SYMBOLS)
|
||||
new_before_text = re.sub(sub_regex, "", before_text)
|
||||
new_after_text = re.sub(sub_regex, "", after_text)
|
||||
idx -= (len(before_text) - len(new_before_text))
|
||||
|
||||
self.setText(new_before_text + new_after_text)
|
||||
self.setCursorPosition(idx)
|
||||
|
||||
|
||||
class FilterComboBox(QtWidgets.QComboBox):
|
||||
def __init__(self, parent=None):
|
||||
super(FilterComboBox, self).__init__(parent)
|
||||
|
||||
self._last_value = None
|
||||
|
||||
self.setFocusPolicy(QtCore.Qt.StrongFocus)
|
||||
self.setEditable(True)
|
||||
|
||||
filter_proxy_model = QtCore.QSortFilterProxyModel(self)
|
||||
filter_proxy_model.setFilterCaseSensitivity(QtCore.Qt.CaseInsensitive)
|
||||
filter_proxy_model.setSourceModel(self.model())
|
||||
|
||||
completer = QtWidgets.QCompleter(filter_proxy_model, self)
|
||||
completer.setCompletionMode(
|
||||
QtWidgets.QCompleter.UnfilteredPopupCompletion
|
||||
)
|
||||
self.setCompleter(completer)
|
||||
|
||||
self.lineEdit().textEdited.connect(
|
||||
filter_proxy_model.setFilterFixedString
|
||||
)
|
||||
completer.activated.connect(self.on_completer_activated)
|
||||
|
||||
self._completer = completer
|
||||
self._filter_proxy_model = filter_proxy_model
|
||||
|
||||
def focusInEvent(self, event):
|
||||
super(FilterComboBox, self).focusInEvent(event)
|
||||
self._last_value = self.lineEdit().text()
|
||||
self.lineEdit().selectAll()
|
||||
|
||||
def value_cleanup(self):
|
||||
text = self.lineEdit().text()
|
||||
idx = self.findText(text)
|
||||
if idx < 0:
|
||||
count = self._completer.completionModel().rowCount()
|
||||
if count > 0:
|
||||
index = self._completer.completionModel().index(0, 0)
|
||||
text = index.data(QtCore.Qt.DisplayRole)
|
||||
idx = self.findText(text)
|
||||
elif self._last_value is not None:
|
||||
idx = self.findText(self._last_value)
|
||||
|
||||
if idx < 0:
|
||||
idx = 0
|
||||
self.setCurrentIndex(idx)
|
||||
|
||||
def on_completer_activated(self, text):
|
||||
if text:
|
||||
index = self.findText(text)
|
||||
self.setCurrentIndex(index)
|
||||
|
||||
def keyPressEvent(self, event):
|
||||
if event.key() in (QtCore.Qt.Key_Return, QtCore.Qt.Key_Enter):
|
||||
self.value_cleanup()
|
||||
|
||||
super(FilterComboBox, self).keyPressEvent(event)
|
||||
|
||||
def setModel(self, model):
|
||||
super(FilterComboBox, self).setModel(model)
|
||||
self._filter_proxy_model.setSourceModel(model)
|
||||
self._completer.setModel(self._filter_proxy_model)
|
||||
|
||||
def setModelColumn(self, column):
|
||||
self._completer.setCompletionColumn(column)
|
||||
self._filter_proxy_model.setFilterKeyColumn(column)
|
||||
super(FilterComboBox, self).setModelColumn(column)
|
||||
|
||||
|
||||
class CreateProjectDialog(QtWidgets.QDialog):
|
||||
def __init__(self, parent=None, dbcon=None):
|
||||
super(CreateProjectDialog, self).__init__(parent)
|
||||
|
||||
self.setWindowTitle("Create Project")
|
||||
|
||||
self.allowed_regex = "[^{}]+".format(PROJECT_NAME_ALLOWED_SYMBOLS)
|
||||
|
||||
if dbcon is None:
|
||||
dbcon = AvalonMongoDB()
|
||||
|
||||
self.dbcon = dbcon
|
||||
self._ignore_code_change = False
|
||||
self._project_name_is_valid = False
|
||||
self._project_code_is_valid = False
|
||||
self._project_code_value = None
|
||||
|
||||
project_names, project_codes = self._get_existing_projects()
|
||||
|
||||
inputs_widget = QtWidgets.QWidget(self)
|
||||
project_name_input = QtWidgets.QLineEdit(inputs_widget)
|
||||
project_code_input = QtWidgets.QLineEdit(inputs_widget)
|
||||
library_project_input = QtWidgets.QCheckBox(inputs_widget)
|
||||
|
||||
inputs_layout = QtWidgets.QFormLayout(inputs_widget)
|
||||
inputs_layout.setContentsMargins(0, 0, 0, 0)
|
||||
inputs_layout.addRow("Project name:", project_name_input)
|
||||
inputs_layout.addRow("Project code:", project_code_input)
|
||||
inputs_layout.addRow("Library project:", library_project_input)
|
||||
|
||||
project_name_label = QtWidgets.QLabel(self)
|
||||
project_code_label = QtWidgets.QLabel(self)
|
||||
|
||||
btns_widget = QtWidgets.QWidget(self)
|
||||
ok_btn = QtWidgets.QPushButton("Ok", btns_widget)
|
||||
ok_btn.setEnabled(False)
|
||||
cancel_btn = QtWidgets.QPushButton("Cancel", btns_widget)
|
||||
btns_layout = QtWidgets.QHBoxLayout(btns_widget)
|
||||
btns_layout.setContentsMargins(0, 0, 0, 0)
|
||||
btns_layout.addStretch(1)
|
||||
btns_layout.addWidget(ok_btn)
|
||||
btns_layout.addWidget(cancel_btn)
|
||||
|
||||
main_layout = QtWidgets.QVBoxLayout(self)
|
||||
main_layout.addWidget(inputs_widget, 0)
|
||||
main_layout.addWidget(project_name_label, 1)
|
||||
main_layout.addWidget(project_code_label, 1)
|
||||
main_layout.addStretch(1)
|
||||
main_layout.addWidget(btns_widget, 0)
|
||||
|
||||
project_name_input.textChanged.connect(self._on_project_name_change)
|
||||
project_code_input.textChanged.connect(self._on_project_code_change)
|
||||
ok_btn.clicked.connect(self._on_ok_clicked)
|
||||
cancel_btn.clicked.connect(self._on_cancel_clicked)
|
||||
|
||||
self.invalid_project_names = project_names
|
||||
self.invalid_project_codes = project_codes
|
||||
|
||||
self.project_name_label = project_name_label
|
||||
self.project_code_label = project_code_label
|
||||
|
||||
self.project_name_input = project_name_input
|
||||
self.project_code_input = project_code_input
|
||||
self.library_project_input = library_project_input
|
||||
|
||||
self.ok_btn = ok_btn
|
||||
|
||||
@property
|
||||
def project_name(self):
|
||||
return self.project_name_input.text()
|
||||
|
||||
def _on_project_name_change(self, value):
|
||||
if self._project_code_value is None:
|
||||
self._ignore_code_change = True
|
||||
self.project_code_input.setText(value.lower())
|
||||
self._ignore_code_change = False
|
||||
|
||||
self._update_valid_project_name(value)
|
||||
|
||||
def _on_project_code_change(self, value):
|
||||
if not value:
|
||||
value = None
|
||||
|
||||
self._update_valid_project_code(value)
|
||||
|
||||
if not self._ignore_code_change:
|
||||
self._project_code_value = value
|
||||
|
||||
def _update_valid_project_name(self, value):
|
||||
message = ""
|
||||
is_valid = True
|
||||
if not value:
|
||||
message = "Project name is empty"
|
||||
is_valid = False
|
||||
|
||||
elif value in self.invalid_project_names:
|
||||
message = "Project name \"{}\" already exist".format(value)
|
||||
is_valid = False
|
||||
|
||||
elif not PROJECT_NAME_REGEX.match(value):
|
||||
message = (
|
||||
"Project name \"{}\" contain not supported symbols"
|
||||
).format(value)
|
||||
is_valid = False
|
||||
|
||||
self._project_name_is_valid = is_valid
|
||||
self.project_name_label.setText(message)
|
||||
self.project_name_label.setVisible(bool(message))
|
||||
self._enable_button()
|
||||
|
||||
def _update_valid_project_code(self, value):
|
||||
message = ""
|
||||
is_valid = True
|
||||
if not value:
|
||||
message = "Project code is empty"
|
||||
is_valid = False
|
||||
|
||||
elif value in self.invalid_project_names:
|
||||
message = "Project code \"{}\" already exist".format(value)
|
||||
is_valid = False
|
||||
|
||||
elif not PROJECT_NAME_REGEX.match(value):
|
||||
message = (
|
||||
"Project code \"{}\" contain not supported symbols"
|
||||
).format(value)
|
||||
is_valid = False
|
||||
|
||||
self._project_code_is_valid = is_valid
|
||||
self.project_code_label.setText(message)
|
||||
self._enable_button()
|
||||
|
||||
def _enable_button(self):
|
||||
self.ok_btn.setEnabled(
|
||||
self._project_name_is_valid and self._project_code_is_valid
|
||||
)
|
||||
|
||||
def _on_cancel_clicked(self):
|
||||
self.done(0)
|
||||
|
||||
def _on_ok_clicked(self):
|
||||
if not self._project_name_is_valid or not self._project_code_is_valid:
|
||||
return
|
||||
|
||||
project_name = self.project_name_input.text()
|
||||
project_code = self.project_code_input.text()
|
||||
library_project = self.library_project_input.isChecked()
|
||||
create_project(project_name, project_code, library_project, self.dbcon)
|
||||
|
||||
self.done(1)
|
||||
|
||||
def _get_existing_projects(self):
|
||||
project_names = set()
|
||||
project_codes = set()
|
||||
for project_name in self.dbcon.database.collection_names():
|
||||
# Each collection will have exactly one project document
|
||||
project_doc = self.dbcon.database[project_name].find_one(
|
||||
{"type": "project"},
|
||||
{"name": 1, "data.code": 1}
|
||||
)
|
||||
if not project_doc:
|
||||
continue
|
||||
|
||||
project_name = project_doc.get("name")
|
||||
if not project_name:
|
||||
continue
|
||||
|
||||
project_names.add(project_name)
|
||||
project_code = project_doc.get("data", {}).get("code")
|
||||
if not project_code:
|
||||
project_code = project_name
|
||||
|
||||
project_codes.add(project_code)
|
||||
return project_names, project_codes
|
||||
176
openpype/tools/project_manager/project_manager/window.py
Normal file
176
openpype/tools/project_manager/project_manager/window.py
Normal file
|
|
@ -0,0 +1,176 @@
|
|||
from Qt import QtWidgets, QtCore, QtGui
|
||||
|
||||
from . import (
|
||||
ProjectModel,
|
||||
|
||||
HierarchyModel,
|
||||
HierarchySelectionModel,
|
||||
HierarchyView,
|
||||
|
||||
CreateProjectDialog
|
||||
)
|
||||
from .style import load_stylesheet, ResourceCache
|
||||
|
||||
from openpype import resources
|
||||
from avalon.api import AvalonMongoDB
|
||||
|
||||
|
||||
class ProjectManagerWindow(QtWidgets.QWidget):
|
||||
def __init__(self, parent=None):
|
||||
super(ProjectManagerWindow, self).__init__(parent)
|
||||
|
||||
self.setWindowTitle("OpenPype Project Manager")
|
||||
self.setWindowIcon(QtGui.QIcon(resources.pype_icon_filepath()))
|
||||
|
||||
# Top part of window
|
||||
top_part_widget = QtWidgets.QWidget(self)
|
||||
|
||||
# Project selection
|
||||
project_widget = QtWidgets.QWidget(top_part_widget)
|
||||
|
||||
dbcon = AvalonMongoDB()
|
||||
|
||||
project_model = ProjectModel(dbcon)
|
||||
project_combobox = QtWidgets.QComboBox(project_widget)
|
||||
project_combobox.setModel(project_model)
|
||||
project_combobox.setRootModelIndex(QtCore.QModelIndex())
|
||||
|
||||
refresh_projects_btn = QtWidgets.QPushButton(project_widget)
|
||||
refresh_projects_btn.setIcon(ResourceCache.get_icon("refresh"))
|
||||
refresh_projects_btn.setToolTip("Refresh projects")
|
||||
refresh_projects_btn.setObjectName("RefreshBtn")
|
||||
|
||||
create_project_btn = QtWidgets.QPushButton(
|
||||
"Create project...", project_widget
|
||||
)
|
||||
|
||||
project_layout = QtWidgets.QHBoxLayout(project_widget)
|
||||
project_layout.setContentsMargins(0, 0, 0, 0)
|
||||
project_layout.addWidget(project_combobox, 0)
|
||||
project_layout.addWidget(refresh_projects_btn, 0)
|
||||
project_layout.addWidget(create_project_btn, 0)
|
||||
project_layout.addStretch(1)
|
||||
|
||||
# Helper buttons
|
||||
helper_btns_widget = QtWidgets.QWidget(top_part_widget)
|
||||
|
||||
helper_label = QtWidgets.QLabel("Add:", helper_btns_widget)
|
||||
add_asset_btn = QtWidgets.QPushButton(
|
||||
ResourceCache.get_icon("asset", "default"),
|
||||
"Asset",
|
||||
helper_btns_widget
|
||||
)
|
||||
add_task_btn = QtWidgets.QPushButton(
|
||||
ResourceCache.get_icon("task", "default"),
|
||||
"Task",
|
||||
helper_btns_widget
|
||||
)
|
||||
|
||||
helper_btns_layout = QtWidgets.QHBoxLayout(helper_btns_widget)
|
||||
helper_btns_layout.setContentsMargins(0, 0, 0, 0)
|
||||
helper_btns_layout.addWidget(helper_label)
|
||||
helper_btns_layout.addWidget(add_asset_btn)
|
||||
helper_btns_layout.addWidget(add_task_btn)
|
||||
helper_btns_layout.addStretch(1)
|
||||
|
||||
# Add widgets to top widget layout
|
||||
top_part_layout = QtWidgets.QVBoxLayout(top_part_widget)
|
||||
top_part_layout.setContentsMargins(0, 0, 0, 0)
|
||||
top_part_layout.addWidget(project_widget)
|
||||
top_part_layout.addWidget(helper_btns_widget)
|
||||
|
||||
hierarchy_model = HierarchyModel(dbcon)
|
||||
|
||||
hierarchy_view = HierarchyView(dbcon, hierarchy_model, self)
|
||||
hierarchy_view.setModel(hierarchy_model)
|
||||
|
||||
_selection_model = HierarchySelectionModel(
|
||||
hierarchy_model.multiselection_column_indexes
|
||||
)
|
||||
_selection_model.setModel(hierarchy_view.model())
|
||||
hierarchy_view.setSelectionModel(_selection_model)
|
||||
|
||||
buttons_widget = QtWidgets.QWidget(self)
|
||||
|
||||
message_label = QtWidgets.QLabel(buttons_widget)
|
||||
save_btn = QtWidgets.QPushButton("Save", buttons_widget)
|
||||
|
||||
buttons_layout = QtWidgets.QHBoxLayout(buttons_widget)
|
||||
buttons_layout.setContentsMargins(0, 0, 0, 0)
|
||||
buttons_layout.addWidget(message_label)
|
||||
buttons_layout.addStretch(1)
|
||||
buttons_layout.addWidget(save_btn)
|
||||
|
||||
main_layout = QtWidgets.QVBoxLayout(self)
|
||||
main_layout.addWidget(top_part_widget)
|
||||
main_layout.addWidget(hierarchy_view)
|
||||
main_layout.addWidget(buttons_widget)
|
||||
|
||||
refresh_projects_btn.clicked.connect(self._on_project_refresh)
|
||||
create_project_btn.clicked.connect(self._on_project_create)
|
||||
project_combobox.currentIndexChanged.connect(self._on_project_change)
|
||||
save_btn.clicked.connect(self._on_save_click)
|
||||
add_asset_btn.clicked.connect(self._on_add_asset)
|
||||
add_task_btn.clicked.connect(self._on_add_task)
|
||||
|
||||
self.project_model = project_model
|
||||
self.project_combobox = project_combobox
|
||||
|
||||
self.hierarchy_view = hierarchy_view
|
||||
self.hierarchy_model = hierarchy_model
|
||||
|
||||
self.message_label = message_label
|
||||
|
||||
self.resize(1200, 600)
|
||||
self.setStyleSheet(load_stylesheet())
|
||||
|
||||
self.refresh_projects()
|
||||
|
||||
def _set_project(self, project_name=None):
|
||||
self.hierarchy_view.set_project(project_name)
|
||||
|
||||
def refresh_projects(self, project_name=None):
|
||||
if project_name is None:
|
||||
if self.project_combobox.count() > 0:
|
||||
project_name = self.project_combobox.currentText()
|
||||
|
||||
self.project_model.refresh()
|
||||
|
||||
if self.project_combobox.count() == 0:
|
||||
return self._set_project()
|
||||
|
||||
if project_name:
|
||||
row = self.project_combobox.findText(project_name)
|
||||
if row >= 0:
|
||||
self.project_combobox.setCurrentIndex(row)
|
||||
|
||||
self._set_project(self.project_combobox.currentText())
|
||||
|
||||
def _on_project_change(self):
|
||||
self._set_project(self.project_combobox.currentText())
|
||||
|
||||
def _on_project_refresh(self):
|
||||
self.refresh_projects()
|
||||
|
||||
def _on_save_click(self):
|
||||
self.hierarchy_model.save()
|
||||
|
||||
def _on_add_asset(self):
|
||||
self.hierarchy_view.add_asset()
|
||||
|
||||
def _on_add_task(self):
|
||||
self.hierarchy_view.add_task()
|
||||
|
||||
def show_message(self, message):
|
||||
# TODO add nicer message pop
|
||||
self.message_label.setText(message)
|
||||
|
||||
def _on_project_create(self):
|
||||
dialog = CreateProjectDialog(self)
|
||||
dialog.exec_()
|
||||
if dialog.result() != 1:
|
||||
return
|
||||
|
||||
project_name = dialog.project_name
|
||||
self.show_message("Created project \"{}\"".format(project_name))
|
||||
self.refresh_projects(project_name)
|
||||
|
|
@ -1,3 +1,3 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Package declaring Pype version."""
|
||||
__version__ = "3.0.0-rc4"
|
||||
__version__ = "3.0.0-rc.5"
|
||||
|
|
|
|||
389
poetry.lock
generated
389
poetry.lock
generated
|
|
@ -80,11 +80,11 @@ python-dateutil = ">=2.7.0"
|
|||
|
||||
[[package]]
|
||||
name = "astroid"
|
||||
version = "2.5.3"
|
||||
version = "2.5.6"
|
||||
description = "An abstract syntax tree for Python with inference support."
|
||||
category = "dev"
|
||||
optional = false
|
||||
python-versions = ">=3.6"
|
||||
python-versions = "~=3.6"
|
||||
|
||||
[package.dependencies]
|
||||
lazy-object-proxy = ">=1.4.0"
|
||||
|
|
@ -109,21 +109,21 @@ python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
|
|||
|
||||
[[package]]
|
||||
name = "attrs"
|
||||
version = "20.3.0"
|
||||
version = "21.2.0"
|
||||
description = "Classes Without Boilerplate"
|
||||
category = "main"
|
||||
optional = false
|
||||
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
|
||||
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
|
||||
|
||||
[package.extras]
|
||||
dev = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "six", "zope.interface", "furo", "sphinx", "pre-commit"]
|
||||
docs = ["furo", "sphinx", "zope.interface"]
|
||||
tests = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "six", "zope.interface"]
|
||||
tests_no_zope = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "six"]
|
||||
dev = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "six", "mypy", "pytest-mypy-plugins", "zope.interface", "furo", "sphinx", "sphinx-notfound-page", "pre-commit"]
|
||||
docs = ["furo", "sphinx", "zope.interface", "sphinx-notfound-page"]
|
||||
tests = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "six", "mypy", "pytest-mypy-plugins", "zope.interface"]
|
||||
tests_no_zope = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "six", "mypy", "pytest-mypy-plugins"]
|
||||
|
||||
[[package]]
|
||||
name = "autopep8"
|
||||
version = "1.5.6"
|
||||
version = "1.5.7"
|
||||
description = "A tool that automatically formats Python code to conform to the PEP 8 style guide"
|
||||
category = "dev"
|
||||
optional = false
|
||||
|
|
@ -135,7 +135,7 @@ toml = "*"
|
|||
|
||||
[[package]]
|
||||
name = "babel"
|
||||
version = "2.9.0"
|
||||
version = "2.9.1"
|
||||
description = "Internationalization utilities"
|
||||
category = "dev"
|
||||
optional = false
|
||||
|
|
@ -159,7 +159,7 @@ wcwidth = ">=0.1.4"
|
|||
|
||||
[[package]]
|
||||
name = "cachetools"
|
||||
version = "4.2.1"
|
||||
version = "4.2.2"
|
||||
description = "Extensible memoizing collections and decorators"
|
||||
category = "main"
|
||||
optional = false
|
||||
|
|
@ -335,7 +335,7 @@ python-versions = "*"
|
|||
|
||||
[[package]]
|
||||
name = "flake8"
|
||||
version = "3.9.1"
|
||||
version = "3.9.2"
|
||||
description = "the modular source code checker: pep8 pyflakes and co"
|
||||
category = "dev"
|
||||
optional = false
|
||||
|
|
@ -413,7 +413,7 @@ uritemplate = ">=3.0.0,<4dev"
|
|||
|
||||
[[package]]
|
||||
name = "google-auth"
|
||||
version = "1.29.0"
|
||||
version = "1.30.0"
|
||||
description = "Google Authentication Library"
|
||||
category = "main"
|
||||
optional = false
|
||||
|
|
@ -486,7 +486,7 @@ python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
|
|||
|
||||
[[package]]
|
||||
name = "importlib-metadata"
|
||||
version = "4.0.0"
|
||||
version = "4.0.1"
|
||||
description = "Read metadata from Python packages"
|
||||
category = "main"
|
||||
optional = false
|
||||
|
|
@ -736,7 +736,7 @@ python-versions = "*"
|
|||
|
||||
[[package]]
|
||||
name = "protobuf"
|
||||
version = "3.15.8"
|
||||
version = "3.17.0"
|
||||
description = "Protocol Buffers"
|
||||
category = "main"
|
||||
optional = false
|
||||
|
|
@ -826,7 +826,7 @@ python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
|
|||
|
||||
[[package]]
|
||||
name = "pygments"
|
||||
version = "2.8.1"
|
||||
version = "2.9.0"
|
||||
description = "Pygments is a syntax highlighting package written in Python."
|
||||
category = "dev"
|
||||
optional = false
|
||||
|
|
@ -834,25 +834,22 @@ python-versions = ">=3.5"
|
|||
|
||||
[[package]]
|
||||
name = "pylint"
|
||||
version = "2.7.4"
|
||||
version = "2.8.2"
|
||||
description = "python code static checker"
|
||||
category = "dev"
|
||||
optional = false
|
||||
python-versions = "~=3.6"
|
||||
|
||||
[package.dependencies]
|
||||
astroid = ">=2.5.2,<2.7"
|
||||
astroid = ">=2.5.6,<2.7"
|
||||
colorama = {version = "*", markers = "sys_platform == \"win32\""}
|
||||
isort = ">=4.2.5,<6"
|
||||
mccabe = ">=0.6,<0.7"
|
||||
toml = ">=0.7.1"
|
||||
|
||||
[package.extras]
|
||||
docs = ["sphinx (==3.5.1)", "python-docs-theme (==2020.12)"]
|
||||
|
||||
[[package]]
|
||||
name = "pymongo"
|
||||
version = "3.11.3"
|
||||
version = "3.11.4"
|
||||
description = "Python driver for MongoDB <http://www.mongodb.org>"
|
||||
category = "main"
|
||||
optional = false
|
||||
|
|
@ -884,7 +881,7 @@ six = "*"
|
|||
|
||||
[[package]]
|
||||
name = "pyobjc-core"
|
||||
version = "7.1"
|
||||
version = "7.2"
|
||||
description = "Python<->ObjC Interoperability Module"
|
||||
category = "main"
|
||||
optional = false
|
||||
|
|
@ -892,26 +889,26 @@ python-versions = ">=3.6"
|
|||
|
||||
[[package]]
|
||||
name = "pyobjc-framework-cocoa"
|
||||
version = "7.1"
|
||||
version = "7.2"
|
||||
description = "Wrappers for the Cocoa frameworks on macOS"
|
||||
category = "main"
|
||||
optional = false
|
||||
python-versions = ">=3.6"
|
||||
|
||||
[package.dependencies]
|
||||
pyobjc-core = ">=7.1"
|
||||
pyobjc-core = ">=7.2"
|
||||
|
||||
[[package]]
|
||||
name = "pyobjc-framework-quartz"
|
||||
version = "7.1"
|
||||
version = "7.2"
|
||||
description = "Wrappers for the Quartz frameworks on macOS"
|
||||
category = "main"
|
||||
optional = false
|
||||
python-versions = ">=3.6"
|
||||
|
||||
[package.dependencies]
|
||||
pyobjc-core = ">=7.1"
|
||||
pyobjc-framework-Cocoa = ">=7.1"
|
||||
pyobjc-core = ">=7.2"
|
||||
pyobjc-framework-Cocoa = ">=7.2"
|
||||
|
||||
[[package]]
|
||||
name = "pyparsing"
|
||||
|
|
@ -943,7 +940,7 @@ python-versions = "*"
|
|||
|
||||
[[package]]
|
||||
name = "pyqt5-sip"
|
||||
version = "12.8.1"
|
||||
version = "12.9.0"
|
||||
description = "The sip module support for PyQt5"
|
||||
category = "main"
|
||||
optional = false
|
||||
|
|
@ -959,7 +956,7 @@ python-versions = ">=3.5"
|
|||
|
||||
[[package]]
|
||||
name = "pytest"
|
||||
version = "6.2.3"
|
||||
version = "6.2.4"
|
||||
description = "pytest: simple powerful testing with Python"
|
||||
category = "dev"
|
||||
optional = false
|
||||
|
|
@ -1124,9 +1121,17 @@ python-versions = ">=3.6"
|
|||
cryptography = ">=2.0"
|
||||
jeepney = ">=0.6"
|
||||
|
||||
[[package]]
|
||||
name = "semver"
|
||||
version = "2.13.0"
|
||||
description = "Python helper for Semantic Versioning (http://semver.org/)"
|
||||
category = "main"
|
||||
optional = false
|
||||
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
|
||||
|
||||
[[package]]
|
||||
name = "six"
|
||||
version = "1.15.0"
|
||||
version = "1.16.0"
|
||||
description = "Python 2 and 3 compatibility utilities"
|
||||
category = "main"
|
||||
optional = false
|
||||
|
|
@ -1150,19 +1155,20 @@ python-versions = "*"
|
|||
|
||||
[[package]]
|
||||
name = "sphinx"
|
||||
version = "3.5.4"
|
||||
version = "4.0.1"
|
||||
description = "Python documentation generator"
|
||||
category = "dev"
|
||||
optional = false
|
||||
python-versions = ">=3.5"
|
||||
python-versions = ">=3.6"
|
||||
|
||||
[package.dependencies]
|
||||
alabaster = ">=0.7,<0.8"
|
||||
babel = ">=1.3"
|
||||
colorama = {version = ">=0.3.5", markers = "sys_platform == \"win32\""}
|
||||
docutils = ">=0.12,<0.17"
|
||||
docutils = ">=0.14,<0.18"
|
||||
imagesize = "*"
|
||||
Jinja2 = ">=2.3"
|
||||
Jinja2 = ">=2.3,<3.0"
|
||||
MarkupSafe = "<2.0"
|
||||
packaging = "*"
|
||||
Pygments = ">=2.0"
|
||||
requests = ">=2.5.0"
|
||||
|
|
@ -1318,7 +1324,7 @@ python-versions = "*"
|
|||
|
||||
[[package]]
|
||||
name = "typing-extensions"
|
||||
version = "3.7.4.3"
|
||||
version = "3.10.0.0"
|
||||
description = "Backported and Experimental Type Hints for Python 3.5+"
|
||||
category = "main"
|
||||
optional = false
|
||||
|
|
@ -1355,7 +1361,7 @@ python-versions = "*"
|
|||
|
||||
[[package]]
|
||||
name = "websocket-client"
|
||||
version = "0.58.0"
|
||||
version = "0.59.0"
|
||||
description = "WebSocket client for Python with low level API options"
|
||||
category = "main"
|
||||
optional = false
|
||||
|
|
@ -1417,7 +1423,7 @@ testing = ["pytest (>=4.6)", "pytest-checkdocs (>=1.2.3)", "pytest-flake8", "pyt
|
|||
[metadata]
|
||||
lock-version = "1.1"
|
||||
python-versions = "3.7.*"
|
||||
content-hash = "80fde42aade7fc90bb68d85f0d9b3feb27fc3744d72eb5af6a11b6c9d9836aca"
|
||||
content-hash = "9e067714903bf7e438bc11556b58b6b96be6b079e9a245690c84de8493fa516e"
|
||||
|
||||
[metadata.files]
|
||||
acre = []
|
||||
|
|
@ -1481,8 +1487,8 @@ arrow = [
|
|||
{file = "arrow-0.17.0.tar.gz", hash = "sha256:ff08d10cda1d36c68657d6ad20d74fbea493d980f8b2d45344e00d6ed2bf6ed4"},
|
||||
]
|
||||
astroid = [
|
||||
{file = "astroid-2.5.3-py3-none-any.whl", hash = "sha256:bea3f32799fbb8581f58431c12591bc20ce11cbc90ad82e2ea5717d94f2080d5"},
|
||||
{file = "astroid-2.5.3.tar.gz", hash = "sha256:ad63b8552c70939568966811a088ef0bc880f99a24a00834abd0e3681b514f91"},
|
||||
{file = "astroid-2.5.6-py3-none-any.whl", hash = "sha256:4db03ab5fc3340cf619dbc25e42c2cc3755154ce6009469766d7143d1fc2ee4e"},
|
||||
{file = "astroid-2.5.6.tar.gz", hash = "sha256:8a398dfce302c13f14bab13e2b14fe385d32b73f4e4853b9bdfb64598baa1975"},
|
||||
]
|
||||
async-timeout = [
|
||||
{file = "async-timeout-3.0.1.tar.gz", hash = "sha256:0c3c816a028d47f659d6ff5c745cb2acf1f966da1fe5c19c77a70282b25f4c5f"},
|
||||
|
|
@ -1493,24 +1499,24 @@ atomicwrites = [
|
|||
{file = "atomicwrites-1.4.0.tar.gz", hash = "sha256:ae70396ad1a434f9c7046fd2dd196fc04b12f9e91ffb859164193be8b6168a7a"},
|
||||
]
|
||||
attrs = [
|
||||
{file = "attrs-20.3.0-py2.py3-none-any.whl", hash = "sha256:31b2eced602aa8423c2aea9c76a724617ed67cf9513173fd3a4f03e3a929c7e6"},
|
||||
{file = "attrs-20.3.0.tar.gz", hash = "sha256:832aa3cde19744e49938b91fea06d69ecb9e649c93ba974535d08ad92164f700"},
|
||||
{file = "attrs-21.2.0-py2.py3-none-any.whl", hash = "sha256:149e90d6d8ac20db7a955ad60cf0e6881a3f20d37096140088356da6c716b0b1"},
|
||||
{file = "attrs-21.2.0.tar.gz", hash = "sha256:ef6aaac3ca6cd92904cdd0d83f629a15f18053ec84e6432106f7a4d04ae4f5fb"},
|
||||
]
|
||||
autopep8 = [
|
||||
{file = "autopep8-1.5.6-py2.py3-none-any.whl", hash = "sha256:f01b06a6808bc31698db907761e5890eb2295e287af53f6693b39ce55454034a"},
|
||||
{file = "autopep8-1.5.6.tar.gz", hash = "sha256:5454e6e9a3d02aae38f866eec0d9a7de4ab9f93c10a273fb0340f3d6d09f7514"},
|
||||
{file = "autopep8-1.5.7-py2.py3-none-any.whl", hash = "sha256:aa213493c30dcdac99537249ee65b24af0b2c29f2e83cd8b3f68760441ed0db9"},
|
||||
{file = "autopep8-1.5.7.tar.gz", hash = "sha256:276ced7e9e3cb22e5d7c14748384a5cf5d9002257c0ed50c0e075b68011bb6d0"},
|
||||
]
|
||||
babel = [
|
||||
{file = "Babel-2.9.0-py2.py3-none-any.whl", hash = "sha256:9d35c22fcc79893c3ecc85ac4a56cde1ecf3f19c540bba0922308a6c06ca6fa5"},
|
||||
{file = "Babel-2.9.0.tar.gz", hash = "sha256:da031ab54472314f210b0adcff1588ee5d1d1d0ba4dbd07b94dba82bde791e05"},
|
||||
{file = "Babel-2.9.1-py2.py3-none-any.whl", hash = "sha256:ab49e12b91d937cd11f0b67cb259a57ab4ad2b59ac7a3b41d6c06c0ac5b0def9"},
|
||||
{file = "Babel-2.9.1.tar.gz", hash = "sha256:bc0c176f9f6a994582230df350aa6e05ba2ebe4b3ac317eab29d9be5d2768da0"},
|
||||
]
|
||||
blessed = [
|
||||
{file = "blessed-1.18.0-py2.py3-none-any.whl", hash = "sha256:5b5e2f0563d5a668c282f3f5946f7b1abb70c85829461900e607e74d7725106e"},
|
||||
{file = "blessed-1.18.0.tar.gz", hash = "sha256:1312879f971330a1b7f2c6341f2ae7e2cbac244bfc9d0ecfbbecd4b0293bc755"},
|
||||
]
|
||||
cachetools = [
|
||||
{file = "cachetools-4.2.1-py3-none-any.whl", hash = "sha256:1d9d5f567be80f7c07d765e21b814326d78c61eb0c3a637dffc0e5d1796cb2e2"},
|
||||
{file = "cachetools-4.2.1.tar.gz", hash = "sha256:f469e29e7aa4cff64d8de4aad95ce76de8ea1125a16c68e0d93f65c3c3dc92e9"},
|
||||
{file = "cachetools-4.2.2-py3-none-any.whl", hash = "sha256:2cc0b89715337ab6dbba85b5b50effe2b0c74e035d83ee8ed637cf52f12ae001"},
|
||||
{file = "cachetools-4.2.2.tar.gz", hash = "sha256:61b5ed1e22a0924aed1d23b478f37e8d52549ff8a961de2909c69bf950020cff"},
|
||||
]
|
||||
certifi = [
|
||||
{file = "certifi-2020.12.5-py2.py3-none-any.whl", hash = "sha256:719a74fb9e33b9bd44cc7f3a8d94bc35e4049deebe19ba7d8e108280cfd59830"},
|
||||
|
|
@ -1689,8 +1695,8 @@ evdev = [
|
|||
{file = "evdev-1.4.0.tar.gz", hash = "sha256:8782740eb1a86b187334c07feb5127d3faa0b236e113206dfe3ae8f77fb1aaf1"},
|
||||
]
|
||||
flake8 = [
|
||||
{file = "flake8-3.9.1-py2.py3-none-any.whl", hash = "sha256:3b9f848952dddccf635be78098ca75010f073bfe14d2c6bda867154bea728d2a"},
|
||||
{file = "flake8-3.9.1.tar.gz", hash = "sha256:1aa8990be1e689d96c745c5682b687ea49f2e05a443aff1f8251092b0014e378"},
|
||||
{file = "flake8-3.9.2-py2.py3-none-any.whl", hash = "sha256:bf8fd333346d844f616e8d47905ef3a3384edae6b4e9beb0c5101e25e3110907"},
|
||||
{file = "flake8-3.9.2.tar.gz", hash = "sha256:07528381786f2a6237b061f6e96610a4167b226cb926e2aa2b6b1d78057c576b"},
|
||||
]
|
||||
ftrack-python-api = [
|
||||
{file = "ftrack-python-api-2.0.0.tar.gz", hash = "sha256:dd6f02c31daf5a10078196dc9eac4671e4297c762fbbf4df98de668ac12281d9"},
|
||||
|
|
@ -1708,8 +1714,8 @@ google-api-python-client = [
|
|||
{file = "google_api_python_client-1.12.8-py2.py3-none-any.whl", hash = "sha256:3c4c4ca46b5c21196bec7ee93453443e477d82cbfa79234d1ce0645f81170eaf"},
|
||||
]
|
||||
google-auth = [
|
||||
{file = "google-auth-1.29.0.tar.gz", hash = "sha256:010f011c4e27d3d5eb01106fba6aac39d164842dfcd8709955c4638f5b11ccf8"},
|
||||
{file = "google_auth-1.29.0-py2.py3-none-any.whl", hash = "sha256:f30a672a64d91cc2e3137765d088c5deec26416246f7a9e956eaf69a8d7ed49c"},
|
||||
{file = "google-auth-1.30.0.tar.gz", hash = "sha256:9ad25fba07f46a628ad4d0ca09f38dcb262830df2ac95b217f9b0129c9e42206"},
|
||||
{file = "google_auth-1.30.0-py2.py3-none-any.whl", hash = "sha256:588bdb03a41ecb4978472b847881e5518b5d9ec6153d3d679aa127a55e13b39f"},
|
||||
]
|
||||
google-auth-httplib2 = [
|
||||
{file = "google-auth-httplib2-0.1.0.tar.gz", hash = "sha256:a07c39fd632becacd3f07718dfd6021bf396978f03ad3ce4321d060015cc30ac"},
|
||||
|
|
@ -1732,8 +1738,8 @@ imagesize = [
|
|||
{file = "imagesize-1.2.0.tar.gz", hash = "sha256:b1f6b5a4eab1f73479a50fb79fcf729514a900c341d8503d62a62dbc4127a2b1"},
|
||||
]
|
||||
importlib-metadata = [
|
||||
{file = "importlib_metadata-4.0.0-py3-none-any.whl", hash = "sha256:19192b88d959336bfa6bdaaaef99aeafec179eca19c47c804e555703ee5f07ef"},
|
||||
{file = "importlib_metadata-4.0.0.tar.gz", hash = "sha256:2e881981c9748d7282b374b68e759c87745c25427b67ecf0cc67fb6637a1bff9"},
|
||||
{file = "importlib_metadata-4.0.1-py3-none-any.whl", hash = "sha256:d7eb1dea6d6a6086f8be21784cc9e3bcfa55872b52309bc5fad53a8ea444465d"},
|
||||
{file = "importlib_metadata-4.0.1.tar.gz", hash = "sha256:8c501196e49fb9df5df43833bdb1e4328f64847763ec8a50703148b73784d581"},
|
||||
]
|
||||
iniconfig = [
|
||||
{file = "iniconfig-1.1.1-py2.py3-none-any.whl", hash = "sha256:011e24c64b7f47f6ebd835bb12a743f2fbe9a26d4cecaa7f53bc4f35ee9da8b3"},
|
||||
|
|
@ -1948,26 +1954,29 @@ prefixed = [
|
|||
{file = "prefixed-0.3.2.tar.gz", hash = "sha256:ca48277ba5fa8346dd4b760847da930c7b84416387c39e93affef086add2c029"},
|
||||
]
|
||||
protobuf = [
|
||||
{file = "protobuf-3.15.8-cp27-cp27m-macosx_10_9_x86_64.whl", hash = "sha256:fad4f971ec38d8df7f4b632c819bf9bbf4f57cfd7312cf526c69ce17ef32436a"},
|
||||
{file = "protobuf-3.15.8-cp27-cp27mu-manylinux1_x86_64.whl", hash = "sha256:f17b352d7ce33c81773cf81d536ca70849de6f73c96413f17309f4b43ae7040b"},
|
||||
{file = "protobuf-3.15.8-cp35-cp35m-macosx_10_9_intel.whl", hash = "sha256:4a054b0b5900b7ea7014099e783fb8c4618e4209fffcd6050857517b3f156e18"},
|
||||
{file = "protobuf-3.15.8-cp35-cp35m-manylinux1_x86_64.whl", hash = "sha256:efa4c4d4fc9ba734e5e85eaced70e1b63fb3c8d08482d839eb838566346f1737"},
|
||||
{file = "protobuf-3.15.8-cp35-cp35m-win32.whl", hash = "sha256:07eec4e2ccbc74e95bb9b3afe7da67957947ee95bdac2b2e91b038b832dd71f0"},
|
||||
{file = "protobuf-3.15.8-cp35-cp35m-win_amd64.whl", hash = "sha256:f9cadaaa4065d5dd4d15245c3b68b967b3652a3108e77f292b58b8c35114b56c"},
|
||||
{file = "protobuf-3.15.8-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:2dc0e8a9e4962207bdc46a365b63a3f1aca6f9681a5082a326c5837ef8f4b745"},
|
||||
{file = "protobuf-3.15.8-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:f80afc0a0ba13339bbab25ca0409e9e2836b12bb012364c06e97c2df250c3343"},
|
||||
{file = "protobuf-3.15.8-cp36-cp36m-win32.whl", hash = "sha256:c5566f956a26cda3abdfacc0ca2e21db6c9f3d18f47d8d4751f2209d6c1a5297"},
|
||||
{file = "protobuf-3.15.8-cp36-cp36m-win_amd64.whl", hash = "sha256:dab75b56a12b1ceb3e40808b5bd9dfdaef3a1330251956e6744e5b6ed8f8830b"},
|
||||
{file = "protobuf-3.15.8-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:3053f13207e7f13dc7be5e9071b59b02020172f09f648e85dc77e3fcb50d1044"},
|
||||
{file = "protobuf-3.15.8-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:1f0b5d156c3df08cc54bc2c8b8b875648ea4cd7ebb2a9a130669f7547ec3488c"},
|
||||
{file = "protobuf-3.15.8-cp37-cp37m-win32.whl", hash = "sha256:90270fe5732c1f1ff664a3bd7123a16456d69b4e66a09a139a00443a32f210b8"},
|
||||
{file = "protobuf-3.15.8-cp37-cp37m-win_amd64.whl", hash = "sha256:f42c2f5fb67da5905bfc03733a311f72fa309252bcd77c32d1462a1ad519521e"},
|
||||
{file = "protobuf-3.15.8-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:f6077db37bfa16494dca58a4a02bfdacd87662247ad6bc1f7f8d13ff3f0013e1"},
|
||||
{file = "protobuf-3.15.8-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:510e66491f1a5ac5953c908aa8300ec47f793130097e4557482803b187a8ee05"},
|
||||
{file = "protobuf-3.15.8-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5ff9fa0e67fcab442af9bc8d4ec3f82cb2ff3be0af62dba047ed4187f0088b7d"},
|
||||
{file = "protobuf-3.15.8-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:1c0e9e56202b9dccbc094353285a252e2b7940b74fdf75f1b4e1b137833fabd7"},
|
||||
{file = "protobuf-3.15.8-py2.py3-none-any.whl", hash = "sha256:a0a08c6b2e6d6c74a6eb5bf6184968eefb1569279e78714e239d33126e753403"},
|
||||
{file = "protobuf-3.15.8.tar.gz", hash = "sha256:0277f62b1e42210cafe79a71628c1d553348da81cbd553402a7f7549c50b11d0"},
|
||||
{file = "protobuf-3.17.0-cp27-cp27m-macosx_10_9_x86_64.whl", hash = "sha256:15351df904347da2081a2eebc42b192c29724eb57dbe56dae440be843f1e4779"},
|
||||
{file = "protobuf-3.17.0-cp27-cp27mu-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:5356981c1919782b8c2e3ea5c5d85ad5937b8178a025ac9edc2f2ca5b4a717ae"},
|
||||
{file = "protobuf-3.17.0-cp35-cp35m-macosx_10_9_intel.whl", hash = "sha256:eac0a2a7ea99e17175f6e7b53cdc9004ed786c072fbdf933def0e454e14fd323"},
|
||||
{file = "protobuf-3.17.0-cp35-cp35m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:4c8d0997fdc0a4cf9de7950d598ce6974b22e8618bbcf1d15e9842010cf8420a"},
|
||||
{file = "protobuf-3.17.0-cp35-cp35m-win32.whl", hash = "sha256:9ae321459d4890c3939c536382f75e232c9e91ce506310353c8a15ad5c379e0d"},
|
||||
{file = "protobuf-3.17.0-cp35-cp35m-win_amd64.whl", hash = "sha256:295944ef0772498d7bf75f6aa5d4dfcfd02f5ce70f735b406e52e43ac3914d38"},
|
||||
{file = "protobuf-3.17.0-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:850f429bd2399525d339d05bc809f090f16d3d88737bed637d355a5ee8d3b81a"},
|
||||
{file = "protobuf-3.17.0-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:809a96d5a1a74538728710f9104f43ae77f5e48bde274ee321b10a324ba52e4f"},
|
||||
{file = "protobuf-3.17.0-cp36-cp36m-win32.whl", hash = "sha256:8a3ac375539055164f31a330770f137875307e6f04c21e2647f2e7139c501295"},
|
||||
{file = "protobuf-3.17.0-cp36-cp36m-win_amd64.whl", hash = "sha256:3d338910b10b88b18581cf6877b3938b2e262e8fdc2c1057f5a291787de63183"},
|
||||
{file = "protobuf-3.17.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:1488f786bd1912f97796cf5def8cacf433735616896cf7ed9dc786cee693dfc8"},
|
||||
{file = "protobuf-3.17.0-cp37-cp37m-manylinux2014_aarch64.whl", hash = "sha256:bcaff977db178f0bfde10bab0d23a5f5adf5964adba70c315e45922a1c55eb90"},
|
||||
{file = "protobuf-3.17.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:939ce06846ddfec99c0bff510510b3ee45778e7a3aec6544d1f36526e5fecb67"},
|
||||
{file = "protobuf-3.17.0-cp37-cp37m-win32.whl", hash = "sha256:3237acce5b666c7b0f45785cc2d0809796d4df3593bd68338aebf25408139188"},
|
||||
{file = "protobuf-3.17.0-cp37-cp37m-win_amd64.whl", hash = "sha256:2f77afe33bb86c7d34221a86193256d69aa10818620fe4a7513d98211d67d672"},
|
||||
{file = "protobuf-3.17.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:acc9f2091ace3de429eee424ab7ba0bc52a6aa9ffc9909e5c4de259a3f71db46"},
|
||||
{file = "protobuf-3.17.0-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:a29631f4f8bcf79b12a59e83d238d888de5034871461d788c74c68218ad75049"},
|
||||
{file = "protobuf-3.17.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:05c304396e309661c45e3a97bd2d8da1fc2bab743ed2ca880bcb757271c40c0e"},
|
||||
{file = "protobuf-3.17.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:baea44967071e6a51e705e4e88aebf35f530a14004cc69f60a185e5d7e13de7e"},
|
||||
{file = "protobuf-3.17.0-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:3b5c461af5a3cebd796c73370db929b7e24cbaba655eefdc044226bc8a843d6b"},
|
||||
{file = "protobuf-3.17.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:44399393c3a8cc04a4cfbdc721dd7f2114497efda582e946a91b8c4290ae5ff5"},
|
||||
{file = "protobuf-3.17.0-py2.py3-none-any.whl", hash = "sha256:e32ef0c9f4b548c80d94dfff8b4130ca2ff3d50caaf2455889e3f5b8a01e8038"},
|
||||
{file = "protobuf-3.17.0.tar.gz", hash = "sha256:05dfe9319939a8473c21b469f34f6486646e54fb8542637cf7ed8e2fbfe21538"},
|
||||
]
|
||||
py = [
|
||||
{file = "py-1.10.0-py2.py3-none-any.whl", hash = "sha256:3b80836aa6d1feeaa108e046da6423ab8f6ceda6468545ae8d02d9d58d18818a"},
|
||||
|
|
@ -2028,78 +2037,78 @@ pyflakes = [
|
|||
{file = "pyflakes-2.3.1.tar.gz", hash = "sha256:f5bc8ecabc05bb9d291eb5203d6810b49040f6ff446a756326104746cc00c1db"},
|
||||
]
|
||||
pygments = [
|
||||
{file = "Pygments-2.8.1-py3-none-any.whl", hash = "sha256:534ef71d539ae97d4c3a4cf7d6f110f214b0e687e92f9cb9d2a3b0d3101289c8"},
|
||||
{file = "Pygments-2.8.1.tar.gz", hash = "sha256:2656e1a6edcdabf4275f9a3640db59fd5de107d88e8663c5d4e9a0fa62f77f94"},
|
||||
{file = "Pygments-2.9.0-py3-none-any.whl", hash = "sha256:d66e804411278594d764fc69ec36ec13d9ae9147193a1740cd34d272ca383b8e"},
|
||||
{file = "Pygments-2.9.0.tar.gz", hash = "sha256:a18f47b506a429f6f4b9df81bb02beab9ca21d0a5fee38ed15aef65f0545519f"},
|
||||
]
|
||||
pylint = [
|
||||
{file = "pylint-2.7.4-py3-none-any.whl", hash = "sha256:209d712ec870a0182df034ae19f347e725c1e615b2269519ab58a35b3fcbbe7a"},
|
||||
{file = "pylint-2.7.4.tar.gz", hash = "sha256:bd38914c7731cdc518634a8d3c5585951302b6e2b6de60fbb3f7a0220e21eeee"},
|
||||
{file = "pylint-2.8.2-py3-none-any.whl", hash = "sha256:f7e2072654a6b6afdf5e2fb38147d3e2d2d43c89f648637baab63e026481279b"},
|
||||
{file = "pylint-2.8.2.tar.gz", hash = "sha256:586d8fa9b1891f4b725f587ef267abe2a1bad89d6b184520c7f07a253dd6e217"},
|
||||
]
|
||||
pymongo = [
|
||||
{file = "pymongo-3.11.3-cp27-cp27m-macosx_10_14_intel.whl", hash = "sha256:4d959e929cec805c2bf391418b1121590b4e7d5cb00af7b1ba521443d45a0918"},
|
||||
{file = "pymongo-3.11.3-cp27-cp27m-manylinux1_i686.whl", hash = "sha256:9fbffc5bad4df99a509783cbd449ed0d24fcd5a450c28e7756c8f20eda3d2aa5"},
|
||||
{file = "pymongo-3.11.3-cp27-cp27m-manylinux1_x86_64.whl", hash = "sha256:bd351ceb2decd23d523fc50bad631ee9ae6e97e7cdc355ce5600fe310484f96e"},
|
||||
{file = "pymongo-3.11.3-cp27-cp27m-win32.whl", hash = "sha256:7d2ae2f7c50adec20fde46a73465de31a6a6fbb4903240f8b7304549752ca7a1"},
|
||||
{file = "pymongo-3.11.3-cp27-cp27m-win_amd64.whl", hash = "sha256:b1aa62903a2c5768b0001632efdea2e8da6c80abdd520c2e8a16001cc9affb23"},
|
||||
{file = "pymongo-3.11.3-cp27-cp27mu-manylinux1_i686.whl", hash = "sha256:180511abfef70feb022360b35f4863dd68e08334197089201d5c52208de9ca2e"},
|
||||
{file = "pymongo-3.11.3-cp27-cp27mu-manylinux1_x86_64.whl", hash = "sha256:42f9ec9d77358f557fe17cc15e796c4d4d492ede1a30cba3664822cae66e97c5"},
|
||||
{file = "pymongo-3.11.3-cp34-cp34m-macosx_10_6_intel.whl", hash = "sha256:3dbc67754882d740f17809342892f0b24398770bd99d48c5cb5ba89f5f5dee4e"},
|
||||
{file = "pymongo-3.11.3-cp34-cp34m-manylinux1_i686.whl", hash = "sha256:733e1cfffc4cd99848230e2999c8a86e284c6af6746482f8ad2ad554dce14e39"},
|
||||
{file = "pymongo-3.11.3-cp34-cp34m-manylinux1_x86_64.whl", hash = "sha256:622a5157ffcd793d305387c1c9fb94185f496c8c9fd66dafb59de0807bc14ad7"},
|
||||
{file = "pymongo-3.11.3-cp34-cp34m-win32.whl", hash = "sha256:2aeb108da1ed8e066800fb447ba5ae89d560e6773d228398a87825ac3630452d"},
|
||||
{file = "pymongo-3.11.3-cp34-cp34m-win_amd64.whl", hash = "sha256:7c77801620e5e75fb9c7abae235d3cc45d212a67efa98f4972eef63e736a8daa"},
|
||||
{file = "pymongo-3.11.3-cp35-cp35m-macosx_10_6_intel.whl", hash = "sha256:29390c39ca873737689a0749c9c3257aad96b323439b11279fbc0ba8626ec9c5"},
|
||||
{file = "pymongo-3.11.3-cp35-cp35m-manylinux1_i686.whl", hash = "sha256:a8b02e0119d6ee381a265d8d2450a38096f82916d895fed2dfd81d4c7a54d6e4"},
|
||||
{file = "pymongo-3.11.3-cp35-cp35m-manylinux1_x86_64.whl", hash = "sha256:28633868be21a187702a8613913e13d1987d831529358c29fc6f6670413df040"},
|
||||
{file = "pymongo-3.11.3-cp35-cp35m-manylinux2014_aarch64.whl", hash = "sha256:685b884fa41bd2913fd20af85866c4ff886b7cbb7e4833b918996aa5d45a04be"},
|
||||
{file = "pymongo-3.11.3-cp35-cp35m-manylinux2014_i686.whl", hash = "sha256:7cd42c66d49ffb68dea065e1c8a4323e7ceab386e660fee9863d4fa227302ba9"},
|
||||
{file = "pymongo-3.11.3-cp35-cp35m-manylinux2014_ppc64le.whl", hash = "sha256:950710f7370613a6bfa2ccd842b488c5b8072e83fb6b7d45d99110bf44651d06"},
|
||||
{file = "pymongo-3.11.3-cp35-cp35m-manylinux2014_s390x.whl", hash = "sha256:c7fd18d4b7939408df9315fedbdb05e179760960a92b3752498e2fcd03f24c3d"},
|
||||
{file = "pymongo-3.11.3-cp35-cp35m-manylinux2014_x86_64.whl", hash = "sha256:cc359e408712faf9ea775f4c0ec8f2bfc843afe47747a657808d9595edd34d71"},
|
||||
{file = "pymongo-3.11.3-cp35-cp35m-win32.whl", hash = "sha256:7814b2cf23aad23464859973c5cd2066ca2fd99e0b934acefbb0b728ac2525bf"},
|
||||
{file = "pymongo-3.11.3-cp35-cp35m-win_amd64.whl", hash = "sha256:e1414599a97554d451e441afb362dbee1505e4550852c0068370d843757a3fe2"},
|
||||
{file = "pymongo-3.11.3-cp36-cp36m-macosx_10_6_intel.whl", hash = "sha256:0384d76b409278ddb34ac19cdc4664511685959bf719adbdc051875ded4689aa"},
|
||||
{file = "pymongo-3.11.3-cp36-cp36m-manylinux1_i686.whl", hash = "sha256:22ee2c94fee1e391735be63aa1c9af4c69fdcb325ae9e5e4ddff770248ef60a6"},
|
||||
{file = "pymongo-3.11.3-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:db6fd53ef5f1914ad801830406440c3bfb701e38a607eda47c38adba267ba300"},
|
||||
{file = "pymongo-3.11.3-cp36-cp36m-manylinux2014_aarch64.whl", hash = "sha256:66b688fc139c6742057795510e3b12c4acbf90d11af1eff9689a41d9c84478d6"},
|
||||
{file = "pymongo-3.11.3-cp36-cp36m-manylinux2014_i686.whl", hash = "sha256:6a5834e392c97f19f36670e34bf9d346d733ad89ee0689a6419dd737dfa4308a"},
|
||||
{file = "pymongo-3.11.3-cp36-cp36m-manylinux2014_ppc64le.whl", hash = "sha256:87981008d565f647142869d99915cc4760b7725858da3d39ecb2a606e23f36fd"},
|
||||
{file = "pymongo-3.11.3-cp36-cp36m-manylinux2014_s390x.whl", hash = "sha256:413b18ac2222f5d961eb8d1c8dcca6c6ca176c8613636d8c13aa23abae7f7a21"},
|
||||
{file = "pymongo-3.11.3-cp36-cp36m-manylinux2014_x86_64.whl", hash = "sha256:610d5cbbfd026e2f6d15665af51e048e49b68363fedece2ed318cc8fe080dd94"},
|
||||
{file = "pymongo-3.11.3-cp36-cp36m-win32.whl", hash = "sha256:3873866534b6527e6863e742eb23ea2a539e3c7ee00ad3f9bec9da27dbaaff6f"},
|
||||
{file = "pymongo-3.11.3-cp36-cp36m-win_amd64.whl", hash = "sha256:b17e627844d86031c77147c40bf992a6e1114025a460874deeda6500d0f34862"},
|
||||
{file = "pymongo-3.11.3-cp37-cp37m-macosx_10_6_intel.whl", hash = "sha256:05e2bda928a3a6bc6ddff9e5a8579d41928b75d7417b18f9a67c82bb52150ac6"},
|
||||
{file = "pymongo-3.11.3-cp37-cp37m-manylinux1_i686.whl", hash = "sha256:19d52c60dc37520385f538d6d1a4c40bc398e0885f4ed6a36ce10b631dab2852"},
|
||||
{file = "pymongo-3.11.3-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:2163d736d6f62b20753be5da3dc07a188420b355f057fcbb3075b05ee6227b2f"},
|
||||
{file = "pymongo-3.11.3-cp37-cp37m-manylinux2014_aarch64.whl", hash = "sha256:b4535d98df83abebb572035754fb3d4ad09ce7449375fa09fa9ede2dbc87b62b"},
|
||||
{file = "pymongo-3.11.3-cp37-cp37m-manylinux2014_i686.whl", hash = "sha256:cd8fc35d4c0c717cc29b0cb894871555cb7137a081e179877ecc537e2607f0b9"},
|
||||
{file = "pymongo-3.11.3-cp37-cp37m-manylinux2014_ppc64le.whl", hash = "sha256:92e2376ce3ca0e3e443b3c5c2bb5d584c7e59221edfb0035313c6306049ba55a"},
|
||||
{file = "pymongo-3.11.3-cp37-cp37m-manylinux2014_s390x.whl", hash = "sha256:4ca92e15fcf02e02e7c24b448a16599b98c9d0e6a46cd85cc50804450ebf7245"},
|
||||
{file = "pymongo-3.11.3-cp37-cp37m-manylinux2014_x86_64.whl", hash = "sha256:5a03ae5ac85b04b2034a0689add9ff597b16d5e24066a87f6ab0e9fa67049156"},
|
||||
{file = "pymongo-3.11.3-cp37-cp37m-win32.whl", hash = "sha256:bc2eb67387b8376120a2be6cba9d23f9d6a6c3828e00fb0a64c55ad7b54116d1"},
|
||||
{file = "pymongo-3.11.3-cp37-cp37m-win_amd64.whl", hash = "sha256:5e1341276ce8b7752db9aeac6bbb0cbe82a3f6a6186866bf6b4906d8d328d50b"},
|
||||
{file = "pymongo-3.11.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:4ac387ac1be71b798d1c372a924f9c30352f30e684e06f086091297352698ac0"},
|
||||
{file = "pymongo-3.11.3-cp38-cp38-manylinux1_i686.whl", hash = "sha256:728313cc0d59d1a1a004f675607dcf5c711ced3f55e75d82b3f264fd758869f3"},
|
||||
{file = "pymongo-3.11.3-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:daa44cefde19978af57ac1d50413cd86ebf2b497328e7a27832f5824bda47439"},
|
||||
{file = "pymongo-3.11.3-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:322f6cc7bf23a264151ebc5229a92600c4b55ac83c83c91c9bab1ec92c888a8d"},
|
||||
{file = "pymongo-3.11.3-cp38-cp38-manylinux2014_i686.whl", hash = "sha256:6043d251fac27ca04ff22ed8deb5ff7a43dc18e8a4a15b4c442d2a20fa313162"},
|
||||
{file = "pymongo-3.11.3-cp38-cp38-manylinux2014_ppc64le.whl", hash = "sha256:66573c8c7808cce4f3b56c23cb7cad6c3d7f4c464b9016d35f5344ad743896d7"},
|
||||
{file = "pymongo-3.11.3-cp38-cp38-manylinux2014_s390x.whl", hash = "sha256:bf70097bd497089f1baabf9cbb3ec4f69c022dc7a70c41ba9c238fa4d0fff7ab"},
|
||||
{file = "pymongo-3.11.3-cp38-cp38-manylinux2014_x86_64.whl", hash = "sha256:f23abcf6eca5859a2982beadfb5111f8c5e76e30ff99aaee3c1c327f814f9f10"},
|
||||
{file = "pymongo-3.11.3-cp38-cp38-win32.whl", hash = "sha256:1d559a76ae87143ad96c2ecd6fdd38e691721e175df7ced3fcdc681b4638bca1"},
|
||||
{file = "pymongo-3.11.3-cp38-cp38-win_amd64.whl", hash = "sha256:152e4ac3158b776135d8fce28d2ac06e682b885fcbe86690d66465f262ab244e"},
|
||||
{file = "pymongo-3.11.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:34c15f5798f23488e509eae82fbf749c3d17db74379a88c07c869ece1aa806b9"},
|
||||
{file = "pymongo-3.11.3-cp39-cp39-manylinux1_i686.whl", hash = "sha256:210ec4a058480b9c3869082e52b66d80c4a48eda9682d7a569a1a5a48100ea54"},
|
||||
{file = "pymongo-3.11.3-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:b44fa04720bbfd617b6aef036989c8c30435f11450c0a59136291d7b41ed647f"},
|
||||
{file = "pymongo-3.11.3-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:b32e4eed2ef19a20dfb57698497a9bc54e74efb2e260c003e9056c145f130dc7"},
|
||||
{file = "pymongo-3.11.3-cp39-cp39-manylinux2014_i686.whl", hash = "sha256:5091aacbdb667b418b751157f48f6daa17142c4f9063d58e5a64c90b2afbdf9a"},
|
||||
{file = "pymongo-3.11.3-cp39-cp39-manylinux2014_ppc64le.whl", hash = "sha256:bb6a5777bf558f444cd4883d617546182cfeff8f2d4acd885253f11a16740534"},
|
||||
{file = "pymongo-3.11.3-cp39-cp39-manylinux2014_s390x.whl", hash = "sha256:980527f4ccc6644855bb68056fe7835da6d06d37776a52df5bcc1882df57c3db"},
|
||||
{file = "pymongo-3.11.3-cp39-cp39-manylinux2014_x86_64.whl", hash = "sha256:65b67637f0a25ac9d25efb13c1578eb065870220ffa82f132c5b2d8e43ac39c3"},
|
||||
{file = "pymongo-3.11.3-cp39-cp39-win32.whl", hash = "sha256:f6748c447feeadda059719ef5ab1fb9d84bd370e205b20049a0e8b45ef4ad593"},
|
||||
{file = "pymongo-3.11.3-cp39-cp39-win_amd64.whl", hash = "sha256:ee42a8f850143ae7c67ea09a183a6a4ad8d053e1dbd9a1134e21a7b5c1bc6c73"},
|
||||
{file = "pymongo-3.11.3-py2.7-macosx-10.14-intel.egg", hash = "sha256:7edff02e44dd0badd749d7342e40705a398d98c5d8f7570f57cff9568c2351fa"},
|
||||
{file = "pymongo-3.11.3.tar.gz", hash = "sha256:db5098587f58fbf8582d9bda2462762b367207246d3e19623782fb449c3c5fcc"},
|
||||
{file = "pymongo-3.11.4-cp27-cp27m-macosx_10_14_intel.whl", hash = "sha256:b7efc7e7049ef366777cfd35437c18a4166bb50a5606a1c840ee3b9624b54fc9"},
|
||||
{file = "pymongo-3.11.4-cp27-cp27m-manylinux1_i686.whl", hash = "sha256:517ba47ca04a55b1f50ee8df9fd97f6c37df5537d118fb2718952b8623860466"},
|
||||
{file = "pymongo-3.11.4-cp27-cp27m-manylinux1_x86_64.whl", hash = "sha256:225c61e08fe517aede7912937939e09adf086c8e6f7e40d4c85ad678c2c2aea3"},
|
||||
{file = "pymongo-3.11.4-cp27-cp27m-win32.whl", hash = "sha256:e4e9db78b71db2b1684ee4ecc3e32c4600f18cdf76e6b9ae03e338e52ee4b168"},
|
||||
{file = "pymongo-3.11.4-cp27-cp27m-win_amd64.whl", hash = "sha256:8e0004b0393d72d76de94b4792a006cb960c1c65c7659930fbf9a81ce4341982"},
|
||||
{file = "pymongo-3.11.4-cp27-cp27mu-manylinux1_i686.whl", hash = "sha256:fedf0dee7a412ca6d1d6d92c158fe9cbaa8ea0cae90d268f9ccc0744de7a97d0"},
|
||||
{file = "pymongo-3.11.4-cp27-cp27mu-manylinux1_x86_64.whl", hash = "sha256:f947b359cc4769af8b49be7e37af01f05fcf15b401da2528021148e4a54426d1"},
|
||||
{file = "pymongo-3.11.4-cp34-cp34m-macosx_10_6_intel.whl", hash = "sha256:3a3498a8326111221560e930f198b495ea6926937e249f475052ffc6893a6680"},
|
||||
{file = "pymongo-3.11.4-cp34-cp34m-manylinux1_i686.whl", hash = "sha256:9a4f6e0b01df820ba9ed0b4e618ca83a1c089e48d4f268d0e00dcd49893d4549"},
|
||||
{file = "pymongo-3.11.4-cp34-cp34m-manylinux1_x86_64.whl", hash = "sha256:d65bac5f6724d9ea6f0b5a0f0e4952fbbf209adcf6b5583b54c54bd2fcd74dc0"},
|
||||
{file = "pymongo-3.11.4-cp34-cp34m-win32.whl", hash = "sha256:15b083d1b789b230e5ac284442d9ecb113c93f3785a6824f748befaab803b812"},
|
||||
{file = "pymongo-3.11.4-cp34-cp34m-win_amd64.whl", hash = "sha256:f08665d3cc5abc2f770f472a9b5f720a9b3ab0b8b3bb97c7c1487515e5653d39"},
|
||||
{file = "pymongo-3.11.4-cp35-cp35m-macosx_10_6_intel.whl", hash = "sha256:977b1d4f868986b4ba5d03c317fde4d3b66e687d74473130cd598e3103db34fa"},
|
||||
{file = "pymongo-3.11.4-cp35-cp35m-manylinux1_i686.whl", hash = "sha256:510cd3bfabb63a07405b7b79fae63127e34c118b7531a2cbbafc7a24fd878594"},
|
||||
{file = "pymongo-3.11.4-cp35-cp35m-manylinux1_x86_64.whl", hash = "sha256:071552b065e809d24c5653fcc14968cfd6fde4e279408640d5ac58e3353a3c5f"},
|
||||
{file = "pymongo-3.11.4-cp35-cp35m-manylinux2014_aarch64.whl", hash = "sha256:f4ba58157e8ae33ee86fadf9062c506e535afd904f07f9be32731f4410a23b7f"},
|
||||
{file = "pymongo-3.11.4-cp35-cp35m-manylinux2014_i686.whl", hash = "sha256:b413117210fa6d92664c3d860571e8e8727c3e8f2ff197276c5d0cb365abd3ad"},
|
||||
{file = "pymongo-3.11.4-cp35-cp35m-manylinux2014_ppc64le.whl", hash = "sha256:08b8723248730599c9803ae4c97b8f3f76c55219104303c88cb962a31e3bb5ee"},
|
||||
{file = "pymongo-3.11.4-cp35-cp35m-manylinux2014_s390x.whl", hash = "sha256:8a41fdc751dc4707a4fafb111c442411816a7c225ebb5cadb57599534b5d5372"},
|
||||
{file = "pymongo-3.11.4-cp35-cp35m-manylinux2014_x86_64.whl", hash = "sha256:f664ed7613b8b18f0ce5696b146776266a038c19c5cd6efffa08ecc189b01b73"},
|
||||
{file = "pymongo-3.11.4-cp35-cp35m-win32.whl", hash = "sha256:5c36428cc4f7fae56354db7f46677fd21222fc3cb1e8829549b851172033e043"},
|
||||
{file = "pymongo-3.11.4-cp35-cp35m-win_amd64.whl", hash = "sha256:d0a70151d7de8a3194cdc906bcc1a42e14594787c64b0c1c9c975e5a2af3e251"},
|
||||
{file = "pymongo-3.11.4-cp36-cp36m-macosx_10_6_intel.whl", hash = "sha256:9b9298964389c180a063a9e8bac8a80ed42de11d04166b20249bfa0a489e0e0f"},
|
||||
{file = "pymongo-3.11.4-cp36-cp36m-manylinux1_i686.whl", hash = "sha256:b2f41261b648cf5dee425f37ff14f4ad151c2f24b827052b402637158fd056ef"},
|
||||
{file = "pymongo-3.11.4-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:e02beaab433fd1104b2804f909e694cfbdb6578020740a9051597adc1cd4e19f"},
|
||||
{file = "pymongo-3.11.4-cp36-cp36m-manylinux2014_aarch64.whl", hash = "sha256:8898f6699f740ca93a0879ed07d8e6db02d68af889d0ebb3d13ab017e6b1af1e"},
|
||||
{file = "pymongo-3.11.4-cp36-cp36m-manylinux2014_i686.whl", hash = "sha256:62c29bc36a6d9be68fe7b5aaf1e120b4aa66a958d1e146601fcd583eb12cae7b"},
|
||||
{file = "pymongo-3.11.4-cp36-cp36m-manylinux2014_ppc64le.whl", hash = "sha256:424799c71ff435094e5fb823c40eebb4500f0e048133311e9c026467e8ccebac"},
|
||||
{file = "pymongo-3.11.4-cp36-cp36m-manylinux2014_s390x.whl", hash = "sha256:3551912f5c34d8dd7c32c6bb00ae04192af47f7b9f653608f107d19c1a21a194"},
|
||||
{file = "pymongo-3.11.4-cp36-cp36m-manylinux2014_x86_64.whl", hash = "sha256:5db59223ed1e634d842a053325f85f908359c6dac9c8ddce8ef145061fae7df8"},
|
||||
{file = "pymongo-3.11.4-cp36-cp36m-win32.whl", hash = "sha256:fea5cb1c63efe1399f0812532c7cf65458d38fd011be350bc5021dfcac39fba8"},
|
||||
{file = "pymongo-3.11.4-cp36-cp36m-win_amd64.whl", hash = "sha256:d4e62417e89b717a7bcd8576ac3108cd063225942cc91c5b37ff5465fdccd386"},
|
||||
{file = "pymongo-3.11.4-cp37-cp37m-macosx_10_6_intel.whl", hash = "sha256:4c7e8c8e1e1918dcf6a652ac4b9d87164587c26fd2ce5dd81e73a5ab3b3d492f"},
|
||||
{file = "pymongo-3.11.4-cp37-cp37m-manylinux1_i686.whl", hash = "sha256:38a7b5140a48fc91681cdb5cb95b7cd64640b43d19259fdd707fa9d5a715f2b2"},
|
||||
{file = "pymongo-3.11.4-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:aff3656af2add93f290731a6b8930b23b35c0c09569150130a58192b3ec6fc61"},
|
||||
{file = "pymongo-3.11.4-cp37-cp37m-manylinux2014_aarch64.whl", hash = "sha256:03be7ad107d252bb7325d4af6309fdd2c025d08854d35f0e7abc8bf048f4245e"},
|
||||
{file = "pymongo-3.11.4-cp37-cp37m-manylinux2014_i686.whl", hash = "sha256:6060794aac9f7b0644b299f46a9c6cbc0bc470bd01572f4134df140afd41ded6"},
|
||||
{file = "pymongo-3.11.4-cp37-cp37m-manylinux2014_ppc64le.whl", hash = "sha256:73326b211e7410c8bd6a74500b1e3f392f39cf10862e243d00937e924f112c01"},
|
||||
{file = "pymongo-3.11.4-cp37-cp37m-manylinux2014_s390x.whl", hash = "sha256:20d75ea11527331a2980ab04762a9d960bcfea9475c54bbeab777af880de61cd"},
|
||||
{file = "pymongo-3.11.4-cp37-cp37m-manylinux2014_x86_64.whl", hash = "sha256:3135dd574ef1286189f3f04a36c8b7a256376914f8cbbce66b94f13125ded858"},
|
||||
{file = "pymongo-3.11.4-cp37-cp37m-win32.whl", hash = "sha256:7c97554ea521f898753d9773891d0347ebfaddcc1dee2ad94850b163171bf1f1"},
|
||||
{file = "pymongo-3.11.4-cp37-cp37m-win_amd64.whl", hash = "sha256:a08c8b322b671857c81f4c30cd3c8df2895fd3c0e9358714f39e0ef8fb327702"},
|
||||
{file = "pymongo-3.11.4-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:f3d851af3852f16ad4adc7ee054fd9c90a7a5063de94d815b7f6a88477b9f4c6"},
|
||||
{file = "pymongo-3.11.4-cp38-cp38-manylinux1_i686.whl", hash = "sha256:3bfc7689a1bacb9bcd2f2d5185d99507aa29f667a58dd8adaa43b5a348139e46"},
|
||||
{file = "pymongo-3.11.4-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:b8f94acd52e530a38f25e4d5bf7ddfdd4bea9193e718f58419def0d4406b58d3"},
|
||||
{file = "pymongo-3.11.4-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:e4b631688dfbdd61b5610e20b64b99d25771c6d52d9da73349342d2a0f11c46a"},
|
||||
{file = "pymongo-3.11.4-cp38-cp38-manylinux2014_i686.whl", hash = "sha256:474e21d0e07cd09679e357d1dac76e570dab86665e79a9d3354b10a279ac6fb3"},
|
||||
{file = "pymongo-3.11.4-cp38-cp38-manylinux2014_ppc64le.whl", hash = "sha256:421d13523d11c57f57f257152bc4a6bb463aadf7a3918e9c96fefdd6be8dbfb8"},
|
||||
{file = "pymongo-3.11.4-cp38-cp38-manylinux2014_s390x.whl", hash = "sha256:0cabfc297f4cf921f15bc789a8fbfd7115eb9f813d3f47a74b609894bc66ab0d"},
|
||||
{file = "pymongo-3.11.4-cp38-cp38-manylinux2014_x86_64.whl", hash = "sha256:fe4189846448df013cd9df11bba38ddf78043f8c290a9f06430732a7a8601cce"},
|
||||
{file = "pymongo-3.11.4-cp38-cp38-win32.whl", hash = "sha256:eb4d176394c37a76e8b0afe54b12d58614a67a60a7f8c0dd3a5afbb013c01092"},
|
||||
{file = "pymongo-3.11.4-cp38-cp38-win_amd64.whl", hash = "sha256:fffff7bfb6799a763d3742c59c6ee7ffadda21abed557637bc44ed1080876484"},
|
||||
{file = "pymongo-3.11.4-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:13acf6164ead81c9fc2afa0e1ea6d6134352973ce2bb35496834fee057063c04"},
|
||||
{file = "pymongo-3.11.4-cp39-cp39-manylinux1_i686.whl", hash = "sha256:d360e5d5dd3d55bf5d1776964625018d85b937d1032bae1926dd52253decd0db"},
|
||||
{file = "pymongo-3.11.4-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:0aaf4d44f1f819360f9432df538d54bbf850f18152f34e20337c01b828479171"},
|
||||
{file = "pymongo-3.11.4-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:08bda7b2c522ff9f1e554570da16298271ebb0c56ab9699446aacba249008988"},
|
||||
{file = "pymongo-3.11.4-cp39-cp39-manylinux2014_i686.whl", hash = "sha256:1a994a42f49dab5b6287e499be7d3d2751776486229980d8857ad53b8333d469"},
|
||||
{file = "pymongo-3.11.4-cp39-cp39-manylinux2014_ppc64le.whl", hash = "sha256:161fcd3281c42f644aa8dec7753cca2af03ce654e17d76da4f0dab34a12480ca"},
|
||||
{file = "pymongo-3.11.4-cp39-cp39-manylinux2014_s390x.whl", hash = "sha256:78f07961f4f214ea8e80be63cffd5cc158eb06cd922ffbf6c7155b11728f28f9"},
|
||||
{file = "pymongo-3.11.4-cp39-cp39-manylinux2014_x86_64.whl", hash = "sha256:ad31f184dcd3271de26ab1f9c51574afb99e1b0e484ab1da3641256b723e4994"},
|
||||
{file = "pymongo-3.11.4-cp39-cp39-win32.whl", hash = "sha256:5e606846c049ed40940524057bfdf1105af6066688c0e6a1a3ce2038589bae70"},
|
||||
{file = "pymongo-3.11.4-cp39-cp39-win_amd64.whl", hash = "sha256:3491c7de09e44eded16824cb58cf9b5cc1dc6f066a0bb7aa69929d02aa53b828"},
|
||||
{file = "pymongo-3.11.4-py2.7-macosx-10.14-intel.egg", hash = "sha256:506a6dab4c7ffdcacdf0b8e70bd20eb2e77fa994519547c9d88d676400fcad58"},
|
||||
{file = "pymongo-3.11.4.tar.gz", hash = "sha256:539d4cb1b16b57026999c53e5aab857fe706e70ae5310cc8c232479923f932e6"},
|
||||
]
|
||||
pynput = [
|
||||
{file = "pynput-1.7.3-py2.py3-none-any.whl", hash = "sha256:fea5777454f896bd79d35393088cd29a089f3b2da166f0848a922b1d5a807d4f"},
|
||||
|
|
@ -2107,28 +2116,28 @@ pynput = [
|
|||
{file = "pynput-1.7.3.tar.gz", hash = "sha256:4e50b1a0ab86847e87e58f6d1993688b9a44f9f4c88d4712315ea8eb552ef828"},
|
||||
]
|
||||
pyobjc-core = [
|
||||
{file = "pyobjc-core-7.1.tar.gz", hash = "sha256:a0616d5d816b4471f8f782c3a9a8923d2cc85014d88ad4f7fec694be9e6ea349"},
|
||||
{file = "pyobjc_core-7.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:f9fb45c9916f2a03ecd6b9ecde4c35d1d0f1a590ae2ea2372f9d9a360226ac1d"},
|
||||
{file = "pyobjc_core-7.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:fff8e87358c6195a2937004f279050cce3d4c02cd77acd73c5ad367307def855"},
|
||||
{file = "pyobjc_core-7.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:afb38efd3f2960eb49eb78552d465cfd025a9d6efa06cd4cd8694dafbe7c6e06"},
|
||||
{file = "pyobjc_core-7.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:7cb329c4119044fe83bcb3c5d4794d636c706ff0cb7c1c77d36ef5c373100082"},
|
||||
{file = "pyobjc_core-7.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:7913d7b20217c294900537faf58e5cc15942ed7af277bf05db25667d18255114"},
|
||||
{file = "pyobjc-core-7.2.tar.gz", hash = "sha256:9e9ec482d80ea030cdb1613d05a247f31eedabe6666d884d42dd890cc5fb0e05"},
|
||||
{file = "pyobjc_core-7.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:94b4d9de9d228db52dd35012096d63bdf8c1ace58ea3be1d5f6f39313cd502f2"},
|
||||
{file = "pyobjc_core-7.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:971cbd7189ae1aa03ef0d16124aa5bcd053779e0e6b6011a41c3dbd5b4ea7e88"},
|
||||
{file = "pyobjc_core-7.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:9d93b20394008373d6d2856d49aaff26f4b97ff42d924a14516c8a82313ec8c0"},
|
||||
{file = "pyobjc_core-7.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:860183540d1be792c26426018139ac8ba75e85f675c59ba080ccdc52d8e74c7a"},
|
||||
{file = "pyobjc_core-7.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:ffe61d3c2a404354daf2d895e34e38c5044453353581b3c396bf5365de26250c"},
|
||||
]
|
||||
pyobjc-framework-cocoa = [
|
||||
{file = "pyobjc-framework-Cocoa-7.1.tar.gz", hash = "sha256:67966152b3d38a0225176fceca2e9f56d849c8e7445548da09a00cb13155ec3e"},
|
||||
{file = "pyobjc_framework_Cocoa-7.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:bef77eafaac5eaf1d91d479d5483fd02216caa3edc27e8f5adc9af0b3fecdac3"},
|
||||
{file = "pyobjc_framework_Cocoa-7.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:b2ea3582c456827dc20e648c905fdbcf8d3dfae89434f981e9b761cd07262049"},
|
||||
{file = "pyobjc_framework_Cocoa-7.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1a4050f2d776f40c2409a151c6f7896420e936934b3bdbfabedf91509637ed9b"},
|
||||
{file = "pyobjc_framework_Cocoa-7.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:3f68f022f1f6d5985c418e10c6608c562fcf4bfe3714ec64fd10ce3dc6221bd4"},
|
||||
{file = "pyobjc_framework_Cocoa-7.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:ecfefd4c48dae42275c18679c69f6f2fff970e711097515a0a8732fc10194018"},
|
||||
{file = "pyobjc-framework-Cocoa-7.2.tar.gz", hash = "sha256:c8b23f03dc3f4436d36c0fd006a8a084835c4f6015187df7c3aa5de8ecd5c653"},
|
||||
{file = "pyobjc_framework_Cocoa-7.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:8e5dd5daa0096755937ec24c345a4b07c3fa131a457f99e0fdeeb01979178ec7"},
|
||||
{file = "pyobjc_framework_Cocoa-7.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:828d183947fc7746953fd0c9b1092cc423745ba0b49719e7b7d1e1614aaa20ec"},
|
||||
{file = "pyobjc_framework_Cocoa-7.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:7e4c6d7baa0c2ab5ea5efb8836ad0b3b3976cffcfc6195c1f195e826c6eb5744"},
|
||||
{file = "pyobjc_framework_Cocoa-7.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c9a9d1d49cc5a810773c88d6de821e60c8cc41d01113cf1b9e7662938f5f7d66"},
|
||||
{file = "pyobjc_framework_Cocoa-7.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:506c2cd09f421eac92b9008a0142174c3d1d70ecd4b0e3fa2b924767995fd14e"},
|
||||
]
|
||||
pyobjc-framework-quartz = [
|
||||
{file = "pyobjc-framework-Quartz-7.1.tar.gz", hash = "sha256:73102c9f4dbfa13275621014785ab3b684cf03ce93a4b0b270500c795349bea9"},
|
||||
{file = "pyobjc_framework_Quartz-7.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:7207a26244f02d4534ebb007fa55a9dc7c1b7fbb490d1e89e0d62cfd175e20f3"},
|
||||
{file = "pyobjc_framework_Quartz-7.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:5bc7a4fb3ea80b5af6910cc27729a0774a96327a69583fcf28057cb2ffce33ac"},
|
||||
{file = "pyobjc_framework_Quartz-7.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:c0469d60d4a79fc252f74adaa8177d2c680621d858c1b8ef19c411e903e2c892"},
|
||||
{file = "pyobjc_framework_Quartz-7.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:04953c031fc35020682bd4613b9b5a9688bdb9eab7ed76fd8dcf028783568b4f"},
|
||||
{file = "pyobjc_framework_Quartz-7.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:d8e0c086faf649f86386d0ed99194c6d0704b602576e2b258532b635b510b790"},
|
||||
{file = "pyobjc-framework-Quartz-7.2.tar.gz", hash = "sha256:ea554e5697bc6747a4ce793c0b0036da16622b44ff75196d6124603008922afa"},
|
||||
{file = "pyobjc_framework_Quartz-7.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:dc61fe61d26f797e4335f3ffc891bcef64624c728c2603e3307b3910580b2cb8"},
|
||||
{file = "pyobjc_framework_Quartz-7.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:ad8103cc38923f2708904db11a0992ea960125ce6adf7b4c7a77d8fdafd412c4"},
|
||||
{file = "pyobjc_framework_Quartz-7.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:4549d17ca41f0bf62792d5bc4b4293ba9a6cc560014b3e18ba22c65e4a5030d2"},
|
||||
{file = "pyobjc_framework_Quartz-7.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:da16e4f1e13cb7b02e30fa538cbb3a356e4a694bbc2bb26d2bd100ca12a54ff6"},
|
||||
{file = "pyobjc_framework_Quartz-7.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:c1f6471177a39535cd0358ae29b8f3d31fe778a21deb74105c448c4e726619d7"},
|
||||
]
|
||||
pyparsing = [
|
||||
{file = "pyparsing-2.4.7-py2.py3-none-any.whl", hash = "sha256:ef9d7589ef3c200abe66653d3f1ab1033c3c419ae9b9bdb1240a85b024efc88b"},
|
||||
|
|
@ -2148,34 +2157,26 @@ pyqt5-qt5 = [
|
|||
{file = "PyQt5_Qt5-5.15.2-py3-none-win_amd64.whl", hash = "sha256:750b78e4dba6bdf1607febedc08738e318ea09e9b10aea9ff0d73073f11f6962"},
|
||||
]
|
||||
pyqt5-sip = [
|
||||
{file = "PyQt5_sip-12.8.1-cp35-cp35m-macosx_10_6_intel.whl", hash = "sha256:bb5a87b66fc1445915104ee97f7a20a69decb42f52803e3b0795fa17ff88226c"},
|
||||
{file = "PyQt5_sip-12.8.1-cp35-cp35m-manylinux1_x86_64.whl", hash = "sha256:a29e2ac399429d3b7738f73e9081e50783e61ac5d29344e0802d0dcd6056c5a2"},
|
||||
{file = "PyQt5_sip-12.8.1-cp35-cp35m-win32.whl", hash = "sha256:0304ca9114b9817a270f67f421355075b78ff9fc25ac58ffd72c2601109d2194"},
|
||||
{file = "PyQt5_sip-12.8.1-cp35-cp35m-win_amd64.whl", hash = "sha256:84ba7746762bd223bed22428e8561aa267a229c28344c2d28c5d5d3f8970cffb"},
|
||||
{file = "PyQt5_sip-12.8.1-cp36-cp36m-macosx_10_6_intel.whl", hash = "sha256:7b81382ce188d63890a0e35abe0f9bb946cabc873a31873b73583b0fc84ac115"},
|
||||
{file = "PyQt5_sip-12.8.1-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:b6d42250baec52a5f77de64e2951d001c5501c3a2df2179f625b241cbaec3369"},
|
||||
{file = "PyQt5_sip-12.8.1-cp36-cp36m-win32.whl", hash = "sha256:6c1ebee60f1d2b3c70aff866b7933d8d8d7646011f7c32f9321ee88c290aa4f9"},
|
||||
{file = "PyQt5_sip-12.8.1-cp36-cp36m-win_amd64.whl", hash = "sha256:34dcd29be47553d5f016ff86e89e24cbc5eebae92eb2f96fb32d2d7ba028c43c"},
|
||||
{file = "PyQt5_sip-12.8.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:ed897c58acf4a3cdca61469daa31fe6e44c33c6c06a37c3f21fab31780b3b86a"},
|
||||
{file = "PyQt5_sip-12.8.1-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:a1b8ef013086e224b8e86c93f880f776d01b59195bdfa2a8e0b23f0480678fec"},
|
||||
{file = "PyQt5_sip-12.8.1-cp37-cp37m-win32.whl", hash = "sha256:0cd969be528c27bbd4755bd323dff4a79a8fdda28215364e6ce3e069cb56c2a9"},
|
||||
{file = "PyQt5_sip-12.8.1-cp37-cp37m-win_amd64.whl", hash = "sha256:c9800729badcb247765e4ffe2241549d02da1fa435b9db224845bc37c3e99cb0"},
|
||||
{file = "PyQt5_sip-12.8.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:9312ec47cac4e33c11503bc1cbeeb0bdae619620472f38e2078c5a51020a930f"},
|
||||
{file = "PyQt5_sip-12.8.1-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:2f35e82fd7ec1e1f6716e9154721c7594956a4f5bd4f826d8c6a6453833cc2f0"},
|
||||
{file = "PyQt5_sip-12.8.1-cp38-cp38-win32.whl", hash = "sha256:da9c9f1e65b9d09e73bd75befc82961b6b61b5a3b9d0a7c832168e1415f163c6"},
|
||||
{file = "PyQt5_sip-12.8.1-cp38-cp38-win_amd64.whl", hash = "sha256:832fd60a264de4134c2824d393320838f3ab648180c9c357ec58a74524d24507"},
|
||||
{file = "PyQt5_sip-12.8.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:c317ab1263e6417c498b81f5c970a9b1af7acefab1f80b4cc0f2f8e661f29fc5"},
|
||||
{file = "PyQt5_sip-12.8.1-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:c9d6d448c29dc6606bb7974696608f81f4316c8234f7c7216396ed110075e777"},
|
||||
{file = "PyQt5_sip-12.8.1-cp39-cp39-win32.whl", hash = "sha256:5a011aeff89660622a6d5c3388d55a9d76932f3b82c95e82fc31abd8b1d2990d"},
|
||||
{file = "PyQt5_sip-12.8.1-cp39-cp39-win_amd64.whl", hash = "sha256:f168f0a7f32b81bfeffdf003c36f25d81c97dee5eb67072a5183e761fe250f13"},
|
||||
{file = "PyQt5_sip-12.8.1.tar.gz", hash = "sha256:30e944db9abee9cc757aea16906d4198129558533eb7fadbe48c5da2bd18e0bd"},
|
||||
{file = "PyQt5_sip-12.9.0-cp36-cp36m-macosx_10_6_intel.whl", hash = "sha256:d85002238b5180bce4b245c13d6face848faa1a7a9e5c6e292025004f2fd619a"},
|
||||
{file = "PyQt5_sip-12.9.0-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:83c3220b1ca36eb8623ba2eb3766637b19eb0ce9f42336ad8253656d32750c0a"},
|
||||
{file = "PyQt5_sip-12.9.0-cp36-cp36m-win_amd64.whl", hash = "sha256:69a3ad4259172e2b1aa9060de211efac39ddd734a517b1924d9c6c0cc4f55f96"},
|
||||
{file = "PyQt5_sip-12.9.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:42274a501ab4806d2c31659170db14c282b8313d2255458064666d9e70d96206"},
|
||||
{file = "PyQt5_sip-12.9.0-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:6a8701892a01a5a2a4720872361197cc80fdd5f49c8482d488ddf38c9c84f055"},
|
||||
{file = "PyQt5_sip-12.9.0-cp37-cp37m-win_amd64.whl", hash = "sha256:4347bd81d30c8e3181e553b3734f91658cfbdd8f1a19f254777f906870974e6d"},
|
||||
{file = "PyQt5_sip-12.9.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:c446971c360a0a1030282a69375a08c78e8a61d568bfd6dab3dcc5cf8817f644"},
|
||||
{file = "PyQt5_sip-12.9.0-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:fc43f2d7c438517ee33e929e8ae77132749c15909afab6aeece5fcf4147ffdb5"},
|
||||
{file = "PyQt5_sip-12.9.0-cp38-cp38-win_amd64.whl", hash = "sha256:c5216403d4d8d857ec4a61f631d3945e44fa248aa2415e9ee9369ab7c8a4d0c7"},
|
||||
{file = "PyQt5_sip-12.9.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:a25b9843c7da6a1608f310879c38e6434331aab1dc2fe6cb65c14f1ecf33780e"},
|
||||
{file = "PyQt5_sip-12.9.0-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:dd05c768c2b55ffe56a9d49ce6cc77cdf3d53dbfad935258a9e347cbfd9a5850"},
|
||||
{file = "PyQt5_sip-12.9.0-cp39-cp39-win_amd64.whl", hash = "sha256:b09f4cd36a4831229fb77c424d89635fa937d97765ec90685e2f257e56a2685a"},
|
||||
{file = "PyQt5_sip-12.9.0.tar.gz", hash = "sha256:d3e4489d7c2b0ece9d203ae66e573939f7f60d4d29e089c9f11daa17cfeaae32"},
|
||||
]
|
||||
pyrsistent = [
|
||||
{file = "pyrsistent-0.17.3.tar.gz", hash = "sha256:2e636185d9eb976a18a8a8e96efce62f2905fea90041958d8cc2a189756ebf3e"},
|
||||
]
|
||||
pytest = [
|
||||
{file = "pytest-6.2.3-py3-none-any.whl", hash = "sha256:6ad9c7bdf517a808242b998ac20063c41532a570d088d77eec1ee12b0b5574bc"},
|
||||
{file = "pytest-6.2.3.tar.gz", hash = "sha256:671238a46e4df0f3498d1c3270e5deb9b32d25134c99b7d75370a68cfbe9b634"},
|
||||
{file = "pytest-6.2.4-py3-none-any.whl", hash = "sha256:91ef2131a9bd6be8f76f1f08eac5c5317221d6ad1e143ae03894b862e8976890"},
|
||||
{file = "pytest-6.2.4.tar.gz", hash = "sha256:50bcad0a0b9c5a72c8e4e7c9855a3ad496ca6a881a3641b4260605450772c54b"},
|
||||
]
|
||||
pytest-cov = [
|
||||
{file = "pytest-cov-2.11.1.tar.gz", hash = "sha256:359952d9d39b9f822d9d29324483e7ba04a3a17dd7d05aa6beb7ea01e359e5f7"},
|
||||
|
|
@ -2236,9 +2237,13 @@ secretstorage = [
|
|||
{file = "SecretStorage-3.3.1-py3-none-any.whl", hash = "sha256:422d82c36172d88d6a0ed5afdec956514b189ddbfb72fefab0c8a1cee4eaf71f"},
|
||||
{file = "SecretStorage-3.3.1.tar.gz", hash = "sha256:fd666c51a6bf200643495a04abb261f83229dcb6fd8472ec393df7ffc8b6f195"},
|
||||
]
|
||||
semver = [
|
||||
{file = "semver-2.13.0-py2.py3-none-any.whl", hash = "sha256:ced8b23dceb22134307c1b8abfa523da14198793d9787ac838e70e29e77458d4"},
|
||||
{file = "semver-2.13.0.tar.gz", hash = "sha256:fa0fe2722ee1c3f57eac478820c3a5ae2f624af8264cbdf9000c980ff7f75e3f"},
|
||||
]
|
||||
six = [
|
||||
{file = "six-1.15.0-py2.py3-none-any.whl", hash = "sha256:8b74bedcbbbaca38ff6d7491d76f2b06b3592611af620f8426e82dddb04a5ced"},
|
||||
{file = "six-1.15.0.tar.gz", hash = "sha256:30639c035cdb23534cd4aa2dd52c3bf48f06e5f4a941509c8bafd8ce11080259"},
|
||||
{file = "six-1.16.0-py2.py3-none-any.whl", hash = "sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254"},
|
||||
{file = "six-1.16.0.tar.gz", hash = "sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926"},
|
||||
]
|
||||
snowballstemmer = [
|
||||
{file = "snowballstemmer-2.1.0-py2.py3-none-any.whl", hash = "sha256:b51b447bea85f9968c13b650126a888aabd4cb4463fca868ec596826325dedc2"},
|
||||
|
|
@ -2249,8 +2254,8 @@ speedcopy = [
|
|||
{file = "speedcopy-2.1.0.tar.gz", hash = "sha256:8bb1a6c735900b83901a7be84ba2175ed3887c13c6786f97dea48f2ea7d504c2"},
|
||||
]
|
||||
sphinx = [
|
||||
{file = "Sphinx-3.5.4-py3-none-any.whl", hash = "sha256:2320d4e994a191f4b4be27da514e46b3d6b420f2ff895d064f52415d342461e8"},
|
||||
{file = "Sphinx-3.5.4.tar.gz", hash = "sha256:19010b7b9fa0dc7756a6e105b2aacd3a80f798af3c25c273be64d7beeb482cb1"},
|
||||
{file = "Sphinx-4.0.1-py3-none-any.whl", hash = "sha256:b2566f5f339737a6ef37198c47d56de1f4a746c722bebdb2fe045c34bfd8b9d0"},
|
||||
{file = "Sphinx-4.0.1.tar.gz", hash = "sha256:cf5104777571b2b7f06fa88ee08fade24563f4a0594cf4bd17d31c47b8740b4c"},
|
||||
]
|
||||
sphinx-qt-documentation = [
|
||||
{file = "sphinx_qt_documentation-0.3-py3-none-any.whl", hash = "sha256:bee247cb9e4fc03fc496d07adfdb943100e1103320c3e5e820e0cfa7c790d9b6"},
|
||||
|
|
@ -2328,9 +2333,9 @@ typed-ast = [
|
|||
{file = "typed_ast-1.4.3.tar.gz", hash = "sha256:fb1bbeac803adea29cedd70781399c99138358c26d05fcbd23c13016b7f5ec65"},
|
||||
]
|
||||
typing-extensions = [
|
||||
{file = "typing_extensions-3.7.4.3-py2-none-any.whl", hash = "sha256:dafc7639cde7f1b6e1acc0f457842a83e722ccca8eef5270af2d74792619a89f"},
|
||||
{file = "typing_extensions-3.7.4.3-py3-none-any.whl", hash = "sha256:7cb407020f00f7bfc3cb3e7881628838e69d8f3fcab2f64742a5e76b2f841918"},
|
||||
{file = "typing_extensions-3.7.4.3.tar.gz", hash = "sha256:99d4073b617d30288f569d3f13d2bd7548c3a7e4c8de87db09a9d29bb3a4a60c"},
|
||||
{file = "typing_extensions-3.10.0.0-py2-none-any.whl", hash = "sha256:0ac0f89795dd19de6b97debb0c6af1c70987fd80a2d62d1958f7e56fcc31b497"},
|
||||
{file = "typing_extensions-3.10.0.0-py3-none-any.whl", hash = "sha256:779383f6086d90c99ae41cf0ff39aac8a7937a9283ce0a414e5dd782f4c94a84"},
|
||||
{file = "typing_extensions-3.10.0.0.tar.gz", hash = "sha256:50b6f157849174217d0656f99dc82fe932884fb250826c18350e159ec6cdf342"},
|
||||
]
|
||||
uritemplate = [
|
||||
{file = "uritemplate-3.0.1-py2.py3-none-any.whl", hash = "sha256:07620c3f3f8eed1f12600845892b0e036a2420acf513c53f7de0abd911a5894f"},
|
||||
|
|
@ -2345,8 +2350,8 @@ wcwidth = [
|
|||
{file = "wcwidth-0.2.5.tar.gz", hash = "sha256:c4d647b99872929fdb7bdcaa4fbe7f01413ed3d98077df798530e5b04f116c83"},
|
||||
]
|
||||
websocket-client = [
|
||||
{file = "websocket_client-0.58.0-py2.py3-none-any.whl", hash = "sha256:44b5df8f08c74c3d82d28100fdc81f4536809ce98a17f0757557813275fbb663"},
|
||||
{file = "websocket_client-0.58.0.tar.gz", hash = "sha256:63509b41d158ae5b7f67eb4ad20fecbb4eee99434e73e140354dc3ff8e09716f"},
|
||||
{file = "websocket-client-0.59.0.tar.gz", hash = "sha256:d376bd60eace9d437ab6d7ee16f4ab4e821c9dae591e1b783c58ebd8aaf80c5c"},
|
||||
{file = "websocket_client-0.59.0-py2.py3-none-any.whl", hash = "sha256:2e50d26ca593f70aba7b13a489435ef88b8fc3b5c5643c1ce8808ff9b40f0b32"},
|
||||
]
|
||||
wrapt = [
|
||||
{file = "wrapt-1.12.1.tar.gz", hash = "sha256:b62ffa81fb85f4332a4f609cab4ac40709470da05643a082ec1eb88e6d9b97d7"},
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
[tool.poetry]
|
||||
name = "OpenPype"
|
||||
version = "3.0.0-rc4"
|
||||
version = "3.0.0-rc.5"
|
||||
description = "Open VFX and Animation pipeline with support."
|
||||
authors = ["OpenPype Team <info@openpype.io>"]
|
||||
license = "MIT License"
|
||||
|
|
@ -36,6 +36,7 @@ pyqt5 = "^5.12.2" # ideally should be replaced with PySide2
|
|||
"Qt.py" = "^1.3.3"
|
||||
speedcopy = "^2.1"
|
||||
six = "^1.15"
|
||||
semver = "^2.13.0" # for version resolution
|
||||
wsrpc_aiohttp = "^3.1.1" # websocket server
|
||||
pywin32 = { version = "300", markers = "sys_platform == 'win32'" }
|
||||
jinxed = [
|
||||
|
|
|
|||
2
setup.py
2
setup.py
|
|
@ -98,7 +98,7 @@ executables = [
|
|||
setup(
|
||||
name="OpenPype",
|
||||
version=__version__,
|
||||
description="Ultimate pipeline",
|
||||
description="OpenPype",
|
||||
cmdclass={"build_sphinx": BuildDoc},
|
||||
options={
|
||||
"build_exe": build_exe_options,
|
||||
|
|
|
|||
41
start.py
41
start.py
|
|
@ -360,7 +360,7 @@ def _determine_mongodb() -> str:
|
|||
|
||||
def _initialize_environment(openpype_version: OpenPypeVersion) -> None:
|
||||
version_path = openpype_version.path
|
||||
os.environ["OPENPYPE_VERSION"] = openpype_version.version
|
||||
os.environ["OPENPYPE_VERSION"] = str(openpype_version)
|
||||
# set OPENPYPE_REPOS_ROOT to point to currently used OpenPype version.
|
||||
os.environ["OPENPYPE_REPOS_ROOT"] = os.path.normpath(
|
||||
version_path.as_posix()
|
||||
|
|
@ -417,6 +417,26 @@ def _find_frozen_openpype(use_version: str = None,
|
|||
openpype_version = None
|
||||
openpype_versions = bootstrap.find_openpype(include_zips=True,
|
||||
staging=use_staging)
|
||||
# get local frozen version and add it to detected version so if it is
|
||||
# newer it will be used instead.
|
||||
local_version_str = bootstrap.get_version(
|
||||
Path(os.environ["OPENPYPE_ROOT"]))
|
||||
if local_version_str:
|
||||
local_version = OpenPypeVersion(
|
||||
version=local_version_str,
|
||||
path=Path(os.environ["OPENPYPE_ROOT"]))
|
||||
if local_version not in openpype_versions:
|
||||
openpype_versions.append(local_version)
|
||||
openpype_versions.sort()
|
||||
# if latest is currently running, ditch whole list
|
||||
# and run from current without installing it.
|
||||
if local_version == openpype_versions[-1]:
|
||||
os.environ["OPENPYPE_TRYOUT"] = "1"
|
||||
openpype_versions = []
|
||||
|
||||
else:
|
||||
print("!!! Warning: cannot determine current running version.")
|
||||
|
||||
if not os.getenv("OPENPYPE_TRYOUT"):
|
||||
try:
|
||||
# use latest one found (last in the list is latest)
|
||||
|
|
@ -464,12 +484,9 @@ def _find_frozen_openpype(use_version: str = None,
|
|||
use_version, openpype_versions)
|
||||
|
||||
if not version_path:
|
||||
if use_version is not None:
|
||||
if not openpype_version:
|
||||
...
|
||||
else:
|
||||
print(("!!! Specified version was not found, using "
|
||||
"latest available"))
|
||||
if use_version is not None and openpype_version:
|
||||
print(("!!! Specified version was not found, using "
|
||||
"latest available"))
|
||||
# specified version was not found so use latest detected.
|
||||
version_path = openpype_version.path
|
||||
print(f">>> Using version [ {openpype_version} ]")
|
||||
|
|
@ -492,7 +509,15 @@ def _find_frozen_openpype(use_version: str = None,
|
|||
|
||||
if openpype_version.path.is_file():
|
||||
print(">>> Extracting zip file ...")
|
||||
version_path = bootstrap.extract_openpype(openpype_version)
|
||||
try:
|
||||
version_path = bootstrap.extract_openpype(openpype_version)
|
||||
except OSError as e:
|
||||
print("!!! failed: {}".format(str(e)))
|
||||
sys.exit(1)
|
||||
else:
|
||||
# cleanup zip after extraction
|
||||
os.unlink(openpype_version.path)
|
||||
|
||||
openpype_version.path = version_path
|
||||
|
||||
_initialize_environment(openpype_version)
|
||||
|
|
|
|||
|
|
@ -5,72 +5,76 @@ import sys
|
|||
from collections import namedtuple
|
||||
from pathlib import Path
|
||||
from zipfile import ZipFile
|
||||
from uuid import uuid4
|
||||
|
||||
import appdirs
|
||||
import pytest
|
||||
|
||||
from igniter.bootstrap_repos import BootstrapRepos
|
||||
from igniter.bootstrap_repos import PypeVersion
|
||||
from pype.lib import OpenPypeSettingsRegistry
|
||||
from igniter.bootstrap_repos import OpenPypeVersion
|
||||
from igniter.user_settings import OpenPypeSettingsRegistry
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def fix_bootstrap(tmp_path, pytestconfig):
|
||||
"""This will fix BoostrapRepos with temp paths."""
|
||||
bs = BootstrapRepos()
|
||||
bs.live_repo_dir = pytestconfig.rootpath / 'repos'
|
||||
bs.data_dir = tmp_path
|
||||
return bs
|
||||
|
||||
|
||||
def test_pype_version():
|
||||
v1 = PypeVersion(1, 2, 3)
|
||||
def test_openpype_version(printer):
|
||||
"""Test determination of OpenPype versions."""
|
||||
v1 = OpenPypeVersion(1, 2, 3)
|
||||
assert str(v1) == "1.2.3"
|
||||
|
||||
v2 = PypeVersion(1, 2, 3, client="x")
|
||||
v2 = OpenPypeVersion(1, 2, 3, prerelease="x")
|
||||
assert str(v2) == "1.2.3-x"
|
||||
assert v1 < v2
|
||||
assert v1 > v2
|
||||
|
||||
v3 = PypeVersion(1, 2, 3, variant="staging")
|
||||
assert str(v3) == "1.2.3-staging"
|
||||
v3 = OpenPypeVersion(1, 2, 3, staging=True)
|
||||
assert str(v3) == "1.2.3+staging"
|
||||
|
||||
v4 = PypeVersion(1, 2, 3, variant="staging", client="client")
|
||||
assert str(v4) == "1.2.3-client-staging"
|
||||
assert v3 < v4
|
||||
assert v1 < v4
|
||||
v4 = OpenPypeVersion(1, 2, 3, staging="True", prerelease="rc.1")
|
||||
assert str(v4) == "1.2.3-rc.1+staging"
|
||||
assert v3 > v4
|
||||
assert v1 > v4
|
||||
assert v4 < OpenPypeVersion(1, 2, 3, prerelease="rc.1")
|
||||
|
||||
v5 = PypeVersion(1, 2, 3, variant="foo", client="x")
|
||||
assert str(v5) == "1.2.3-x"
|
||||
v5 = OpenPypeVersion(1, 2, 3, build="foo", prerelease="x")
|
||||
assert str(v5) == "1.2.3-x+foo"
|
||||
assert v4 < v5
|
||||
|
||||
v6 = PypeVersion(1, 2, 3, variant="foo")
|
||||
assert str(v6) == "1.2.3"
|
||||
v6 = OpenPypeVersion(1, 2, 3, prerelease="foo")
|
||||
assert str(v6) == "1.2.3-foo"
|
||||
|
||||
v7 = PypeVersion(2, 0, 0)
|
||||
v7 = OpenPypeVersion(2, 0, 0)
|
||||
assert v1 < v7
|
||||
|
||||
v8 = PypeVersion(0, 1, 5)
|
||||
v8 = OpenPypeVersion(0, 1, 5)
|
||||
assert v8 < v7
|
||||
|
||||
v9 = PypeVersion(1, 2, 4)
|
||||
v9 = OpenPypeVersion(1, 2, 4)
|
||||
assert v9 > v1
|
||||
|
||||
v10 = PypeVersion(1, 2, 2)
|
||||
v10 = OpenPypeVersion(1, 2, 2)
|
||||
assert v10 < v1
|
||||
|
||||
v11 = PypeVersion(1, 2, 3, path=Path("/foo/bar"))
|
||||
v11 = OpenPypeVersion(1, 2, 3, path=Path("/foo/bar"))
|
||||
assert v10 < v11
|
||||
|
||||
assert v5 == v2
|
||||
|
||||
sort_versions = [
|
||||
PypeVersion(3, 2, 1),
|
||||
PypeVersion(1, 2, 3),
|
||||
PypeVersion(0, 0, 1),
|
||||
PypeVersion(4, 8, 10),
|
||||
PypeVersion(4, 8, 20),
|
||||
PypeVersion(4, 8, 9),
|
||||
PypeVersion(1, 2, 3, variant="staging"),
|
||||
PypeVersion(1, 2, 3, client="client")
|
||||
OpenPypeVersion(3, 2, 1),
|
||||
OpenPypeVersion(1, 2, 3),
|
||||
OpenPypeVersion(0, 0, 1),
|
||||
OpenPypeVersion(4, 8, 10),
|
||||
OpenPypeVersion(4, 8, 20),
|
||||
OpenPypeVersion(4, 8, 9),
|
||||
OpenPypeVersion(1, 2, 3, staging=True),
|
||||
OpenPypeVersion(1, 2, 3, build="foo")
|
||||
]
|
||||
res = sorted(sort_versions)
|
||||
|
||||
|
|
@ -81,57 +85,51 @@ def test_pype_version():
|
|||
|
||||
str_versions = [
|
||||
"5.5.1",
|
||||
"5.5.2-client",
|
||||
"5.5.3-client-strange",
|
||||
"5.5.4-staging",
|
||||
"5.5.5-staging-client",
|
||||
"5.5.2-foo",
|
||||
"5.5.3-foo+strange",
|
||||
"5.5.4+staging",
|
||||
"5.5.5+staging-client",
|
||||
"5.6.3",
|
||||
"5.6.3-staging"
|
||||
"5.6.3+staging"
|
||||
]
|
||||
res_versions = []
|
||||
for v in str_versions:
|
||||
res_versions.append(PypeVersion(version=v))
|
||||
|
||||
res_versions = [OpenPypeVersion(version=v) for v in str_versions]
|
||||
sorted_res_versions = sorted(res_versions)
|
||||
|
||||
assert str(sorted_res_versions[0]) == str_versions[0]
|
||||
assert str(sorted_res_versions[-1]) == str_versions[5]
|
||||
|
||||
with pytest.raises(ValueError):
|
||||
_ = PypeVersion()
|
||||
with pytest.raises(TypeError):
|
||||
_ = OpenPypeVersion()
|
||||
|
||||
with pytest.raises(ValueError):
|
||||
_ = PypeVersion(major=1)
|
||||
_ = OpenPypeVersion(version="booobaa")
|
||||
|
||||
with pytest.raises(ValueError):
|
||||
_ = PypeVersion(version="booobaa")
|
||||
|
||||
v11 = PypeVersion(version="4.6.7-client-staging")
|
||||
v11 = OpenPypeVersion(version="4.6.7-foo+staging")
|
||||
assert v11.major == 4
|
||||
assert v11.minor == 6
|
||||
assert v11.subversion == 7
|
||||
assert v11.variant == "staging"
|
||||
assert v11.client == "client"
|
||||
assert v11.patch == 7
|
||||
assert v11.staging is True
|
||||
assert v11.prerelease == "foo"
|
||||
|
||||
|
||||
def test_get_main_version():
|
||||
ver = PypeVersion(1, 2, 3, variant="staging", client="foo")
|
||||
ver = OpenPypeVersion(1, 2, 3, staging=True, prerelease="foo")
|
||||
assert ver.get_main_version() == "1.2.3"
|
||||
|
||||
|
||||
def test_get_version_path_from_list():
|
||||
versions = [
|
||||
PypeVersion(1, 2, 3, path=Path('/foo/bar')),
|
||||
PypeVersion(3, 4, 5, variant="staging", path=Path("/bar/baz")),
|
||||
PypeVersion(6, 7, 8, client="x", path=Path("boo/goo"))
|
||||
OpenPypeVersion(1, 2, 3, path=Path('/foo/bar')),
|
||||
OpenPypeVersion(3, 4, 5, staging=True, path=Path("/bar/baz")),
|
||||
OpenPypeVersion(6, 7, 8, prerelease="x", path=Path("boo/goo"))
|
||||
]
|
||||
path = BootstrapRepos.get_version_path_from_list(
|
||||
"3.4.5-staging", versions)
|
||||
"3.4.5+staging", versions)
|
||||
|
||||
assert path == Path("/bar/baz")
|
||||
|
||||
|
||||
def test_search_string_for_pype_version(printer):
|
||||
def test_search_string_for_openpype_version(printer):
|
||||
strings = [
|
||||
("3.0.1", True),
|
||||
("foo-3.0", False),
|
||||
|
|
@ -142,106 +140,112 @@ def test_search_string_for_pype_version(printer):
|
|||
]
|
||||
for ver_string in strings:
|
||||
printer(f"testing {ver_string[0]} should be {ver_string[1]}")
|
||||
assert PypeVersion.version_in_str(ver_string[0])[0] == ver_string[1]
|
||||
assert OpenPypeVersion.version_in_str(ver_string[0])[0] == \
|
||||
ver_string[1]
|
||||
|
||||
|
||||
@pytest.mark.slow
|
||||
def test_install_live_repos(fix_bootstrap, printer):
|
||||
pype_version = fix_bootstrap.create_version_from_live_code()
|
||||
def test_install_live_repos(fix_bootstrap, printer, monkeypatch, pytestconfig):
|
||||
monkeypatch.setenv("OPENPYPE_ROOT", pytestconfig.rootpath.as_posix())
|
||||
monkeypatch.setenv("OPENPYPE_DATABASE_NAME", str(uuid4()))
|
||||
openpype_version = fix_bootstrap.create_version_from_live_code()
|
||||
sep = os.path.sep
|
||||
expected_paths = [
|
||||
f"{pype_version.path}{sep}repos{sep}avalon-core",
|
||||
f"{pype_version.path}{sep}repos{sep}avalon-unreal-integration",
|
||||
f"{pype_version.path}"
|
||||
f"{openpype_version.path}{sep}repos{sep}avalon-core",
|
||||
f"{openpype_version.path}{sep}repos{sep}avalon-unreal-integration",
|
||||
f"{openpype_version.path}"
|
||||
]
|
||||
printer("testing zip creation")
|
||||
assert os.path.exists(pype_version.path), "zip archive was not created"
|
||||
fix_bootstrap.add_paths_from_archive(pype_version.path)
|
||||
assert os.path.exists(openpype_version.path), "zip archive was not created"
|
||||
fix_bootstrap.add_paths_from_archive(openpype_version.path)
|
||||
for ep in expected_paths:
|
||||
assert ep in sys.path, f"{ep} not set correctly"
|
||||
|
||||
printer("testing pype imported")
|
||||
del sys.modules["pype"]
|
||||
import pype # noqa: F401
|
||||
printer("testing openpype imported")
|
||||
try:
|
||||
del sys.modules["openpype"]
|
||||
except KeyError:
|
||||
# wasn't imported before
|
||||
pass
|
||||
import openpype # noqa: F401
|
||||
|
||||
# test if pype is imported from specific location in zip
|
||||
assert "pype" in sys.modules.keys(), "Pype not imported"
|
||||
assert sys.modules["pype"].__file__ == \
|
||||
f"{pype_version.path}{sep}pype{sep}__init__.py"
|
||||
# test if openpype is imported from specific location in zip
|
||||
assert "openpype" in sys.modules.keys(), "OpenPype not imported"
|
||||
assert sys.modules["openpype"].__file__ == \
|
||||
f"{openpype_version.path}{sep}openpype{sep}__init__.py"
|
||||
|
||||
|
||||
def test_find_pype(fix_bootstrap, tmp_path_factory, monkeypatch, printer):
|
||||
|
||||
test_pype = namedtuple("Pype", "prefix version suffix type valid")
|
||||
def test_find_openpype(fix_bootstrap, tmp_path_factory, monkeypatch, printer):
|
||||
test_openpype = namedtuple("OpenPype", "prefix version suffix type valid")
|
||||
|
||||
test_versions_1 = [
|
||||
test_pype(prefix="foo-v", version="5.5.1",
|
||||
suffix=".zip", type="zip", valid=False),
|
||||
test_pype(prefix="bar-v", version="5.5.2-client",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_pype(prefix="baz-v", version="5.5.3-client-strange",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_pype(prefix="bum-v", version="5.5.4-staging",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_pype(prefix="zum-v", version="5.5.5-client-staging",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_pype(prefix="fam-v", version="5.6.3",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_pype(prefix="foo-v", version="5.6.3-staging",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_pype(prefix="fim-v", version="5.6.3",
|
||||
suffix=".zip", type="zip", valid=False),
|
||||
test_pype(prefix="foo-v", version="5.6.4",
|
||||
suffix=".txt", type="txt", valid=False),
|
||||
test_pype(prefix="foo-v", version="5.7.1",
|
||||
suffix="", type="dir", valid=False),
|
||||
test_openpype(prefix="foo-v", version="5.5.1",
|
||||
suffix=".zip", type="zip", valid=False),
|
||||
test_openpype(prefix="bar-v", version="5.5.2-rc.1",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_openpype(prefix="baz-v", version="5.5.3-foo-strange",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_openpype(prefix="bum-v", version="5.5.4+staging",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_openpype(prefix="zum-v", version="5.5.5-foo+staging",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_openpype(prefix="fam-v", version="5.6.3",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_openpype(prefix="foo-v", version="5.6.3+staging",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_openpype(prefix="fim-v", version="5.6.3",
|
||||
suffix=".zip", type="zip", valid=False),
|
||||
test_openpype(prefix="foo-v", version="5.6.4",
|
||||
suffix=".txt", type="txt", valid=False),
|
||||
test_openpype(prefix="foo-v", version="5.7.1",
|
||||
suffix="", type="dir", valid=False),
|
||||
]
|
||||
|
||||
test_versions_2 = [
|
||||
test_pype(prefix="foo-v", version="10.0.0",
|
||||
suffix=".txt", type="txt", valid=False),
|
||||
test_pype(prefix="lom-v", version="7.2.6",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_pype(prefix="bom-v", version="7.2.7-client",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_pype(prefix="woo-v", version="7.2.8-client-strange",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_pype(prefix="loo-v", version="7.2.10-client-staging",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_pype(prefix="kok-v", version="7.0.1",
|
||||
suffix=".zip", type="zip", valid=True)
|
||||
test_openpype(prefix="foo-v", version="10.0.0",
|
||||
suffix=".txt", type="txt", valid=False),
|
||||
test_openpype(prefix="lom-v", version="7.2.6",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_openpype(prefix="bom-v", version="7.2.7-rc.3",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_openpype(prefix="woo-v", version="7.2.8-foo-strange",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_openpype(prefix="loo-v", version="7.2.10-foo+staging",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_openpype(prefix="kok-v", version="7.0.1",
|
||||
suffix=".zip", type="zip", valid=True)
|
||||
]
|
||||
|
||||
test_versions_3 = [
|
||||
test_pype(prefix="foo-v", version="3.0.0",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_pype(prefix="goo-v", version="3.0.1",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_pype(prefix="hoo-v", version="4.1.0",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_pype(prefix="foo-v", version="4.1.2",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_pype(prefix="foo-v", version="3.0.1-client",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_pype(prefix="foo-v", version="3.0.1-client-strange",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_pype(prefix="foo-v", version="3.0.1-staging",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_pype(prefix="foo-v", version="3.0.1-client-staging",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_pype(prefix="foo-v", version="3.2.0",
|
||||
suffix=".zip", type="zip", valid=True)
|
||||
test_openpype(prefix="foo-v", version="3.0.0",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_openpype(prefix="goo-v", version="3.0.1",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_openpype(prefix="hoo-v", version="4.1.0",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_openpype(prefix="foo-v", version="4.1.2",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_openpype(prefix="foo-v", version="3.0.1-foo",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_openpype(prefix="foo-v", version="3.0.1-foo-strange",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_openpype(prefix="foo-v", version="3.0.1+staging",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_openpype(prefix="foo-v", version="3.0.1-foo+staging",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_openpype(prefix="foo-v", version="3.2.0",
|
||||
suffix=".zip", type="zip", valid=True)
|
||||
]
|
||||
|
||||
test_versions_4 = [
|
||||
test_pype(prefix="foo-v", version="10.0.0",
|
||||
suffix="", type="dir", valid=True),
|
||||
test_pype(prefix="lom-v", version="11.2.6",
|
||||
suffix=".zip", type="dir", valid=False),
|
||||
test_pype(prefix="bom-v", version="7.2.7-client",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_pype(prefix="woo-v", version="7.2.8-client-strange",
|
||||
suffix=".zip", type="txt", valid=False)
|
||||
test_openpype(prefix="foo-v", version="10.0.0",
|
||||
suffix="", type="dir", valid=True),
|
||||
test_openpype(prefix="lom-v", version="11.2.6",
|
||||
suffix=".zip", type="dir", valid=False),
|
||||
test_openpype(prefix="bom-v", version="7.2.7-foo",
|
||||
suffix=".zip", type="zip", valid=True),
|
||||
test_openpype(prefix="woo-v", version="7.2.8-foo-strange",
|
||||
suffix=".zip", type="txt", valid=False)
|
||||
]
|
||||
|
||||
def _create_invalid_zip(path: Path):
|
||||
|
|
@ -251,7 +255,7 @@ def test_find_pype(fix_bootstrap, tmp_path_factory, monkeypatch, printer):
|
|||
def _create_valid_zip(path: Path, version: str):
|
||||
with ZipFile(path, "w") as zf:
|
||||
zf.writestr(
|
||||
"pype/version.py", f"__version__ = '{version}'\n\n")
|
||||
"openpype/version.py", f"__version__ = '{version}'\n\n")
|
||||
|
||||
def _create_invalid_dir(path: Path):
|
||||
path.mkdir(parents=True, exist_ok=True)
|
||||
|
|
@ -259,9 +263,9 @@ def test_find_pype(fix_bootstrap, tmp_path_factory, monkeypatch, printer):
|
|||
fp.write("invalid")
|
||||
|
||||
def _create_valid_dir(path: Path, version: str):
|
||||
pype_path = path / "pype"
|
||||
version_path = pype_path / "version.py"
|
||||
pype_path.mkdir(parents=True, exist_ok=True)
|
||||
openpype_path = path / "openpype"
|
||||
version_path = openpype_path / "version.py"
|
||||
openpype_path.mkdir(parents=True, exist_ok=True)
|
||||
with open(version_path, "w") as fp:
|
||||
fp.write(f"__version__ = '{version}'\n\n")
|
||||
|
||||
|
|
@ -283,15 +287,15 @@ def test_find_pype(fix_bootstrap, tmp_path_factory, monkeypatch, printer):
|
|||
with open(test_path, "w") as fp:
|
||||
fp.write("foo")
|
||||
|
||||
# in PYPE_PATH
|
||||
# in OPENPYPE_PATH
|
||||
e_path = tmp_path_factory.mktemp("environ")
|
||||
|
||||
# create files and directories for test
|
||||
for test_file in test_versions_1:
|
||||
_build_test_item(e_path, test_file)
|
||||
|
||||
# in pypePath registry
|
||||
p_path = tmp_path_factory.mktemp("pypePath")
|
||||
# in openPypePath registry
|
||||
p_path = tmp_path_factory.mktemp("openPypePath")
|
||||
for test_file in test_versions_2:
|
||||
_build_test_item(p_path, test_file)
|
||||
|
||||
|
|
@ -310,10 +314,10 @@ def test_find_pype(fix_bootstrap, tmp_path_factory, monkeypatch, printer):
|
|||
for test_file in test_versions_4:
|
||||
_build_test_item(dir_path, test_file)
|
||||
|
||||
printer("testing finding Pype in given path ...")
|
||||
result = fix_bootstrap.find_pype(g_path, include_zips=True)
|
||||
printer("testing finding OpenPype in given path ...")
|
||||
result = fix_bootstrap.find_openpype(g_path, include_zips=True)
|
||||
# we should have results as file were created
|
||||
assert result is not None, "no Pype version found"
|
||||
assert result is not None, "no OpenPype version found"
|
||||
# latest item in `result` should be latest version found.
|
||||
expected_path = Path(
|
||||
g_path / "{}{}{}".format(
|
||||
|
|
@ -323,13 +327,14 @@ def test_find_pype(fix_bootstrap, tmp_path_factory, monkeypatch, printer):
|
|||
)
|
||||
)
|
||||
assert result, "nothing found"
|
||||
assert result[-1].path == expected_path, "not a latest version of Pype 3"
|
||||
assert result[-1].path == expected_path, ("not a latest version of "
|
||||
"OpenPype 3")
|
||||
|
||||
monkeypatch.setenv("PYPE_PATH", e_path.as_posix())
|
||||
monkeypatch.setenv("OPENPYPE_PATH", e_path.as_posix())
|
||||
|
||||
result = fix_bootstrap.find_pype(include_zips=True)
|
||||
result = fix_bootstrap.find_openpype(include_zips=True)
|
||||
# we should have results as file were created
|
||||
assert result is not None, "no Pype version found"
|
||||
assert result is not None, "no OpenPype version found"
|
||||
# latest item in `result` should be latest version found.
|
||||
expected_path = Path(
|
||||
e_path / "{}{}{}".format(
|
||||
|
|
@ -339,21 +344,23 @@ def test_find_pype(fix_bootstrap, tmp_path_factory, monkeypatch, printer):
|
|||
)
|
||||
)
|
||||
assert result, "nothing found"
|
||||
assert result[-1].path == expected_path, "not a latest version of Pype 1"
|
||||
assert result[-1].path == expected_path, ("not a latest version of "
|
||||
"OpenPype 1")
|
||||
|
||||
monkeypatch.delenv("PYPE_PATH", raising=False)
|
||||
monkeypatch.delenv("OPENPYPE_PATH", raising=False)
|
||||
|
||||
# mock appdirs user_data_dir
|
||||
def mock_user_data_dir(*args, **kwargs):
|
||||
"""Mock local app data dir."""
|
||||
return d_path.as_posix()
|
||||
|
||||
monkeypatch.setattr(appdirs, "user_data_dir", mock_user_data_dir)
|
||||
fix_bootstrap.registry = OpenPypeSettingsRegistry()
|
||||
fix_bootstrap.registry.set_item("pypePath", d_path.as_posix())
|
||||
fix_bootstrap.registry.set_item("openPypePath", d_path.as_posix())
|
||||
|
||||
result = fix_bootstrap.find_pype(include_zips=True)
|
||||
result = fix_bootstrap.find_openpype(include_zips=True)
|
||||
# we should have results as file were created
|
||||
assert result is not None, "no Pype version found"
|
||||
assert result is not None, "no OpenPype version found"
|
||||
# latest item in `result` should be latest version found.
|
||||
expected_path = Path(
|
||||
d_path / "{}{}{}".format(
|
||||
|
|
@ -363,10 +370,11 @@ def test_find_pype(fix_bootstrap, tmp_path_factory, monkeypatch, printer):
|
|||
)
|
||||
)
|
||||
assert result, "nothing found"
|
||||
assert result[-1].path == expected_path, "not a latest version of Pype 2"
|
||||
assert result[-1].path == expected_path, ("not a latest version of "
|
||||
"OpenPype 2")
|
||||
|
||||
result = fix_bootstrap.find_pype(e_path, include_zips=True)
|
||||
assert result is not None, "no Pype version found"
|
||||
result = fix_bootstrap.find_openpype(e_path, include_zips=True)
|
||||
assert result is not None, "no OpenPype version found"
|
||||
expected_path = Path(
|
||||
e_path / "{}{}{}".format(
|
||||
test_versions_1[5].prefix,
|
||||
|
|
@ -374,10 +382,11 @@ def test_find_pype(fix_bootstrap, tmp_path_factory, monkeypatch, printer):
|
|||
test_versions_1[5].suffix
|
||||
)
|
||||
)
|
||||
assert result[-1].path == expected_path, "not a latest version of Pype 1"
|
||||
assert result[-1].path == expected_path, ("not a latest version of "
|
||||
"OpenPype 1")
|
||||
|
||||
result = fix_bootstrap.find_pype(dir_path, include_zips=True)
|
||||
assert result is not None, "no Pype versions found"
|
||||
result = fix_bootstrap.find_openpype(dir_path, include_zips=True)
|
||||
assert result is not None, "no OpenPype versions found"
|
||||
expected_path = Path(
|
||||
dir_path / "{}{}{}".format(
|
||||
test_versions_4[0].prefix,
|
||||
|
|
@ -385,4 +394,5 @@ def test_find_pype(fix_bootstrap, tmp_path_factory, monkeypatch, printer):
|
|||
test_versions_4[0].suffix
|
||||
)
|
||||
)
|
||||
assert result[-1].path == expected_path, "not a latest version of Pype 4"
|
||||
assert result[-1].path == expected_path, ("not a latest version of "
|
||||
"OpenPype 4")
|
||||
|
|
|
|||
|
|
@ -1,5 +1,7 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Test suite for User Settings."""
|
||||
import pytest
|
||||
from pype.lib import (
|
||||
from igniter.user_settings import (
|
||||
IniSettingRegistry,
|
||||
JSONSettingRegistry,
|
||||
OpenPypeSecureRegistry
|
||||
|
|
@ -9,9 +11,9 @@ import configparser
|
|||
|
||||
|
||||
@pytest.fixture
|
||||
def secure_registry(tmpdir):
|
||||
def secure_registry():
|
||||
name = "pypetest_{}".format(str(uuid4()))
|
||||
r = OpenPypeSecureRegistry(name, tmpdir)
|
||||
r = OpenPypeSecureRegistry(name)
|
||||
yield r
|
||||
|
||||
|
||||
|
|
@ -14,6 +14,12 @@ PS> .\build.ps1
|
|||
|
||||
#>
|
||||
|
||||
$arguments=$ARGS
|
||||
$disable_submodule_update=""
|
||||
if($arguments -eq "--no-submodule-update") {
|
||||
$disable_submodule_update=$true
|
||||
}
|
||||
|
||||
function Start-Progress {
|
||||
param([ScriptBlock]$code)
|
||||
$scroll = "/-\|/-\|"
|
||||
|
|
@ -134,10 +140,14 @@ catch {
|
|||
Write-Host $_.Exception.Message
|
||||
Exit-WithCode 1
|
||||
}
|
||||
|
||||
Write-Host ">>> " -NoNewLine -ForegroundColor green
|
||||
Write-Host "Making sure submodules are up-to-date ..."
|
||||
git submodule update --init --recursive
|
||||
if (-not $disable_submodule_update) {
|
||||
Write-Host ">>> " -NoNewLine -ForegroundColor green
|
||||
Write-Host "Making sure submodules are up-to-date ..."
|
||||
git submodule update --init --recursive
|
||||
} else {
|
||||
Write-Host "*** " -NoNewLine -ForegroundColor yellow
|
||||
Write-Host "Not updating submodules ..."
|
||||
}
|
||||
|
||||
Write-Host ">>> " -NoNewline -ForegroundColor green
|
||||
Write-Host "OpenPype [ " -NoNewline -ForegroundColor white
|
||||
|
|
@ -164,6 +174,7 @@ Write-Host "OK" -ForegroundColor green
|
|||
|
||||
Write-Host ">>> " -NoNewline -ForegroundColor green
|
||||
Write-Host "Building OpenPype ..."
|
||||
$startTime = [int][double]::Parse((Get-Date -UFormat %s))
|
||||
|
||||
$out = & poetry run python setup.py build 2>&1
|
||||
if ($LASTEXITCODE -ne 0)
|
||||
|
|
@ -182,7 +193,8 @@ Write-Host ">>> " -NoNewline -ForegroundColor green
|
|||
Write-Host "restoring current directory"
|
||||
Set-Location -Path $current_dir
|
||||
|
||||
$endTime = [int][double]::Parse((Get-Date -UFormat %s))
|
||||
Write-Host "*** " -NoNewline -ForegroundColor Cyan
|
||||
Write-Host "All done. You will find OpenPype and build log in " -NoNewLine
|
||||
Write-Host "All done in $($endTime - $startTime) secs. You will find OpenPype and build log in " -NoNewLine
|
||||
Write-Host "'.\build'" -NoNewline -ForegroundColor Green
|
||||
Write-Host " directory."
|
||||
|
|
|
|||
|
|
@ -57,6 +57,26 @@ BIPurple='\033[1;95m' # Purple
|
|||
BICyan='\033[1;96m' # Cyan
|
||||
BIWhite='\033[1;97m' # White
|
||||
|
||||
args=$@
|
||||
disable_submodule_update = 0
|
||||
while :; do
|
||||
case $1 in
|
||||
--no-submodule-update)
|
||||
disable_submodule_update=1
|
||||
;;
|
||||
--)
|
||||
shift
|
||||
break
|
||||
;;
|
||||
*)
|
||||
break
|
||||
esac
|
||||
|
||||
shift
|
||||
done
|
||||
|
||||
|
||||
|
||||
|
||||
##############################################################################
|
||||
# Detect required version of python
|
||||
|
|
@ -172,9 +192,12 @@ main () {
|
|||
. "$openpype_root/tools/create_env.sh" || { echo -e "${BIRed}!!!${RST} Poetry installation failed"; return; }
|
||||
fi
|
||||
|
||||
if [ "$disable_submodule_update" == 1 ]; then
|
||||
echo -e "${BIGreen}>>>${RST} Making sure submodules are up-to-date ..."
|
||||
git submodule update --init --recursive
|
||||
|
||||
else
|
||||
echo -e "${BIYellow}***${RST} Not updating submodules ..."
|
||||
fi
|
||||
echo -e "${BIGreen}>>>${RST} Building ..."
|
||||
if [[ "$OSTYPE" == "linux-gnu"* ]]; then
|
||||
poetry run python "$openpype_root/setup.py" build > "$openpype_root/build/build.log" || { echo -e "${BIRed}!!!${RST} Build failed, see the build log."; return; }
|
||||
|
|
|
|||
|
|
@ -26,10 +26,12 @@ import platform
|
|||
from pathlib import Path
|
||||
import shutil
|
||||
import blessed
|
||||
import enlighten
|
||||
import time
|
||||
|
||||
|
||||
term = blessed.Terminal()
|
||||
manager = enlighten.get_manager()
|
||||
|
||||
|
||||
def _print(msg: str, type: int = 0) -> None:
|
||||
|
|
@ -52,6 +54,24 @@ def _print(msg: str, type: int = 0) -> None:
|
|||
print("{}{}".format(header, msg))
|
||||
|
||||
|
||||
def count_folders(path: Path) -> int:
|
||||
"""Recursively count items inside given Path.
|
||||
|
||||
Args:
|
||||
path (Path): Path to count.
|
||||
|
||||
Returns:
|
||||
int: number of items.
|
||||
|
||||
"""
|
||||
cnt = 0
|
||||
for child in path.iterdir():
|
||||
if child.is_dir():
|
||||
cnt += 1
|
||||
cnt += count_folders(child)
|
||||
return cnt
|
||||
|
||||
|
||||
_print("Starting dependency cleanup ...")
|
||||
start_time = time.time_ns()
|
||||
|
||||
|
|
@ -96,30 +116,55 @@ deps_dir = build_dir / "dependencies"
|
|||
|
||||
# copy all files
|
||||
_print("Copying dependencies ...")
|
||||
shutil.copytree(site_pkg.as_posix(), deps_dir.as_posix())
|
||||
|
||||
total_files = count_folders(site_pkg)
|
||||
progress_bar = enlighten.Counter(
|
||||
total=total_files, desc="Processing Dependencies",
|
||||
units="%", color="green")
|
||||
|
||||
|
||||
def _progress(_base, _names):
|
||||
progress_bar.update()
|
||||
return []
|
||||
|
||||
|
||||
shutil.copytree(site_pkg.as_posix(),
|
||||
deps_dir.as_posix(),
|
||||
ignore=_progress)
|
||||
progress_bar.close()
|
||||
# iterate over frozen libs and create list to delete
|
||||
libs_dir = build_dir / "lib"
|
||||
|
||||
to_delete = []
|
||||
_print("Finding duplicates ...")
|
||||
# _print("Finding duplicates ...")
|
||||
deps_items = list(deps_dir.iterdir())
|
||||
item_count = len(list(libs_dir.iterdir()))
|
||||
find_progress_bar = enlighten.Counter(
|
||||
total=item_count, desc="Finding duplicates", units="%", color="yellow")
|
||||
|
||||
for d in libs_dir.iterdir():
|
||||
if (deps_dir / d.name) in deps_items:
|
||||
to_delete.append(d)
|
||||
_print(f"found {d}", 3)
|
||||
# _print(f"found {d}", 3)
|
||||
find_progress_bar.update()
|
||||
|
||||
find_progress_bar.close()
|
||||
# add openpype and igniter in libs too
|
||||
to_delete.append(libs_dir / "openpype")
|
||||
to_delete.append(libs_dir / "igniter")
|
||||
|
||||
# delete duplicates
|
||||
_print(f"Deleting {len(to_delete)} duplicates ...")
|
||||
# _print(f"Deleting {len(to_delete)} duplicates ...")
|
||||
delete_progress_bar = enlighten.Counter(
|
||||
total=len(to_delete), desc="Deleting duplicates", units="%", color="red")
|
||||
for d in to_delete:
|
||||
if d.is_dir():
|
||||
shutil.rmtree(d)
|
||||
else:
|
||||
d.unlink()
|
||||
delete_progress_bar.update()
|
||||
|
||||
delete_progress_bar.close()
|
||||
|
||||
end_time = time.time_ns()
|
||||
total_time = (end_time - start_time) / 1000000000
|
||||
|
|
|
|||
18
tools/run_project_manager.ps1
Normal file
18
tools/run_project_manager.ps1
Normal file
|
|
@ -0,0 +1,18 @@
|
|||
<#
|
||||
.SYNOPSIS
|
||||
Helper script OpenPype Tray.
|
||||
|
||||
.DESCRIPTION
|
||||
|
||||
|
||||
.EXAMPLE
|
||||
|
||||
PS> .\run_tray.ps1
|
||||
|
||||
#>
|
||||
$current_dir = Get-Location
|
||||
$script_dir = Split-Path -Path $MyInvocation.MyCommand.Definition -Parent
|
||||
$openpype_root = (Get-Item $script_dir).parent.FullName
|
||||
Set-Location -Path $openpype_root
|
||||
& poetry run python "$($openpype_root)\start.py" projectmanager
|
||||
Set-Location -Path $current_dir
|
||||
|
|
@ -122,5 +122,4 @@ main () {
|
|||
PYTHONPATH=$original_pythonpath
|
||||
}
|
||||
|
||||
|
||||
|
||||
main
|
||||
|
|
|
|||
28
website/README.md
Normal file
28
website/README.md
Normal file
|
|
@ -0,0 +1,28 @@
|
|||
When developing on Windows make sure `start.sh` has the correct line endings (`LF`).
|
||||
|
||||
Start via yarn:
|
||||
---------------
|
||||
Clone repository
|
||||
|
||||
Install yarn if not already installed (https://classic.yarnpkg.com/en/docs/install)
|
||||
For example via npm (but could be installed differently too)
|
||||
|
||||
```npm install --global yarn```
|
||||
|
||||
Then go to `website` folder
|
||||
|
||||
```yarn install``` (takes a while)
|
||||
|
||||
To start local test server:
|
||||
|
||||
```yarn start```
|
||||
|
||||
Server is accessible by default on http://localhost:3000
|
||||
|
||||
Start via docker:
|
||||
-----------------
|
||||
Setting for docker container:
|
||||
```bash
|
||||
docker build . -t pype-docs
|
||||
docker run --rm -p 3000:3000 -v /c/Users/admin/openpype.io:/app pype-docs
|
||||
```
|
||||
39
website/docs/admin_hosts_aftereffects.md
Normal file
39
website/docs/admin_hosts_aftereffects.md
Normal file
|
|
@ -0,0 +1,39 @@
|
|||
---
|
||||
id: admin_hosts_aftereffects
|
||||
title: AfterEffects Settings
|
||||
sidebar_label: AfterEffects
|
||||
---
|
||||
|
||||
import Tabs from '@theme/Tabs';
|
||||
import TabItem from '@theme/TabItem';
|
||||
|
||||
## AfterEffects settings
|
||||
|
||||
There is a couple of settings that could configure publishing process for **AfterEffects**.
|
||||
All of them are Project based, eg. each project could have different configuration.
|
||||
|
||||
Location: Settings > Project > AfterEffects
|
||||
|
||||

|
||||
|
||||
## Publish plugins
|
||||
|
||||
### Validate Scene Settings
|
||||
|
||||
#### Skip Resolution Check for Tasks
|
||||
|
||||
Set regex pattern(s) to look for in a Task name to skip resolution check against values from DB.
|
||||
|
||||
#### Skip Timeline Check for Tasks
|
||||
|
||||
Set regex pattern(s) to look for in a Task name to skip `frameStart`, `frameEnd` check against values from DB.
|
||||
|
||||
### AfterEffects Submit to Deadline
|
||||
|
||||
* `Use Published scene` - Set to True (green) when Deadline should take published scene as a source instead of uploaded local one.
|
||||
* `Priority` - priority of job on farm
|
||||
* `Primary Pool` - here is list of pool fetched from server you can select from.
|
||||
* `Secondary Pool`
|
||||
* `Frames Per Task` - number of sequence division between individual tasks (chunks)
|
||||
making one job on farm.
|
||||
|
||||
51
website/docs/admin_hosts_harmony.md
Normal file
51
website/docs/admin_hosts_harmony.md
Normal file
|
|
@ -0,0 +1,51 @@
|
|||
---
|
||||
id: admin_hosts_harmony
|
||||
title: ToonBoom Harmony Settings
|
||||
sidebar_label: ToonBoom Harmony
|
||||
---
|
||||
|
||||
import Tabs from '@theme/Tabs';
|
||||
import TabItem from '@theme/TabItem';
|
||||
|
||||
## ToonBoom Harmony settings
|
||||
|
||||
There is a couple of settings that could configure publishing process for **ToonBoom Harmony**.
|
||||
All of them are Project based, eg. each project could have different configuration.
|
||||
|
||||
Location: Settings > Project > Harmony
|
||||
|
||||

|
||||
|
||||
## Publish plugins
|
||||
|
||||
### Collect Palettes
|
||||
|
||||
#### Allowed tasks
|
||||
|
||||
Set regex pattern(s) only for task names when publishing of Palettes should occur.
|
||||
|
||||
Use ".*" to publish Palettes for ALL tasks.
|
||||
|
||||
### Validate Scene Settings
|
||||
|
||||
#### Skip Frame check for Assets with
|
||||
|
||||
Set regex pattern(s) for filtering Asset name that should skip validation of `frameEnd` value from DB.
|
||||
|
||||
#### Skip Resolution Check for Tasks
|
||||
|
||||
Set regex pattern(s) for filtering Asset name that should skip validation or `Resolution` value from DB.
|
||||
|
||||
#### Skip Timeline Check for Tasks
|
||||
|
||||
Set regex pattern(s) for filtering Task name that should skip validation `frameStart`, `frameEnd` check against values from DB.
|
||||
|
||||
### Harmony Submit to Deadline
|
||||
|
||||
* `Use Published scene` - Set to True (green) when Deadline should take published scene as a source instead of uploaded local one.
|
||||
* `Priority` - priority of job on farm
|
||||
* `Primary Pool` - here is list of pool fetched from server you can select from.
|
||||
* `Secondary Pool`
|
||||
* `Frames Per Task` - number of sequence division between individual tasks (chunks)
|
||||
making one job on farm.
|
||||
|
||||
52
website/docs/admin_hosts_maya.md
Normal file
52
website/docs/admin_hosts_maya.md
Normal file
|
|
@ -0,0 +1,52 @@
|
|||
---
|
||||
id: admin_hosts_maya
|
||||
title: Maya
|
||||
sidebar_label: Maya
|
||||
---
|
||||
|
||||
## Maya
|
||||
|
||||
### Publish Plugins
|
||||
|
||||
#### Render Settings Validator (`ValidateRenderSettings`)
|
||||
|
||||
Render Settings Validator is here to make sure artists will submit renders
|
||||
we correct settings. Some of these settings are needed by OpenPype but some
|
||||
can be defined by TD using [OpenPype Settings UI](admin_settings).
|
||||
|
||||
OpenPype enforced settings include:
|
||||
|
||||
- animation must be enabled in output
|
||||
- render prefix must start with `maya/<scene>` to make sure renders are in
|
||||
correct directory
|
||||
- there must be `<renderlayer>` or its equivalent in different renderers in
|
||||
file prefix
|
||||
- if multiple cameras are to be rendered, `<camera>` token must be in file prefix
|
||||
|
||||
For **Vray**:
|
||||
- AOV separator must be set to `_` (underscore)
|
||||
|
||||
For **Redshift**:
|
||||
- all AOVs must follow `<BeautyPath>/<BeautyFile>_<RenderPass>` image file prefix
|
||||
- AOV image format must be same as the one set in Output settings
|
||||
|
||||
For **Renderman**:
|
||||
- both image and directory prefixes must comply to `<layer>_<aov>.<f4>.<ext>` and `<ws>/renders/maya/<scene>/<layer>` respectively
|
||||
|
||||
For **Arnold**:
|
||||
- there shouldn't be `<renderpass>` token when merge AOVs option is turned on
|
||||
|
||||
|
||||
Additional check can be added via Settings - **Project Settings > Maya > Publish plugin > ValidateRenderSettings**.
|
||||
You can add as many options as you want for every supported renderer. In first field put node type and attribute
|
||||
and in the second required value.
|
||||
|
||||

|
||||
|
||||
In this example we've put `aiOptions.AA_samples` in first one and `6` to second to enforce
|
||||
Arnolds Camera (AA) samples to 6.
|
||||
|
||||
Note that `aiOptions` is not the name of node but rather its type. For renderers there is usually
|
||||
just one instance of this node type but if that is not so, validator will go through all its
|
||||
instances and check the value there. Node type for **VRay** settings is `VRaySettingsNode`, for **Renderman**
|
||||
it is `rmanGlobals`, for **Redshift** it is `RedshiftOptions`.
|
||||
|
|
@ -142,6 +142,22 @@ You can set group of selected subsets with shortcut `Ctrl + G`.
|
|||
You'll set the group in Avalon database so your changes will take effect for all users.
|
||||
:::
|
||||
|
||||
### Site Sync support
|
||||
|
||||
If **Site Sync** is enabled additional widget is shown in right bottom corner.
|
||||
It contains list of all representations of selected version(s). It also shows availability of representation files
|
||||
on particular site (*active* - mine, *remote* - theirs).
|
||||
|
||||

|
||||
|
||||
On this picture you see that representation files are available only on remote site (could be GDrive or other).
|
||||
If artist wants to work with the file(s) they need to be downloaded first. That could be done by right mouse click on
|
||||
particular representation (or multiselect all) and select *Download*.
|
||||
|
||||
This will mark representation to be download which will happen in the background if OpenPype Tray is running.
|
||||
|
||||
For more details of progress, state or possible error details artist should open **[Sync Queue](#Sync-Queue)** item in Tray app.
|
||||
|
||||
Work in progress...
|
||||
|
||||
## Library Loader
|
||||
|
|
@ -412,3 +428,35 @@ It might also happen that user deletes underlying host item(for example layer in
|
|||
This could result in phantom issues during publishing. Use Subset Manager to purge workfile from abandoned items.
|
||||
|
||||
Please check behaviour in host of your choice.
|
||||
|
||||
## Sync Queue
|
||||
|
||||
### Details
|
||||
|
||||
If **Site Sync** is configured for a project, each asset is marked to be synchronized to a remote site during publishing.
|
||||
Each artist's OpenPype Tray application handles synchronization in background, it looks for all representation which
|
||||
are marked with the site of the user (unique site name per artist) and remote site.
|
||||
|
||||
Artists then can see progress of synchronization via **Sync Queue** link in the Tray application.
|
||||
|
||||
Artists can see all synced representation in this dialog with helpful information such as when representation was created, when it was synched,
|
||||
status of synchronization (OK or Fail) etc.
|
||||
|
||||
### Usage
|
||||
|
||||
With this app artists can modify synchronized representation, for example mark failed representation for re-sync etc.
|
||||
|
||||

|
||||
|
||||
Actions accessible by context menu on single (or multiple representations):
|
||||
- *Open in Explorer* - if site is locally accessible, open folder with it with OS based explorer
|
||||
- *Re-sync Active Site* - mark artist own side for re-download (repre must be accessible on remote side)
|
||||
- *Re-sync Remote Site* - mark representation for re-upload
|
||||
- *Completely remove from local* - removes tag of synchronization to artist's local site, removes files from disk (available only for personal sites)
|
||||
- *Change priority* - mark representations with higher priority for faster synchronization run
|
||||
|
||||
Double click on any of the representation open Detail dialog with information about all files for particular representation.
|
||||
In this dialog error details could be accessed in the context menu.
|
||||
|
||||
Artists can also Pause whole server or specific project for synchronization. In that state no download/upload is being run.
|
||||
This might be helpful if the artist is not interested in a particular project for a while or wants to save bandwidth data limit for a bit.
|
||||
BIN
website/docs/assets/admin_hosts_aftereffects_settings.png
Normal file
BIN
website/docs/assets/admin_hosts_aftereffects_settings.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 31 KiB |
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue