Merge branch 'develop' into bugfix/OP-3654_Flame-re-timing-produces-frame-range-discrepancy-

This commit is contained in:
Jakub Jezek 2022-08-19 17:41:49 +02:00
commit 44770cb6c2
No known key found for this signature in database
GPG key ID: 730D7C02726179A7
114 changed files with 8911 additions and 1464 deletions

5
.gitmodules vendored
View file

@ -4,7 +4,4 @@
[submodule "tools/modules/powershell/PSWriteColor"]
path = tools/modules/powershell/PSWriteColor
url = https://github.com/EvotecIT/PSWriteColor.git
[submodule "vendor/configs/OpenColorIO-Configs"]
path = vendor/configs/OpenColorIO-Configs
url = https://github.com/imageworks/OpenColorIO-Configs
url = https://github.com/EvotecIT/PSWriteColor.git

View file

@ -1,21 +1,40 @@
# Changelog
## [3.13.1-nightly.3](https://github.com/pypeclub/OpenPype/tree/HEAD)
## [3.14.0](https://github.com/pypeclub/OpenPype/tree/3.14.0) (2022-08-18)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.13.0...HEAD)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.13.0...3.14.0)
**🆕 New features**
- Maya: Build workfile by template [\#3578](https://github.com/pypeclub/OpenPype/pull/3578)
**🚀 Enhancements**
- Ftrack: Addiotional component metadata [\#3685](https://github.com/pypeclub/OpenPype/pull/3685)
- Ftrack: Set task status on farm publishing [\#3680](https://github.com/pypeclub/OpenPype/pull/3680)
- Ftrack: Set task status on task creation in integrate hierarchy [\#3675](https://github.com/pypeclub/OpenPype/pull/3675)
- Maya: Disable rendering of all lights for render instances submitted through Deadline. [\#3661](https://github.com/pypeclub/OpenPype/pull/3661)
- General: Optimized OCIO configs [\#3650](https://github.com/pypeclub/OpenPype/pull/3650)
**🐛 Bug fixes**
- General: Switch from hero version to versioned works [\#3691](https://github.com/pypeclub/OpenPype/pull/3691)
- General: Fix finding of last version [\#3656](https://github.com/pypeclub/OpenPype/pull/3656)
- General: Extract Review can scale with pixel aspect ratio [\#3644](https://github.com/pypeclub/OpenPype/pull/3644)
- Maya: Refactor moved usage of CreateRender settings [\#3643](https://github.com/pypeclub/OpenPype/pull/3643)
- General: Hero version representations have full context [\#3638](https://github.com/pypeclub/OpenPype/pull/3638)
- Nuke: color settings for render write node is working now [\#3632](https://github.com/pypeclub/OpenPype/pull/3632)
- Maya: FBX support for update in reference loader [\#3631](https://github.com/pypeclub/OpenPype/pull/3631)
- Integrator: Don't force to have dot before frame [\#3611](https://github.com/pypeclub/OpenPype/pull/3611)
**🔀 Refactored code**
- General: Use client projects getter [\#3673](https://github.com/pypeclub/OpenPype/pull/3673)
- Resolve: Match folder structure to other hosts [\#3653](https://github.com/pypeclub/OpenPype/pull/3653)
- Maya: Hosts as modules [\#3647](https://github.com/pypeclub/OpenPype/pull/3647)
- TimersManager: Plugins are in timers manager module [\#3639](https://github.com/pypeclub/OpenPype/pull/3639)
- General: Move workfiles functions into pipeline [\#3637](https://github.com/pypeclub/OpenPype/pull/3637)
- General: Workfiles builder using query functions [\#3598](https://github.com/pypeclub/OpenPype/pull/3598)
**Merged pull requests:**
@ -51,7 +70,6 @@
- Ftrack: Sync hierarchical attributes can handle new created entities [\#3621](https://github.com/pypeclub/OpenPype/pull/3621)
- General: Extract review aspect ratio scale is calculated by ffmpeg [\#3620](https://github.com/pypeclub/OpenPype/pull/3620)
- Maya: Fix types of default settings [\#3617](https://github.com/pypeclub/OpenPype/pull/3617)
- Integrator: Don't force to have dot before frame [\#3611](https://github.com/pypeclub/OpenPype/pull/3611)
- AfterEffects: refactored integrate doesnt work formulti frame publishes [\#3610](https://github.com/pypeclub/OpenPype/pull/3610)
- Maya look data contents fails with custom attribute on group [\#3607](https://github.com/pypeclub/OpenPype/pull/3607)
- TrayPublisher: Fix wrong conflict merge [\#3600](https://github.com/pypeclub/OpenPype/pull/3600)
@ -89,7 +107,6 @@
**🚀 Enhancements**
- General: Global thumbnail extractor is ready for more cases [\#3561](https://github.com/pypeclub/OpenPype/pull/3561)
- Maya: add additional validators to Settings [\#3540](https://github.com/pypeclub/OpenPype/pull/3540)
**🐛 Bug fixes**
@ -100,16 +117,10 @@
- General: Remove hosts filter on integrator plugins [\#3556](https://github.com/pypeclub/OpenPype/pull/3556)
- Settings: Clean default values of environments [\#3550](https://github.com/pypeclub/OpenPype/pull/3550)
- Module interfaces: Fix import error [\#3547](https://github.com/pypeclub/OpenPype/pull/3547)
- Workfiles tool: Show of tool and it's flags [\#3539](https://github.com/pypeclub/OpenPype/pull/3539)
- General: Create workfile documents works again [\#3538](https://github.com/pypeclub/OpenPype/pull/3538)
**🔀 Refactored code**
- General: Use query functions in integrator [\#3563](https://github.com/pypeclub/OpenPype/pull/3563)
- General: Mongo core connection moved to client [\#3531](https://github.com/pypeclub/OpenPype/pull/3531)
- Refactor Integrate Asset [\#3530](https://github.com/pypeclub/OpenPype/pull/3530)
- General: Client docstrings cleanup [\#3529](https://github.com/pypeclub/OpenPype/pull/3529)
- General: Move load related functions into pipeline [\#3527](https://github.com/pypeclub/OpenPype/pull/3527)
**Merged pull requests:**

View file

@ -381,7 +381,7 @@ class OpenPypeVersion(semver.VersionInfo):
@classmethod
def get_local_versions(
cls, production: bool = None,
staging: bool = None, compatible_with: OpenPypeVersion = None
staging: bool = None
) -> List:
"""Get all versions available on this machine.
@ -391,8 +391,10 @@ class OpenPypeVersion(semver.VersionInfo):
Args:
production (bool): Return production versions.
staging (bool): Return staging versions.
compatible_with (OpenPypeVersion): Return only those compatible
with specified version.
Returns:
list: of compatible versions available on the machine.
"""
# Return all local versions if arguments are set to None
if production is None and staging is None:
@ -411,16 +413,7 @@ class OpenPypeVersion(semver.VersionInfo):
# DEPRECATED: backwards compatible way to look for versions in root
dir_to_search = Path(user_data_dir("openpype", "pypeclub"))
versions = OpenPypeVersion.get_versions_from_directory(
dir_to_search, compatible_with=compatible_with
)
if compatible_with:
dir_to_search = Path(
user_data_dir("openpype", "pypeclub")) / f"{compatible_with.major}.{compatible_with.minor}" # noqa
versions += OpenPypeVersion.get_versions_from_directory(
dir_to_search, compatible_with=compatible_with
)
versions = OpenPypeVersion.get_versions_from_directory(dir_to_search)
filtered_versions = []
for version in versions:
@ -434,7 +427,7 @@ class OpenPypeVersion(semver.VersionInfo):
@classmethod
def get_remote_versions(
cls, production: bool = None,
staging: bool = None, compatible_with: OpenPypeVersion = None
staging: bool = None
) -> List:
"""Get all versions available in OpenPype Path.
@ -444,8 +437,7 @@ class OpenPypeVersion(semver.VersionInfo):
Args:
production (bool): Return production versions.
staging (bool): Return staging versions.
compatible_with (OpenPypeVersion): Return only those compatible
with specified version.
"""
# Return all local versions if arguments are set to None
if production is None and staging is None:
@ -479,13 +471,7 @@ class OpenPypeVersion(semver.VersionInfo):
if not dir_to_search:
return []
# DEPRECATED: look for version in root directory
versions = cls.get_versions_from_directory(
dir_to_search, compatible_with=compatible_with)
if compatible_with:
dir_to_search = dir_to_search / f"{compatible_with.major}.{compatible_with.minor}" # noqa
versions += cls.get_versions_from_directory(
dir_to_search, compatible_with=compatible_with)
versions = cls.get_versions_from_directory(dir_to_search)
filtered_versions = []
for version in versions:
@ -498,14 +484,11 @@ class OpenPypeVersion(semver.VersionInfo):
@staticmethod
def get_versions_from_directory(
openpype_dir: Path,
compatible_with: OpenPypeVersion = None) -> List:
openpype_dir: Path) -> List:
"""Get all detected OpenPype versions in directory.
Args:
openpype_dir (Path): Directory to scan.
compatible_with (OpenPypeVersion): Return only versions compatible
with build version specified as OpenPypeVersion.
Returns:
list of OpenPypeVersion
@ -514,15 +497,22 @@ class OpenPypeVersion(semver.VersionInfo):
ValueError: if invalid path is specified.
"""
_openpype_versions = []
openpype_versions = []
if not openpype_dir.exists() and not openpype_dir.is_dir():
return _openpype_versions
return openpype_versions
# iterate over directory in first level and find all that might
# contain OpenPype.
for item in openpype_dir.iterdir():
# if the item is directory with major.minor version, dive deeper
# if file, strip extension, in case of dir not.
if item.is_dir() and re.match(r"^\d+\.\d+$", item.name):
_versions = OpenPypeVersion.get_versions_from_directory(
item)
if _versions:
openpype_versions += _versions
# if file exists, strip extension, in case of dir don't.
name = item.name if item.is_dir() else item.stem
result = OpenPypeVersion.version_in_str(name)
@ -540,14 +530,10 @@ class OpenPypeVersion(semver.VersionInfo):
)[0]:
continue
if compatible_with and not detected_version.is_compatible(
compatible_with):
continue
detected_version.path = item
_openpype_versions.append(detected_version)
openpype_versions.append(detected_version)
return sorted(_openpype_versions)
return sorted(openpype_versions)
@staticmethod
def get_installed_version_str() -> str:
@ -575,15 +561,14 @@ class OpenPypeVersion(semver.VersionInfo):
def get_latest_version(
staging: bool = False,
local: bool = None,
remote: bool = None,
compatible_with: OpenPypeVersion = None
remote: bool = None
) -> Union[OpenPypeVersion, None]:
"""Get latest available version.
"""Get the latest available version.
The version does not contain information about path and source.
This is utility version to get latest version from all found. Build
version is not listed if staging is enabled.
This is utility version to get the latest version from all found.
Build version is not listed if staging is enabled.
Arguments 'local' and 'remote' define if local and remote repository
versions are used. All versions are used if both are not set (or set
@ -595,8 +580,9 @@ class OpenPypeVersion(semver.VersionInfo):
staging (bool, optional): List staging versions if True.
local (bool, optional): List local versions if True.
remote (bool, optional): List remote versions if True.
compatible_with (OpenPypeVersion, optional) Return only version
compatible with compatible_with.
Returns:
Latest OpenPypeVersion or None
"""
if local is None and remote is None:
@ -628,12 +614,7 @@ class OpenPypeVersion(semver.VersionInfo):
return None
all_versions.sort()
latest_version: OpenPypeVersion
latest_version = all_versions[-1]
if compatible_with and not latest_version.is_compatible(
compatible_with):
return None
return latest_version
return all_versions[-1]
@classmethod
def get_expected_studio_version(cls, staging=False, global_settings=None):
@ -764,9 +745,9 @@ class BootstrapRepos:
self, repo_dir: Path = None) -> Union[OpenPypeVersion, None]:
"""Copy zip created from OpenPype repositories to user data dir.
This detect OpenPype version either in local "live" OpenPype
This detects OpenPype version either in local "live" OpenPype
repository or in user provided path. Then it will zip it in temporary
directory and finally it will move it to destination which is user
directory, and finally it will move it to destination which is user
data directory. Existing files will be replaced.
Args:
@ -777,7 +758,7 @@ class BootstrapRepos:
"""
# if repo dir is not set, we detect local "live" OpenPype repository
# version and use it as a source. Otherwise repo_dir is user
# version and use it as a source. Otherwise, repo_dir is user
# entered location.
if repo_dir:
version = self.get_version(repo_dir)
@ -1141,28 +1122,27 @@ class BootstrapRepos:
@staticmethod
def find_openpype_version(
version: Union[str, OpenPypeVersion],
staging: bool,
compatible_with: OpenPypeVersion = None
staging: bool
) -> Union[OpenPypeVersion, None]:
"""Find location of specified OpenPype version.
Args:
version (Union[str, OpenPypeVersion): Version to find.
staging (bool): Filter staging versions.
compatible_with (OpenPypeVersion, optional): Find only
versions compatible with specified one.
Returns:
requested OpenPypeVersion.
"""
installed_version = OpenPypeVersion.get_installed_version()
if isinstance(version, str):
version = OpenPypeVersion(version=version)
installed_version = OpenPypeVersion.get_installed_version()
if installed_version == version:
return installed_version
local_versions = OpenPypeVersion.get_local_versions(
staging=staging, production=not staging,
compatible_with=compatible_with
staging=staging, production=not staging
)
zip_version = None
for local_version in local_versions:
@ -1176,8 +1156,7 @@ class BootstrapRepos:
return zip_version
remote_versions = OpenPypeVersion.get_remote_versions(
staging=staging, production=not staging,
compatible_with=compatible_with
staging=staging, production=not staging
)
for remote_version in remote_versions:
if remote_version == version:
@ -1186,13 +1165,23 @@ class BootstrapRepos:
@staticmethod
def find_latest_openpype_version(
staging, compatible_with: OpenPypeVersion = None):
staging: bool
) -> Union[OpenPypeVersion, None]:
"""Find the latest available OpenPype version in all location.
Args:
staging (bool): True to look for staging versions.
Returns:
Latest OpenPype version on None if nothing was found.
"""
installed_version = OpenPypeVersion.get_installed_version()
local_versions = OpenPypeVersion.get_local_versions(
staging=staging, compatible_with=compatible_with
staging=staging
)
remote_versions = OpenPypeVersion.get_remote_versions(
staging=staging, compatible_with=compatible_with
staging=staging
)
all_versions = local_versions + remote_versions
if not staging:
@ -1217,8 +1206,7 @@ class BootstrapRepos:
self,
openpype_path: Union[Path, str] = None,
staging: bool = False,
include_zips: bool = False,
compatible_with: OpenPypeVersion = None
include_zips: bool = False
) -> Union[List[OpenPypeVersion], None]:
"""Get ordered dict of detected OpenPype version.
@ -1235,8 +1223,6 @@ class BootstrapRepos:
otherwise.
include_zips (bool, optional): If set True it will try to find
OpenPype in zip files in given directory.
compatible_with (OpenPypeVersion, optional): Find only those
versions compatible with the one specified.
Returns:
dict of Path: Dictionary of detected OpenPype version.
@ -1255,52 +1241,34 @@ class BootstrapRepos:
("Finding OpenPype in non-filesystem locations is"
" not implemented yet."))
version_dir = ""
if compatible_with:
version_dir = f"{compatible_with.major}.{compatible_with.minor}"
# if checks bellow for OPENPYPE_PATH and registry fails, use data_dir
# DEPRECATED: lookup in root of this folder is deprecated in favour
# of major.minor sub-folders.
dirs_to_search = [
self.data_dir
]
if compatible_with:
dirs_to_search.append(self.data_dir / version_dir)
dirs_to_search = [self.data_dir]
if openpype_path:
dirs_to_search = [openpype_path]
if compatible_with:
dirs_to_search.append(openpype_path / version_dir)
else:
elif os.getenv("OPENPYPE_PATH") \
and Path(os.getenv("OPENPYPE_PATH")).exists():
# first try OPENPYPE_PATH and if that is not available,
# try registry.
if os.getenv("OPENPYPE_PATH") \
and Path(os.getenv("OPENPYPE_PATH")).exists():
dirs_to_search = [Path(os.getenv("OPENPYPE_PATH"))]
dirs_to_search = [Path(os.getenv("OPENPYPE_PATH"))]
else:
try:
registry_dir = Path(
str(self.registry.get_item("openPypePath")))
if registry_dir.exists():
dirs_to_search = [registry_dir]
if compatible_with:
dirs_to_search.append(
Path(os.getenv("OPENPYPE_PATH")) / version_dir)
else:
try:
registry_dir = Path(
str(self.registry.get_item("openPypePath")))
if registry_dir.exists():
dirs_to_search = [registry_dir]
if compatible_with:
dirs_to_search.append(registry_dir / version_dir)
except ValueError:
# nothing found in registry, we'll use data dir
pass
except ValueError:
# nothing found in registry, we'll use data dir
pass
openpype_versions = []
for dir_to_search in dirs_to_search:
try:
openpype_versions += self.get_openpype_versions(
dir_to_search, staging, compatible_with=compatible_with)
dir_to_search, staging)
except ValueError:
# location is invalid, skip it
pass
@ -1668,15 +1636,12 @@ class BootstrapRepos:
def get_openpype_versions(
self,
openpype_dir: Path,
staging: bool = False,
compatible_with: OpenPypeVersion = None) -> list:
staging: bool = False) -> list:
"""Get all detected OpenPype versions in directory.
Args:
openpype_dir (Path): Directory to scan.
staging (bool, optional): Find staging versions if True.
compatible_with (OpenPypeVersion, optional): Get only versions
compatible with the one specified.
Returns:
list of OpenPypeVersion
@ -1688,12 +1653,18 @@ class BootstrapRepos:
if not openpype_dir.exists() and not openpype_dir.is_dir():
raise ValueError(f"specified directory {openpype_dir} is invalid")
_openpype_versions = []
openpype_versions = []
# iterate over directory in first level and find all that might
# contain OpenPype.
for item in openpype_dir.iterdir():
# if the item is directory with major.minor version, dive deeper
if item.is_dir() and re.match(r"^\d+\.\d+$", item.name):
_versions = self.get_openpype_versions(
item, staging=staging)
if _versions:
openpype_versions += _versions
# if file, strip extension, in case of dir not.
# if it is file, strip extension, in case of dir don't.
name = item.name if item.is_dir() else item.stem
result = OpenPypeVersion.version_in_str(name)
@ -1711,18 +1682,14 @@ class BootstrapRepos:
):
continue
if compatible_with and \
not detected_version.is_compatible(compatible_with):
continue
detected_version.path = item
if staging and detected_version.is_staging():
_openpype_versions.append(detected_version)
openpype_versions.append(detected_version)
if not staging and not detected_version.is_staging():
_openpype_versions.append(detected_version)
openpype_versions.append(detected_version)
return sorted(_openpype_versions)
return sorted(openpype_versions)
class OpenPypeVersionExists(Exception):

View file

@ -62,7 +62,7 @@ class InstallThread(QThread):
progress_callback=self.set_progress, message=self.message)
local_version = OpenPypeVersion.get_installed_version_str()
# if user did entered nothing, we install OpenPype from local version.
# if user did enter nothing, we install OpenPype from local version.
# zip content of `repos`, copy it to user data dir and append
# version to it.
if not self._path:
@ -93,6 +93,23 @@ class InstallThread(QThread):
detected = bs.find_openpype(include_zips=True)
if detected:
if not OpenPypeVersion.get_installed_version().is_compatible(
detected[-1]):
self.message.emit((
f"Latest detected version {detected[-1]} "
"is not compatible with the currently running "
f"{local_version}"
), True)
self.message.emit((
"Filtering detected versions to compatible ones..."
), False)
detected = [
version for version in detected
if version.is_compatible(
OpenPypeVersion.get_installed_version())
]
if OpenPypeVersion(
version=local_version, path=Path()) < detected[-1]:
self.message.emit((

View file

@ -6,6 +6,7 @@ that has project name as a context (e.g. on 'ProjectEntity'?).
+ We will need more specific functions doing wery specific queires really fast.
"""
import re
import collections
import six
@ -1009,17 +1010,70 @@ def get_representation_by_name(
return conn.find_one(query_filter, _prepare_fields(fields))
def _flatten_dict(data):
flatten_queue = collections.deque()
flatten_queue.append(data)
output = {}
while flatten_queue:
item = flatten_queue.popleft()
for key, value in item.items():
if not isinstance(value, dict):
output[key] = value
continue
tmp = {}
for subkey, subvalue in value.items():
new_key = "{}.{}".format(key, subkey)
tmp[new_key] = subvalue
flatten_queue.append(tmp)
return output
def _regex_filters(filters):
output = []
for key, value in filters.items():
regexes = []
a_values = []
if isinstance(value, re.Pattern):
regexes.append(value)
elif isinstance(value, (list, tuple, set)):
for item in value:
if isinstance(item, re.Pattern):
regexes.append(item)
else:
a_values.append(item)
else:
a_values.append(value)
key_filters = []
if len(a_values) == 1:
key_filters.append({key: a_values[0]})
elif a_values:
key_filters.append({key: {"$in": a_values}})
for regex in regexes:
key_filters.append({key: {"$regex": regex}})
if len(key_filters) == 1:
output.append(key_filters[0])
else:
output.append({"$or": key_filters})
return output
def _get_representations(
project_name,
representation_ids,
representation_names,
version_ids,
extensions,
context_filters,
names_by_version_ids,
standard,
archived,
fields
):
default_output = []
repre_types = []
if standard:
repre_types.append("representation")
@ -1027,7 +1081,7 @@ def _get_representations(
repre_types.append("archived_representation")
if not repre_types:
return []
return default_output
if len(repre_types) == 1:
query_filter = {"type": repre_types[0]}
@ -1037,25 +1091,21 @@ def _get_representations(
if representation_ids is not None:
representation_ids = _convert_ids(representation_ids)
if not representation_ids:
return []
return default_output
query_filter["_id"] = {"$in": representation_ids}
if representation_names is not None:
if not representation_names:
return []
return default_output
query_filter["name"] = {"$in": list(representation_names)}
if version_ids is not None:
version_ids = _convert_ids(version_ids)
if not version_ids:
return []
return default_output
query_filter["parent"] = {"$in": version_ids}
if extensions is not None:
if not extensions:
return []
query_filter["context.ext"] = {"$in": list(extensions)}
or_queries = []
if names_by_version_ids is not None:
or_query = []
for version_id, names in names_by_version_ids.items():
@ -1065,8 +1115,36 @@ def _get_representations(
"name": {"$in": list(names)}
})
if not or_query:
return default_output
or_queries.append(or_query)
if context_filters is not None:
if not context_filters:
return []
query_filter["$or"] = or_query
_flatten_filters = _flatten_dict(context_filters)
flatten_filters = {}
for key, value in _flatten_filters.items():
if not key.startswith("context"):
key = "context.{}".format(key)
flatten_filters[key] = value
for item in _regex_filters(flatten_filters):
for key, value in item.items():
if key != "$or":
query_filter[key] = value
elif value:
or_queries.append(value)
if len(or_queries) == 1:
query_filter["$or"] = or_queries[0]
elif or_queries:
and_query = []
for or_query in or_queries:
if isinstance(or_query, list):
or_query = {"$or": or_query}
and_query.append(or_query)
query_filter["$and"] = and_query
conn = get_project_connection(project_name)
@ -1078,7 +1156,7 @@ def get_representations(
representation_ids=None,
representation_names=None,
version_ids=None,
extensions=None,
context_filters=None,
names_by_version_ids=None,
archived=False,
standard=True,
@ -1096,8 +1174,8 @@ def get_representations(
as filter. Filter ignored if 'None' is passed.
version_ids (Iterable[str]): Subset ids used as parent filter. Filter
ignored if 'None' is passed.
extensions (Iterable[str]): Filter by extension of main representation
file (without dot).
context_filters (Dict[str, List[str, re.Pattern]]): Filter by
representation context fields.
names_by_version_ids (dict[ObjectId, list[str]]): Complex filtering
using version ids and list of names under the version.
archived (bool): Output will also contain archived representations.
@ -1113,7 +1191,7 @@ def get_representations(
representation_ids=representation_ids,
representation_names=representation_names,
version_ids=version_ids,
extensions=extensions,
context_filters=context_filters,
names_by_version_ids=names_by_version_ids,
standard=True,
archived=archived,
@ -1126,7 +1204,7 @@ def get_archived_representations(
representation_ids=None,
representation_names=None,
version_ids=None,
extensions=None,
context_filters=None,
names_by_version_ids=None,
fields=None
):
@ -1142,8 +1220,8 @@ def get_archived_representations(
as filter. Filter ignored if 'None' is passed.
version_ids (Iterable[str]): Subset ids used as parent filter. Filter
ignored if 'None' is passed.
extensions (Iterable[str]): Filter by extension of main representation
file (without dot).
context_filters (Dict[str, List[str, re.Pattern]]): Filter by
representation context fields.
names_by_version_ids (dict[ObjectId, List[str]]): Complex filtering
using version ids and list of names under the version.
fields (Iterable[str]): Fields that should be returned. All fields are
@ -1158,7 +1236,7 @@ def get_archived_representations(
representation_ids=representation_ids,
representation_names=representation_names,
version_ids=version_ids,
extensions=extensions,
context_filters=context_filters,
names_by_version_ids=names_by_version_ids,
standard=False,
archived=True,
@ -1181,58 +1259,64 @@ def get_representations_parents(project_name, representations):
dict[ObjectId, tuple]: Parents by representation id.
"""
repres_by_version_id = collections.defaultdict(list)
versions_by_version_id = {}
versions_by_subset_id = collections.defaultdict(list)
subsets_by_subset_id = {}
subsets_by_asset_id = collections.defaultdict(list)
repre_docs_by_version_id = collections.defaultdict(list)
version_docs_by_version_id = {}
version_docs_by_subset_id = collections.defaultdict(list)
subset_docs_by_subset_id = {}
subset_docs_by_asset_id = collections.defaultdict(list)
output = {}
for representation in representations:
repre_id = representation["_id"]
for repre_doc in representations:
repre_id = repre_doc["_id"]
version_id = repre_doc["parent"]
output[repre_id] = (None, None, None, None)
version_id = representation["parent"]
repres_by_version_id[version_id].append(representation)
repre_docs_by_version_id[version_id].append(repre_doc)
versions = get_versions(
project_name, version_ids=repres_by_version_id.keys()
version_docs = get_versions(
project_name,
version_ids=repre_docs_by_version_id.keys(),
hero=True
)
for version in versions:
version_id = version["_id"]
subset_id = version["parent"]
versions_by_version_id[version_id] = version
versions_by_subset_id[subset_id].append(version)
for version_doc in version_docs:
version_id = version_doc["_id"]
subset_id = version_doc["parent"]
version_docs_by_version_id[version_id] = version_doc
version_docs_by_subset_id[subset_id].append(version_doc)
subsets = get_subsets(
project_name, subset_ids=versions_by_subset_id.keys()
subset_docs = get_subsets(
project_name, subset_ids=version_docs_by_subset_id.keys()
)
for subset in subsets:
subset_id = subset["_id"]
asset_id = subset["parent"]
subsets_by_subset_id[subset_id] = subset
subsets_by_asset_id[asset_id].append(subset)
for subset_doc in subset_docs:
subset_id = subset_doc["_id"]
asset_id = subset_doc["parent"]
subset_docs_by_subset_id[subset_id] = subset_doc
subset_docs_by_asset_id[asset_id].append(subset_doc)
assets = get_assets(project_name, asset_ids=subsets_by_asset_id.keys())
assets_by_id = {
asset["_id"]: asset
for asset in assets
asset_docs = get_assets(
project_name, asset_ids=subset_docs_by_asset_id.keys()
)
asset_docs_by_id = {
asset_doc["_id"]: asset_doc
for asset_doc in asset_docs
}
project = get_project(project_name)
project_doc = get_project(project_name)
for version_id, representations in repres_by_version_id.items():
asset = None
subset = None
version = versions_by_version_id.get(version_id)
if version:
subset_id = version["parent"]
subset = subsets_by_subset_id.get(subset_id)
if subset:
asset_id = subset["parent"]
asset = assets_by_id.get(asset_id)
for version_id, repre_docs in repre_docs_by_version_id.items():
asset_doc = None
subset_doc = None
version_doc = version_docs_by_version_id.get(version_id)
if version_doc:
subset_id = version_doc["parent"]
subset_doc = subset_docs_by_subset_id.get(subset_id)
if subset_doc:
asset_id = subset_doc["parent"]
asset_doc = asset_docs_by_id.get(asset_id)
for representation in representations:
repre_id = representation["_id"]
output[repre_id] = (version, subset, asset, project)
for repre_doc in repre_docs:
repre_id = repre_doc["_id"]
output[repre_id] = (
version_doc, subset_doc, asset_doc, project_doc
)
return output

View file

@ -19,8 +19,15 @@ class MissingMethodsError(ValueError):
joined_missing = ", ".join(
['"{}"'.format(item) for item in missing_methods]
)
if isinstance(host, HostBase):
host_name = host.name
else:
try:
host_name = host.__file__.replace("\\", "/").split("/")[-3]
except Exception:
host_name = str(host)
message = (
"Host \"{}\" miss methods {}".format(host.name, joined_missing)
"Host \"{}\" miss methods {}".format(host_name, joined_missing)
)
super(MissingMethodsError, self).__init__(message)

View file

@ -1,27 +1,6 @@
import os
from .module import OpenPypeMaya
def add_implementation_envs(env, _app):
# Add requirements to PYTHONPATH
pype_root = os.environ["OPENPYPE_REPOS_ROOT"]
new_python_paths = [
os.path.join(pype_root, "openpype", "hosts", "maya", "startup")
]
old_python_path = env.get("PYTHONPATH") or ""
for path in old_python_path.split(os.pathsep):
if not path:
continue
norm_path = os.path.normpath(path)
if norm_path not in new_python_paths:
new_python_paths.append(norm_path)
env["PYTHONPATH"] = os.pathsep.join(new_python_paths)
# Set default values if are not already set via settings
defaults = {
"OPENPYPE_LOG_NO_COLORS": "Yes"
}
for key, value in defaults.items():
if not env.get(key):
env[key] = value
__all__ = (
"OpenPypeMaya",
)

View file

@ -0,0 +1,253 @@
import json
from collections import OrderedDict
import maya.cmds as cmds
import qargparse
from openpype.tools.utils.widgets import OptionDialog
from .lib import get_main_window, imprint
# To change as enum
build_types = ["context_asset", "linked_asset", "all_assets"]
def get_placeholder_attributes(node):
return {
attr: cmds.getAttr("{}.{}".format(node, attr))
for attr in cmds.listAttr(node, userDefined=True)}
def delete_placeholder_attributes(node):
'''
function to delete all extra placeholder attributes
'''
extra_attributes = get_placeholder_attributes(node)
for attribute in extra_attributes:
cmds.deleteAttr(node + '.' + attribute)
def create_placeholder():
args = placeholder_window()
if not args:
return # operation canceled, no locator created
# custom arg parse to force empty data query
# and still imprint them on placeholder
# and getting items when arg is of type Enumerator
options = create_options(args)
# create placeholder name dynamically from args and options
placeholder_name = create_placeholder_name(args, options)
selection = cmds.ls(selection=True)
if not selection:
raise ValueError("Nothing is selected")
placeholder = cmds.spaceLocator(name=placeholder_name)[0]
# get the long name of the placeholder (with the groups)
placeholder_full_name = cmds.ls(selection[0], long=True)[
0] + '|' + placeholder.replace('|', '')
if selection:
cmds.parent(placeholder, selection[0])
imprint(placeholder_full_name, options)
# Some tweaks because imprint force enums to to default value so we get
# back arg read and force them to attributes
imprint_enum(placeholder_full_name, args)
# Add helper attributes to keep placeholder info
cmds.addAttr(
placeholder_full_name,
longName="parent",
hidden=True,
dataType="string"
)
cmds.addAttr(
placeholder_full_name,
longName="index",
hidden=True,
attributeType="short",
defaultValue=-1
)
cmds.setAttr(placeholder_full_name + '.parent', "", type="string")
def create_placeholder_name(args, options):
placeholder_builder_type = [
arg.read() for arg in args if 'builder_type' in str(arg)
][0]
placeholder_family = options['family']
placeholder_name = placeholder_builder_type.split('_')
# add famlily in any
if placeholder_family:
placeholder_name.insert(1, placeholder_family)
# add loader arguments if any
if options['loader_args']:
pos = 2
loader_args = options['loader_args'].replace('\'', '\"')
loader_args = json.loads(loader_args)
values = [v for v in loader_args.values()]
for i in range(len(values)):
placeholder_name.insert(i + pos, values[i])
placeholder_name = '_'.join(placeholder_name)
return placeholder_name.capitalize()
def update_placeholder():
placeholder = cmds.ls(selection=True)
if len(placeholder) == 0:
raise ValueError("No node selected")
if len(placeholder) > 1:
raise ValueError("Too many selected nodes")
placeholder = placeholder[0]
args = placeholder_window(get_placeholder_attributes(placeholder))
if not args:
return # operation canceled
# delete placeholder attributes
delete_placeholder_attributes(placeholder)
options = create_options(args)
imprint(placeholder, options)
imprint_enum(placeholder, args)
cmds.addAttr(
placeholder,
longName="parent",
hidden=True,
dataType="string"
)
cmds.addAttr(
placeholder,
longName="index",
hidden=True,
attributeType="short",
defaultValue=-1
)
cmds.setAttr(placeholder + '.parent', '', type="string")
def create_options(args):
options = OrderedDict()
for arg in args:
if not type(arg) == qargparse.Separator:
options[str(arg)] = arg._data.get("items") or arg.read()
return options
def imprint_enum(placeholder, args):
"""
Imprint method doesn't act properly with enums.
Replacing the functionnality with this for now
"""
enum_values = {str(arg): arg.read()
for arg in args if arg._data.get("items")}
string_to_value_enum_table = {
build: i for i, build
in enumerate(build_types)}
for key, value in enum_values.items():
cmds.setAttr(
placeholder + "." + key,
string_to_value_enum_table[value])
def placeholder_window(options=None):
options = options or dict()
dialog = OptionDialog(parent=get_main_window())
dialog.setWindowTitle("Create Placeholder")
args = [
qargparse.Separator("Main attributes"),
qargparse.Enum(
"builder_type",
label="Asset Builder Type",
default=options.get("builder_type", 0),
items=build_types,
help="""Asset Builder Type
Builder type describe what template loader will look for.
context_asset : Template loader will look for subsets of
current context asset (Asset bob will find asset)
linked_asset : Template loader will look for assets linked
to current context asset.
Linked asset are looked in avalon database under field "inputLinks"
"""
),
qargparse.String(
"family",
default=options.get("family", ""),
label="OpenPype Family",
placeholder="ex: model, look ..."),
qargparse.String(
"representation",
default=options.get("representation", ""),
label="OpenPype Representation",
placeholder="ex: ma, abc ..."),
qargparse.String(
"loader",
default=options.get("loader", ""),
label="Loader",
placeholder="ex: ReferenceLoader, LightLoader ...",
help="""Loader
Defines what openpype loader will be used to load assets.
Useable loader depends on current host's loader list.
Field is case sensitive.
"""),
qargparse.String(
"loader_args",
default=options.get("loader_args", ""),
label="Loader Arguments",
placeholder='ex: {"camera":"persp", "lights":True}',
help="""Loader
Defines a dictionnary of arguments used to load assets.
Useable arguments depend on current placeholder Loader.
Field should be a valid python dict. Anything else will be ignored.
"""),
qargparse.Integer(
"order",
default=options.get("order", 0),
min=0,
max=999,
label="Order",
placeholder="ex: 0, 100 ... (smallest order loaded first)",
help="""Order
Order defines asset loading priority (0 to 999)
Priority rule is : "lowest is first to load"."""),
qargparse.Separator(
"Optional attributes"),
qargparse.String(
"asset",
default=options.get("asset", ""),
label="Asset filter",
placeholder="regex filtering by asset name",
help="Filtering assets by matching field regex to asset's name"),
qargparse.String(
"subset",
default=options.get("subset", ""),
label="Subset filter",
placeholder="regex filtering by subset name",
help="Filtering assets by matching field regex to subset's name"),
qargparse.String(
"hierarchy",
default=options.get("hierarchy", ""),
label="Hierarchy filter",
placeholder="regex filtering by asset's hierarchy",
help="Filtering assets by matching field asset's hierarchy")
]
dialog.create(args)
if not dialog.exec_():
return None
return args

View file

@ -9,10 +9,15 @@ import maya.cmds as cmds
from openpype.settings import get_project_settings
from openpype.pipeline import legacy_io
from openpype.pipeline.workfile import BuildWorkfile
from openpype.pipeline.workfile.build_template import (
build_workfile_template,
update_workfile_template
)
from openpype.tools.utils import host_tools
from openpype.hosts.maya.api import lib, lib_rendersettings
from .lib import get_main_window, IS_HEADLESS
from .commands import reset_frame_range
from .lib_template_builder import create_placeholder, update_placeholder
log = logging.getLogger(__name__)
@ -147,6 +152,34 @@ def install():
parent_widget
)
)
builder_menu = cmds.menuItem(
"Template Builder",
subMenu=True,
tearOff=True,
parent=MENU_NAME
)
cmds.menuItem(
"Create Placeholder",
parent=builder_menu,
command=lambda *args: create_placeholder()
)
cmds.menuItem(
"Update Placeholder",
parent=builder_menu,
command=lambda *args: update_placeholder()
)
cmds.menuItem(
"Build Workfile from template",
parent=builder_menu,
command=build_workfile_template
)
cmds.menuItem(
"Update Workfile from template",
parent=builder_menu,
command=update_workfile_template
)
cmds.setParent(MENU_NAME, menu=True)
def add_scripts_menu():

View file

@ -0,0 +1,252 @@
import re
from maya import cmds
from openpype.client import get_representations
from openpype.pipeline import legacy_io
from openpype.pipeline.workfile.abstract_template_loader import (
AbstractPlaceholder,
AbstractTemplateLoader
)
from openpype.pipeline.workfile.build_template_exceptions import (
TemplateAlreadyImported
)
PLACEHOLDER_SET = 'PLACEHOLDERS_SET'
class MayaTemplateLoader(AbstractTemplateLoader):
"""Concrete implementation of AbstractTemplateLoader for maya
"""
def import_template(self, path):
"""Import template into current scene.
Block if a template is already loaded.
Args:
path (str): A path to current template (usually given by
get_template_path implementation)
Returns:
bool: Wether the template was succesfully imported or not
"""
if cmds.objExists(PLACEHOLDER_SET):
raise TemplateAlreadyImported(
"Build template already loaded\n"
"Clean scene if needed (File > New Scene)")
cmds.sets(name=PLACEHOLDER_SET, empty=True)
self.new_nodes = cmds.file(path, i=True, returnNewNodes=True)
cmds.setAttr(PLACEHOLDER_SET + '.hiddenInOutliner', True)
for set in cmds.listSets(allSets=True):
if (cmds.objExists(set) and
cmds.attributeQuery('id', node=set, exists=True) and
cmds.getAttr(set + '.id') == 'pyblish.avalon.instance'):
if cmds.attributeQuery('asset', node=set, exists=True):
cmds.setAttr(
set + '.asset',
legacy_io.Session['AVALON_ASSET'], type='string'
)
return True
def template_already_imported(self, err_msg):
clearButton = "Clear scene and build"
updateButton = "Update template"
abortButton = "Abort"
title = "Scene already builded"
message = (
"It's seems a template was already build for this scene.\n"
"Error message reveived :\n\n\"{}\"".format(err_msg))
buttons = [clearButton, updateButton, abortButton]
defaultButton = clearButton
cancelButton = abortButton
dismissString = abortButton
answer = cmds.confirmDialog(
t=title,
m=message,
b=buttons,
db=defaultButton,
cb=cancelButton,
ds=dismissString)
if answer == clearButton:
cmds.file(newFile=True, force=True)
self.import_template(self.template_path)
self.populate_template()
elif answer == updateButton:
self.update_missing_containers()
elif answer == abortButton:
return
@staticmethod
def get_template_nodes():
attributes = cmds.ls('*.builder_type', long=True)
return [attribute.rpartition('.')[0] for attribute in attributes]
def get_loaded_containers_by_id(self):
try:
containers = cmds.sets("AVALON_CONTAINERS", q=True)
except ValueError:
return None
return [
cmds.getAttr(container + '.representation')
for container in containers]
class MayaPlaceholder(AbstractPlaceholder):
"""Concrete implementation of AbstractPlaceholder for maya
"""
optional_keys = {'asset', 'subset', 'hierarchy'}
def get_data(self, node):
user_data = dict()
for attr in self.required_keys.union(self.optional_keys):
attribute_name = '{}.{}'.format(node, attr)
if not cmds.attributeQuery(attr, node=node, exists=True):
print("{} not found".format(attribute_name))
continue
user_data[attr] = cmds.getAttr(
attribute_name,
asString=True)
user_data['parent'] = (
cmds.getAttr(node + '.parent', asString=True)
or node.rpartition('|')[0]
or ""
)
user_data['node'] = node
if user_data['parent']:
siblings = cmds.listRelatives(user_data['parent'], children=True)
else:
siblings = cmds.ls(assemblies=True)
node_shortname = user_data['node'].rpartition('|')[2]
current_index = cmds.getAttr(node + '.index', asString=True)
user_data['index'] = (
current_index if current_index >= 0
else siblings.index(node_shortname))
self.data = user_data
def parent_in_hierarchy(self, containers):
"""Parent loaded container to placeholder's parent
ie : Set loaded content as placeholder's sibling
Args:
containers (String): Placeholder loaded containers
"""
if not containers:
return
roots = cmds.sets(containers, q=True)
nodes_to_parent = []
for root in roots:
if root.endswith("_RN"):
refRoot = cmds.referenceQuery(root, n=True)[0]
refRoot = cmds.listRelatives(refRoot, parent=True) or [refRoot]
nodes_to_parent.extend(refRoot)
elif root in cmds.listSets(allSets=True):
if not cmds.sets(root, q=True):
return
else:
continue
else:
nodes_to_parent.append(root)
if self.data['parent']:
cmds.parent(nodes_to_parent, self.data['parent'])
# Move loaded nodes to correct index in outliner hierarchy
placeholder_node = self.data['node']
placeholder_form = cmds.xform(
placeholder_node,
q=True,
matrix=True,
worldSpace=True
)
for node in set(nodes_to_parent):
cmds.reorder(node, front=True)
cmds.reorder(node, relative=self.data['index'])
cmds.xform(node, matrix=placeholder_form, ws=True)
holding_sets = cmds.listSets(object=placeholder_node)
if not holding_sets:
return
for holding_set in holding_sets:
cmds.sets(roots, forceElement=holding_set)
def clean(self):
"""Hide placeholder, parent them to root
add them to placeholder set and register placeholder's parent
to keep placeholder info available for future use
"""
node = self.data['node']
if self.data['parent']:
cmds.setAttr(node + '.parent', self.data['parent'], type='string')
if cmds.getAttr(node + '.index') < 0:
cmds.setAttr(node + '.index', self.data['index'])
holding_sets = cmds.listSets(object=node)
if holding_sets:
for set in holding_sets:
cmds.sets(node, remove=set)
if cmds.listRelatives(node, p=True):
node = cmds.parent(node, world=True)[0]
cmds.sets(node, addElement=PLACEHOLDER_SET)
cmds.hide(node)
cmds.setAttr(node + '.hiddenInOutliner', True)
def get_representations(self, current_asset_doc, linked_asset_docs):
project_name = legacy_io.active_project()
builder_type = self.data["builder_type"]
if builder_type == "context_asset":
context_filters = {
"asset": [current_asset_doc["name"]],
"subset": [re.compile(self.data["subset"])],
"hierarchy": [re.compile(self.data["hierarchy"])],
"representations": [self.data["representation"]],
"family": [self.data["family"]]
}
elif builder_type != "linked_asset":
context_filters = {
"asset": [re.compile(self.data["asset"])],
"subset": [re.compile(self.data["subset"])],
"hierarchy": [re.compile(self.data["hierarchy"])],
"representation": [self.data["representation"]],
"family": [self.data["family"]]
}
else:
asset_regex = re.compile(self.data["asset"])
linked_asset_names = []
for asset_doc in linked_asset_docs:
asset_name = asset_doc["name"]
if asset_regex.match(asset_name):
linked_asset_names.append(asset_name)
context_filters = {
"asset": linked_asset_names,
"subset": [re.compile(self.data["subset"])],
"hierarchy": [re.compile(self.data["hierarchy"])],
"representation": [self.data["representation"]],
"family": [self.data["family"]],
}
return list(get_representations(
project_name,
context_filters=context_filters
))
def err_message(self):
return (
"Error while trying to load a representation.\n"
"Either the subset wasn't published or the template is malformed."
"\n\n"
"Builder was looking for :\n{attributes}".format(
attributes="\n".join([
"{}: {}".format(key.title(), value)
for key, value in self.data.items()]
)
)
)

View file

@ -2,11 +2,9 @@
import os
from maya import cmds
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
def file_extensions():
return HOST_WORKFILE_EXTENSIONS["maya"]
return [".ma", ".mb"]
def has_unsaved_changes():

View file

@ -0,0 +1,47 @@
import os
from openpype.modules import OpenPypeModule
from openpype.modules.interfaces import IHostModule
MAYA_ROOT_DIR = os.path.dirname(os.path.abspath(__file__))
class OpenPypeMaya(OpenPypeModule, IHostModule):
name = "openpype_maya"
host_name = "maya"
def initialize(self, module_settings):
self.enabled = True
def add_implementation_envs(self, env, _app):
# Add requirements to PYTHONPATH
new_python_paths = [
os.path.join(MAYA_ROOT_DIR, "startup")
]
old_python_path = env.get("PYTHONPATH") or ""
for path in old_python_path.split(os.pathsep):
if not path:
continue
norm_path = os.path.normpath(path)
if norm_path not in new_python_paths:
new_python_paths.append(norm_path)
env["PYTHONPATH"] = os.pathsep.join(new_python_paths)
# Set default values if are not already set via settings
defaults = {
"OPENPYPE_LOG_NO_COLORS": "Yes"
}
for key, value in defaults.items():
if not env.get(key):
env[key] = value
def get_launch_hook_paths(self, app):
if app.host_name != self.host_name:
return []
return [
os.path.join(MAYA_ROOT_DIR, "hooks")
]
def get_workfile_extensions(self):
return [".ma", ".mb"]

View file

@ -71,7 +71,6 @@ class CreateRender(plugin.Creator):
label = "Render"
family = "rendering"
icon = "eye"
_token = None
_user = None
_password = None
@ -220,6 +219,12 @@ class CreateRender(plugin.Creator):
self.data["tilesY"] = 2
self.data["convertToScanline"] = False
self.data["useReferencedAovs"] = False
self.data["renderSetupIncludeLights"] = (
self._project_settings.get(
"maya", {}).get(
"RenderSettings", {}).get(
"enable_all_lights", False)
)
# Disable for now as this feature is not working yet
# self.data["assScene"] = False

View file

@ -311,7 +311,10 @@ class CollectMayaRender(pyblish.api.ContextPlugin):
"useReferencedAovs": render_instance.data.get(
"useReferencedAovs") or render_instance.data.get(
"vrayUseReferencedAovs") or False,
"aovSeparator": layer_render_products.layer_data.aov_separator # noqa: E501
"aovSeparator": layer_render_products.layer_data.aov_separator, # noqa: E501
"renderSetupIncludeLights": render_instance.data.get(
"renderSetupIncludeLights"
)
}
# Collect Deadline url if Deadline module is enabled
@ -354,6 +357,7 @@ class CollectMayaRender(pyblish.api.ContextPlugin):
instance = context.create_instance(expected_layer_name)
instance.data["label"] = label
instance.data["farm"] = True
instance.data.update(data)
self.log.debug("data: {}".format(json.dumps(data, indent=4)))

View file

@ -40,11 +40,13 @@ def get_ocio_config_path(profile_folder):
Returns:
str: Path to vendorized config file.
"""
return os.path.join(
os.environ["OPENPYPE_ROOT"],
"vendor",
"configs",
"OpenColorIO-Configs",
"bin",
"ocioconfig",
"OpenColorIOConfigs",
profile_folder,
"config.ocio"
)

View file

@ -242,6 +242,14 @@ class ValidateRenderSettings(pyblish.api.InstancePlugin):
instance.context.data["project_settings"]["maya"]["publish"]["ValidateRenderSettings"].get( # noqa: E501
"{}_render_attributes".format(renderer)) or []
)
settings_lights_flag = instance.context.data["project_settings"].get(
"maya", {}).get(
"RenderSettings", {}).get(
"enable_all_lights", False)
instance_lights_flag = instance.data.get("renderSetupIncludeLights")
if settings_lights_flag != instance_lights_flag:
cls.log.warning('Instance flag for "Render Setup Include Lights" is set to {0} and Settings flag is set to {1}'.format(instance_lights_flag, settings_lights_flag)) # noqa
# go through definitions and test if such node.attribute exists.
# if so, compare its value from the one required.

View file

@ -162,7 +162,15 @@ class LoadClip(plugin.NukeLoader):
data_imprint = {}
for k in add_keys:
if k == 'version':
data_imprint[k] = context["version"]['name']
version_doc = context["version"]
if version_doc["type"] == "hero_version":
version = "hero"
else:
version = version_doc.get("name")
if version:
data_imprint[k] = version
elif k == 'colorspace':
colorspace = repre["data"].get(k)
colorspace = colorspace or version_data.get(k)

View file

@ -1,129 +0,0 @@
from .api.utils import (
setup,
get_resolve_module
)
from .api.pipeline import (
install,
uninstall,
ls,
containerise,
update_container,
publish,
launch_workfiles_app,
maintained_selection,
remove_instance,
list_instances
)
from .api.lib import (
maintain_current_timeline,
publish_clip_color,
get_project_manager,
get_current_project,
get_current_timeline,
create_bin,
get_media_pool_item,
create_media_pool_item,
create_timeline_item,
get_timeline_item,
get_video_track_names,
get_current_timeline_items,
get_pype_timeline_item_by_name,
get_timeline_item_pype_tag,
set_timeline_item_pype_tag,
imprint,
set_publish_attribute,
get_publish_attribute,
create_compound_clip,
swap_clips,
get_pype_clip_metadata,
set_project_manager_to_folder_name,
get_otio_clip_instance_data,
get_reformated_path
)
from .api.menu import launch_pype_menu
from .api.plugin import (
ClipLoader,
TimelineItemLoader,
Creator,
PublishClip
)
from .api.workio import (
open_file,
save_file,
current_file,
has_unsaved_changes,
file_extensions,
work_root
)
from .api.testing_utils import TestGUI
__all__ = [
# pipeline
"install",
"uninstall",
"ls",
"containerise",
"update_container",
"reload_pipeline",
"publish",
"launch_workfiles_app",
"maintained_selection",
"remove_instance",
"list_instances",
# utils
"setup",
"get_resolve_module",
# lib
"maintain_current_timeline",
"publish_clip_color",
"get_project_manager",
"get_current_project",
"get_current_timeline",
"create_bin",
"get_media_pool_item",
"create_media_pool_item",
"create_timeline_item",
"get_timeline_item",
"get_video_track_names",
"get_current_timeline_items",
"get_pype_timeline_item_by_name",
"get_timeline_item_pype_tag",
"set_timeline_item_pype_tag",
"imprint",
"set_publish_attribute",
"get_publish_attribute",
"create_compound_clip",
"swap_clips",
"get_pype_clip_metadata",
"set_project_manager_to_folder_name",
"get_otio_clip_instance_data",
"get_reformated_path",
# menu
"launch_pype_menu",
# plugin
"ClipLoader",
"TimelineItemLoader",
"Creator",
"PublishClip",
# workio
"open_file",
"save_file",
"current_file",
"has_unsaved_changes",
"file_extensions",
"work_root",
"TestGUI"
]

View file

@ -1,11 +1,137 @@
"""
resolve api
"""
import os
bmdvr = None
bmdvf = None
API_DIR = os.path.dirname(os.path.abspath(__file__))
HOST_DIR = os.path.dirname(API_DIR)
PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
from .utils import (
get_resolve_module
)
from .pipeline import (
install,
uninstall,
ls,
containerise,
update_container,
publish,
launch_workfiles_app,
maintained_selection,
remove_instance,
list_instances
)
from .lib import (
maintain_current_timeline,
publish_clip_color,
get_project_manager,
get_current_project,
get_current_timeline,
create_bin,
get_media_pool_item,
create_media_pool_item,
create_timeline_item,
get_timeline_item,
get_video_track_names,
get_current_timeline_items,
get_pype_timeline_item_by_name,
get_timeline_item_pype_tag,
set_timeline_item_pype_tag,
imprint,
set_publish_attribute,
get_publish_attribute,
create_compound_clip,
swap_clips,
get_pype_clip_metadata,
set_project_manager_to_folder_name,
get_otio_clip_instance_data,
get_reformated_path
)
from .menu import launch_pype_menu
from .plugin import (
ClipLoader,
TimelineItemLoader,
Creator,
PublishClip
)
from .workio import (
open_file,
save_file,
current_file,
has_unsaved_changes,
file_extensions,
work_root
)
from .testing_utils import TestGUI
__all__ = [
"bmdvr",
"bmdvf",
# pipeline
"install",
"uninstall",
"ls",
"containerise",
"update_container",
"reload_pipeline",
"publish",
"launch_workfiles_app",
"maintained_selection",
"remove_instance",
"list_instances",
# utils
"get_resolve_module",
# lib
"maintain_current_timeline",
"publish_clip_color",
"get_project_manager",
"get_current_project",
"get_current_timeline",
"create_bin",
"get_media_pool_item",
"create_media_pool_item",
"create_timeline_item",
"get_timeline_item",
"get_video_track_names",
"get_current_timeline_items",
"get_pype_timeline_item_by_name",
"get_timeline_item_pype_tag",
"set_timeline_item_pype_tag",
"imprint",
"set_publish_attribute",
"get_publish_attribute",
"create_compound_clip",
"swap_clips",
"get_pype_clip_metadata",
"set_project_manager_to_folder_name",
"get_otio_clip_instance_data",
"get_reformated_path",
# menu
"launch_pype_menu",
# plugin
"ClipLoader",
"TimelineItemLoader",
"Creator",
"PublishClip",
# workio
"open_file",
"save_file",
"current_file",
"has_unsaved_changes",
"file_extensions",
"work_root",
"TestGUI"
]

View file

@ -4,7 +4,7 @@ from __future__ import absolute_import
import pyblish.api
from ...action import get_errored_instances_from_context
from openpype.action import get_errored_instances_from_context
class SelectInvalidAction(pyblish.api.Action):

View file

@ -4,13 +4,13 @@ import re
import os
import contextlib
from opentimelineio import opentime
from openpype.lib import Logger
from openpype.pipeline.editorial import is_overlapping_otio_ranges
from ..otio import davinci_export as otio_export
from openpype.api import Logger
log = Logger().get_logger(__name__)
log = Logger.get_logger(__name__)
self = sys.modules[__name__]
self.project_manager = None

View file

@ -3,13 +3,13 @@ import sys
from Qt import QtWidgets, QtCore
from openpype.tools.utils import host_tools
from .pipeline import (
publish,
launch_workfiles_app
)
from openpype.tools.utils import host_tools
def load_stylesheet():
path = os.path.join(os.path.dirname(__file__), "menu_style.qss")

View file

@ -7,7 +7,7 @@ from collections import OrderedDict
from pyblish import api as pyblish
from openpype.api import Logger
from openpype.lib import Logger
from openpype.pipeline import (
schema,
register_loader_plugin_path,
@ -16,11 +16,15 @@ from openpype.pipeline import (
deregister_creator_plugin_path,
AVALON_CONTAINER_ID,
)
from . import lib
from . import PLUGINS_DIR
from openpype.tools.utils import host_tools
log = Logger().get_logger(__name__)
from . import lib
from .utils import get_resolve_module
log = Logger.get_logger(__name__)
HOST_DIR = os.path.dirname(os.path.abspath(os.path.dirname(__file__)))
PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
@ -39,7 +43,6 @@ def install():
See the Maya equivalent for inspiration on how to implement this.
"""
from .. import get_resolve_module
log.info("openpype.hosts.resolve installed")

View file

@ -1,9 +1,9 @@
#!/usr/bin/env python
import time
from openpype.hosts.resolve.utils import get_resolve_module
from openpype.api import Logger
from openpype.lib import Logger
log = Logger().get_logger(__name__)
log = Logger.get_logger(__name__)
wait_delay = 2.5
wait = 0.00

View file

@ -4,21 +4,21 @@
Resolve's tools for setting environment
"""
import sys
import os
import shutil
from . import HOST_DIR
from openpype.api import Logger
log = Logger().get_logger(__name__)
import sys
from openpype.lib import Logger
log = Logger.get_logger(__name__)
def get_resolve_module():
from openpype.hosts import resolve
from openpype.hosts.resolve import api
# dont run if already loaded
if resolve.api.bmdvr:
if api.bmdvr:
log.info(("resolve module is assigned to "
f"`pype.hosts.resolve.api.bmdvr`: {resolve.api.bmdvr}"))
return resolve.api.bmdvr
f"`pype.hosts.resolve.api.bmdvr`: {api.bmdvr}"))
return api.bmdvr
try:
"""
The PYTHONPATH needs to be set correctly for this import
@ -71,79 +71,9 @@ def get_resolve_module():
# assign global var and return
bmdvr = bmd.scriptapp("Resolve")
bmdvf = bmd.scriptapp("Fusion")
resolve.api.bmdvr = bmdvr
resolve.api.bmdvf = bmdvf
api.bmdvr = bmdvr
api.bmdvf = bmdvf
log.info(("Assigning resolve module to "
f"`pype.hosts.resolve.api.bmdvr`: {resolve.api.bmdvr}"))
f"`pype.hosts.resolve.api.bmdvr`: {api.bmdvr}"))
log.info(("Assigning resolve module to "
f"`pype.hosts.resolve.api.bmdvf`: {resolve.api.bmdvf}"))
def _sync_utility_scripts(env=None):
""" Synchronizing basic utlility scripts for resolve.
To be able to run scripts from inside `Resolve/Workspace/Scripts` menu
all scripts has to be accessible from defined folder.
"""
if not env:
env = os.environ
# initiate inputs
scripts = {}
us_env = env.get("RESOLVE_UTILITY_SCRIPTS_SOURCE_DIR")
us_dir = env.get("RESOLVE_UTILITY_SCRIPTS_DIR", "")
us_paths = [os.path.join(
HOST_DIR,
"utility_scripts"
)]
# collect script dirs
if us_env:
log.info(f"Utility Scripts Env: `{us_env}`")
us_paths = us_env.split(
os.pathsep) + us_paths
# collect scripts from dirs
for path in us_paths:
scripts.update({path: os.listdir(path)})
log.info(f"Utility Scripts Dir: `{us_paths}`")
log.info(f"Utility Scripts: `{scripts}`")
# make sure no script file is in folder
if next((s for s in os.listdir(us_dir)), None):
for s in os.listdir(us_dir):
path = os.path.join(us_dir, s)
log.info(f"Removing `{path}`...")
if os.path.isdir(path):
shutil.rmtree(path, onerror=None)
else:
os.remove(path)
# copy scripts into Resolve's utility scripts dir
for d, sl in scripts.items():
# directory and scripts list
for s in sl:
# script in script list
src = os.path.join(d, s)
dst = os.path.join(us_dir, s)
log.info(f"Copying `{src}` to `{dst}`...")
if os.path.isdir(src):
shutil.copytree(
src, dst, symlinks=False,
ignore=None, ignore_dangling_symlinks=False
)
else:
shutil.copy2(src, dst)
def setup(env=None):
""" Wrapper installer started from pype.hooks.resolve.ResolvePrelaunch()
"""
if not env:
env = os.environ
# synchronize resolve utility scripts
_sync_utility_scripts(env)
log.info("Resolve OpenPype wrapper has been installed")
f"`pype.hosts.resolve.api.bmdvf`: {api.bmdvf}"))

View file

@ -2,14 +2,14 @@
import os
from openpype.api import Logger
from .. import (
from .lib import (
get_project_manager,
get_current_project,
set_project_manager_to_folder_name
)
log = Logger().get_logger(__name__)
log = Logger.get_logger(__name__)
exported_projet_ext = ".drp"
@ -60,7 +60,7 @@ def open_file(filepath):
# load project from input path
project = pm.LoadProject(fname)
log.info(f"Project {project.GetName()} opened...")
return True
except AttributeError:
log.warning((f"Project with name `{fname}` does not exist! It will "
f"be imported from {filepath} and then loaded..."))
@ -69,9 +69,8 @@ def open_file(filepath):
project = pm.LoadProject(fname)
log.info(f"Project imported/loaded {project.GetName()}...")
return True
else:
return False
return False
return True
def current_file():
pm = get_project_manager()
@ -80,13 +79,9 @@ def current_file():
name = project.GetName()
fname = name + exported_projet_ext
current_file = os.path.join(current_dir, fname)
normalised = os.path.normpath(current_file)
# Unsaved current file
if normalised == "":
if not current_file:
return None
return normalised
return os.path.normpath(current_file)
def work_root(session):

View file

@ -1,7 +1,7 @@
import os
import importlib
from openpype.lib import PreLaunchHook
from openpype.hosts.resolve.api import utils
from openpype.hosts.resolve.utils import setup
class ResolvePrelaunch(PreLaunchHook):
@ -43,18 +43,6 @@ class ResolvePrelaunch(PreLaunchHook):
self.launch_context.env.get("PRE_PYTHON_SCRIPT", ""))
self.launch_context.env["PRE_PYTHON_SCRIPT"] = pre_py_sc
self.log.debug(f"-- pre_py_sc: `{pre_py_sc}`...")
try:
__import__("openpype.hosts.resolve")
__import__("pyblish")
except ImportError:
self.log.warning(
"pyblish: Could not load Resolve integration.",
exc_info=True
)
else:
# Resolve Setup integration
importlib.reload(utils)
self.log.debug(f"-- utils.__file__: `{utils.__file__}`")
utils.setup(self.launch_context.env)
# Resolve Setup integration
setup(self.launch_context.env)

View file

@ -1,9 +1,12 @@
# from pprint import pformat
from openpype.hosts import resolve
from openpype.hosts.resolve.api import lib
from openpype.hosts.resolve.api import plugin, lib
from openpype.hosts.resolve.api.lib import (
get_video_track_names,
create_bin,
)
class CreateShotClip(resolve.Creator):
class CreateShotClip(plugin.Creator):
"""Publishable clip"""
label = "Create Publishable Clip"
@ -11,7 +14,7 @@ class CreateShotClip(resolve.Creator):
icon = "film"
defaults = ["Main"]
gui_tracks = resolve.get_video_track_names()
gui_tracks = get_video_track_names()
gui_name = "OpenPype publish attributes creator"
gui_info = "Define sequential rename and fill hierarchy data."
gui_inputs = {
@ -250,7 +253,7 @@ class CreateShotClip(resolve.Creator):
sq_markers = self.timeline.GetMarkers()
# create media bin for compound clips (trackItems)
mp_folder = resolve.create_bin(self.timeline.GetName())
mp_folder = create_bin(self.timeline.GetName())
kwargs = {
"ui_inputs": widget.result,
@ -264,6 +267,6 @@ class CreateShotClip(resolve.Creator):
self.rename_index = i
self.log.info(track_item_data)
# convert track item to timeline media pool item
track_item = resolve.PublishClip(
track_item = plugin.PublishClip(
self, track_item_data, **kwargs).convert()
track_item.SetClipColor(lib.publish_clip_color)

View file

@ -1,21 +1,22 @@
from copy import deepcopy
from importlib import reload
from openpype.client import (
get_version_by_id,
get_last_version_by_subset_id,
)
from openpype.hosts import resolve
# from openpype.hosts import resolve
from openpype.pipeline import (
get_representation_path,
legacy_io,
)
from openpype.hosts.resolve.api import lib, plugin
reload(plugin)
reload(lib)
from openpype.hosts.resolve.api.pipeline import (
containerise,
update_container,
)
class LoadClip(resolve.TimelineItemLoader):
class LoadClip(plugin.TimelineItemLoader):
"""Load a subset to timeline as clip
Place clip to timeline on its asset origin timings collected
@ -46,7 +47,7 @@ class LoadClip(resolve.TimelineItemLoader):
})
# load clip to timeline and get main variables
timeline_item = resolve.ClipLoader(
timeline_item = plugin.ClipLoader(
self, context, **options).load()
namespace = namespace or timeline_item.GetName()
version = context['version']
@ -80,7 +81,7 @@ class LoadClip(resolve.TimelineItemLoader):
self.log.info("Loader done: `{}`".format(name))
return resolve.containerise(
return containerise(
timeline_item,
name, namespace, context,
self.__class__.__name__,
@ -98,7 +99,7 @@ class LoadClip(resolve.TimelineItemLoader):
context.update({"representation": representation})
name = container['name']
namespace = container['namespace']
timeline_item_data = resolve.get_pype_timeline_item_by_name(namespace)
timeline_item_data = lib.get_pype_timeline_item_by_name(namespace)
timeline_item = timeline_item_data["clip"]["item"]
project_name = legacy_io.active_project()
version = get_version_by_id(project_name, representation["parent"])
@ -109,7 +110,7 @@ class LoadClip(resolve.TimelineItemLoader):
self.fname = get_representation_path(representation)
context["version"] = {"data": version_data}
loader = resolve.ClipLoader(self, context)
loader = plugin.ClipLoader(self, context)
timeline_item = loader.update(timeline_item)
# add additional metadata from the version to imprint Avalon knob
@ -136,7 +137,7 @@ class LoadClip(resolve.TimelineItemLoader):
# update color of clip regarding the version order
self.set_item_color(timeline_item, version)
return resolve.update_container(timeline_item, data_imprint)
return update_container(timeline_item, data_imprint)
@classmethod
def set_item_color(cls, timeline_item, version):

View file

@ -1,7 +1,7 @@
import os
import pyblish.api
import openpype.api
from openpype.hosts import resolve
from openpype.hosts.resolve.api.lib import get_project_manager
class ExtractWorkfile(openpype.api.Extractor):
@ -29,7 +29,7 @@ class ExtractWorkfile(openpype.api.Extractor):
os.path.join(staging_dir, drp_file_name))
# write out the drp workfile
resolve.get_project_manager().ExportProject(
get_project_manager().ExportProject(
project.GetName(), drp_file_path)
# create drp workfile representation

View file

@ -1,9 +1,15 @@
import pyblish
from openpype.hosts import resolve
# # developer reload modules
from pprint import pformat
import pyblish
from openpype.hosts.resolve.api.lib import (
get_current_timeline_items,
get_timeline_item_pype_tag,
publish_clip_color,
get_publish_attribute,
get_otio_clip_instance_data,
)
class PrecollectInstances(pyblish.api.ContextPlugin):
"""Collect all Track items selection."""
@ -14,8 +20,8 @@ class PrecollectInstances(pyblish.api.ContextPlugin):
def process(self, context):
otio_timeline = context.data["otioTimeline"]
selected_timeline_items = resolve.get_current_timeline_items(
filter=True, selecting_color=resolve.publish_clip_color)
selected_timeline_items = get_current_timeline_items(
filter=True, selecting_color=publish_clip_color)
self.log.info(
"Processing enabled track items: {}".format(
@ -27,7 +33,7 @@ class PrecollectInstances(pyblish.api.ContextPlugin):
timeline_item = timeline_item_data["clip"]["item"]
# get pype tag data
tag_data = resolve.get_timeline_item_pype_tag(timeline_item)
tag_data = get_timeline_item_pype_tag(timeline_item)
self.log.debug(f"__ tag_data: {pformat(tag_data)}")
if not tag_data:
@ -67,7 +73,7 @@ class PrecollectInstances(pyblish.api.ContextPlugin):
"asset": asset,
"item": timeline_item,
"families": families,
"publish": resolve.get_publish_attribute(timeline_item),
"publish": get_publish_attribute(timeline_item),
"fps": context.data["fps"],
"handleStart": handle_start,
"handleEnd": handle_end,
@ -75,7 +81,7 @@ class PrecollectInstances(pyblish.api.ContextPlugin):
})
# otio clip data
otio_data = resolve.get_otio_clip_instance_data(
otio_data = get_otio_clip_instance_data(
otio_timeline, timeline_item_data) or {}
data.update(otio_data)
@ -134,7 +140,7 @@ class PrecollectInstances(pyblish.api.ContextPlugin):
"asset": asset,
"family": family,
"families": [],
"publish": resolve.get_publish_attribute(timeline_item)
"publish": get_publish_attribute(timeline_item)
})
context.create_instance(**data)

View file

@ -1,11 +1,9 @@
import pyblish.api
from pprint import pformat
from importlib import reload
from openpype.hosts import resolve
from openpype.hosts.resolve import api as rapi
from openpype.pipeline import legacy_io
from openpype.hosts.resolve.otio import davinci_export
reload(davinci_export)
class PrecollectWorkfile(pyblish.api.ContextPlugin):
@ -18,9 +16,9 @@ class PrecollectWorkfile(pyblish.api.ContextPlugin):
asset = legacy_io.Session["AVALON_ASSET"]
subset = "workfile"
project = resolve.get_current_project()
project = rapi.get_current_project()
fps = project.GetSetting("timelineFrameRate")
video_tracks = resolve.get_video_track_names()
video_tracks = rapi.get_video_track_names()
# adding otio timeline to context
otio_timeline = davinci_export.create_otio_timeline(project)

View file

@ -6,10 +6,11 @@ from openpype.pipeline import install_host
def main(env):
import openpype.hosts.resolve as bmdvr
from openpype.hosts.resolve.utils import setup
import openpype.hosts.resolve.api as bmdvr
# Registers openpype's Global pyblish plugins
install_host(bmdvr)
bmdvr.setup(env)
setup(env)
if __name__ == "__main__":

View file

@ -2,13 +2,13 @@ import os
import sys
from openpype.pipeline import install_host
from openpype.api import Logger
from openpype.lib import Logger
log = Logger().get_logger(__name__)
log = Logger.get_logger(__name__)
def main(env):
import openpype.hosts.resolve as bmdvr
import openpype.hosts.resolve.api as bmdvr
# activate resolve from openpype
install_host(bmdvr)

View file

@ -6,8 +6,8 @@ import opentimelineio as otio
from openpype.pipeline import install_host
from openpype.hosts.resolve import TestGUI
import openpype.hosts.resolve as bmdvr
import openpype.hosts.resolve.api as bmdvr
from openpype.hosts.resolve.api.testing_utils import TestGUI
from openpype.hosts.resolve.otio import davinci_export as otio_export

View file

@ -2,11 +2,16 @@
import os
import sys
from openpype.pipeline import install_host
from openpype.hosts.resolve import TestGUI
import openpype.hosts.resolve as bmdvr
import clique
from openpype.pipeline import install_host
from openpype.hosts.resolve.api.testing_utils import TestGUI
import openpype.hosts.resolve.api as bmdvr
from openpype.hosts.resolve.api.lib import (
create_media_pool_item,
create_timeline_item,
)
class ThisTestGUI(TestGUI):
extensions = [".exr", ".jpg", ".mov", ".png", ".mp4", ".ari", ".arx"]
@ -55,10 +60,10 @@ class ThisTestGUI(TestGUI):
# skip if unwanted extension
if ext not in self.extensions:
return
media_pool_item = bmdvr.create_media_pool_item(fpath)
media_pool_item = create_media_pool_item(fpath)
print(media_pool_item)
track_item = bmdvr.create_timeline_item(media_pool_item)
track_item = create_timeline_item(media_pool_item)
print(track_item)

View file

@ -1,13 +1,17 @@
#! python3
from openpype.pipeline import install_host
import openpype.hosts.resolve as bmdvr
from openpype.hosts.resolve import api as bmdvr
from openpype.hosts.resolve.api.lib import (
create_media_pool_item,
create_timeline_item,
)
def file_processing(fpath):
media_pool_item = bmdvr.create_media_pool_item(fpath)
media_pool_item = create_media_pool_item(fpath)
print(media_pool_item)
track_item = bmdvr.create_timeline_item(media_pool_item)
track_item = create_timeline_item(media_pool_item)
print(track_item)

View file

@ -0,0 +1,54 @@
import os
import shutil
from openpype.lib import Logger
RESOLVE_ROOT_DIR = os.path.dirname(os.path.abspath(__file__))
def setup(env):
log = Logger.get_logger("ResolveSetup")
scripts = {}
us_env = env.get("RESOLVE_UTILITY_SCRIPTS_SOURCE_DIR")
us_dir = env.get("RESOLVE_UTILITY_SCRIPTS_DIR", "")
us_paths = [os.path.join(
RESOLVE_ROOT_DIR,
"utility_scripts"
)]
# collect script dirs
if us_env:
log.info(f"Utility Scripts Env: `{us_env}`")
us_paths = us_env.split(
os.pathsep) + us_paths
# collect scripts from dirs
for path in us_paths:
scripts.update({path: os.listdir(path)})
log.info(f"Utility Scripts Dir: `{us_paths}`")
log.info(f"Utility Scripts: `{scripts}`")
# make sure no script file is in folder
for s in os.listdir(us_dir):
path = os.path.join(us_dir, s)
log.info(f"Removing `{path}`...")
if os.path.isdir(path):
shutil.rmtree(path, onerror=None)
else:
os.remove(path)
# copy scripts into Resolve's utility scripts dir
for d, sl in scripts.items():
# directory and scripts list
for s in sl:
# script in script list
src = os.path.join(d, s)
dst = os.path.join(us_dir, s)
log.info(f"Copying `{src}` to `{dst}`...")
if os.path.isdir(src):
shutil.copytree(
src, dst, symlinks=False,
ignore=None, ignore_dangling_symlinks=False
)
else:
shutil.copy2(src, dst)

View file

@ -957,32 +957,24 @@ class ApplicationLaunchContext:
# TODO load additional studio paths from settings
import openpype
pype_dir = os.path.dirname(os.path.abspath(openpype.__file__))
openpype_dir = os.path.dirname(os.path.abspath(openpype.__file__))
# --- START: Backwards compatibility ---
hooks_dir = os.path.join(pype_dir, "hooks")
global_hooks_dir = os.path.join(openpype_dir, "hooks")
subfolder_names = ["global"]
if self.host_name:
subfolder_names.append(self.host_name)
for subfolder_name in subfolder_names:
path = os.path.join(hooks_dir, subfolder_name)
if (
os.path.exists(path)
and os.path.isdir(path)
and path not in paths
):
paths.append(path)
# --- END: Backwards compatibility ---
subfolders_list = [
["hooks"]
hooks_dirs = [
global_hooks_dir
]
if self.host_name:
subfolders_list.append(["hosts", self.host_name, "hooks"])
# If host requires launch hooks and is module then launch hooks
# should be collected using 'collect_launch_hook_paths'
# - module have to implement 'get_launch_hook_paths'
host_module = self.modules_manager.get_host_module(self.host_name)
if not host_module:
hooks_dirs.append(os.path.join(
openpype_dir, "hosts", self.host_name, "hooks"
))
for subfolders in subfolders_list:
path = os.path.join(pype_dir, *subfolders)
for path in hooks_dirs:
if (
os.path.exists(path)
and os.path.isdir(path)
@ -991,7 +983,9 @@ class ApplicationLaunchContext:
paths.append(path)
# Load modules paths
paths.extend(self.modules_manager.collect_launch_hook_paths())
paths.extend(
self.modules_manager.collect_launch_hook_paths(self.application)
)
return paths
@ -1304,6 +1298,7 @@ def get_app_environments_for_context(
dict: Environments for passed context and application.
"""
from openpype.modules import ModulesManager
from openpype.pipeline import AvalonMongoDB, Anatomy
# Avalon database connection
@ -1316,8 +1311,6 @@ def get_app_environments_for_context(
asset_doc = get_asset_by_name(project_name, asset_name)
if modules_manager is None:
from openpype.modules import ModulesManager
modules_manager = ModulesManager()
# Prepare app object which can be obtained only from ApplciationManager
@ -1344,7 +1337,7 @@ def get_app_environments_for_context(
})
prepare_app_environments(data, env_group, modules_manager)
prepare_context_environments(data, env_group)
prepare_context_environments(data, env_group, modules_manager)
# Discard avalon connection
dbcon.uninstall()
@ -1503,8 +1496,10 @@ def prepare_app_environments(
final_env = None
# Add host specific environments
if app.host_name and implementation_envs:
module = __import__("openpype.hosts", fromlist=[app.host_name])
host_module = getattr(module, app.host_name, None)
host_module = modules_manager.get_host_module(app.host_name)
if not host_module:
module = __import__("openpype.hosts", fromlist=[app.host_name])
host_module = getattr(module, app.host_name, None)
add_implementation_envs = None
if host_module:
add_implementation_envs = getattr(
@ -1563,7 +1558,7 @@ def apply_project_environments_value(
return env
def prepare_context_environments(data, env_group=None):
def prepare_context_environments(data, env_group=None, modules_manager=None):
"""Modify launch environments with context data for launched host.
Args:
@ -1658,10 +1653,10 @@ def prepare_context_environments(data, env_group=None):
data["env"]["AVALON_APP"] = app.host_name
data["env"]["AVALON_WORKDIR"] = workdir
_prepare_last_workfile(data, workdir)
_prepare_last_workfile(data, workdir, modules_manager)
def _prepare_last_workfile(data, workdir):
def _prepare_last_workfile(data, workdir, modules_manager):
"""last workfile workflow preparation.
Function check if should care about last workfile workflow and tries
@ -1676,8 +1671,13 @@ def _prepare_last_workfile(data, workdir):
result will be stored.
workdir (str): Path to folder where workfiles should be stored.
"""
from openpype.modules import ModulesManager
from openpype.pipeline import HOST_WORKFILE_EXTENSIONS
if not modules_manager:
modules_manager = ModulesManager()
log = data["log"]
_workdir_data = data.get("workdir_data")
@ -1725,7 +1725,12 @@ def _prepare_last_workfile(data, workdir):
# Last workfile path
last_workfile_path = data.get("last_workfile_path") or ""
if not last_workfile_path:
extensions = HOST_WORKFILE_EXTENSIONS.get(app.host_name)
host_module = modules_manager.get_host_module(app.host_name)
if host_module:
extensions = host_module.get_workfile_extensions()
else:
extensions = HOST_WORKFILE_EXTENSIONS.get(app.host_name)
if extensions:
from openpype.pipeline.workfile import (
get_workfile_template_key,

View file

@ -1,11 +1,9 @@
"""Should be used only inside of hosts."""
import os
import json
import re
import copy
import platform
import logging
import collections
import functools
import warnings
@ -13,13 +11,9 @@ from openpype.client import (
get_project,
get_assets,
get_asset_by_name,
get_subsets,
get_last_versions,
get_last_version_by_subset_name,
get_representations,
get_workfile_info,
)
from openpype.settings import get_project_settings
from .profiles_filtering import filter_profiles
from .events import emit_event
from .path_templates import StringTemplate

View file

@ -140,7 +140,7 @@ class _LoadCache:
def get_default_modules_dir():
"""Path to default OpenPype modules."""
current_dir = os.path.abspath(os.path.dirname(__file__))
current_dir = os.path.dirname(os.path.abspath(__file__))
output = []
for folder_name in ("default_modules", ):
@ -298,6 +298,8 @@ def _load_modules():
# Add current directory at first place
# - has small differences in import logic
current_dir = os.path.abspath(os.path.dirname(__file__))
hosts_dir = os.path.join(os.path.dirname(current_dir), "hosts")
module_dirs.insert(0, hosts_dir)
module_dirs.insert(0, current_dir)
processed_paths = set()
@ -314,6 +316,7 @@ def _load_modules():
continue
is_in_current_dir = dirpath == current_dir
is_in_host_dir = dirpath == hosts_dir
for filename in os.listdir(dirpath):
# Ignore filenames
if filename in IGNORED_FILENAMES:
@ -353,6 +356,24 @@ def _load_modules():
sys.modules[new_import_str] = default_module
setattr(openpype_modules, basename, default_module)
elif is_in_host_dir:
import_str = "openpype.hosts.{}".format(basename)
new_import_str = "{}.{}".format(modules_key, basename)
# Until all hosts are converted to be able use them as
# modules is this error check needed
try:
default_module = __import__(
import_str, fromlist=("", )
)
sys.modules[new_import_str] = default_module
setattr(openpype_modules, basename, default_module)
except Exception:
log.warning(
"Failed to import host folder {}".format(basename),
exc_info=True
)
elif os.path.isdir(fullpath):
import_module_from_dirpath(dirpath, filename, modules_key)
@ -768,24 +789,50 @@ class ModulesManager:
output.extend(paths)
return output
def collect_launch_hook_paths(self):
"""Helper to collect hooks from modules inherited ILaunchHookPaths.
def collect_launch_hook_paths(self, app):
"""Helper to collect application launch hooks.
It used to be based on 'ILaunchHookPaths' which is not true anymore.
Module just have to have implemented 'get_launch_hook_paths' method.
Args:
app (Application): Application object which can be used for
filtering of which launch hook paths are returned.
Returns:
list: Paths to launch hook directories.
"""
from openpype_interfaces import ILaunchHookPaths
str_type = type("")
expected_types = (list, tuple, set)
output = []
for module in self.get_enabled_modules():
# Skip module that do not inherit from `ILaunchHookPaths`
if not isinstance(module, ILaunchHookPaths):
# Skip module if does not have implemented 'get_launch_hook_paths'
func = getattr(module, "get_launch_hook_paths", None)
if func is None:
continue
func = module.get_launch_hook_paths
if hasattr(inspect, "signature"):
sig = inspect.signature(func)
expect_args = len(sig.parameters) > 0
else:
expect_args = len(inspect.getargspec(func)[0]) > 0
# Pass application argument if method expect it.
try:
if expect_args:
hook_paths = func(app)
else:
hook_paths = func()
except Exception:
self.log.warning(
"Failed to call 'get_launch_hook_paths'",
exc_info=True
)
continue
hook_paths = module.get_launch_hook_paths()
if not hook_paths:
continue
@ -804,6 +851,45 @@ class ModulesManager:
output.extend(hook_paths)
return output
def get_host_module(self, host_name):
"""Find host module by host name.
Args:
host_name (str): Host name for which is found host module.
Returns:
OpenPypeModule: Found host module by name.
None: There was not found module inheriting IHostModule which has
host name set to passed 'host_name'.
"""
from openpype_interfaces import IHostModule
for module in self.get_enabled_modules():
if (
isinstance(module, IHostModule)
and module.host_name == host_name
):
return module
return None
def get_host_names(self):
"""List of available host names based on host modules.
Returns:
Iterable[str]: All available host names based on enabled modules
inheriting 'IHostModule'.
"""
from openpype_interfaces import IHostModule
host_names = {
module.host_name
for module in self.get_enabled_modules()
if isinstance(module, IHostModule)
}
return host_names
def print_report(self):
"""Print out report of time spent on modules initialization parts.

View file

@ -62,6 +62,7 @@ payload_skeleton_template = {
"RenderLayer": None, # Render only this layer
"Renderer": None,
"ProjectPath": None, # Resolve relative references
"RenderSetupIncludeLights": None, # Include all lights flag.
},
"AuxFiles": [] # Mandatory for Deadline, may be empty
}
@ -504,6 +505,7 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin):
self.payload_skeleton["JobInfo"]["Comment"] = comment
self.payload_skeleton["PluginInfo"]["RenderLayer"] = renderlayer
self.payload_skeleton["PluginInfo"]["RenderSetupIncludeLights"] = instance.data.get("renderSetupIncludeLights") # noqa
# Adding file dependencies.
dependencies = instance.context.data["fileDependencies"]
dependencies.append(filepath)

View file

@ -697,13 +697,22 @@ class SyncToAvalonEvent(BaseEvent):
continue
auto_sync = changes[CUST_ATTR_AUTO_SYNC]["new"]
if auto_sync == "1":
turned_on = auto_sync == "1"
ft_project = self.cur_project
username = self._get_username(session, event)
message = (
"Auto sync was turned {} for project \"{}\" by \"{}\"."
).format(
"on" if turned_on else "off",
ft_project["full_name"],
username
)
if turned_on:
message += " Triggering syncToAvalon action."
self.log.debug(message)
if turned_on:
# Trigger sync to avalon action if auto sync was turned on
ft_project = self.cur_project
self.log.debug((
"Auto sync was turned on for project <{}>."
" Triggering syncToAvalon action."
).format(ft_project["full_name"]))
selection = [{
"entityId": ft_project["id"],
"entityType": "show"
@ -851,6 +860,26 @@ class SyncToAvalonEvent(BaseEvent):
self.report()
return True
def _get_username(self, session, event):
username = "Unknown"
event_source = event.get("source")
if not event_source:
return username
user_info = event_source.get("user")
if not user_info:
return username
user_id = user_info.get("id")
if not user_id:
return username
user_entity = session.query(
"User where id is {}".format(user_id)
).first()
if user_entity:
username = user_entity["username"] or username
return username
def process_removed(self):
"""
Handles removed entities (not removed tasks - handle separately).

View file

@ -105,11 +105,17 @@ class CollectFtrackApi(pyblish.api.ContextPlugin):
context.data["ftrackEntity"] = asset_entity
context.data["ftrackTask"] = task_entity
self.per_instance_process(context, asset_name, task_name)
self.per_instance_process(context, asset_entity, task_entity)
def per_instance_process(
self, context, context_asset_name, context_task_name
self, context, context_asset_entity, context_task_entity
):
context_task_name = None
context_asset_name = None
if context_asset_entity:
context_asset_name = context_asset_entity["name"]
if context_task_entity:
context_task_name = context_task_entity["name"]
instance_by_asset_and_task = {}
for instance in context:
self.log.debug(
@ -120,6 +126,8 @@ class CollectFtrackApi(pyblish.api.ContextPlugin):
if not instance_asset_name and not instance_task_name:
self.log.debug("Instance does not have set context keys.")
instance.data["ftrackEntity"] = context_asset_entity
instance.data["ftrackTask"] = context_task_entity
continue
elif instance_asset_name and instance_task_name:
@ -131,6 +139,8 @@ class CollectFtrackApi(pyblish.api.ContextPlugin):
"Instance's context is same as in publish context."
" Asset: {} | Task: {}"
).format(context_asset_name, context_task_name))
instance.data["ftrackEntity"] = context_asset_entity
instance.data["ftrackTask"] = context_task_entity
continue
asset_name = instance_asset_name
task_name = instance_task_name
@ -141,6 +151,8 @@ class CollectFtrackApi(pyblish.api.ContextPlugin):
"Instance's context task is same as in publish"
" context. Task: {}"
).format(context_task_name))
instance.data["ftrackEntity"] = context_asset_entity
instance.data["ftrackTask"] = context_task_entity
continue
asset_name = context_asset_name
@ -152,6 +164,8 @@ class CollectFtrackApi(pyblish.api.ContextPlugin):
"Instance's context asset is same as in publish"
" context. Asset: {}"
).format(context_asset_name))
instance.data["ftrackEntity"] = context_asset_entity
instance.data["ftrackTask"] = context_task_entity
continue
# Do not use context's task name

View file

@ -0,0 +1,150 @@
import pyblish.api
from openpype.lib import filter_profiles
class IntegrateFtrackFarmStatus(pyblish.api.ContextPlugin):
"""Change task status when should be published on farm.
Instance which has set "farm" key in data to 'True' is considered as will
be rendered on farm thus it's status should be changed.
"""
order = pyblish.api.IntegratorOrder + 0.48
label = "Integrate Ftrack Farm Status"
farm_status_profiles = []
def process(self, context):
# Quick end
if not self.farm_status_profiles:
project_name = context.data["projectName"]
self.log.info((
"Status profiles are not filled for project \"{}\". Skipping"
).format(project_name))
return
filtered_instances = self.filter_instances(context)
instances_with_status_names = self.get_instances_with_statuse_names(
context, filtered_instances
)
if instances_with_status_names:
self.fill_statuses(context, instances_with_status_names)
def filter_instances(self, context):
filtered_instances = []
for instance in context:
# Skip disabled instances
if instance.data.get("publish") is False:
continue
subset_name = instance.data["subset"]
msg_start = "Skipping instance {}.".format(subset_name)
if not instance.data.get("farm"):
self.log.debug(
"{} Won't be rendered on farm.".format(msg_start)
)
continue
task_entity = instance.data.get("ftrackTask")
if not task_entity:
self.log.debug(
"{} Does not have filled task".format(msg_start)
)
continue
filtered_instances.append(instance)
return filtered_instances
def get_instances_with_statuse_names(self, context, instances):
instances_with_status_names = []
for instance in instances:
family = instance.data["family"]
subset_name = instance.data["subset"]
task_entity = instance.data["ftrackTask"]
host_name = context.data["hostName"]
task_name = task_entity["name"]
task_type = task_entity["type"]["name"]
status_profile = filter_profiles(
self.farm_status_profiles,
{
"hosts": host_name,
"task_types": task_type,
"task_names": task_name,
"families": family,
"subsets": subset_name,
},
logger=self.log
)
if not status_profile:
# There already is log in 'filter_profiles'
continue
status_name = status_profile["status_name"]
if status_name:
instances_with_status_names.append((instance, status_name))
return instances_with_status_names
def fill_statuses(self, context, instances_with_status_names):
# Prepare available task statuses on the project
project_name = context.data["projectName"]
session = context.data["ftrackSession"]
project_entity = session.query((
"select project_schema from Project where full_name is \"{}\""
).format(project_name)).one()
project_schema = project_entity["project_schema"]
task_type_ids = set()
for item in instances_with_status_names:
instance, _ = item
task_entity = instance.data["ftrackTask"]
task_type_ids.add(task_entity["type"]["id"])
task_statuses_by_type_id = {
task_type_id: project_schema.get_statuses("Task", task_type_id)
for task_type_id in task_type_ids
}
# Keep track if anything has changed
skipped_status_names = set()
status_changed = False
for item in instances_with_status_names:
instance, status_name = item
task_entity = instance.data["ftrackTask"]
task_statuses = task_statuses_by_type_id[task_entity["type"]["id"]]
status_name_low = status_name.lower()
status_id = None
status_name = None
# Skip if status name was already tried to be found
for status in task_statuses:
if status["name"].lower() == status_name_low:
status_id = status["id"]
status_name = status["name"]
break
if status_id is None:
if status_name_low not in skipped_status_names:
skipped_status_names.add(status_name_low)
joined_status_names = ", ".join({
'"{}"'.format(status["name"])
for status in task_statuses
})
self.log.warning((
"Status \"{}\" is not available on project \"{}\"."
" Available statuses are {}"
).format(status_name, project_name, joined_status_names))
continue
# Change task status id
if status_id != task_entity["status_id"]:
task_entity["status_id"] = status_id
status_changed = True
path = "/".join([
item["name"]
for item in task_entity["link"]
])
self.log.debug("Set status \"{}\" to \"{}\"".format(
status_name, path
))
if status_changed:
session.commit()

View file

@ -3,6 +3,7 @@ import json
import copy
import pyblish.api
from openpype.lib.openpype_version import get_openpype_version
from openpype.lib.transcoding import (
get_ffprobe_streams,
convert_ffprobe_fps_to_float,
@ -20,6 +21,17 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
label = "Integrate Ftrack Component"
families = ["ftrack"]
metadata_keys_to_label = {
"openpype_version": "OpenPype version",
"frame_start": "Frame start",
"frame_end": "Frame end",
"duration": "Duration",
"width": "Resolution width",
"height": "Resolution height",
"fps": "FPS",
"codec": "Codec"
}
family_mapping = {
"camera": "cam",
"look": "look",
@ -43,6 +55,7 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
}
keep_first_subset_name_for_review = True
asset_versions_status_profiles = {}
additional_metadata_keys = []
def process(self, instance):
self.log.debug("instance {}".format(instance))
@ -105,7 +118,8 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
"component_data": None,
"component_path": None,
"component_location": None,
"component_location_name": None
"component_location_name": None,
"additional_data": {}
}
# Filter types of representations
@ -152,6 +166,7 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
"name": "thumbnail"
}
thumbnail_item["thumbnail"] = True
# Create copy of item before setting location
src_components_to_add.append(copy.deepcopy(thumbnail_item))
# Create copy of first thumbnail
@ -248,19 +263,15 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
first_thumbnail_component[
"asset_data"]["name"] = extended_asset_name
component_meta = self._prepare_component_metadata(
instance, repre, repre_path, True
)
# Change location
review_item["component_path"] = repre_path
# Change component data
review_item["component_data"] = {
# Default component name is "main".
"name": "ftrackreview-mp4",
"metadata": {
"ftr_meta": json.dumps(component_meta)
}
"metadata": self._prepare_component_metadata(
instance, repre, repre_path, True
)
}
if is_first_review_repre:
@ -302,13 +313,9 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
component_data = copy_src_item["component_data"]
component_name = component_data["name"]
component_data["name"] = component_name + "_src"
component_meta = self._prepare_component_metadata(
component_data["metadata"] = self._prepare_component_metadata(
instance, repre, copy_src_item["component_path"], False
)
if component_meta:
component_data["metadata"] = {
"ftr_meta": json.dumps(component_meta)
}
component_list.append(copy_src_item)
# Add others representations as component
@ -326,16 +333,12 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
):
other_item["asset_data"]["name"] = extended_asset_name
component_meta = self._prepare_component_metadata(
instance, repre, published_path, False
)
component_data = {
"name": repre["name"]
"name": repre["name"],
"metadata": self._prepare_component_metadata(
instance, repre, published_path, False
)
}
if component_meta:
component_data["metadata"] = {
"ftr_meta": json.dumps(component_meta)
}
other_item["component_data"] = component_data
other_item["component_location_name"] = unmanaged_location_name
other_item["component_path"] = published_path
@ -354,6 +357,9 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
))
instance.data["ftrackComponentsList"] = component_list
def _collect_additional_metadata(self, streams):
pass
def _get_repre_path(self, instance, repre, only_published):
"""Get representation path that can be used for integration.
@ -423,6 +429,11 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
def _prepare_component_metadata(
self, instance, repre, component_path, is_review
):
metadata = {}
if "openpype_version" in self.additional_metadata_keys:
label = self.metadata_keys_to_label["openpype_version"]
metadata[label] = get_openpype_version()
extension = os.path.splitext(component_path)[-1]
streams = []
try:
@ -442,13 +453,23 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
# - exr is special case which can have issues with reading through
# ffmpegh but we want to set fps for it
if not video_streams and extension not in [".exr"]:
return {}
return metadata
stream_width = None
stream_height = None
stream_fps = None
frame_out = None
codec_label = None
for video_stream in video_streams:
codec_label = video_stream.get("codec_long_name")
if not codec_label:
codec_label = video_stream.get("codec")
if codec_label:
pix_fmt = video_stream.get("pix_fmt")
if pix_fmt:
codec_label += " ({})".format(pix_fmt)
tmp_width = video_stream.get("width")
tmp_height = video_stream.get("height")
if tmp_width and tmp_height:
@ -456,8 +477,8 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
stream_height = tmp_height
input_framerate = video_stream.get("r_frame_rate")
duration = video_stream.get("duration")
if input_framerate is None or duration is None:
stream_duration = video_stream.get("duration")
if input_framerate is None or stream_duration is None:
continue
try:
stream_fps = convert_ffprobe_fps_to_float(
@ -473,9 +494,9 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
stream_height = tmp_height
self.log.debug("FPS from stream is {} and duration is {}".format(
input_framerate, duration
input_framerate, stream_duration
))
frame_out = float(duration) * stream_fps
frame_out = float(stream_duration) * stream_fps
break
# Prepare FPS
@ -483,43 +504,58 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
if instance_fps is None:
instance_fps = instance.context.data["fps"]
if not is_review:
output = {}
fps = stream_fps or instance_fps
if fps:
output["frameRate"] = fps
if stream_width and stream_height:
output["width"] = int(stream_width)
output["height"] = int(stream_height)
return output
frame_start = repre.get("frameStartFtrack")
frame_end = repre.get("frameEndFtrack")
if frame_start is None or frame_end is None:
frame_start = instance.data["frameStart"]
frame_end = instance.data["frameEnd"]
fps = None
repre_fps = repre.get("fps")
if repre_fps is not None:
repre_fps = float(repre_fps)
fps = stream_fps or repre_fps or instance_fps
# Prepare frame ranges
frame_start = repre.get("frameStartFtrack")
frame_end = repre.get("frameEndFtrack")
if frame_start is None or frame_end is None:
frame_start = instance.data["frameStart"]
frame_end = instance.data["frameEnd"]
duration = (frame_end - frame_start) + 1
for key, value in [
("fps", fps),
("frame_start", frame_start),
("frame_end", frame_end),
("duration", duration),
("width", stream_width),
("height", stream_height),
("fps", fps),
("codec", codec_label)
]:
if not value or key not in self.additional_metadata_keys:
continue
label = self.metadata_keys_to_label[key]
metadata[label] = value
if not is_review:
ftr_meta = {}
if fps:
ftr_meta["frameRate"] = fps
if stream_width and stream_height:
ftr_meta["width"] = int(stream_width)
ftr_meta["height"] = int(stream_height)
metadata["ftr_meta"] = json.dumps(ftr_meta)
return metadata
# Frame end of uploaded video file should be duration in frames
# - frame start is always 0
# - frame end is duration in frames
if not frame_out:
frame_out = frame_end - frame_start + 1
frame_out = duration
# Ftrack documentation says that it is required to have
# 'width' and 'height' in review component. But with those values
# review video does not play.
component_meta = {
metadata["ftr_meta"] = json.dumps({
"frameIn": 0,
"frameOut": frame_out,
"frameRate": float(fps)
}
return component_meta
})
return metadata

View file

@ -1,4 +1,4 @@
from abc import abstractmethod
from abc import abstractmethod, abstractproperty
from openpype import resources
@ -50,12 +50,32 @@ class IPluginPaths(OpenPypeInterface):
class ILaunchHookPaths(OpenPypeInterface):
"""Module has launch hook paths to return.
Modules does not have to inherit from this interface (changed 8.11.2022).
Module just have to have implemented 'get_launch_hook_paths' to be able use
the advantage.
Expected result is list of paths.
["path/to/launch_hooks_dir"]
"""
@abstractmethod
def get_launch_hook_paths(self):
def get_launch_hook_paths(self, app):
"""Paths to directory with application launch hooks.
Method can be also defined without arguments.
```python
def get_launch_hook_paths(self):
return []
```
Args:
app (Application): Application object which can be used for
filtering of which launch hook paths are returned.
Returns:
Iterable[str]: Path to directories where launch hooks can be found.
"""
pass
@ -66,6 +86,7 @@ class ITrayModule(OpenPypeInterface):
The module still must be usable if is not used in tray even if
would do nothing.
"""
tray_initialized = False
_tray_manager = None
@ -78,16 +99,19 @@ class ITrayModule(OpenPypeInterface):
This is where GUIs should be loaded or tray specific parts should be
prepared.
"""
pass
@abstractmethod
def tray_menu(self, tray_menu):
"""Add module's action to tray menu."""
pass
@abstractmethod
def tray_start(self):
"""Start procedure in Pype tray."""
pass
@abstractmethod
@ -96,6 +120,7 @@ class ITrayModule(OpenPypeInterface):
This is place where all threads should be shut.
"""
pass
def execute_in_main_thread(self, callback):
@ -104,6 +129,7 @@ class ITrayModule(OpenPypeInterface):
Some callbacks need to be processed on main thread (menu actions
must be added on main thread or they won't get triggered etc.)
"""
if not self.tray_initialized:
# TODO Called without initialized tray, still main thread needed
try:
@ -128,6 +154,7 @@ class ITrayModule(OpenPypeInterface):
msecs (int): Duration of message visibility in miliseconds.
Default is 10000 msecs, may differ by Qt version.
"""
if self._tray_manager:
self._tray_manager.show_tray_message(title, message, icon, msecs)
@ -280,16 +307,19 @@ class ITrayService(ITrayModule):
def set_service_running_icon(self):
"""Change icon of an QAction to green circle."""
if self.menu_action:
self.menu_action.setIcon(self.get_icon_running())
def set_service_failed_icon(self):
"""Change icon of an QAction to red circle."""
if self.menu_action:
self.menu_action.setIcon(self.get_icon_failed())
def set_service_idle_icon(self):
"""Change icon of an QAction to orange circle."""
if self.menu_action:
self.menu_action.setIcon(self.get_icon_idle())
@ -303,6 +333,7 @@ class ISettingsChangeListener(OpenPypeInterface):
"publish": ["path/to/publish_plugins"]
}
"""
@abstractmethod
def on_system_settings_save(
self, old_value, new_value, changes, new_value_metadata
@ -320,3 +351,24 @@ class ISettingsChangeListener(OpenPypeInterface):
self, old_value, new_value, changes, project_name, new_value_metadata
):
pass
class IHostModule(OpenPypeInterface):
"""Module which also contain a host implementation."""
@abstractproperty
def host_name(self):
"""Name of host which module represents."""
pass
def get_workfile_extensions(self):
"""Define workfile extensions for host.
Not all hosts support workfiles thus this is optional implementation.
Returns:
List[str]: Extensions used for workfiles with dot.
"""
return []

View file

@ -6,7 +6,11 @@ from typing import List
import gazu
from pymongo import UpdateOne
from openpype.client import get_project, get_assets
from openpype.client import (
get_projects,
get_project,
get_assets,
)
from openpype.pipeline import AvalonMongoDB
from openpype.api import get_project_settings
from openpype.modules.kitsu.utils.credentials import validate_credentials
@ -37,7 +41,7 @@ def sync_zou(login: str, password: str):
dbcon = AvalonMongoDB()
dbcon.install()
op_projects = [p for p in dbcon.projects()]
op_projects = list(get_projects())
for project_doc in op_projects:
sync_zou_from_op_project(project_doc["name"], dbcon, project_doc)

View file

@ -6,7 +6,7 @@ import platform
import copy
from collections import deque, defaultdict
from openpype.client import get_projects
from openpype.modules import OpenPypeModule
from openpype_interfaces import ITrayModule
from openpype.settings import (
@ -913,7 +913,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
enabled_projects = []
if self.enabled:
for project in self.connection.projects(projection={"name": 1}):
for project in get_projects(fields=["name"]):
project_name = project["name"]
if self.is_project_enabled(project_name):
enabled_projects.append(project_name)
@ -1242,10 +1242,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
def _prepare_sync_project_settings(self, exclude_locals):
sync_project_settings = {}
system_sites = self.get_all_site_configs()
project_docs = self.connection.projects(
projection={"name": 1},
only_active=True
)
project_docs = get_projects(fields=["name"])
for project_doc in project_docs:
project_name = project_doc["name"]
sites = copy.deepcopy(system_sites) # get all configured sites

View file

@ -16,6 +16,7 @@ from .utils import (
switch_container,
get_loader_identifier,
get_loaders_by_name,
get_representation_path_from_context,
get_representation_path,
@ -61,6 +62,7 @@ __all__ = (
"switch_container",
"get_loader_identifier",
"get_loaders_by_name",
"get_representation_path_from_context",
"get_representation_path",

View file

@ -222,13 +222,20 @@ def get_representation_context(representation):
project_name, representation
)
if not representation:
raise AssertionError("Representation was not found in database")
version, subset, asset, project = get_representation_parents(
project_name, representation
)
assert all([representation, version, subset, asset, project]), (
"This is a bug"
)
if not version:
raise AssertionError("Version was not found in database")
if not subset:
raise AssertionError("Subset was not found in database")
if not asset:
raise AssertionError("Asset was not found in database")
if not project:
raise AssertionError("Project was not found in database")
context = {
"project": {
@ -369,6 +376,20 @@ def get_loader_identifier(loader):
return loader.__name__
def get_loaders_by_name():
from .plugins import discover_loader_plugins
loaders_by_name = {}
for loader in discover_loader_plugins():
loader_name = loader.__name__
if loader_name in loaders_by_name:
raise KeyError(
"Duplicated loader name {} !".format(loader_name)
)
loaders_by_name[loader_name] = loader
return loaders_by_name
def _get_container_loader(container):
"""Return the Loader corresponding to the container"""
from .plugins import discover_loader_plugins

View file

@ -0,0 +1,526 @@
import os
from abc import ABCMeta, abstractmethod
import six
import logging
from functools import reduce
from openpype.client import get_asset_by_name
from openpype.settings import get_project_settings
from openpype.lib import (
StringTemplate,
Logger,
filter_profiles,
get_linked_assets,
)
from openpype.pipeline import legacy_io, Anatomy
from openpype.pipeline.load import (
get_loaders_by_name,
get_representation_context,
load_with_repre_context,
)
from .build_template_exceptions import (
TemplateAlreadyImported,
TemplateLoadingFailed,
TemplateProfileNotFound,
TemplateNotFound
)
log = logging.getLogger(__name__)
def update_representations(entities, entity):
if entity['context']['subset'] not in entities:
entities[entity['context']['subset']] = entity
else:
current = entities[entity['context']['subset']]
incomming = entity
entities[entity['context']['subset']] = max(
current, incomming,
key=lambda entity: entity["context"].get("version", -1))
return entities
def parse_loader_args(loader_args):
if not loader_args:
return dict()
try:
parsed_args = eval(loader_args)
if not isinstance(parsed_args, dict):
return dict()
else:
return parsed_args
except Exception as err:
print(
"Error while parsing loader arguments '{}'.\n{}: {}\n\n"
"Continuing with default arguments. . .".format(
loader_args,
err.__class__.__name__,
err))
return dict()
@six.add_metaclass(ABCMeta)
class AbstractTemplateLoader:
"""
Abstraction of Template Loader.
Properties:
template_path : property to get current template path
Methods:
import_template : Abstract Method. Used to load template,
depending on current host
get_template_nodes : Abstract Method. Used to query nodes acting
as placeholders. Depending on current host
"""
_log = None
def __init__(self, placeholder_class):
# TODO template loader should expect host as and argument
# - host have all responsibility for most of code (also provide
# placeholder class)
# - also have responsibility for current context
# - this won't work in DCCs where multiple workfiles with
# different contexts can be opened at single time
# - template loader should have ability to change context
project_name = legacy_io.active_project()
asset_name = legacy_io.Session["AVALON_ASSET"]
self.loaders_by_name = get_loaders_by_name()
self.current_asset = asset_name
self.project_name = project_name
self.host_name = legacy_io.Session["AVALON_APP"]
self.task_name = legacy_io.Session["AVALON_TASK"]
self.placeholder_class = placeholder_class
self.current_asset_doc = get_asset_by_name(project_name, asset_name)
self.task_type = (
self.current_asset_doc
.get("data", {})
.get("tasks", {})
.get(self.task_name, {})
.get("type")
)
self.log.info(
"BUILDING ASSET FROM TEMPLATE :\n"
"Starting templated build for {asset} in {project}\n\n"
"Asset : {asset}\n"
"Task : {task_name} ({task_type})\n"
"Host : {host}\n"
"Project : {project}\n".format(
asset=self.current_asset,
host=self.host_name,
project=self.project_name,
task_name=self.task_name,
task_type=self.task_type
))
# Skip if there is no loader
if not self.loaders_by_name:
self.log.warning(
"There is no registered loaders. No assets will be loaded")
return
@property
def log(self):
if self._log is None:
self._log = Logger.get_logger(self.__class__.__name__)
return self._log
def template_already_imported(self, err_msg):
"""In case template was already loaded.
Raise the error as a default action.
Override this method in your template loader implementation
to manage this case."""
self.log.error("{}: {}".format(
err_msg.__class__.__name__,
err_msg))
raise TemplateAlreadyImported(err_msg)
def template_loading_failed(self, err_msg):
"""In case template loading failed
Raise the error as a default action.
Override this method in your template loader implementation
to manage this case.
"""
self.log.error("{}: {}".format(
err_msg.__class__.__name__,
err_msg))
raise TemplateLoadingFailed(err_msg)
@property
def template_path(self):
"""
Property returning template path. Avoiding setter.
Getting template path from open pype settings based on current avalon
session and solving the path variables if needed.
Returns:
str: Solved template path
Raises:
TemplateProfileNotFound: No profile found from settings for
current avalon session
KeyError: Could not solve path because a key does not exists
in avalon context
TemplateNotFound: Solved path does not exists on current filesystem
"""
project_name = self.project_name
host_name = self.host_name
task_name = self.task_name
task_type = self.task_type
anatomy = Anatomy(project_name)
project_settings = get_project_settings(project_name)
build_info = project_settings[host_name]["templated_workfile_build"]
profile = filter_profiles(
build_info["profiles"],
{
"task_types": task_type,
"tasks": task_name
}
)
if not profile:
raise TemplateProfileNotFound(
"No matching profile found for task '{}' of type '{}' "
"with host '{}'".format(task_name, task_type, host_name)
)
path = profile["path"]
if not path:
raise TemplateLoadingFailed(
"Template path is not set.\n"
"Path need to be set in {}\\Template Workfile Build "
"Settings\\Profiles".format(host_name.title()))
# Try fill path with environments and anatomy roots
fill_data = {
key: value
for key, value in os.environ.items()
}
fill_data["root"] = anatomy.roots
result = StringTemplate.format_template(path, fill_data)
if result.solved:
path = result.normalized()
if path and os.path.exists(path):
self.log.info("Found template at: '{}'".format(path))
return path
solved_path = None
while True:
try:
solved_path = anatomy.path_remapper(path)
except KeyError as missing_key:
raise KeyError(
"Could not solve key '{}' in template path '{}'".format(
missing_key, path))
if solved_path is None:
solved_path = path
if solved_path == path:
break
path = solved_path
solved_path = os.path.normpath(solved_path)
if not os.path.exists(solved_path):
raise TemplateNotFound(
"Template found in openPype settings for task '{}' with host "
"'{}' does not exists. (Not found : {})".format(
task_name, host_name, solved_path))
self.log.info("Found template at: '{}'".format(solved_path))
return solved_path
def populate_template(self, ignored_ids=None):
"""
Use template placeholders to load assets and parent them in hierarchy
Arguments :
ignored_ids :
Returns:
None
"""
loaders_by_name = self.loaders_by_name
current_asset_doc = self.current_asset_doc
linked_assets = get_linked_assets(current_asset_doc)
ignored_ids = ignored_ids or []
placeholders = self.get_placeholders()
self.log.debug("Placeholders found in template: {}".format(
[placeholder.name for placeholder in placeholders]
))
for placeholder in placeholders:
self.log.debug("Start to processing placeholder {}".format(
placeholder.name
))
placeholder_representations = self.get_placeholder_representations(
placeholder,
current_asset_doc,
linked_assets
)
if not placeholder_representations:
self.log.info(
"There's no representation for this placeholder: "
"{}".format(placeholder.name)
)
continue
for representation in placeholder_representations:
self.preload(placeholder, loaders_by_name, representation)
if self.load_data_is_incorrect(
placeholder,
representation,
ignored_ids):
continue
self.log.info(
"Loading {}_{} with loader {}\n"
"Loader arguments used : {}".format(
representation['context']['asset'],
representation['context']['subset'],
placeholder.loader_name,
placeholder.loader_args))
try:
container = self.load(
placeholder, loaders_by_name, representation)
except Exception:
self.load_failed(placeholder, representation)
else:
self.load_succeed(placeholder, container)
finally:
self.postload(placeholder)
def get_placeholder_representations(
self, placeholder, current_asset_doc, linked_asset_docs
):
placeholder_representations = placeholder.get_representations(
current_asset_doc,
linked_asset_docs
)
for repre_doc in reduce(
update_representations,
placeholder_representations,
dict()
).values():
yield repre_doc
def load_data_is_incorrect(
self, placeholder, last_representation, ignored_ids):
if not last_representation:
self.log.warning(placeholder.err_message())
return True
if (str(last_representation['_id']) in ignored_ids):
print("Ignoring : ", last_representation['_id'])
return True
return False
def preload(self, placeholder, loaders_by_name, last_representation):
pass
def load(self, placeholder, loaders_by_name, last_representation):
repre = get_representation_context(last_representation)
return load_with_repre_context(
loaders_by_name[placeholder.loader_name],
repre,
options=parse_loader_args(placeholder.loader_args))
def load_succeed(self, placeholder, container):
placeholder.parent_in_hierarchy(container)
def load_failed(self, placeholder, last_representation):
self.log.warning(
"Got error trying to load {}:{} with {}".format(
last_representation['context']['asset'],
last_representation['context']['subset'],
placeholder.loader_name
),
exc_info=True
)
def postload(self, placeholder):
placeholder.clean()
def update_missing_containers(self):
loaded_containers_ids = self.get_loaded_containers_by_id()
self.populate_template(ignored_ids=loaded_containers_ids)
def get_placeholders(self):
placeholders = map(self.placeholder_class, self.get_template_nodes())
valid_placeholders = filter(
lambda i: i.is_valid,
placeholders
)
sorted_placeholders = list(sorted(
valid_placeholders,
key=lambda i: i.order
))
return sorted_placeholders
@abstractmethod
def get_loaded_containers_by_id(self):
"""
Collect already loaded containers for updating scene
Return:
dict (string, node): A dictionnary id as key
and containers as value
"""
pass
@abstractmethod
def import_template(self, template_path):
"""
Import template in current host
Args:
template_path (str): fullpath to current task and
host's template file
Return:
None
"""
pass
@abstractmethod
def get_template_nodes(self):
"""
Returning a list of nodes acting as host placeholders for
templating. The data representation is by user.
AbstractLoadTemplate (and LoadTemplate) won't directly manipulate nodes
Args :
None
Returns:
list(AnyNode): Solved template path
"""
pass
@six.add_metaclass(ABCMeta)
class AbstractPlaceholder:
"""Abstraction of placeholders logic.
Properties:
required_keys: A list of mandatory keys to decribe placeholder
and assets to load.
optional_keys: A list of optional keys to decribe
placeholder and assets to load
loader_name: Name of linked loader to use while loading assets
Args:
identifier (str): Placeholder identifier. Should be possible to be
used as identifier in "a scene" (e.g. unique node name).
"""
required_keys = {
"builder_type",
"family",
"representation",
"order",
"loader",
"loader_args"
}
optional_keys = {}
def __init__(self, identifier):
self._log = None
self._name = identifier
self.get_data(identifier)
@property
def log(self):
if self._log is None:
self._log = Logger.get_logger(repr(self))
return self._log
def __repr__(self):
return "< {} {} >".format(self.__class__.__name__, self.name)
@property
def name(self):
return self._name
@property
def loader_args(self):
return self.data["loader_args"]
@property
def builder_type(self):
return self.data["builder_type"]
@property
def order(self):
return self.data["order"]
@property
def loader_name(self):
"""Return placeholder loader name.
Returns:
str: Loader name that will be used to load placeholder
representations.
"""
return self.data["loader"]
@property
def is_valid(self):
"""Test validity of placeholder.
i.e.: every required key exists in placeholder data
Returns:
bool: True if every key is in data
"""
if set(self.required_keys).issubset(self.data.keys()):
self.log.debug("Valid placeholder : {}".format(self.name))
return True
self.log.info("Placeholder is not valid : {}".format(self.name))
return False
@abstractmethod
def parent_in_hierarchy(self, container):
"""Place loaded container in correct hierarchy given by placeholder
Args:
container (Dict[str, Any]): Loaded container created by loader.
"""
pass
@abstractmethod
def clean(self):
"""Clean placeholder from hierarchy after loading assets."""
pass
@abstractmethod
def get_representations(self, current_asset_doc, linked_asset_docs):
"""Query representations based on placeholder data.
Args:
current_asset_doc (Dict[str, Any]): Document of current
context asset.
linked_asset_docs (List[Dict[str, Any]]): Documents of assets
linked to current context asset.
Returns:
Iterable[Dict[str, Any]]: Representations that are matching
placeholder filters.
"""
pass
@abstractmethod
def get_data(self, identifier):
"""Collect information about placeholder by identifier.
Args:
identifier (str): A unique placeholder identifier defined by
implementation.
"""
pass

View file

@ -0,0 +1,68 @@
from importlib import import_module
from openpype.lib import classes_from_module
from openpype.host import HostBase
from openpype.pipeline import registered_host
from .abstract_template_loader import (
AbstractPlaceholder,
AbstractTemplateLoader)
from .build_template_exceptions import (
TemplateLoadingFailed,
TemplateAlreadyImported,
MissingHostTemplateModule,
MissingTemplatePlaceholderClass,
MissingTemplateLoaderClass
)
_module_path_format = 'openpype.hosts.{host}.api.template_loader'
def build_workfile_template(*args):
template_loader = build_template_loader()
try:
template_loader.import_template(template_loader.template_path)
except TemplateAlreadyImported as err:
template_loader.template_already_imported(err)
except TemplateLoadingFailed as err:
template_loader.template_loading_failed(err)
else:
template_loader.populate_template()
def update_workfile_template(args):
template_loader = build_template_loader()
template_loader.update_missing_containers()
def build_template_loader():
# TODO refactor to use advantage of 'HostBase' and don't import dynamically
# - hosts should have methods that gives option to return builders
host = registered_host()
if isinstance(host, HostBase):
host_name = host.name
else:
host_name = host.__name__.partition('.')[2]
module_path = _module_path_format.format(host=host_name)
module = import_module(module_path)
if not module:
raise MissingHostTemplateModule(
"No template loader found for host {}".format(host_name))
template_loader_class = classes_from_module(
AbstractTemplateLoader,
module
)
template_placeholder_class = classes_from_module(
AbstractPlaceholder,
module
)
if not template_loader_class:
raise MissingTemplateLoaderClass()
template_loader_class = template_loader_class[0]
if not template_placeholder_class:
raise MissingTemplatePlaceholderClass()
template_placeholder_class = template_placeholder_class[0]
return template_loader_class(template_placeholder_class)

View file

@ -0,0 +1,35 @@
class MissingHostTemplateModule(Exception):
"""Error raised when expected module does not exists"""
pass
class MissingTemplatePlaceholderClass(Exception):
"""Error raised when module doesn't implement a placeholder class"""
pass
class MissingTemplateLoaderClass(Exception):
"""Error raised when module doesn't implement a template loader class"""
pass
class TemplateNotFound(Exception):
"""Exception raised when template does not exist."""
pass
class TemplateProfileNotFound(Exception):
"""Exception raised when current profile
doesn't match any template profile"""
pass
class TemplateAlreadyImported(Exception):
"""Error raised when Template was already imported by host for
this session"""
pass
class TemplateLoadingFailed(Exception):
"""Error raised whend Template loader was unable to load the template"""
pass

View file

@ -1459,6 +1459,8 @@ class ExtractReview(pyblish.api.InstancePlugin):
output = -1
regexes = self.compile_list_of_regexes(in_list)
for regex in regexes:
if not value:
continue
if re.match(regex, value):
output = 1
break

View file

@ -93,6 +93,6 @@ class IntegrateSubsetGroup(pyblish.api.InstancePlugin):
return {
"families": anatomy_data["family"],
"tasks": task.get("name"),
"hosts": anatomy_data["app"],
"hosts": instance.context.data["hostName"],
"task_types": task.get("type")
}

View file

@ -491,7 +491,11 @@
"usd": "usd"
},
"keep_first_subset_name_for_review": true,
"asset_versions_status_profiles": []
"asset_versions_status_profiles": [],
"additional_metadata_keys": []
},
"IntegrateFtrackFarmStatus": {
"farm_status_profiles": []
}
}
}

View file

@ -34,6 +34,7 @@
"RenderSettings": {
"apply_render_settings": true,
"default_render_image_folder": "renders",
"enable_all_lights": false,
"aov_separator": "underscore",
"reset_current_frame": false,
"arnold_renderer": {
@ -967,6 +968,9 @@
}
]
},
"templated_workfile_build": {
"profiles": []
},
"filters": {
"preset 1": {
"ValidateNoAnimation": false,
@ -976,4 +980,4 @@
"ValidateNoAnimation": false
}
}
}
}

View file

@ -19,4 +19,4 @@
"step": "step"
}
}
}
}

View file

@ -1039,6 +1039,82 @@
}
]
}
},
{
"key": "additional_metadata_keys",
"label": "Additional metadata keys on components",
"type": "enum",
"multiselection": true,
"enum_items": [
{"openpype_version": "OpenPype version"},
{"frame_start": "Frame start"},
{"frame_end": "Frame end"},
{"duration": "Duration"},
{"width": "Resolution width"},
{"height": "Resolution height"},
{"fps": "FPS"},
{"code": "Codec"}
]
}
]
},
{
"type": "dict",
"key": "IntegrateFtrackFarmStatus",
"label": "Integrate Ftrack Farm Status",
"children": [
{
"type": "label",
"label": "Change status of task when it's subset is submitted to farm"
},
{
"type": "list",
"collapsible": true,
"key": "farm_status_profiles",
"label": "Farm status profiles",
"use_label_wrap": true,
"object_type": {
"type": "dict",
"children": [
{
"key": "hosts",
"label": "Host names",
"type": "hosts-enum",
"multiselection": true
},
{
"key": "task_types",
"label": "Task types",
"type": "task-types-enum"
},
{
"key": "task_names",
"label": "Task names",
"type": "list",
"object_type": "text"
},
{
"key": "families",
"label": "Families",
"type": "list",
"object_type": "text"
},
{
"key": "subsets",
"label": "Subset names",
"type": "list",
"object_type": "text"
},
{
"type": "separator"
},
{
"key": "status_name",
"label": "Status name",
"type": "text"
}
]
}
}
]
}

View file

@ -77,6 +77,10 @@
"type": "schema",
"name": "schema_workfile_build"
},
{
"type": "schema",
"name": "schema_templated_workfile_build"
},
{
"type": "schema",
"name": "schema_publish_gui_filter"

View file

@ -14,6 +14,11 @@
"key": "default_render_image_folder",
"label": "Default render image folder"
},
{
"type": "boolean",
"key": "enable_all_lights",
"label": "Include all lights in Render Setup Layers by default"
},
{
"key": "aov_separator",
"label": "AOV Separator character",

View file

@ -0,0 +1,35 @@
{
"type": "dict",
"collapsible": true,
"key": "templated_workfile_build",
"label": "Templated Workfile Build Settings",
"children": [
{
"type": "list",
"key": "profiles",
"label": "Profiles",
"object_type": {
"type": "dict",
"children": [
{
"key": "task_types",
"label": "Task types",
"type": "task-types-enum"
},
{
"key": "tasks",
"label": "Task names",
"type": "list",
"object_type": "text"
},
{
"key": "path",
"label": "Path to template",
"type": "text",
"object_type": "text"
}
]
}
}
]
}

View file

@ -10,6 +10,7 @@ from Qt import QtCore, QtGui
import qtawesome
from openpype.client import (
get_projects,
get_project,
get_assets,
)
@ -527,7 +528,7 @@ class LauncherModel(QtCore.QObject):
current_project = self.project_name
project_names = set()
project_docs_by_name = {}
for project_doc in self._dbcon.projects(only_active=True):
for project_doc in get_projects():
project_name = project_doc["name"]
project_names.add(project_name)
project_docs_by_name[project_name] = project_doc

View file

@ -3,7 +3,7 @@ import sys
from Qt import QtWidgets, QtCore, QtGui
from openpype import style
from openpype.client import get_project
from openpype.client import get_projects, get_project
from openpype.pipeline import AvalonMongoDB
from openpype.tools.utils import lib as tools_lib
from openpype.tools.loader.widgets import (
@ -239,7 +239,7 @@ class LibraryLoaderWindow(QtWidgets.QDialog):
def get_filtered_projects(self):
projects = list()
for project in self.dbcon.projects():
for project in get_projects(fields=["name", "data.library_project"]):
is_library = project.get("data", {}).get("library_project", False)
if (
(is_library and self.show_libraries) or

View file

@ -1547,6 +1547,11 @@ def _load_representations_by_loader(loader, repre_contexts,
return
for repre_context in repre_contexts.values():
version_doc = repre_context["version"]
if version_doc["type"] == "hero_version":
version_name = "Hero"
else:
version_name = version_doc.get("name")
try:
if data_by_repre_id:
_id = repre_context["representation"]["_id"]
@ -1564,7 +1569,7 @@ def _load_representations_by_loader(loader, repre_contexts,
None,
repre_context["representation"]["name"],
repre_context["subset"]["name"],
repre_context["version"]["name"]
version_name
))
except Exception as exc:
@ -1577,7 +1582,7 @@ def _load_representations_by_loader(loader, repre_contexts,
formatted_traceback,
repre_context["representation"]["name"],
repre_context["subset"]["name"],
repre_context["version"]["name"]
version_name
))
return error_info

View file

@ -8,6 +8,7 @@ from pymongo import UpdateOne, DeleteOne
from Qt import QtCore, QtGui
from openpype.client import (
get_projects,
get_project,
get_assets,
get_asset_ids_with_subsets,
@ -54,12 +55,8 @@ class ProjectModel(QtGui.QStandardItemModel):
self._items_by_name[None] = none_project
new_project_items.append(none_project)
project_docs = self.dbcon.projects(
projection={"name": 1},
only_active=True
)
project_names = set()
for project_doc in project_docs:
for project_doc in get_projects(fields=["name"]):
project_name = project_doc.get("name")
if not project_name:
continue

View file

@ -4,8 +4,9 @@ import sys
from Qt import QtWidgets, QtCore
import qtawesome
from openpype.pipeline import legacy_io
from openpype import style
from openpype.client import get_projects
from openpype.pipeline import legacy_io
from openpype.tools.utils.delegates import VersionDelegate
from openpype.tools.utils.lib import (
qt_app_context,
@ -195,8 +196,7 @@ def show(root=None, debug=False, parent=None, items=None):
if not os.environ.get("AVALON_PROJECT"):
any_project = next(
project for project in legacy_io.projects()
if project.get("active", True) is not False
project for project in get_projects()
)
project_name = any_project["name"]

View file

@ -3,6 +3,7 @@ import uuid
from Qt import QtWidgets, QtCore, QtGui
import qtawesome
from openpype.client import get_projects
from openpype.pipeline import AvalonMongoDB
from openpype.style import get_objected_colors
from openpype.tools.utils.widgets import ImageButton
@ -783,8 +784,6 @@ class ProjectModel(QtGui.QStandardItemModel):
self.setColumnCount(2)
self.dbcon = None
self._only_active = only_active
self._default_item = None
self._items_by_name = {}
@ -828,9 +827,6 @@ class ProjectModel(QtGui.QStandardItemModel):
index = self.index(index.row(), 0, index.parent())
return super(ProjectModel, self).flags(index)
def set_dbcon(self, dbcon):
self.dbcon = dbcon
def refresh(self):
# Change id of versions refresh
self._version_refresh_id = uuid.uuid4()
@ -846,31 +842,30 @@ class ProjectModel(QtGui.QStandardItemModel):
self._default_item.setData("", PROJECT_VERSION_ROLE)
project_names = set()
if self.dbcon is not None:
for project_doc in self.dbcon.projects(
projection={"name": 1, "data.active": 1},
only_active=self._only_active
):
project_name = project_doc["name"]
project_names.add(project_name)
if project_name in self._items_by_name:
item = self._items_by_name[project_name]
else:
item = QtGui.QStandardItem(project_name)
for project_doc in get_projects(
inactive=not self._only_active,
fields=["name", "data.active"]
):
project_name = project_doc["name"]
project_names.add(project_name)
if project_name in self._items_by_name:
item = self._items_by_name[project_name]
else:
item = QtGui.QStandardItem(project_name)
self._items_by_name[project_name] = item
new_items.append(item)
self._items_by_name[project_name] = item
new_items.append(item)
is_active = project_doc.get("data", {}).get("active", True)
item.setData(project_name, PROJECT_NAME_ROLE)
item.setData(is_active, PROJECT_IS_ACTIVE_ROLE)
item.setData("", PROJECT_VERSION_ROLE)
item.setData(False, PROJECT_IS_SELECTED_ROLE)
is_active = project_doc.get("data", {}).get("active", True)
item.setData(project_name, PROJECT_NAME_ROLE)
item.setData(is_active, PROJECT_IS_ACTIVE_ROLE)
item.setData("", PROJECT_VERSION_ROLE)
item.setData(False, PROJECT_IS_SELECTED_ROLE)
if not is_active:
font = item.font()
font.setItalic(True)
item.setFont(font)
if not is_active:
font = item.font()
font.setItalic(True)
item.setFont(font)
root_item = self.invisibleRootItem()
for project_name in tuple(self._items_by_name.keys()):
@ -1067,8 +1062,6 @@ class ProjectListWidget(QtWidgets.QWidget):
self.project_model = project_model
self.inactive_chk = inactive_chk
self.dbcon = None
def set_entity(self, entity):
self._entity = entity
@ -1211,15 +1204,6 @@ class ProjectListWidget(QtWidgets.QWidget):
selected_project = index.data(PROJECT_NAME_ROLE)
break
if not self.dbcon:
try:
self.dbcon = AvalonMongoDB()
self.dbcon.install()
except Exception:
self.dbcon = None
self.current_project = None
self.project_model.set_dbcon(self.dbcon)
self.project_model.refresh()
self.project_proxy.sort(0)

View file

@ -3,6 +3,7 @@ from Qt import QtWidgets, QtCore
import qtawesome
from openpype.client import (
get_projects,
get_project,
get_asset_by_id,
)
@ -291,9 +292,7 @@ class AssetWidget(QtWidgets.QWidget):
def _set_projects(self):
project_names = list()
for doc in self.dbcon.projects(projection={"name": 1},
only_active=True):
for doc in get_projects(fields=["name"]):
project_name = doc.get("name")
if project_name:
project_names.append(project_name)
@ -320,8 +319,7 @@ class AssetWidget(QtWidgets.QWidget):
def on_project_change(self):
projects = list()
for project in self.dbcon.projects(projection={"name": 1},
only_active=True):
for project in get_projects(fields=["name"]):
projects.append(project['name'])
project_name = self.combo_projects.currentText()
if project_name in projects:

View file

@ -10,19 +10,19 @@ from Qt import QtCore, QtGui, QtWidgets
import openpype.version
from openpype.api import (
Logger,
resources,
get_system_settings
)
from openpype.lib import (
get_openpype_execute_args,
from openpype.lib import get_openpype_execute_args, Logger
from openpype.lib.openpype_version import (
op_version_control_available,
get_expected_version,
get_installed_version,
is_current_version_studio_latest,
is_current_version_higher_than_expected,
is_running_from_build,
is_running_staging,
get_expected_version,
get_openpype_version
get_openpype_version,
)
from openpype.modules import TrayModulesManager
from openpype import style
@ -329,6 +329,25 @@ class TrayManager:
self._version_dialog.close()
return
installed_version = get_installed_version()
expected_version = get_expected_version()
# Request new build if is needed
if (
# Backwards compatibility
not hasattr(expected_version, "is_compatible")
or not expected_version.is_compatible(installed_version)
):
if (
self._version_dialog is not None
and self._version_dialog.isVisible()
):
self._version_dialog.close()
dialog = BuildVersionDialog()
dialog.exec_()
return
if self._version_dialog is None:
self._version_dialog = VersionUpdateDialog()
self._version_dialog.restart_requested.connect(
@ -338,7 +357,6 @@ class TrayManager:
self._outdated_version_ignored
)
expected_version = get_expected_version()
current_version = get_openpype_version()
current_is_higher = is_current_version_higher_than_expected()

View file

@ -443,10 +443,6 @@ class FamilyConfigCache:
if profiles:
# Make sure connection is installed
# - accessing attribute which does not have auto-install
self.dbcon.install()
database = getattr(self.dbcon, "database", None)
if database is None:
database = self.dbcon._database
asset_doc = get_asset_by_name(
project_name, asset_name, fields=["data.tasks"]
) or {}

View file

@ -3,6 +3,7 @@ import logging
import Qt
from Qt import QtCore, QtGui
from openpype.client import get_projects
from .constants import (
PROJECT_IS_ACTIVE_ROLE,
PROJECT_NAME_ROLE,
@ -296,29 +297,29 @@ class ProjectModel(QtGui.QStandardItemModel):
self._default_item = item
project_names = set()
if self.dbcon is not None:
for project_doc in self.dbcon.projects(
projection={"name": 1, "data.active": 1},
only_active=self._only_active
):
project_name = project_doc["name"]
project_names.add(project_name)
if project_name in self._items_by_name:
item = self._items_by_name[project_name]
else:
item = QtGui.QStandardItem(project_name)
project_docs = get_projects(
inactive=not self._only_active,
fields=["name", "data.active"]
)
for project_doc in project_docs:
project_name = project_doc["name"]
project_names.add(project_name)
if project_name in self._items_by_name:
item = self._items_by_name[project_name]
else:
item = QtGui.QStandardItem(project_name)
self._items_by_name[project_name] = item
new_items.append(item)
self._items_by_name[project_name] = item
new_items.append(item)
is_active = project_doc.get("data", {}).get("active", True)
item.setData(project_name, PROJECT_NAME_ROLE)
item.setData(is_active, PROJECT_IS_ACTIVE_ROLE)
is_active = project_doc.get("data", {}).get("active", True)
item.setData(project_name, PROJECT_NAME_ROLE)
item.setData(is_active, PROJECT_IS_ACTIVE_ROLE)
if not is_active:
font = item.font()
font.setItalic(True)
item.setFont(font)
if not is_active:
font = item.font()
font.setItalic(True)
item.setFont(font)
root_item = self.invisibleRootItem()
for project_name in tuple(self._items_by_name.keys()):

View file

@ -0,0 +1,80 @@
# SPDX-License-Identifier: MIT
from __future__ import absolute_import, division, print_function
import sys
from functools import partial
from . import converters, exceptions, filters, setters, validators
from ._cmp import cmp_using
from ._config import get_run_validators, set_run_validators
from ._funcs import asdict, assoc, astuple, evolve, has, resolve_types
from ._make import (
NOTHING,
Attribute,
Factory,
attrib,
attrs,
fields,
fields_dict,
make_class,
validate,
)
from ._version_info import VersionInfo
__version__ = "21.4.0"
__version_info__ = VersionInfo._from_version_string(__version__)
__title__ = "attrs"
__description__ = "Classes Without Boilerplate"
__url__ = "https://www.attrs.org/"
__uri__ = __url__
__doc__ = __description__ + " <" + __uri__ + ">"
__author__ = "Hynek Schlawack"
__email__ = "hs@ox.cx"
__license__ = "MIT"
__copyright__ = "Copyright (c) 2015 Hynek Schlawack"
s = attributes = attrs
ib = attr = attrib
dataclass = partial(attrs, auto_attribs=True) # happy Easter ;)
__all__ = [
"Attribute",
"Factory",
"NOTHING",
"asdict",
"assoc",
"astuple",
"attr",
"attrib",
"attributes",
"attrs",
"cmp_using",
"converters",
"evolve",
"exceptions",
"fields",
"fields_dict",
"filters",
"get_run_validators",
"has",
"ib",
"make_class",
"resolve_types",
"s",
"set_run_validators",
"setters",
"validate",
"validators",
]
if sys.version_info[:2] >= (3, 6):
from ._next_gen import define, field, frozen, mutable # noqa: F401
__all__.extend(("define", "field", "frozen", "mutable"))

View file

@ -0,0 +1,484 @@
import sys
from typing import (
Any,
Callable,
Dict,
Generic,
List,
Mapping,
Optional,
Sequence,
Tuple,
Type,
TypeVar,
Union,
overload,
)
# `import X as X` is required to make these public
from . import converters as converters
from . import exceptions as exceptions
from . import filters as filters
from . import setters as setters
from . import validators as validators
from ._version_info import VersionInfo
__version__: str
__version_info__: VersionInfo
__title__: str
__description__: str
__url__: str
__uri__: str
__author__: str
__email__: str
__license__: str
__copyright__: str
_T = TypeVar("_T")
_C = TypeVar("_C", bound=type)
_EqOrderType = Union[bool, Callable[[Any], Any]]
_ValidatorType = Callable[[Any, Attribute[_T], _T], Any]
_ConverterType = Callable[[Any], Any]
_FilterType = Callable[[Attribute[_T], _T], bool]
_ReprType = Callable[[Any], str]
_ReprArgType = Union[bool, _ReprType]
_OnSetAttrType = Callable[[Any, Attribute[Any], Any], Any]
_OnSetAttrArgType = Union[
_OnSetAttrType, List[_OnSetAttrType], setters._NoOpType
]
_FieldTransformer = Callable[
[type, List[Attribute[Any]]], List[Attribute[Any]]
]
_CompareWithType = Callable[[Any, Any], bool]
# FIXME: in reality, if multiple validators are passed they must be in a list
# or tuple, but those are invariant and so would prevent subtypes of
# _ValidatorType from working when passed in a list or tuple.
_ValidatorArgType = Union[_ValidatorType[_T], Sequence[_ValidatorType[_T]]]
# _make --
NOTHING: object
# NOTE: Factory lies about its return type to make this possible:
# `x: List[int] # = Factory(list)`
# Work around mypy issue #4554 in the common case by using an overload.
if sys.version_info >= (3, 8):
from typing import Literal
@overload
def Factory(factory: Callable[[], _T]) -> _T: ...
@overload
def Factory(
factory: Callable[[Any], _T],
takes_self: Literal[True],
) -> _T: ...
@overload
def Factory(
factory: Callable[[], _T],
takes_self: Literal[False],
) -> _T: ...
else:
@overload
def Factory(factory: Callable[[], _T]) -> _T: ...
@overload
def Factory(
factory: Union[Callable[[Any], _T], Callable[[], _T]],
takes_self: bool = ...,
) -> _T: ...
# Static type inference support via __dataclass_transform__ implemented as per:
# https://github.com/microsoft/pyright/blob/1.1.135/specs/dataclass_transforms.md
# This annotation must be applied to all overloads of "define" and "attrs"
#
# NOTE: This is a typing construct and does not exist at runtime. Extensions
# wrapping attrs decorators should declare a separate __dataclass_transform__
# signature in the extension module using the specification linked above to
# provide pyright support.
def __dataclass_transform__(
*,
eq_default: bool = True,
order_default: bool = False,
kw_only_default: bool = False,
field_descriptors: Tuple[Union[type, Callable[..., Any]], ...] = (()),
) -> Callable[[_T], _T]: ...
class Attribute(Generic[_T]):
name: str
default: Optional[_T]
validator: Optional[_ValidatorType[_T]]
repr: _ReprArgType
cmp: _EqOrderType
eq: _EqOrderType
order: _EqOrderType
hash: Optional[bool]
init: bool
converter: Optional[_ConverterType]
metadata: Dict[Any, Any]
type: Optional[Type[_T]]
kw_only: bool
on_setattr: _OnSetAttrType
def evolve(self, **changes: Any) -> "Attribute[Any]": ...
# NOTE: We had several choices for the annotation to use for type arg:
# 1) Type[_T]
# - Pros: Handles simple cases correctly
# - Cons: Might produce less informative errors in the case of conflicting
# TypeVars e.g. `attr.ib(default='bad', type=int)`
# 2) Callable[..., _T]
# - Pros: Better error messages than #1 for conflicting TypeVars
# - Cons: Terrible error messages for validator checks.
# e.g. attr.ib(type=int, validator=validate_str)
# -> error: Cannot infer function type argument
# 3) type (and do all of the work in the mypy plugin)
# - Pros: Simple here, and we could customize the plugin with our own errors.
# - Cons: Would need to write mypy plugin code to handle all the cases.
# We chose option #1.
# `attr` lies about its return type to make the following possible:
# attr() -> Any
# attr(8) -> int
# attr(validator=<some callable>) -> Whatever the callable expects.
# This makes this type of assignments possible:
# x: int = attr(8)
#
# This form catches explicit None or no default but with no other arguments
# returns Any.
@overload
def attrib(
default: None = ...,
validator: None = ...,
repr: _ReprArgType = ...,
cmp: Optional[_EqOrderType] = ...,
hash: Optional[bool] = ...,
init: bool = ...,
metadata: Optional[Mapping[Any, Any]] = ...,
type: None = ...,
converter: None = ...,
factory: None = ...,
kw_only: bool = ...,
eq: Optional[_EqOrderType] = ...,
order: Optional[_EqOrderType] = ...,
on_setattr: Optional[_OnSetAttrArgType] = ...,
) -> Any: ...
# This form catches an explicit None or no default and infers the type from the
# other arguments.
@overload
def attrib(
default: None = ...,
validator: Optional[_ValidatorArgType[_T]] = ...,
repr: _ReprArgType = ...,
cmp: Optional[_EqOrderType] = ...,
hash: Optional[bool] = ...,
init: bool = ...,
metadata: Optional[Mapping[Any, Any]] = ...,
type: Optional[Type[_T]] = ...,
converter: Optional[_ConverterType] = ...,
factory: Optional[Callable[[], _T]] = ...,
kw_only: bool = ...,
eq: Optional[_EqOrderType] = ...,
order: Optional[_EqOrderType] = ...,
on_setattr: Optional[_OnSetAttrArgType] = ...,
) -> _T: ...
# This form catches an explicit default argument.
@overload
def attrib(
default: _T,
validator: Optional[_ValidatorArgType[_T]] = ...,
repr: _ReprArgType = ...,
cmp: Optional[_EqOrderType] = ...,
hash: Optional[bool] = ...,
init: bool = ...,
metadata: Optional[Mapping[Any, Any]] = ...,
type: Optional[Type[_T]] = ...,
converter: Optional[_ConverterType] = ...,
factory: Optional[Callable[[], _T]] = ...,
kw_only: bool = ...,
eq: Optional[_EqOrderType] = ...,
order: Optional[_EqOrderType] = ...,
on_setattr: Optional[_OnSetAttrArgType] = ...,
) -> _T: ...
# This form covers type=non-Type: e.g. forward references (str), Any
@overload
def attrib(
default: Optional[_T] = ...,
validator: Optional[_ValidatorArgType[_T]] = ...,
repr: _ReprArgType = ...,
cmp: Optional[_EqOrderType] = ...,
hash: Optional[bool] = ...,
init: bool = ...,
metadata: Optional[Mapping[Any, Any]] = ...,
type: object = ...,
converter: Optional[_ConverterType] = ...,
factory: Optional[Callable[[], _T]] = ...,
kw_only: bool = ...,
eq: Optional[_EqOrderType] = ...,
order: Optional[_EqOrderType] = ...,
on_setattr: Optional[_OnSetAttrArgType] = ...,
) -> Any: ...
@overload
def field(
*,
default: None = ...,
validator: None = ...,
repr: _ReprArgType = ...,
hash: Optional[bool] = ...,
init: bool = ...,
metadata: Optional[Mapping[Any, Any]] = ...,
converter: None = ...,
factory: None = ...,
kw_only: bool = ...,
eq: Optional[bool] = ...,
order: Optional[bool] = ...,
on_setattr: Optional[_OnSetAttrArgType] = ...,
) -> Any: ...
# This form catches an explicit None or no default and infers the type from the
# other arguments.
@overload
def field(
*,
default: None = ...,
validator: Optional[_ValidatorArgType[_T]] = ...,
repr: _ReprArgType = ...,
hash: Optional[bool] = ...,
init: bool = ...,
metadata: Optional[Mapping[Any, Any]] = ...,
converter: Optional[_ConverterType] = ...,
factory: Optional[Callable[[], _T]] = ...,
kw_only: bool = ...,
eq: Optional[_EqOrderType] = ...,
order: Optional[_EqOrderType] = ...,
on_setattr: Optional[_OnSetAttrArgType] = ...,
) -> _T: ...
# This form catches an explicit default argument.
@overload
def field(
*,
default: _T,
validator: Optional[_ValidatorArgType[_T]] = ...,
repr: _ReprArgType = ...,
hash: Optional[bool] = ...,
init: bool = ...,
metadata: Optional[Mapping[Any, Any]] = ...,
converter: Optional[_ConverterType] = ...,
factory: Optional[Callable[[], _T]] = ...,
kw_only: bool = ...,
eq: Optional[_EqOrderType] = ...,
order: Optional[_EqOrderType] = ...,
on_setattr: Optional[_OnSetAttrArgType] = ...,
) -> _T: ...
# This form covers type=non-Type: e.g. forward references (str), Any
@overload
def field(
*,
default: Optional[_T] = ...,
validator: Optional[_ValidatorArgType[_T]] = ...,
repr: _ReprArgType = ...,
hash: Optional[bool] = ...,
init: bool = ...,
metadata: Optional[Mapping[Any, Any]] = ...,
converter: Optional[_ConverterType] = ...,
factory: Optional[Callable[[], _T]] = ...,
kw_only: bool = ...,
eq: Optional[_EqOrderType] = ...,
order: Optional[_EqOrderType] = ...,
on_setattr: Optional[_OnSetAttrArgType] = ...,
) -> Any: ...
@overload
@__dataclass_transform__(order_default=True, field_descriptors=(attrib, field))
def attrs(
maybe_cls: _C,
these: Optional[Dict[str, Any]] = ...,
repr_ns: Optional[str] = ...,
repr: bool = ...,
cmp: Optional[_EqOrderType] = ...,
hash: Optional[bool] = ...,
init: bool = ...,
slots: bool = ...,
frozen: bool = ...,
weakref_slot: bool = ...,
str: bool = ...,
auto_attribs: bool = ...,
kw_only: bool = ...,
cache_hash: bool = ...,
auto_exc: bool = ...,
eq: Optional[_EqOrderType] = ...,
order: Optional[_EqOrderType] = ...,
auto_detect: bool = ...,
collect_by_mro: bool = ...,
getstate_setstate: Optional[bool] = ...,
on_setattr: Optional[_OnSetAttrArgType] = ...,
field_transformer: Optional[_FieldTransformer] = ...,
match_args: bool = ...,
) -> _C: ...
@overload
@__dataclass_transform__(order_default=True, field_descriptors=(attrib, field))
def attrs(
maybe_cls: None = ...,
these: Optional[Dict[str, Any]] = ...,
repr_ns: Optional[str] = ...,
repr: bool = ...,
cmp: Optional[_EqOrderType] = ...,
hash: Optional[bool] = ...,
init: bool = ...,
slots: bool = ...,
frozen: bool = ...,
weakref_slot: bool = ...,
str: bool = ...,
auto_attribs: bool = ...,
kw_only: bool = ...,
cache_hash: bool = ...,
auto_exc: bool = ...,
eq: Optional[_EqOrderType] = ...,
order: Optional[_EqOrderType] = ...,
auto_detect: bool = ...,
collect_by_mro: bool = ...,
getstate_setstate: Optional[bool] = ...,
on_setattr: Optional[_OnSetAttrArgType] = ...,
field_transformer: Optional[_FieldTransformer] = ...,
match_args: bool = ...,
) -> Callable[[_C], _C]: ...
@overload
@__dataclass_transform__(field_descriptors=(attrib, field))
def define(
maybe_cls: _C,
*,
these: Optional[Dict[str, Any]] = ...,
repr: bool = ...,
hash: Optional[bool] = ...,
init: bool = ...,
slots: bool = ...,
frozen: bool = ...,
weakref_slot: bool = ...,
str: bool = ...,
auto_attribs: bool = ...,
kw_only: bool = ...,
cache_hash: bool = ...,
auto_exc: bool = ...,
eq: Optional[bool] = ...,
order: Optional[bool] = ...,
auto_detect: bool = ...,
getstate_setstate: Optional[bool] = ...,
on_setattr: Optional[_OnSetAttrArgType] = ...,
field_transformer: Optional[_FieldTransformer] = ...,
match_args: bool = ...,
) -> _C: ...
@overload
@__dataclass_transform__(field_descriptors=(attrib, field))
def define(
maybe_cls: None = ...,
*,
these: Optional[Dict[str, Any]] = ...,
repr: bool = ...,
hash: Optional[bool] = ...,
init: bool = ...,
slots: bool = ...,
frozen: bool = ...,
weakref_slot: bool = ...,
str: bool = ...,
auto_attribs: bool = ...,
kw_only: bool = ...,
cache_hash: bool = ...,
auto_exc: bool = ...,
eq: Optional[bool] = ...,
order: Optional[bool] = ...,
auto_detect: bool = ...,
getstate_setstate: Optional[bool] = ...,
on_setattr: Optional[_OnSetAttrArgType] = ...,
field_transformer: Optional[_FieldTransformer] = ...,
match_args: bool = ...,
) -> Callable[[_C], _C]: ...
mutable = define
frozen = define # they differ only in their defaults
# TODO: add support for returning NamedTuple from the mypy plugin
class _Fields(Tuple[Attribute[Any], ...]):
def __getattr__(self, name: str) -> Attribute[Any]: ...
def fields(cls: type) -> _Fields: ...
def fields_dict(cls: type) -> Dict[str, Attribute[Any]]: ...
def validate(inst: Any) -> None: ...
def resolve_types(
cls: _C,
globalns: Optional[Dict[str, Any]] = ...,
localns: Optional[Dict[str, Any]] = ...,
attribs: Optional[List[Attribute[Any]]] = ...,
) -> _C: ...
# TODO: add support for returning a proper attrs class from the mypy plugin
# we use Any instead of _CountingAttr so that e.g. `make_class('Foo',
# [attr.ib()])` is valid
def make_class(
name: str,
attrs: Union[List[str], Tuple[str, ...], Dict[str, Any]],
bases: Tuple[type, ...] = ...,
repr_ns: Optional[str] = ...,
repr: bool = ...,
cmp: Optional[_EqOrderType] = ...,
hash: Optional[bool] = ...,
init: bool = ...,
slots: bool = ...,
frozen: bool = ...,
weakref_slot: bool = ...,
str: bool = ...,
auto_attribs: bool = ...,
kw_only: bool = ...,
cache_hash: bool = ...,
auto_exc: bool = ...,
eq: Optional[_EqOrderType] = ...,
order: Optional[_EqOrderType] = ...,
collect_by_mro: bool = ...,
on_setattr: Optional[_OnSetAttrArgType] = ...,
field_transformer: Optional[_FieldTransformer] = ...,
) -> type: ...
# _funcs --
# TODO: add support for returning TypedDict from the mypy plugin
# FIXME: asdict/astuple do not honor their factory args. Waiting on one of
# these:
# https://github.com/python/mypy/issues/4236
# https://github.com/python/typing/issues/253
# XXX: remember to fix attrs.asdict/astuple too!
def asdict(
inst: Any,
recurse: bool = ...,
filter: Optional[_FilterType[Any]] = ...,
dict_factory: Type[Mapping[Any, Any]] = ...,
retain_collection_types: bool = ...,
value_serializer: Optional[
Callable[[type, Attribute[Any], Any], Any]
] = ...,
tuple_keys: Optional[bool] = ...,
) -> Dict[str, Any]: ...
# TODO: add support for returning NamedTuple from the mypy plugin
def astuple(
inst: Any,
recurse: bool = ...,
filter: Optional[_FilterType[Any]] = ...,
tuple_factory: Type[Sequence[Any]] = ...,
retain_collection_types: bool = ...,
) -> Tuple[Any, ...]: ...
def has(cls: type) -> bool: ...
def assoc(inst: _T, **changes: Any) -> _T: ...
def evolve(inst: _T, **changes: Any) -> _T: ...
# _config --
def set_run_validators(run: bool) -> None: ...
def get_run_validators() -> bool: ...
# aliases --
s = attributes = attrs
ib = attr = attrib
dataclass = attrs # Technically, partial(attrs, auto_attribs=True) ;)

View file

@ -0,0 +1,154 @@
# SPDX-License-Identifier: MIT
from __future__ import absolute_import, division, print_function
import functools
from ._compat import new_class
from ._make import _make_ne
_operation_names = {"eq": "==", "lt": "<", "le": "<=", "gt": ">", "ge": ">="}
def cmp_using(
eq=None,
lt=None,
le=None,
gt=None,
ge=None,
require_same_type=True,
class_name="Comparable",
):
"""
Create a class that can be passed into `attr.ib`'s ``eq``, ``order``, and
``cmp`` arguments to customize field comparison.
The resulting class will have a full set of ordering methods if
at least one of ``{lt, le, gt, ge}`` and ``eq`` are provided.
:param Optional[callable] eq: `callable` used to evaluate equality
of two objects.
:param Optional[callable] lt: `callable` used to evaluate whether
one object is less than another object.
:param Optional[callable] le: `callable` used to evaluate whether
one object is less than or equal to another object.
:param Optional[callable] gt: `callable` used to evaluate whether
one object is greater than another object.
:param Optional[callable] ge: `callable` used to evaluate whether
one object is greater than or equal to another object.
:param bool require_same_type: When `True`, equality and ordering methods
will return `NotImplemented` if objects are not of the same type.
:param Optional[str] class_name: Name of class. Defaults to 'Comparable'.
See `comparison` for more details.
.. versionadded:: 21.1.0
"""
body = {
"__slots__": ["value"],
"__init__": _make_init(),
"_requirements": [],
"_is_comparable_to": _is_comparable_to,
}
# Add operations.
num_order_functions = 0
has_eq_function = False
if eq is not None:
has_eq_function = True
body["__eq__"] = _make_operator("eq", eq)
body["__ne__"] = _make_ne()
if lt is not None:
num_order_functions += 1
body["__lt__"] = _make_operator("lt", lt)
if le is not None:
num_order_functions += 1
body["__le__"] = _make_operator("le", le)
if gt is not None:
num_order_functions += 1
body["__gt__"] = _make_operator("gt", gt)
if ge is not None:
num_order_functions += 1
body["__ge__"] = _make_operator("ge", ge)
type_ = new_class(class_name, (object,), {}, lambda ns: ns.update(body))
# Add same type requirement.
if require_same_type:
type_._requirements.append(_check_same_type)
# Add total ordering if at least one operation was defined.
if 0 < num_order_functions < 4:
if not has_eq_function:
# functools.total_ordering requires __eq__ to be defined,
# so raise early error here to keep a nice stack.
raise ValueError(
"eq must be define is order to complete ordering from "
"lt, le, gt, ge."
)
type_ = functools.total_ordering(type_)
return type_
def _make_init():
"""
Create __init__ method.
"""
def __init__(self, value):
"""
Initialize object with *value*.
"""
self.value = value
return __init__
def _make_operator(name, func):
"""
Create operator method.
"""
def method(self, other):
if not self._is_comparable_to(other):
return NotImplemented
result = func(self.value, other.value)
if result is NotImplemented:
return NotImplemented
return result
method.__name__ = "__%s__" % (name,)
method.__doc__ = "Return a %s b. Computed by attrs." % (
_operation_names[name],
)
return method
def _is_comparable_to(self, other):
"""
Check whether `other` is comparable to `self`.
"""
for func in self._requirements:
if not func(self, other):
return False
return True
def _check_same_type(self, other):
"""
Return True if *self* and *other* are of the same type, False otherwise.
"""
return other.value.__class__ is self.value.__class__

View file

@ -0,0 +1,13 @@
from typing import Type
from . import _CompareWithType
def cmp_using(
eq: Optional[_CompareWithType],
lt: Optional[_CompareWithType],
le: Optional[_CompareWithType],
gt: Optional[_CompareWithType],
ge: Optional[_CompareWithType],
require_same_type: bool,
class_name: str,
) -> Type: ...

View file

@ -0,0 +1,261 @@
# SPDX-License-Identifier: MIT
from __future__ import absolute_import, division, print_function
import platform
import sys
import threading
import types
import warnings
PY2 = sys.version_info[0] == 2
PYPY = platform.python_implementation() == "PyPy"
PY36 = sys.version_info[:2] >= (3, 6)
HAS_F_STRINGS = PY36
PY310 = sys.version_info[:2] >= (3, 10)
if PYPY or PY36:
ordered_dict = dict
else:
from collections import OrderedDict
ordered_dict = OrderedDict
if PY2:
from collections import Mapping, Sequence
from UserDict import IterableUserDict
# We 'bundle' isclass instead of using inspect as importing inspect is
# fairly expensive (order of 10-15 ms for a modern machine in 2016)
def isclass(klass):
return isinstance(klass, (type, types.ClassType))
def new_class(name, bases, kwds, exec_body):
"""
A minimal stub of types.new_class that we need for make_class.
"""
ns = {}
exec_body(ns)
return type(name, bases, ns)
# TYPE is used in exceptions, repr(int) is different on Python 2 and 3.
TYPE = "type"
def iteritems(d):
return d.iteritems()
# Python 2 is bereft of a read-only dict proxy, so we make one!
class ReadOnlyDict(IterableUserDict):
"""
Best-effort read-only dict wrapper.
"""
def __setitem__(self, key, val):
# We gently pretend we're a Python 3 mappingproxy.
raise TypeError(
"'mappingproxy' object does not support item assignment"
)
def update(self, _):
# We gently pretend we're a Python 3 mappingproxy.
raise AttributeError(
"'mappingproxy' object has no attribute 'update'"
)
def __delitem__(self, _):
# We gently pretend we're a Python 3 mappingproxy.
raise TypeError(
"'mappingproxy' object does not support item deletion"
)
def clear(self):
# We gently pretend we're a Python 3 mappingproxy.
raise AttributeError(
"'mappingproxy' object has no attribute 'clear'"
)
def pop(self, key, default=None):
# We gently pretend we're a Python 3 mappingproxy.
raise AttributeError(
"'mappingproxy' object has no attribute 'pop'"
)
def popitem(self):
# We gently pretend we're a Python 3 mappingproxy.
raise AttributeError(
"'mappingproxy' object has no attribute 'popitem'"
)
def setdefault(self, key, default=None):
# We gently pretend we're a Python 3 mappingproxy.
raise AttributeError(
"'mappingproxy' object has no attribute 'setdefault'"
)
def __repr__(self):
# Override to be identical to the Python 3 version.
return "mappingproxy(" + repr(self.data) + ")"
def metadata_proxy(d):
res = ReadOnlyDict()
res.data.update(d) # We blocked update, so we have to do it like this.
return res
def just_warn(*args, **kw): # pragma: no cover
"""
We only warn on Python 3 because we are not aware of any concrete
consequences of not setting the cell on Python 2.
"""
else: # Python 3 and later.
from collections.abc import Mapping, Sequence # noqa
def just_warn(*args, **kw):
"""
We only warn on Python 3 because we are not aware of any concrete
consequences of not setting the cell on Python 2.
"""
warnings.warn(
"Running interpreter doesn't sufficiently support code object "
"introspection. Some features like bare super() or accessing "
"__class__ will not work with slotted classes.",
RuntimeWarning,
stacklevel=2,
)
def isclass(klass):
return isinstance(klass, type)
TYPE = "class"
def iteritems(d):
return d.items()
new_class = types.new_class
def metadata_proxy(d):
return types.MappingProxyType(dict(d))
def make_set_closure_cell():
"""Return a function of two arguments (cell, value) which sets
the value stored in the closure cell `cell` to `value`.
"""
# pypy makes this easy. (It also supports the logic below, but
# why not do the easy/fast thing?)
if PYPY:
def set_closure_cell(cell, value):
cell.__setstate__((value,))
return set_closure_cell
# Otherwise gotta do it the hard way.
# Create a function that will set its first cellvar to `value`.
def set_first_cellvar_to(value):
x = value
return
# This function will be eliminated as dead code, but
# not before its reference to `x` forces `x` to be
# represented as a closure cell rather than a local.
def force_x_to_be_a_cell(): # pragma: no cover
return x
try:
# Extract the code object and make sure our assumptions about
# the closure behavior are correct.
if PY2:
co = set_first_cellvar_to.func_code
else:
co = set_first_cellvar_to.__code__
if co.co_cellvars != ("x",) or co.co_freevars != ():
raise AssertionError # pragma: no cover
# Convert this code object to a code object that sets the
# function's first _freevar_ (not cellvar) to the argument.
if sys.version_info >= (3, 8):
# CPython 3.8+ has an incompatible CodeType signature
# (added a posonlyargcount argument) but also added
# CodeType.replace() to do this without counting parameters.
set_first_freevar_code = co.replace(
co_cellvars=co.co_freevars, co_freevars=co.co_cellvars
)
else:
args = [co.co_argcount]
if not PY2:
args.append(co.co_kwonlyargcount)
args.extend(
[
co.co_nlocals,
co.co_stacksize,
co.co_flags,
co.co_code,
co.co_consts,
co.co_names,
co.co_varnames,
co.co_filename,
co.co_name,
co.co_firstlineno,
co.co_lnotab,
# These two arguments are reversed:
co.co_cellvars,
co.co_freevars,
]
)
set_first_freevar_code = types.CodeType(*args)
def set_closure_cell(cell, value):
# Create a function using the set_first_freevar_code,
# whose first closure cell is `cell`. Calling it will
# change the value of that cell.
setter = types.FunctionType(
set_first_freevar_code, {}, "setter", (), (cell,)
)
# And call it to set the cell.
setter(value)
# Make sure it works on this interpreter:
def make_func_with_cell():
x = None
def func():
return x # pragma: no cover
return func
if PY2:
cell = make_func_with_cell().func_closure[0]
else:
cell = make_func_with_cell().__closure__[0]
set_closure_cell(cell, 100)
if cell.cell_contents != 100:
raise AssertionError # pragma: no cover
except Exception:
return just_warn
else:
return set_closure_cell
set_closure_cell = make_set_closure_cell()
# Thread-local global to track attrs instances which are already being repr'd.
# This is needed because there is no other (thread-safe) way to pass info
# about the instances that are already being repr'd through the call stack
# in order to ensure we don't perform infinite recursion.
#
# For instance, if an instance contains a dict which contains that instance,
# we need to know that we're already repr'ing the outside instance from within
# the dict's repr() call.
#
# This lives here rather than in _make.py so that the functions in _make.py
# don't have a direct reference to the thread-local in their globals dict.
# If they have such a reference, it breaks cloudpickle.
repr_context = threading.local()

View file

@ -0,0 +1,33 @@
# SPDX-License-Identifier: MIT
from __future__ import absolute_import, division, print_function
__all__ = ["set_run_validators", "get_run_validators"]
_run_validators = True
def set_run_validators(run):
"""
Set whether or not validators are run. By default, they are run.
.. deprecated:: 21.3.0 It will not be removed, but it also will not be
moved to new ``attrs`` namespace. Use `attrs.validators.set_disabled()`
instead.
"""
if not isinstance(run, bool):
raise TypeError("'run' must be bool.")
global _run_validators
_run_validators = run
def get_run_validators():
"""
Return whether or not validators are run.
.. deprecated:: 21.3.0 It will not be removed, but it also will not be
moved to new ``attrs`` namespace. Use `attrs.validators.get_disabled()`
instead.
"""
return _run_validators

View file

@ -0,0 +1,422 @@
# SPDX-License-Identifier: MIT
from __future__ import absolute_import, division, print_function
import copy
from ._compat import iteritems
from ._make import NOTHING, _obj_setattr, fields
from .exceptions import AttrsAttributeNotFoundError
def asdict(
inst,
recurse=True,
filter=None,
dict_factory=dict,
retain_collection_types=False,
value_serializer=None,
):
"""
Return the ``attrs`` attribute values of *inst* as a dict.
Optionally recurse into other ``attrs``-decorated classes.
:param inst: Instance of an ``attrs``-decorated class.
:param bool recurse: Recurse into classes that are also
``attrs``-decorated.
:param callable filter: A callable whose return code determines whether an
attribute or element is included (``True``) or dropped (``False``). Is
called with the `attrs.Attribute` as the first argument and the
value as the second argument.
:param callable dict_factory: A callable to produce dictionaries from. For
example, to produce ordered dictionaries instead of normal Python
dictionaries, pass in ``collections.OrderedDict``.
:param bool retain_collection_types: Do not convert to ``list`` when
encountering an attribute whose type is ``tuple`` or ``set``. Only
meaningful if ``recurse`` is ``True``.
:param Optional[callable] value_serializer: A hook that is called for every
attribute or dict key/value. It receives the current instance, field
and value and must return the (updated) value. The hook is run *after*
the optional *filter* has been applied.
:rtype: return type of *dict_factory*
:raise attr.exceptions.NotAnAttrsClassError: If *cls* is not an ``attrs``
class.
.. versionadded:: 16.0.0 *dict_factory*
.. versionadded:: 16.1.0 *retain_collection_types*
.. versionadded:: 20.3.0 *value_serializer*
.. versionadded:: 21.3.0 If a dict has a collection for a key, it is
serialized as a tuple.
"""
attrs = fields(inst.__class__)
rv = dict_factory()
for a in attrs:
v = getattr(inst, a.name)
if filter is not None and not filter(a, v):
continue
if value_serializer is not None:
v = value_serializer(inst, a, v)
if recurse is True:
if has(v.__class__):
rv[a.name] = asdict(
v,
recurse=True,
filter=filter,
dict_factory=dict_factory,
retain_collection_types=retain_collection_types,
value_serializer=value_serializer,
)
elif isinstance(v, (tuple, list, set, frozenset)):
cf = v.__class__ if retain_collection_types is True else list
rv[a.name] = cf(
[
_asdict_anything(
i,
is_key=False,
filter=filter,
dict_factory=dict_factory,
retain_collection_types=retain_collection_types,
value_serializer=value_serializer,
)
for i in v
]
)
elif isinstance(v, dict):
df = dict_factory
rv[a.name] = df(
(
_asdict_anything(
kk,
is_key=True,
filter=filter,
dict_factory=df,
retain_collection_types=retain_collection_types,
value_serializer=value_serializer,
),
_asdict_anything(
vv,
is_key=False,
filter=filter,
dict_factory=df,
retain_collection_types=retain_collection_types,
value_serializer=value_serializer,
),
)
for kk, vv in iteritems(v)
)
else:
rv[a.name] = v
else:
rv[a.name] = v
return rv
def _asdict_anything(
val,
is_key,
filter,
dict_factory,
retain_collection_types,
value_serializer,
):
"""
``asdict`` only works on attrs instances, this works on anything.
"""
if getattr(val.__class__, "__attrs_attrs__", None) is not None:
# Attrs class.
rv = asdict(
val,
recurse=True,
filter=filter,
dict_factory=dict_factory,
retain_collection_types=retain_collection_types,
value_serializer=value_serializer,
)
elif isinstance(val, (tuple, list, set, frozenset)):
if retain_collection_types is True:
cf = val.__class__
elif is_key:
cf = tuple
else:
cf = list
rv = cf(
[
_asdict_anything(
i,
is_key=False,
filter=filter,
dict_factory=dict_factory,
retain_collection_types=retain_collection_types,
value_serializer=value_serializer,
)
for i in val
]
)
elif isinstance(val, dict):
df = dict_factory
rv = df(
(
_asdict_anything(
kk,
is_key=True,
filter=filter,
dict_factory=df,
retain_collection_types=retain_collection_types,
value_serializer=value_serializer,
),
_asdict_anything(
vv,
is_key=False,
filter=filter,
dict_factory=df,
retain_collection_types=retain_collection_types,
value_serializer=value_serializer,
),
)
for kk, vv in iteritems(val)
)
else:
rv = val
if value_serializer is not None:
rv = value_serializer(None, None, rv)
return rv
def astuple(
inst,
recurse=True,
filter=None,
tuple_factory=tuple,
retain_collection_types=False,
):
"""
Return the ``attrs`` attribute values of *inst* as a tuple.
Optionally recurse into other ``attrs``-decorated classes.
:param inst: Instance of an ``attrs``-decorated class.
:param bool recurse: Recurse into classes that are also
``attrs``-decorated.
:param callable filter: A callable whose return code determines whether an
attribute or element is included (``True``) or dropped (``False``). Is
called with the `attrs.Attribute` as the first argument and the
value as the second argument.
:param callable tuple_factory: A callable to produce tuples from. For
example, to produce lists instead of tuples.
:param bool retain_collection_types: Do not convert to ``list``
or ``dict`` when encountering an attribute which type is
``tuple``, ``dict`` or ``set``. Only meaningful if ``recurse`` is
``True``.
:rtype: return type of *tuple_factory*
:raise attr.exceptions.NotAnAttrsClassError: If *cls* is not an ``attrs``
class.
.. versionadded:: 16.2.0
"""
attrs = fields(inst.__class__)
rv = []
retain = retain_collection_types # Very long. :/
for a in attrs:
v = getattr(inst, a.name)
if filter is not None and not filter(a, v):
continue
if recurse is True:
if has(v.__class__):
rv.append(
astuple(
v,
recurse=True,
filter=filter,
tuple_factory=tuple_factory,
retain_collection_types=retain,
)
)
elif isinstance(v, (tuple, list, set, frozenset)):
cf = v.__class__ if retain is True else list
rv.append(
cf(
[
astuple(
j,
recurse=True,
filter=filter,
tuple_factory=tuple_factory,
retain_collection_types=retain,
)
if has(j.__class__)
else j
for j in v
]
)
)
elif isinstance(v, dict):
df = v.__class__ if retain is True else dict
rv.append(
df(
(
astuple(
kk,
tuple_factory=tuple_factory,
retain_collection_types=retain,
)
if has(kk.__class__)
else kk,
astuple(
vv,
tuple_factory=tuple_factory,
retain_collection_types=retain,
)
if has(vv.__class__)
else vv,
)
for kk, vv in iteritems(v)
)
)
else:
rv.append(v)
else:
rv.append(v)
return rv if tuple_factory is list else tuple_factory(rv)
def has(cls):
"""
Check whether *cls* is a class with ``attrs`` attributes.
:param type cls: Class to introspect.
:raise TypeError: If *cls* is not a class.
:rtype: bool
"""
return getattr(cls, "__attrs_attrs__", None) is not None
def assoc(inst, **changes):
"""
Copy *inst* and apply *changes*.
:param inst: Instance of a class with ``attrs`` attributes.
:param changes: Keyword changes in the new copy.
:return: A copy of inst with *changes* incorporated.
:raise attr.exceptions.AttrsAttributeNotFoundError: If *attr_name* couldn't
be found on *cls*.
:raise attr.exceptions.NotAnAttrsClassError: If *cls* is not an ``attrs``
class.
.. deprecated:: 17.1.0
Use `attrs.evolve` instead if you can.
This function will not be removed du to the slightly different approach
compared to `attrs.evolve`.
"""
import warnings
warnings.warn(
"assoc is deprecated and will be removed after 2018/01.",
DeprecationWarning,
stacklevel=2,
)
new = copy.copy(inst)
attrs = fields(inst.__class__)
for k, v in iteritems(changes):
a = getattr(attrs, k, NOTHING)
if a is NOTHING:
raise AttrsAttributeNotFoundError(
"{k} is not an attrs attribute on {cl}.".format(
k=k, cl=new.__class__
)
)
_obj_setattr(new, k, v)
return new
def evolve(inst, **changes):
"""
Create a new instance, based on *inst* with *changes* applied.
:param inst: Instance of a class with ``attrs`` attributes.
:param changes: Keyword changes in the new copy.
:return: A copy of inst with *changes* incorporated.
:raise TypeError: If *attr_name* couldn't be found in the class
``__init__``.
:raise attr.exceptions.NotAnAttrsClassError: If *cls* is not an ``attrs``
class.
.. versionadded:: 17.1.0
"""
cls = inst.__class__
attrs = fields(cls)
for a in attrs:
if not a.init:
continue
attr_name = a.name # To deal with private attributes.
init_name = attr_name if attr_name[0] != "_" else attr_name[1:]
if init_name not in changes:
changes[init_name] = getattr(inst, attr_name)
return cls(**changes)
def resolve_types(cls, globalns=None, localns=None, attribs=None):
"""
Resolve any strings and forward annotations in type annotations.
This is only required if you need concrete types in `Attribute`'s *type*
field. In other words, you don't need to resolve your types if you only
use them for static type checking.
With no arguments, names will be looked up in the module in which the class
was created. If this is not what you want, e.g. if the name only exists
inside a method, you may pass *globalns* or *localns* to specify other
dictionaries in which to look up these names. See the docs of
`typing.get_type_hints` for more details.
:param type cls: Class to resolve.
:param Optional[dict] globalns: Dictionary containing global variables.
:param Optional[dict] localns: Dictionary containing local variables.
:param Optional[list] attribs: List of attribs for the given class.
This is necessary when calling from inside a ``field_transformer``
since *cls* is not an ``attrs`` class yet.
:raise TypeError: If *cls* is not a class.
:raise attr.exceptions.NotAnAttrsClassError: If *cls* is not an ``attrs``
class and you didn't pass any attribs.
:raise NameError: If types cannot be resolved because of missing variables.
:returns: *cls* so you can use this function also as a class decorator.
Please note that you have to apply it **after** `attrs.define`. That
means the decorator has to come in the line **before** `attrs.define`.
.. versionadded:: 20.1.0
.. versionadded:: 21.1.0 *attribs*
"""
# Since calling get_type_hints is expensive we cache whether we've
# done it already.
if getattr(cls, "__attrs_types_resolved__", None) != cls:
import typing
hints = typing.get_type_hints(cls, globalns=globalns, localns=localns)
for field in fields(cls) if attribs is None else attribs:
if field.name in hints:
# Since fields have been frozen we must work around it.
_obj_setattr(field, "type", hints[field.name])
# We store the class we resolved so that subclasses know they haven't
# been resolved.
cls.__attrs_types_resolved__ = cls
# Return the class so you can use it as a decorator too.
return cls

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,216 @@
# SPDX-License-Identifier: MIT
"""
These are Python 3.6+-only and keyword-only APIs that call `attr.s` and
`attr.ib` with different default values.
"""
from functools import partial
from . import setters
from ._funcs import asdict as _asdict
from ._funcs import astuple as _astuple
from ._make import (
NOTHING,
_frozen_setattrs,
_ng_default_on_setattr,
attrib,
attrs,
)
from .exceptions import UnannotatedAttributeError
def define(
maybe_cls=None,
*,
these=None,
repr=None,
hash=None,
init=None,
slots=True,
frozen=False,
weakref_slot=True,
str=False,
auto_attribs=None,
kw_only=False,
cache_hash=False,
auto_exc=True,
eq=None,
order=False,
auto_detect=True,
getstate_setstate=None,
on_setattr=None,
field_transformer=None,
match_args=True,
):
r"""
Define an ``attrs`` class.
Differences to the classic `attr.s` that it uses underneath:
- Automatically detect whether or not *auto_attribs* should be `True`
(c.f. *auto_attribs* parameter).
- If *frozen* is `False`, run converters and validators when setting an
attribute by default.
- *slots=True* (see :term:`slotted classes` for potentially surprising
behaviors)
- *auto_exc=True*
- *auto_detect=True*
- *order=False*
- *match_args=True*
- Some options that were only relevant on Python 2 or were kept around for
backwards-compatibility have been removed.
Please note that these are all defaults and you can change them as you
wish.
:param Optional[bool] auto_attribs: If set to `True` or `False`, it behaves
exactly like `attr.s`. If left `None`, `attr.s` will try to guess:
1. If any attributes are annotated and no unannotated `attrs.fields`\ s
are found, it assumes *auto_attribs=True*.
2. Otherwise it assumes *auto_attribs=False* and tries to collect
`attrs.fields`\ s.
For now, please refer to `attr.s` for the rest of the parameters.
.. versionadded:: 20.1.0
.. versionchanged:: 21.3.0 Converters are also run ``on_setattr``.
"""
def do_it(cls, auto_attribs):
return attrs(
maybe_cls=cls,
these=these,
repr=repr,
hash=hash,
init=init,
slots=slots,
frozen=frozen,
weakref_slot=weakref_slot,
str=str,
auto_attribs=auto_attribs,
kw_only=kw_only,
cache_hash=cache_hash,
auto_exc=auto_exc,
eq=eq,
order=order,
auto_detect=auto_detect,
collect_by_mro=True,
getstate_setstate=getstate_setstate,
on_setattr=on_setattr,
field_transformer=field_transformer,
match_args=match_args,
)
def wrap(cls):
"""
Making this a wrapper ensures this code runs during class creation.
We also ensure that frozen-ness of classes is inherited.
"""
nonlocal frozen, on_setattr
had_on_setattr = on_setattr not in (None, setters.NO_OP)
# By default, mutable classes convert & validate on setattr.
if frozen is False and on_setattr is None:
on_setattr = _ng_default_on_setattr
# However, if we subclass a frozen class, we inherit the immutability
# and disable on_setattr.
for base_cls in cls.__bases__:
if base_cls.__setattr__ is _frozen_setattrs:
if had_on_setattr:
raise ValueError(
"Frozen classes can't use on_setattr "
"(frozen-ness was inherited)."
)
on_setattr = setters.NO_OP
break
if auto_attribs is not None:
return do_it(cls, auto_attribs)
try:
return do_it(cls, True)
except UnannotatedAttributeError:
return do_it(cls, False)
# maybe_cls's type depends on the usage of the decorator. It's a class
# if it's used as `@attrs` but ``None`` if used as `@attrs()`.
if maybe_cls is None:
return wrap
else:
return wrap(maybe_cls)
mutable = define
frozen = partial(define, frozen=True, on_setattr=None)
def field(
*,
default=NOTHING,
validator=None,
repr=True,
hash=None,
init=True,
metadata=None,
converter=None,
factory=None,
kw_only=False,
eq=None,
order=None,
on_setattr=None,
):
"""
Identical to `attr.ib`, except keyword-only and with some arguments
removed.
.. versionadded:: 20.1.0
"""
return attrib(
default=default,
validator=validator,
repr=repr,
hash=hash,
init=init,
metadata=metadata,
converter=converter,
factory=factory,
kw_only=kw_only,
eq=eq,
order=order,
on_setattr=on_setattr,
)
def asdict(inst, *, recurse=True, filter=None, value_serializer=None):
"""
Same as `attr.asdict`, except that collections types are always retained
and dict is always used as *dict_factory*.
.. versionadded:: 21.3.0
"""
return _asdict(
inst=inst,
recurse=recurse,
filter=filter,
value_serializer=value_serializer,
retain_collection_types=True,
)
def astuple(inst, *, recurse=True, filter=None):
"""
Same as `attr.astuple`, except that collections types are always retained
and `tuple` is always used as the *tuple_factory*.
.. versionadded:: 21.3.0
"""
return _astuple(
inst=inst, recurse=recurse, filter=filter, retain_collection_types=True
)

View file

@ -0,0 +1,87 @@
# SPDX-License-Identifier: MIT
from __future__ import absolute_import, division, print_function
from functools import total_ordering
from ._funcs import astuple
from ._make import attrib, attrs
@total_ordering
@attrs(eq=False, order=False, slots=True, frozen=True)
class VersionInfo(object):
"""
A version object that can be compared to tuple of length 1--4:
>>> attr.VersionInfo(19, 1, 0, "final") <= (19, 2)
True
>>> attr.VersionInfo(19, 1, 0, "final") < (19, 1, 1)
True
>>> vi = attr.VersionInfo(19, 2, 0, "final")
>>> vi < (19, 1, 1)
False
>>> vi < (19,)
False
>>> vi == (19, 2,)
True
>>> vi == (19, 2, 1)
False
.. versionadded:: 19.2
"""
year = attrib(type=int)
minor = attrib(type=int)
micro = attrib(type=int)
releaselevel = attrib(type=str)
@classmethod
def _from_version_string(cls, s):
"""
Parse *s* and return a _VersionInfo.
"""
v = s.split(".")
if len(v) == 3:
v.append("final")
return cls(
year=int(v[0]), minor=int(v[1]), micro=int(v[2]), releaselevel=v[3]
)
def _ensure_tuple(self, other):
"""
Ensure *other* is a tuple of a valid length.
Returns a possibly transformed *other* and ourselves as a tuple of
the same length as *other*.
"""
if self.__class__ is other.__class__:
other = astuple(other)
if not isinstance(other, tuple):
raise NotImplementedError
if not (1 <= len(other) <= 4):
raise NotImplementedError
return astuple(self)[: len(other)], other
def __eq__(self, other):
try:
us, them = self._ensure_tuple(other)
except NotImplementedError:
return NotImplemented
return us == them
def __lt__(self, other):
try:
us, them = self._ensure_tuple(other)
except NotImplementedError:
return NotImplemented
# Since alphabetically "dev0" < "final" < "post1" < "post2", we don't
# have to do anything special with releaselevel for now.
return us < them

View file

@ -0,0 +1,9 @@
class VersionInfo:
@property
def year(self) -> int: ...
@property
def minor(self) -> int: ...
@property
def micro(self) -> int: ...
@property
def releaselevel(self) -> str: ...

View file

@ -0,0 +1,155 @@
# SPDX-License-Identifier: MIT
"""
Commonly useful converters.
"""
from __future__ import absolute_import, division, print_function
from ._compat import PY2
from ._make import NOTHING, Factory, pipe
if not PY2:
import inspect
import typing
__all__ = [
"default_if_none",
"optional",
"pipe",
"to_bool",
]
def optional(converter):
"""
A converter that allows an attribute to be optional. An optional attribute
is one which can be set to ``None``.
Type annotations will be inferred from the wrapped converter's, if it
has any.
:param callable converter: the converter that is used for non-``None``
values.
.. versionadded:: 17.1.0
"""
def optional_converter(val):
if val is None:
return None
return converter(val)
if not PY2:
sig = None
try:
sig = inspect.signature(converter)
except (ValueError, TypeError): # inspect failed
pass
if sig:
params = list(sig.parameters.values())
if params and params[0].annotation is not inspect.Parameter.empty:
optional_converter.__annotations__["val"] = typing.Optional[
params[0].annotation
]
if sig.return_annotation is not inspect.Signature.empty:
optional_converter.__annotations__["return"] = typing.Optional[
sig.return_annotation
]
return optional_converter
def default_if_none(default=NOTHING, factory=None):
"""
A converter that allows to replace ``None`` values by *default* or the
result of *factory*.
:param default: Value to be used if ``None`` is passed. Passing an instance
of `attrs.Factory` is supported, however the ``takes_self`` option
is *not*.
:param callable factory: A callable that takes no parameters whose result
is used if ``None`` is passed.
:raises TypeError: If **neither** *default* or *factory* is passed.
:raises TypeError: If **both** *default* and *factory* are passed.
:raises ValueError: If an instance of `attrs.Factory` is passed with
``takes_self=True``.
.. versionadded:: 18.2.0
"""
if default is NOTHING and factory is None:
raise TypeError("Must pass either `default` or `factory`.")
if default is not NOTHING and factory is not None:
raise TypeError(
"Must pass either `default` or `factory` but not both."
)
if factory is not None:
default = Factory(factory)
if isinstance(default, Factory):
if default.takes_self:
raise ValueError(
"`takes_self` is not supported by default_if_none."
)
def default_if_none_converter(val):
if val is not None:
return val
return default.factory()
else:
def default_if_none_converter(val):
if val is not None:
return val
return default
return default_if_none_converter
def to_bool(val):
"""
Convert "boolean" strings (e.g., from env. vars.) to real booleans.
Values mapping to :code:`True`:
- :code:`True`
- :code:`"true"` / :code:`"t"`
- :code:`"yes"` / :code:`"y"`
- :code:`"on"`
- :code:`"1"`
- :code:`1`
Values mapping to :code:`False`:
- :code:`False`
- :code:`"false"` / :code:`"f"`
- :code:`"no"` / :code:`"n"`
- :code:`"off"`
- :code:`"0"`
- :code:`0`
:raises ValueError: for any other value.
.. versionadded:: 21.3.0
"""
if isinstance(val, str):
val = val.lower()
truthy = {True, "true", "t", "yes", "y", "on", "1", 1}
falsy = {False, "false", "f", "no", "n", "off", "0", 0}
try:
if val in truthy:
return True
if val in falsy:
return False
except TypeError:
# Raised when "val" is not hashable (e.g., lists)
pass
raise ValueError("Cannot convert value to bool: {}".format(val))

View file

@ -0,0 +1,13 @@
from typing import Callable, Optional, TypeVar, overload
from . import _ConverterType
_T = TypeVar("_T")
def pipe(*validators: _ConverterType) -> _ConverterType: ...
def optional(converter: _ConverterType) -> _ConverterType: ...
@overload
def default_if_none(default: _T) -> _ConverterType: ...
@overload
def default_if_none(*, factory: Callable[[], _T]) -> _ConverterType: ...
def to_bool(val: str) -> bool: ...

View file

@ -0,0 +1,94 @@
# SPDX-License-Identifier: MIT
from __future__ import absolute_import, division, print_function
class FrozenError(AttributeError):
"""
A frozen/immutable instance or attribute have been attempted to be
modified.
It mirrors the behavior of ``namedtuples`` by using the same error message
and subclassing `AttributeError`.
.. versionadded:: 20.1.0
"""
msg = "can't set attribute"
args = [msg]
class FrozenInstanceError(FrozenError):
"""
A frozen instance has been attempted to be modified.
.. versionadded:: 16.1.0
"""
class FrozenAttributeError(FrozenError):
"""
A frozen attribute has been attempted to be modified.
.. versionadded:: 20.1.0
"""
class AttrsAttributeNotFoundError(ValueError):
"""
An ``attrs`` function couldn't find an attribute that the user asked for.
.. versionadded:: 16.2.0
"""
class NotAnAttrsClassError(ValueError):
"""
A non-``attrs`` class has been passed into an ``attrs`` function.
.. versionadded:: 16.2.0
"""
class DefaultAlreadySetError(RuntimeError):
"""
A default has been set using ``attr.ib()`` and is attempted to be reset
using the decorator.
.. versionadded:: 17.1.0
"""
class UnannotatedAttributeError(RuntimeError):
"""
A class with ``auto_attribs=True`` has an ``attr.ib()`` without a type
annotation.
.. versionadded:: 17.3.0
"""
class PythonTooOldError(RuntimeError):
"""
It was attempted to use an ``attrs`` feature that requires a newer Python
version.
.. versionadded:: 18.2.0
"""
class NotCallableError(TypeError):
"""
A ``attr.ib()`` requiring a callable has been set with a value
that is not callable.
.. versionadded:: 19.2.0
"""
def __init__(self, msg, value):
super(TypeError, self).__init__(msg, value)
self.msg = msg
self.value = value
def __str__(self):
return str(self.msg)

View file

@ -0,0 +1,17 @@
from typing import Any
class FrozenError(AttributeError):
msg: str = ...
class FrozenInstanceError(FrozenError): ...
class FrozenAttributeError(FrozenError): ...
class AttrsAttributeNotFoundError(ValueError): ...
class NotAnAttrsClassError(ValueError): ...
class DefaultAlreadySetError(RuntimeError): ...
class UnannotatedAttributeError(RuntimeError): ...
class PythonTooOldError(RuntimeError): ...
class NotCallableError(TypeError):
msg: str = ...
value: Any = ...
def __init__(self, msg: str, value: Any) -> None: ...

View file

@ -0,0 +1,54 @@
# SPDX-License-Identifier: MIT
"""
Commonly useful filters for `attr.asdict`.
"""
from __future__ import absolute_import, division, print_function
from ._compat import isclass
from ._make import Attribute
def _split_what(what):
"""
Returns a tuple of `frozenset`s of classes and attributes.
"""
return (
frozenset(cls for cls in what if isclass(cls)),
frozenset(cls for cls in what if isinstance(cls, Attribute)),
)
def include(*what):
"""
Include *what*.
:param what: What to include.
:type what: `list` of `type` or `attrs.Attribute`\\ s
:rtype: `callable`
"""
cls, attrs = _split_what(what)
def include_(attribute, value):
return value.__class__ in cls or attribute in attrs
return include_
def exclude(*what):
"""
Exclude *what*.
:param what: What to exclude.
:type what: `list` of classes or `attrs.Attribute`\\ s.
:rtype: `callable`
"""
cls, attrs = _split_what(what)
def exclude_(attribute, value):
return value.__class__ not in cls and attribute not in attrs
return exclude_

View file

@ -0,0 +1,6 @@
from typing import Any, Union
from . import Attribute, _FilterType
def include(*what: Union[type, Attribute[Any]]) -> _FilterType[Any]: ...
def exclude(*what: Union[type, Attribute[Any]]) -> _FilterType[Any]: ...

View file

View file

@ -0,0 +1,79 @@
# SPDX-License-Identifier: MIT
"""
Commonly used hooks for on_setattr.
"""
from __future__ import absolute_import, division, print_function
from . import _config
from .exceptions import FrozenAttributeError
def pipe(*setters):
"""
Run all *setters* and return the return value of the last one.
.. versionadded:: 20.1.0
"""
def wrapped_pipe(instance, attrib, new_value):
rv = new_value
for setter in setters:
rv = setter(instance, attrib, rv)
return rv
return wrapped_pipe
def frozen(_, __, ___):
"""
Prevent an attribute to be modified.
.. versionadded:: 20.1.0
"""
raise FrozenAttributeError()
def validate(instance, attrib, new_value):
"""
Run *attrib*'s validator on *new_value* if it has one.
.. versionadded:: 20.1.0
"""
if _config._run_validators is False:
return new_value
v = attrib.validator
if not v:
return new_value
v(instance, attrib, new_value)
return new_value
def convert(instance, attrib, new_value):
"""
Run *attrib*'s converter -- if it has one -- on *new_value* and return the
result.
.. versionadded:: 20.1.0
"""
c = attrib.converter
if c:
return c(new_value)
return new_value
NO_OP = object()
"""
Sentinel for disabling class-wide *on_setattr* hooks for certain attributes.
Does not work in `pipe` or within lists.
.. versionadded:: 20.1.0
"""

View file

@ -0,0 +1,19 @@
from typing import Any, NewType, NoReturn, TypeVar, cast
from . import Attribute, _OnSetAttrType
_T = TypeVar("_T")
def frozen(
instance: Any, attribute: Attribute[Any], new_value: Any
) -> NoReturn: ...
def pipe(*setters: _OnSetAttrType) -> _OnSetAttrType: ...
def validate(instance: Any, attribute: Attribute[_T], new_value: _T) -> _T: ...
# convert is allowed to return Any, because they can be chained using pipe.
def convert(
instance: Any, attribute: Attribute[Any], new_value: Any
) -> Any: ...
_NoOpType = NewType("_NoOpType", object)
NO_OP: _NoOpType

View file

@ -0,0 +1,561 @@
# SPDX-License-Identifier: MIT
"""
Commonly useful validators.
"""
from __future__ import absolute_import, division, print_function
import operator
import re
from contextlib import contextmanager
from ._config import get_run_validators, set_run_validators
from ._make import _AndValidator, and_, attrib, attrs
from .exceptions import NotCallableError
try:
Pattern = re.Pattern
except AttributeError: # Python <3.7 lacks a Pattern type.
Pattern = type(re.compile(""))
__all__ = [
"and_",
"deep_iterable",
"deep_mapping",
"disabled",
"ge",
"get_disabled",
"gt",
"in_",
"instance_of",
"is_callable",
"le",
"lt",
"matches_re",
"max_len",
"optional",
"provides",
"set_disabled",
]
def set_disabled(disabled):
"""
Globally disable or enable running validators.
By default, they are run.
:param disabled: If ``True``, disable running all validators.
:type disabled: bool
.. warning::
This function is not thread-safe!
.. versionadded:: 21.3.0
"""
set_run_validators(not disabled)
def get_disabled():
"""
Return a bool indicating whether validators are currently disabled or not.
:return: ``True`` if validators are currently disabled.
:rtype: bool
.. versionadded:: 21.3.0
"""
return not get_run_validators()
@contextmanager
def disabled():
"""
Context manager that disables running validators within its context.
.. warning::
This context manager is not thread-safe!
.. versionadded:: 21.3.0
"""
set_run_validators(False)
try:
yield
finally:
set_run_validators(True)
@attrs(repr=False, slots=True, hash=True)
class _InstanceOfValidator(object):
type = attrib()
def __call__(self, inst, attr, value):
"""
We use a callable class to be able to change the ``__repr__``.
"""
if not isinstance(value, self.type):
raise TypeError(
"'{name}' must be {type!r} (got {value!r} that is a "
"{actual!r}).".format(
name=attr.name,
type=self.type,
actual=value.__class__,
value=value,
),
attr,
self.type,
value,
)
def __repr__(self):
return "<instance_of validator for type {type!r}>".format(
type=self.type
)
def instance_of(type):
"""
A validator that raises a `TypeError` if the initializer is called
with a wrong type for this particular attribute (checks are performed using
`isinstance` therefore it's also valid to pass a tuple of types).
:param type: The type to check for.
:type type: type or tuple of types
:raises TypeError: With a human readable error message, the attribute
(of type `attrs.Attribute`), the expected type, and the value it
got.
"""
return _InstanceOfValidator(type)
@attrs(repr=False, frozen=True, slots=True)
class _MatchesReValidator(object):
pattern = attrib()
match_func = attrib()
def __call__(self, inst, attr, value):
"""
We use a callable class to be able to change the ``__repr__``.
"""
if not self.match_func(value):
raise ValueError(
"'{name}' must match regex {pattern!r}"
" ({value!r} doesn't)".format(
name=attr.name, pattern=self.pattern.pattern, value=value
),
attr,
self.pattern,
value,
)
def __repr__(self):
return "<matches_re validator for pattern {pattern!r}>".format(
pattern=self.pattern
)
def matches_re(regex, flags=0, func=None):
r"""
A validator that raises `ValueError` if the initializer is called
with a string that doesn't match *regex*.
:param regex: a regex string or precompiled pattern to match against
:param int flags: flags that will be passed to the underlying re function
(default 0)
:param callable func: which underlying `re` function to call (options
are `re.fullmatch`, `re.search`, `re.match`, default
is ``None`` which means either `re.fullmatch` or an emulation of
it on Python 2). For performance reasons, they won't be used directly
but on a pre-`re.compile`\ ed pattern.
.. versionadded:: 19.2.0
.. versionchanged:: 21.3.0 *regex* can be a pre-compiled pattern.
"""
fullmatch = getattr(re, "fullmatch", None)
valid_funcs = (fullmatch, None, re.search, re.match)
if func not in valid_funcs:
raise ValueError(
"'func' must be one of {}.".format(
", ".join(
sorted(
e and e.__name__ or "None" for e in set(valid_funcs)
)
)
)
)
if isinstance(regex, Pattern):
if flags:
raise TypeError(
"'flags' can only be used with a string pattern; "
"pass flags to re.compile() instead"
)
pattern = regex
else:
pattern = re.compile(regex, flags)
if func is re.match:
match_func = pattern.match
elif func is re.search:
match_func = pattern.search
elif fullmatch:
match_func = pattern.fullmatch
else: # Python 2 fullmatch emulation (https://bugs.python.org/issue16203)
pattern = re.compile(
r"(?:{})\Z".format(pattern.pattern), pattern.flags
)
match_func = pattern.match
return _MatchesReValidator(pattern, match_func)
@attrs(repr=False, slots=True, hash=True)
class _ProvidesValidator(object):
interface = attrib()
def __call__(self, inst, attr, value):
"""
We use a callable class to be able to change the ``__repr__``.
"""
if not self.interface.providedBy(value):
raise TypeError(
"'{name}' must provide {interface!r} which {value!r} "
"doesn't.".format(
name=attr.name, interface=self.interface, value=value
),
attr,
self.interface,
value,
)
def __repr__(self):
return "<provides validator for interface {interface!r}>".format(
interface=self.interface
)
def provides(interface):
"""
A validator that raises a `TypeError` if the initializer is called
with an object that does not provide the requested *interface* (checks are
performed using ``interface.providedBy(value)`` (see `zope.interface
<https://zopeinterface.readthedocs.io/en/latest/>`_).
:param interface: The interface to check for.
:type interface: ``zope.interface.Interface``
:raises TypeError: With a human readable error message, the attribute
(of type `attrs.Attribute`), the expected interface, and the
value it got.
"""
return _ProvidesValidator(interface)
@attrs(repr=False, slots=True, hash=True)
class _OptionalValidator(object):
validator = attrib()
def __call__(self, inst, attr, value):
if value is None:
return
self.validator(inst, attr, value)
def __repr__(self):
return "<optional validator for {what} or None>".format(
what=repr(self.validator)
)
def optional(validator):
"""
A validator that makes an attribute optional. An optional attribute is one
which can be set to ``None`` in addition to satisfying the requirements of
the sub-validator.
:param validator: A validator (or a list of validators) that is used for
non-``None`` values.
:type validator: callable or `list` of callables.
.. versionadded:: 15.1.0
.. versionchanged:: 17.1.0 *validator* can be a list of validators.
"""
if isinstance(validator, list):
return _OptionalValidator(_AndValidator(validator))
return _OptionalValidator(validator)
@attrs(repr=False, slots=True, hash=True)
class _InValidator(object):
options = attrib()
def __call__(self, inst, attr, value):
try:
in_options = value in self.options
except TypeError: # e.g. `1 in "abc"`
in_options = False
if not in_options:
raise ValueError(
"'{name}' must be in {options!r} (got {value!r})".format(
name=attr.name, options=self.options, value=value
)
)
def __repr__(self):
return "<in_ validator with options {options!r}>".format(
options=self.options
)
def in_(options):
"""
A validator that raises a `ValueError` if the initializer is called
with a value that does not belong in the options provided. The check is
performed using ``value in options``.
:param options: Allowed options.
:type options: list, tuple, `enum.Enum`, ...
:raises ValueError: With a human readable error message, the attribute (of
type `attrs.Attribute`), the expected options, and the value it
got.
.. versionadded:: 17.1.0
"""
return _InValidator(options)
@attrs(repr=False, slots=False, hash=True)
class _IsCallableValidator(object):
def __call__(self, inst, attr, value):
"""
We use a callable class to be able to change the ``__repr__``.
"""
if not callable(value):
message = (
"'{name}' must be callable "
"(got {value!r} that is a {actual!r})."
)
raise NotCallableError(
msg=message.format(
name=attr.name, value=value, actual=value.__class__
),
value=value,
)
def __repr__(self):
return "<is_callable validator>"
def is_callable():
"""
A validator that raises a `attr.exceptions.NotCallableError` if the
initializer is called with a value for this particular attribute
that is not callable.
.. versionadded:: 19.1.0
:raises `attr.exceptions.NotCallableError`: With a human readable error
message containing the attribute (`attrs.Attribute`) name,
and the value it got.
"""
return _IsCallableValidator()
@attrs(repr=False, slots=True, hash=True)
class _DeepIterable(object):
member_validator = attrib(validator=is_callable())
iterable_validator = attrib(
default=None, validator=optional(is_callable())
)
def __call__(self, inst, attr, value):
"""
We use a callable class to be able to change the ``__repr__``.
"""
if self.iterable_validator is not None:
self.iterable_validator(inst, attr, value)
for member in value:
self.member_validator(inst, attr, member)
def __repr__(self):
iterable_identifier = (
""
if self.iterable_validator is None
else " {iterable!r}".format(iterable=self.iterable_validator)
)
return (
"<deep_iterable validator for{iterable_identifier}"
" iterables of {member!r}>"
).format(
iterable_identifier=iterable_identifier,
member=self.member_validator,
)
def deep_iterable(member_validator, iterable_validator=None):
"""
A validator that performs deep validation of an iterable.
:param member_validator: Validator to apply to iterable members
:param iterable_validator: Validator to apply to iterable itself
(optional)
.. versionadded:: 19.1.0
:raises TypeError: if any sub-validators fail
"""
return _DeepIterable(member_validator, iterable_validator)
@attrs(repr=False, slots=True, hash=True)
class _DeepMapping(object):
key_validator = attrib(validator=is_callable())
value_validator = attrib(validator=is_callable())
mapping_validator = attrib(default=None, validator=optional(is_callable()))
def __call__(self, inst, attr, value):
"""
We use a callable class to be able to change the ``__repr__``.
"""
if self.mapping_validator is not None:
self.mapping_validator(inst, attr, value)
for key in value:
self.key_validator(inst, attr, key)
self.value_validator(inst, attr, value[key])
def __repr__(self):
return (
"<deep_mapping validator for objects mapping {key!r} to {value!r}>"
).format(key=self.key_validator, value=self.value_validator)
def deep_mapping(key_validator, value_validator, mapping_validator=None):
"""
A validator that performs deep validation of a dictionary.
:param key_validator: Validator to apply to dictionary keys
:param value_validator: Validator to apply to dictionary values
:param mapping_validator: Validator to apply to top-level mapping
attribute (optional)
.. versionadded:: 19.1.0
:raises TypeError: if any sub-validators fail
"""
return _DeepMapping(key_validator, value_validator, mapping_validator)
@attrs(repr=False, frozen=True, slots=True)
class _NumberValidator(object):
bound = attrib()
compare_op = attrib()
compare_func = attrib()
def __call__(self, inst, attr, value):
"""
We use a callable class to be able to change the ``__repr__``.
"""
if not self.compare_func(value, self.bound):
raise ValueError(
"'{name}' must be {op} {bound}: {value}".format(
name=attr.name,
op=self.compare_op,
bound=self.bound,
value=value,
)
)
def __repr__(self):
return "<Validator for x {op} {bound}>".format(
op=self.compare_op, bound=self.bound
)
def lt(val):
"""
A validator that raises `ValueError` if the initializer is called
with a number larger or equal to *val*.
:param val: Exclusive upper bound for values
.. versionadded:: 21.3.0
"""
return _NumberValidator(val, "<", operator.lt)
def le(val):
"""
A validator that raises `ValueError` if the initializer is called
with a number greater than *val*.
:param val: Inclusive upper bound for values
.. versionadded:: 21.3.0
"""
return _NumberValidator(val, "<=", operator.le)
def ge(val):
"""
A validator that raises `ValueError` if the initializer is called
with a number smaller than *val*.
:param val: Inclusive lower bound for values
.. versionadded:: 21.3.0
"""
return _NumberValidator(val, ">=", operator.ge)
def gt(val):
"""
A validator that raises `ValueError` if the initializer is called
with a number smaller or equal to *val*.
:param val: Exclusive lower bound for values
.. versionadded:: 21.3.0
"""
return _NumberValidator(val, ">", operator.gt)
@attrs(repr=False, frozen=True, slots=True)
class _MaxLengthValidator(object):
max_length = attrib()
def __call__(self, inst, attr, value):
"""
We use a callable class to be able to change the ``__repr__``.
"""
if len(value) > self.max_length:
raise ValueError(
"Length of '{name}' must be <= {max}: {len}".format(
name=attr.name, max=self.max_length, len=len(value)
)
)
def __repr__(self):
return "<max_len validator for {max}>".format(max=self.max_length)
def max_len(length):
"""
A validator that raises `ValueError` if the initializer is called
with a string or iterable that is longer than *length*.
:param int length: Maximum length of the string or iterable
.. versionadded:: 21.3.0
"""
return _MaxLengthValidator(length)

View file

@ -0,0 +1,78 @@
from typing import (
Any,
AnyStr,
Callable,
Container,
ContextManager,
Iterable,
List,
Mapping,
Match,
Optional,
Pattern,
Tuple,
Type,
TypeVar,
Union,
overload,
)
from . import _ValidatorType
_T = TypeVar("_T")
_T1 = TypeVar("_T1")
_T2 = TypeVar("_T2")
_T3 = TypeVar("_T3")
_I = TypeVar("_I", bound=Iterable)
_K = TypeVar("_K")
_V = TypeVar("_V")
_M = TypeVar("_M", bound=Mapping)
def set_disabled(run: bool) -> None: ...
def get_disabled() -> bool: ...
def disabled() -> ContextManager[None]: ...
# To be more precise on instance_of use some overloads.
# If there are more than 3 items in the tuple then we fall back to Any
@overload
def instance_of(type: Type[_T]) -> _ValidatorType[_T]: ...
@overload
def instance_of(type: Tuple[Type[_T]]) -> _ValidatorType[_T]: ...
@overload
def instance_of(
type: Tuple[Type[_T1], Type[_T2]]
) -> _ValidatorType[Union[_T1, _T2]]: ...
@overload
def instance_of(
type: Tuple[Type[_T1], Type[_T2], Type[_T3]]
) -> _ValidatorType[Union[_T1, _T2, _T3]]: ...
@overload
def instance_of(type: Tuple[type, ...]) -> _ValidatorType[Any]: ...
def provides(interface: Any) -> _ValidatorType[Any]: ...
def optional(
validator: Union[_ValidatorType[_T], List[_ValidatorType[_T]]]
) -> _ValidatorType[Optional[_T]]: ...
def in_(options: Container[_T]) -> _ValidatorType[_T]: ...
def and_(*validators: _ValidatorType[_T]) -> _ValidatorType[_T]: ...
def matches_re(
regex: Union[Pattern[AnyStr], AnyStr],
flags: int = ...,
func: Optional[
Callable[[AnyStr, AnyStr, int], Optional[Match[AnyStr]]]
] = ...,
) -> _ValidatorType[AnyStr]: ...
def deep_iterable(
member_validator: _ValidatorType[_T],
iterable_validator: Optional[_ValidatorType[_I]] = ...,
) -> _ValidatorType[_I]: ...
def deep_mapping(
key_validator: _ValidatorType[_K],
value_validator: _ValidatorType[_V],
mapping_validator: Optional[_ValidatorType[_M]] = ...,
) -> _ValidatorType[_M]: ...
def is_callable() -> _ValidatorType[_T]: ...
def lt(val: _T) -> _ValidatorType[_T]: ...
def le(val: _T) -> _ValidatorType[_T]: ...
def ge(val: _T) -> _ValidatorType[_T]: ...
def gt(val: _T) -> _ValidatorType[_T]: ...
def max_len(length: int) -> _ValidatorType[_T]: ...

View file

@ -0,0 +1,70 @@
# SPDX-License-Identifier: MIT
from attr import (
NOTHING,
Attribute,
Factory,
__author__,
__copyright__,
__description__,
__doc__,
__email__,
__license__,
__title__,
__url__,
__version__,
__version_info__,
assoc,
cmp_using,
define,
evolve,
field,
fields,
fields_dict,
frozen,
has,
make_class,
mutable,
resolve_types,
validate,
)
from attr._next_gen import asdict, astuple
from . import converters, exceptions, filters, setters, validators
__all__ = [
"__author__",
"__copyright__",
"__description__",
"__doc__",
"__email__",
"__license__",
"__title__",
"__url__",
"__version__",
"__version_info__",
"asdict",
"assoc",
"astuple",
"Attribute",
"cmp_using",
"converters",
"define",
"evolve",
"exceptions",
"Factory",
"field",
"fields_dict",
"fields",
"filters",
"frozen",
"has",
"make_class",
"mutable",
"NOTHING",
"resolve_types",
"setters",
"validate",
"validators",
]

View file

@ -0,0 +1,63 @@
from typing import (
Any,
Callable,
Dict,
Mapping,
Optional,
Sequence,
Tuple,
Type,
)
# Because we need to type our own stuff, we have to make everything from
# attr explicitly public too.
from attr import __author__ as __author__
from attr import __copyright__ as __copyright__
from attr import __description__ as __description__
from attr import __email__ as __email__
from attr import __license__ as __license__
from attr import __title__ as __title__
from attr import __url__ as __url__
from attr import __version__ as __version__
from attr import __version_info__ as __version_info__
from attr import _FilterType
from attr import assoc as assoc
from attr import Attribute as Attribute
from attr import define as define
from attr import evolve as evolve
from attr import Factory as Factory
from attr import exceptions as exceptions
from attr import field as field
from attr import fields as fields
from attr import fields_dict as fields_dict
from attr import frozen as frozen
from attr import has as has
from attr import make_class as make_class
from attr import mutable as mutable
from attr import NOTHING as NOTHING
from attr import resolve_types as resolve_types
from attr import setters as setters
from attr import validate as validate
from attr import validators as validators
# TODO: see definition of attr.asdict/astuple
def asdict(
inst: Any,
recurse: bool = ...,
filter: Optional[_FilterType[Any]] = ...,
dict_factory: Type[Mapping[Any, Any]] = ...,
retain_collection_types: bool = ...,
value_serializer: Optional[
Callable[[type, Attribute[Any], Any], Any]
] = ...,
tuple_keys: bool = ...,
) -> Dict[str, Any]: ...
# TODO: add support for returning NamedTuple from the mypy plugin
def astuple(
inst: Any,
recurse: bool = ...,
filter: Optional[_FilterType[Any]] = ...,
tuple_factory: Type[Sequence[Any]] = ...,
retain_collection_types: bool = ...,
) -> Tuple[Any, ...]: ...

View file

@ -0,0 +1,3 @@
# SPDX-License-Identifier: MIT
from attr.converters import * # noqa

View file

@ -0,0 +1,3 @@
# SPDX-License-Identifier: MIT
from attr.exceptions import * # noqa

View file

@ -0,0 +1,3 @@
# SPDX-License-Identifier: MIT
from attr.filters import * # noqa

Some files were not shown because too many files have changed in this diff Show more