diff --git a/.gitmodules b/.gitmodules index bac3132b77..fe93791c4e 100644 --- a/.gitmodules +++ b/.gitmodules @@ -4,7 +4,4 @@ [submodule "tools/modules/powershell/PSWriteColor"] path = tools/modules/powershell/PSWriteColor - url = https://github.com/EvotecIT/PSWriteColor.git -[submodule "vendor/configs/OpenColorIO-Configs"] - path = vendor/configs/OpenColorIO-Configs - url = https://github.com/imageworks/OpenColorIO-Configs + url = https://github.com/EvotecIT/PSWriteColor.git \ No newline at end of file diff --git a/CHANGELOG.md b/CHANGELOG.md index 2adb4ac154..65a3cb27e6 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,22 +1,58 @@ # Changelog -## [3.13.1-nightly.2](https://github.com/pypeclub/OpenPype/tree/HEAD) +## [3.14.1-nightly.1](https://github.com/pypeclub/OpenPype/tree/HEAD) -[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.13.0...HEAD) +[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.14.0...HEAD) + +**🚀 Enhancements** + +- Ftrack: More logs related to auto sync value change [\#3671](https://github.com/pypeclub/OpenPype/pull/3671) **🐛 Bug fixes** +- RoyalRender: handle host name that is not set [\#3695](https://github.com/pypeclub/OpenPype/pull/3695) + +## [3.14.0](https://github.com/pypeclub/OpenPype/tree/3.14.0) (2022-08-18) + +[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.14.0-nightly.1...3.14.0) + +**🆕 New features** + +- Maya: Build workfile by template [\#3578](https://github.com/pypeclub/OpenPype/pull/3578) + +**🚀 Enhancements** + +- Ftrack: Addiotional component metadata [\#3685](https://github.com/pypeclub/OpenPype/pull/3685) +- Ftrack: Set task status on farm publishing [\#3680](https://github.com/pypeclub/OpenPype/pull/3680) +- Ftrack: Set task status on task creation in integrate hierarchy [\#3675](https://github.com/pypeclub/OpenPype/pull/3675) +- Maya: Disable rendering of all lights for render instances submitted through Deadline. [\#3661](https://github.com/pypeclub/OpenPype/pull/3661) +- General: Optimized OCIO configs [\#3650](https://github.com/pypeclub/OpenPype/pull/3650) + +**🐛 Bug fixes** + +- General: Switch from hero version to versioned works [\#3691](https://github.com/pypeclub/OpenPype/pull/3691) +- General: Fix finding of last version [\#3656](https://github.com/pypeclub/OpenPype/pull/3656) +- General: Extract Review can scale with pixel aspect ratio [\#3644](https://github.com/pypeclub/OpenPype/pull/3644) +- Maya: Refactor moved usage of CreateRender settings [\#3643](https://github.com/pypeclub/OpenPype/pull/3643) - General: Hero version representations have full context [\#3638](https://github.com/pypeclub/OpenPype/pull/3638) +- Nuke: color settings for render write node is working now [\#3632](https://github.com/pypeclub/OpenPype/pull/3632) - Maya: FBX support for update in reference loader [\#3631](https://github.com/pypeclub/OpenPype/pull/3631) **🔀 Refactored code** +- General: Use client projects getter [\#3673](https://github.com/pypeclub/OpenPype/pull/3673) +- Resolve: Match folder structure to other hosts [\#3653](https://github.com/pypeclub/OpenPype/pull/3653) +- Maya: Hosts as modules [\#3647](https://github.com/pypeclub/OpenPype/pull/3647) - TimersManager: Plugins are in timers manager module [\#3639](https://github.com/pypeclub/OpenPype/pull/3639) - General: Move workfiles functions into pipeline [\#3637](https://github.com/pypeclub/OpenPype/pull/3637) +- General: Workfiles builder using query functions [\#3598](https://github.com/pypeclub/OpenPype/pull/3598) **Merged pull requests:** +- Deadline: Global job pre load is not Pype 2 compatible [\#3666](https://github.com/pypeclub/OpenPype/pull/3666) +- Maya: Remove unused get current renderer logic [\#3645](https://github.com/pypeclub/OpenPype/pull/3645) - Kitsu|Fix: Movie project type fails & first loop children names [\#3636](https://github.com/pypeclub/OpenPype/pull/3636) +- fix the bug of failing to extract look when UDIMs format used in AiImage [\#3628](https://github.com/pypeclub/OpenPype/pull/3628) ## [3.13.0](https://github.com/pypeclub/OpenPype/tree/3.13.0) (2022-08-09) @@ -55,7 +91,6 @@ - General: Update imports in start script [\#3579](https://github.com/pypeclub/OpenPype/pull/3579) - Nuke: render family integration consistency [\#3576](https://github.com/pypeclub/OpenPype/pull/3576) - Ftrack: Handle missing published path in integrator [\#3570](https://github.com/pypeclub/OpenPype/pull/3570) -- Maya: fix Review image plane attribute [\#3569](https://github.com/pypeclub/OpenPype/pull/3569) - Nuke: publish existing frames with slate with correct range [\#3555](https://github.com/pypeclub/OpenPype/pull/3555) **🔀 Refactored code** @@ -84,35 +119,20 @@ **🚀 Enhancements** - General: Global thumbnail extractor is ready for more cases [\#3561](https://github.com/pypeclub/OpenPype/pull/3561) -- Maya: add additional validators to Settings [\#3540](https://github.com/pypeclub/OpenPype/pull/3540) -- General: Interactive console in cli [\#3526](https://github.com/pypeclub/OpenPype/pull/3526) -- Ftrack: Automatic daily review session creation can define trigger hour [\#3516](https://github.com/pypeclub/OpenPype/pull/3516) **🐛 Bug fixes** +- Maya: fix Review image plane attribute [\#3569](https://github.com/pypeclub/OpenPype/pull/3569) - Maya: Fix animated attributes \(ie. overscan\) on loaded cameras breaking review publishing. [\#3562](https://github.com/pypeclub/OpenPype/pull/3562) - NewPublisher: Python 2 compatible html escape [\#3559](https://github.com/pypeclub/OpenPype/pull/3559) - Remove invalid submodules from `/vendor` [\#3557](https://github.com/pypeclub/OpenPype/pull/3557) - General: Remove hosts filter on integrator plugins [\#3556](https://github.com/pypeclub/OpenPype/pull/3556) - Settings: Clean default values of environments [\#3550](https://github.com/pypeclub/OpenPype/pull/3550) - Module interfaces: Fix import error [\#3547](https://github.com/pypeclub/OpenPype/pull/3547) -- Workfiles tool: Show of tool and it's flags [\#3539](https://github.com/pypeclub/OpenPype/pull/3539) -- General: Create workfile documents works again [\#3538](https://github.com/pypeclub/OpenPype/pull/3538) -- Additional fixes for powershell scripts [\#3525](https://github.com/pypeclub/OpenPype/pull/3525) -- Maya: Added wrapper around cmds.setAttr [\#3523](https://github.com/pypeclub/OpenPype/pull/3523) -- Nuke: double slate [\#3521](https://github.com/pypeclub/OpenPype/pull/3521) -- General: Fix hash of centos oiio archive [\#3519](https://github.com/pypeclub/OpenPype/pull/3519) -- Maya: Renderman display output fix [\#3514](https://github.com/pypeclub/OpenPype/pull/3514) -- TrayPublisher: Simple creation enhancements and fixes [\#3513](https://github.com/pypeclub/OpenPype/pull/3513) **🔀 Refactored code** - General: Use query functions in integrator [\#3563](https://github.com/pypeclub/OpenPype/pull/3563) -- General: Mongo core connection moved to client [\#3531](https://github.com/pypeclub/OpenPype/pull/3531) -- Refactor Integrate Asset [\#3530](https://github.com/pypeclub/OpenPype/pull/3530) -- General: Client docstrings cleanup [\#3529](https://github.com/pypeclub/OpenPype/pull/3529) -- General: Move load related functions into pipeline [\#3527](https://github.com/pypeclub/OpenPype/pull/3527) -- General: Get current context document functions [\#3522](https://github.com/pypeclub/OpenPype/pull/3522) **Merged pull requests:** diff --git a/igniter/bootstrap_repos.py b/igniter/bootstrap_repos.py index 750b2f1bf7..c5003b062e 100644 --- a/igniter/bootstrap_repos.py +++ b/igniter/bootstrap_repos.py @@ -381,7 +381,7 @@ class OpenPypeVersion(semver.VersionInfo): @classmethod def get_local_versions( cls, production: bool = None, - staging: bool = None, compatible_with: OpenPypeVersion = None + staging: bool = None ) -> List: """Get all versions available on this machine. @@ -391,8 +391,10 @@ class OpenPypeVersion(semver.VersionInfo): Args: production (bool): Return production versions. staging (bool): Return staging versions. - compatible_with (OpenPypeVersion): Return only those compatible - with specified version. + + Returns: + list: of compatible versions available on the machine. + """ # Return all local versions if arguments are set to None if production is None and staging is None: @@ -411,16 +413,7 @@ class OpenPypeVersion(semver.VersionInfo): # DEPRECATED: backwards compatible way to look for versions in root dir_to_search = Path(user_data_dir("openpype", "pypeclub")) - versions = OpenPypeVersion.get_versions_from_directory( - dir_to_search, compatible_with=compatible_with - ) - if compatible_with: - dir_to_search = Path( - user_data_dir("openpype", "pypeclub")) / f"{compatible_with.major}.{compatible_with.minor}" # noqa - versions += OpenPypeVersion.get_versions_from_directory( - dir_to_search, compatible_with=compatible_with - ) - + versions = OpenPypeVersion.get_versions_from_directory(dir_to_search) filtered_versions = [] for version in versions: @@ -434,7 +427,7 @@ class OpenPypeVersion(semver.VersionInfo): @classmethod def get_remote_versions( cls, production: bool = None, - staging: bool = None, compatible_with: OpenPypeVersion = None + staging: bool = None ) -> List: """Get all versions available in OpenPype Path. @@ -444,8 +437,7 @@ class OpenPypeVersion(semver.VersionInfo): Args: production (bool): Return production versions. staging (bool): Return staging versions. - compatible_with (OpenPypeVersion): Return only those compatible - with specified version. + """ # Return all local versions if arguments are set to None if production is None and staging is None: @@ -479,13 +471,7 @@ class OpenPypeVersion(semver.VersionInfo): if not dir_to_search: return [] - # DEPRECATED: look for version in root directory - versions = cls.get_versions_from_directory( - dir_to_search, compatible_with=compatible_with) - if compatible_with: - dir_to_search = dir_to_search / f"{compatible_with.major}.{compatible_with.minor}" # noqa - versions += cls.get_versions_from_directory( - dir_to_search, compatible_with=compatible_with) + versions = cls.get_versions_from_directory(dir_to_search) filtered_versions = [] for version in versions: @@ -498,14 +484,11 @@ class OpenPypeVersion(semver.VersionInfo): @staticmethod def get_versions_from_directory( - openpype_dir: Path, - compatible_with: OpenPypeVersion = None) -> List: + openpype_dir: Path) -> List: """Get all detected OpenPype versions in directory. Args: openpype_dir (Path): Directory to scan. - compatible_with (OpenPypeVersion): Return only versions compatible - with build version specified as OpenPypeVersion. Returns: list of OpenPypeVersion @@ -514,15 +497,22 @@ class OpenPypeVersion(semver.VersionInfo): ValueError: if invalid path is specified. """ - _openpype_versions = [] + openpype_versions = [] if not openpype_dir.exists() and not openpype_dir.is_dir(): - return _openpype_versions + return openpype_versions # iterate over directory in first level and find all that might # contain OpenPype. for item in openpype_dir.iterdir(): + # if the item is directory with major.minor version, dive deeper - # if file, strip extension, in case of dir not. + if item.is_dir() and re.match(r"^\d+\.\d+$", item.name): + _versions = OpenPypeVersion.get_versions_from_directory( + item) + if _versions: + openpype_versions += _versions + + # if file exists, strip extension, in case of dir don't. name = item.name if item.is_dir() else item.stem result = OpenPypeVersion.version_in_str(name) @@ -540,14 +530,10 @@ class OpenPypeVersion(semver.VersionInfo): )[0]: continue - if compatible_with and not detected_version.is_compatible( - compatible_with): - continue - detected_version.path = item - _openpype_versions.append(detected_version) + openpype_versions.append(detected_version) - return sorted(_openpype_versions) + return sorted(openpype_versions) @staticmethod def get_installed_version_str() -> str: @@ -575,15 +561,14 @@ class OpenPypeVersion(semver.VersionInfo): def get_latest_version( staging: bool = False, local: bool = None, - remote: bool = None, - compatible_with: OpenPypeVersion = None + remote: bool = None ) -> Union[OpenPypeVersion, None]: - """Get latest available version. + """Get the latest available version. The version does not contain information about path and source. - This is utility version to get latest version from all found. Build - version is not listed if staging is enabled. + This is utility version to get the latest version from all found. + Build version is not listed if staging is enabled. Arguments 'local' and 'remote' define if local and remote repository versions are used. All versions are used if both are not set (or set @@ -595,8 +580,9 @@ class OpenPypeVersion(semver.VersionInfo): staging (bool, optional): List staging versions if True. local (bool, optional): List local versions if True. remote (bool, optional): List remote versions if True. - compatible_with (OpenPypeVersion, optional) Return only version - compatible with compatible_with. + + Returns: + Latest OpenPypeVersion or None """ if local is None and remote is None: @@ -628,12 +614,7 @@ class OpenPypeVersion(semver.VersionInfo): return None all_versions.sort() - latest_version: OpenPypeVersion - latest_version = all_versions[-1] - if compatible_with and not latest_version.is_compatible( - compatible_with): - return None - return latest_version + return all_versions[-1] @classmethod def get_expected_studio_version(cls, staging=False, global_settings=None): @@ -764,9 +745,9 @@ class BootstrapRepos: self, repo_dir: Path = None) -> Union[OpenPypeVersion, None]: """Copy zip created from OpenPype repositories to user data dir. - This detect OpenPype version either in local "live" OpenPype + This detects OpenPype version either in local "live" OpenPype repository or in user provided path. Then it will zip it in temporary - directory and finally it will move it to destination which is user + directory, and finally it will move it to destination which is user data directory. Existing files will be replaced. Args: @@ -777,7 +758,7 @@ class BootstrapRepos: """ # if repo dir is not set, we detect local "live" OpenPype repository - # version and use it as a source. Otherwise repo_dir is user + # version and use it as a source. Otherwise, repo_dir is user # entered location. if repo_dir: version = self.get_version(repo_dir) @@ -1141,28 +1122,27 @@ class BootstrapRepos: @staticmethod def find_openpype_version( version: Union[str, OpenPypeVersion], - staging: bool, - compatible_with: OpenPypeVersion = None + staging: bool ) -> Union[OpenPypeVersion, None]: """Find location of specified OpenPype version. Args: version (Union[str, OpenPypeVersion): Version to find. staging (bool): Filter staging versions. - compatible_with (OpenPypeVersion, optional): Find only - versions compatible with specified one. + + Returns: + requested OpenPypeVersion. """ + installed_version = OpenPypeVersion.get_installed_version() if isinstance(version, str): version = OpenPypeVersion(version=version) - installed_version = OpenPypeVersion.get_installed_version() if installed_version == version: return installed_version local_versions = OpenPypeVersion.get_local_versions( - staging=staging, production=not staging, - compatible_with=compatible_with + staging=staging, production=not staging ) zip_version = None for local_version in local_versions: @@ -1176,8 +1156,7 @@ class BootstrapRepos: return zip_version remote_versions = OpenPypeVersion.get_remote_versions( - staging=staging, production=not staging, - compatible_with=compatible_with + staging=staging, production=not staging ) for remote_version in remote_versions: if remote_version == version: @@ -1186,13 +1165,23 @@ class BootstrapRepos: @staticmethod def find_latest_openpype_version( - staging, compatible_with: OpenPypeVersion = None): + staging: bool + ) -> Union[OpenPypeVersion, None]: + """Find the latest available OpenPype version in all location. + + Args: + staging (bool): True to look for staging versions. + + Returns: + Latest OpenPype version on None if nothing was found. + + """ installed_version = OpenPypeVersion.get_installed_version() local_versions = OpenPypeVersion.get_local_versions( - staging=staging, compatible_with=compatible_with + staging=staging ) remote_versions = OpenPypeVersion.get_remote_versions( - staging=staging, compatible_with=compatible_with + staging=staging ) all_versions = local_versions + remote_versions if not staging: @@ -1217,8 +1206,7 @@ class BootstrapRepos: self, openpype_path: Union[Path, str] = None, staging: bool = False, - include_zips: bool = False, - compatible_with: OpenPypeVersion = None + include_zips: bool = False ) -> Union[List[OpenPypeVersion], None]: """Get ordered dict of detected OpenPype version. @@ -1235,8 +1223,6 @@ class BootstrapRepos: otherwise. include_zips (bool, optional): If set True it will try to find OpenPype in zip files in given directory. - compatible_with (OpenPypeVersion, optional): Find only those - versions compatible with the one specified. Returns: dict of Path: Dictionary of detected OpenPype version. @@ -1255,52 +1241,34 @@ class BootstrapRepos: ("Finding OpenPype in non-filesystem locations is" " not implemented yet.")) - version_dir = "" - if compatible_with: - version_dir = f"{compatible_with.major}.{compatible_with.minor}" - # if checks bellow for OPENPYPE_PATH and registry fails, use data_dir # DEPRECATED: lookup in root of this folder is deprecated in favour # of major.minor sub-folders. - dirs_to_search = [ - self.data_dir - ] - if compatible_with: - dirs_to_search.append(self.data_dir / version_dir) + dirs_to_search = [self.data_dir] if openpype_path: dirs_to_search = [openpype_path] - - if compatible_with: - dirs_to_search.append(openpype_path / version_dir) - else: + elif os.getenv("OPENPYPE_PATH") \ + and Path(os.getenv("OPENPYPE_PATH")).exists(): # first try OPENPYPE_PATH and if that is not available, # try registry. - if os.getenv("OPENPYPE_PATH") \ - and Path(os.getenv("OPENPYPE_PATH")).exists(): - dirs_to_search = [Path(os.getenv("OPENPYPE_PATH"))] + dirs_to_search = [Path(os.getenv("OPENPYPE_PATH"))] + else: + try: + registry_dir = Path( + str(self.registry.get_item("openPypePath"))) + if registry_dir.exists(): + dirs_to_search = [registry_dir] - if compatible_with: - dirs_to_search.append( - Path(os.getenv("OPENPYPE_PATH")) / version_dir) - else: - try: - registry_dir = Path( - str(self.registry.get_item("openPypePath"))) - if registry_dir.exists(): - dirs_to_search = [registry_dir] - if compatible_with: - dirs_to_search.append(registry_dir / version_dir) - - except ValueError: - # nothing found in registry, we'll use data dir - pass + except ValueError: + # nothing found in registry, we'll use data dir + pass openpype_versions = [] for dir_to_search in dirs_to_search: try: openpype_versions += self.get_openpype_versions( - dir_to_search, staging, compatible_with=compatible_with) + dir_to_search, staging) except ValueError: # location is invalid, skip it pass @@ -1668,15 +1636,12 @@ class BootstrapRepos: def get_openpype_versions( self, openpype_dir: Path, - staging: bool = False, - compatible_with: OpenPypeVersion = None) -> list: + staging: bool = False) -> list: """Get all detected OpenPype versions in directory. Args: openpype_dir (Path): Directory to scan. staging (bool, optional): Find staging versions if True. - compatible_with (OpenPypeVersion, optional): Get only versions - compatible with the one specified. Returns: list of OpenPypeVersion @@ -1688,12 +1653,18 @@ class BootstrapRepos: if not openpype_dir.exists() and not openpype_dir.is_dir(): raise ValueError(f"specified directory {openpype_dir} is invalid") - _openpype_versions = [] + openpype_versions = [] # iterate over directory in first level and find all that might # contain OpenPype. for item in openpype_dir.iterdir(): + # if the item is directory with major.minor version, dive deeper + if item.is_dir() and re.match(r"^\d+\.\d+$", item.name): + _versions = self.get_openpype_versions( + item, staging=staging) + if _versions: + openpype_versions += _versions - # if file, strip extension, in case of dir not. + # if it is file, strip extension, in case of dir don't. name = item.name if item.is_dir() else item.stem result = OpenPypeVersion.version_in_str(name) @@ -1711,18 +1682,14 @@ class BootstrapRepos: ): continue - if compatible_with and \ - not detected_version.is_compatible(compatible_with): - continue - detected_version.path = item if staging and detected_version.is_staging(): - _openpype_versions.append(detected_version) + openpype_versions.append(detected_version) if not staging and not detected_version.is_staging(): - _openpype_versions.append(detected_version) + openpype_versions.append(detected_version) - return sorted(_openpype_versions) + return sorted(openpype_versions) class OpenPypeVersionExists(Exception): diff --git a/igniter/install_thread.py b/igniter/install_thread.py index 8e31f8cb8f..0cccf664e7 100644 --- a/igniter/install_thread.py +++ b/igniter/install_thread.py @@ -62,7 +62,7 @@ class InstallThread(QThread): progress_callback=self.set_progress, message=self.message) local_version = OpenPypeVersion.get_installed_version_str() - # if user did entered nothing, we install OpenPype from local version. + # if user did enter nothing, we install OpenPype from local version. # zip content of `repos`, copy it to user data dir and append # version to it. if not self._path: @@ -93,6 +93,23 @@ class InstallThread(QThread): detected = bs.find_openpype(include_zips=True) if detected: + if not OpenPypeVersion.get_installed_version().is_compatible( + detected[-1]): + self.message.emit(( + f"Latest detected version {detected[-1]} " + "is not compatible with the currently running " + f"{local_version}" + ), True) + self.message.emit(( + "Filtering detected versions to compatible ones..." + ), False) + + detected = [ + version for version in detected + if version.is_compatible( + OpenPypeVersion.get_installed_version()) + ] + if OpenPypeVersion( version=local_version, path=Path()) < detected[-1]: self.message.emit(( diff --git a/openpype/cli.py b/openpype/cli.py index ffe288040e..398d1a94c0 100644 --- a/openpype/cli.py +++ b/openpype/cli.py @@ -40,18 +40,6 @@ def settings(dev): PypeCommands().launch_settings_gui(dev) -@main.command() -def standalonepublisher(): - """Show Pype Standalone publisher UI.""" - PypeCommands().launch_standalone_publisher() - - -@main.command() -def traypublisher(): - """Show new OpenPype Standalone publisher UI.""" - PypeCommands().launch_traypublisher() - - @main.command() def tray(): """Launch pype tray. diff --git a/openpype/client/entities.py b/openpype/client/entities.py index 7362f57d7f..f1f1d30214 100644 --- a/openpype/client/entities.py +++ b/openpype/client/entities.py @@ -6,6 +6,7 @@ that has project name as a context (e.g. on 'ProjectEntity'?). + We will need more specific functions doing wery specific queires really fast. """ +import re import collections import six @@ -1009,17 +1010,70 @@ def get_representation_by_name( return conn.find_one(query_filter, _prepare_fields(fields)) +def _flatten_dict(data): + flatten_queue = collections.deque() + flatten_queue.append(data) + output = {} + while flatten_queue: + item = flatten_queue.popleft() + for key, value in item.items(): + if not isinstance(value, dict): + output[key] = value + continue + + tmp = {} + for subkey, subvalue in value.items(): + new_key = "{}.{}".format(key, subkey) + tmp[new_key] = subvalue + flatten_queue.append(tmp) + return output + + +def _regex_filters(filters): + output = [] + for key, value in filters.items(): + regexes = [] + a_values = [] + if isinstance(value, re.Pattern): + regexes.append(value) + elif isinstance(value, (list, tuple, set)): + for item in value: + if isinstance(item, re.Pattern): + regexes.append(item) + else: + a_values.append(item) + else: + a_values.append(value) + + key_filters = [] + if len(a_values) == 1: + key_filters.append({key: a_values[0]}) + elif a_values: + key_filters.append({key: {"$in": a_values}}) + + for regex in regexes: + key_filters.append({key: {"$regex": regex}}) + + if len(key_filters) == 1: + output.append(key_filters[0]) + else: + output.append({"$or": key_filters}) + + return output + + def _get_representations( project_name, representation_ids, representation_names, version_ids, - extensions, + context_filters, names_by_version_ids, standard, archived, fields ): + default_output = [] repre_types = [] if standard: repre_types.append("representation") @@ -1027,7 +1081,7 @@ def _get_representations( repre_types.append("archived_representation") if not repre_types: - return [] + return default_output if len(repre_types) == 1: query_filter = {"type": repre_types[0]} @@ -1037,25 +1091,21 @@ def _get_representations( if representation_ids is not None: representation_ids = _convert_ids(representation_ids) if not representation_ids: - return [] + return default_output query_filter["_id"] = {"$in": representation_ids} if representation_names is not None: if not representation_names: - return [] + return default_output query_filter["name"] = {"$in": list(representation_names)} if version_ids is not None: version_ids = _convert_ids(version_ids) if not version_ids: - return [] + return default_output query_filter["parent"] = {"$in": version_ids} - if extensions is not None: - if not extensions: - return [] - query_filter["context.ext"] = {"$in": list(extensions)} - + or_queries = [] if names_by_version_ids is not None: or_query = [] for version_id, names in names_by_version_ids.items(): @@ -1065,8 +1115,36 @@ def _get_representations( "name": {"$in": list(names)} }) if not or_query: + return default_output + or_queries.append(or_query) + + if context_filters is not None: + if not context_filters: return [] - query_filter["$or"] = or_query + _flatten_filters = _flatten_dict(context_filters) + flatten_filters = {} + for key, value in _flatten_filters.items(): + if not key.startswith("context"): + key = "context.{}".format(key) + flatten_filters[key] = value + + for item in _regex_filters(flatten_filters): + for key, value in item.items(): + if key != "$or": + query_filter[key] = value + + elif value: + or_queries.append(value) + + if len(or_queries) == 1: + query_filter["$or"] = or_queries[0] + elif or_queries: + and_query = [] + for or_query in or_queries: + if isinstance(or_query, list): + or_query = {"$or": or_query} + and_query.append(or_query) + query_filter["$and"] = and_query conn = get_project_connection(project_name) @@ -1078,7 +1156,7 @@ def get_representations( representation_ids=None, representation_names=None, version_ids=None, - extensions=None, + context_filters=None, names_by_version_ids=None, archived=False, standard=True, @@ -1096,8 +1174,8 @@ def get_representations( as filter. Filter ignored if 'None' is passed. version_ids (Iterable[str]): Subset ids used as parent filter. Filter ignored if 'None' is passed. - extensions (Iterable[str]): Filter by extension of main representation - file (without dot). + context_filters (Dict[str, List[str, re.Pattern]]): Filter by + representation context fields. names_by_version_ids (dict[ObjectId, list[str]]): Complex filtering using version ids and list of names under the version. archived (bool): Output will also contain archived representations. @@ -1113,7 +1191,7 @@ def get_representations( representation_ids=representation_ids, representation_names=representation_names, version_ids=version_ids, - extensions=extensions, + context_filters=context_filters, names_by_version_ids=names_by_version_ids, standard=True, archived=archived, @@ -1126,7 +1204,7 @@ def get_archived_representations( representation_ids=None, representation_names=None, version_ids=None, - extensions=None, + context_filters=None, names_by_version_ids=None, fields=None ): @@ -1142,8 +1220,8 @@ def get_archived_representations( as filter. Filter ignored if 'None' is passed. version_ids (Iterable[str]): Subset ids used as parent filter. Filter ignored if 'None' is passed. - extensions (Iterable[str]): Filter by extension of main representation - file (without dot). + context_filters (Dict[str, List[str, re.Pattern]]): Filter by + representation context fields. names_by_version_ids (dict[ObjectId, List[str]]): Complex filtering using version ids and list of names under the version. fields (Iterable[str]): Fields that should be returned. All fields are @@ -1158,7 +1236,7 @@ def get_archived_representations( representation_ids=representation_ids, representation_names=representation_names, version_ids=version_ids, - extensions=extensions, + context_filters=context_filters, names_by_version_ids=names_by_version_ids, standard=False, archived=True, @@ -1181,58 +1259,64 @@ def get_representations_parents(project_name, representations): dict[ObjectId, tuple]: Parents by representation id. """ - repres_by_version_id = collections.defaultdict(list) - versions_by_version_id = {} - versions_by_subset_id = collections.defaultdict(list) - subsets_by_subset_id = {} - subsets_by_asset_id = collections.defaultdict(list) + repre_docs_by_version_id = collections.defaultdict(list) + version_docs_by_version_id = {} + version_docs_by_subset_id = collections.defaultdict(list) + subset_docs_by_subset_id = {} + subset_docs_by_asset_id = collections.defaultdict(list) output = {} - for representation in representations: - repre_id = representation["_id"] + for repre_doc in representations: + repre_id = repre_doc["_id"] + version_id = repre_doc["parent"] output[repre_id] = (None, None, None, None) - version_id = representation["parent"] - repres_by_version_id[version_id].append(representation) + repre_docs_by_version_id[version_id].append(repre_doc) - versions = get_versions( - project_name, version_ids=repres_by_version_id.keys() + version_docs = get_versions( + project_name, + version_ids=repre_docs_by_version_id.keys(), + hero=True ) - for version in versions: - version_id = version["_id"] - subset_id = version["parent"] - versions_by_version_id[version_id] = version - versions_by_subset_id[subset_id].append(version) + for version_doc in version_docs: + version_id = version_doc["_id"] + subset_id = version_doc["parent"] + version_docs_by_version_id[version_id] = version_doc + version_docs_by_subset_id[subset_id].append(version_doc) - subsets = get_subsets( - project_name, subset_ids=versions_by_subset_id.keys() + subset_docs = get_subsets( + project_name, subset_ids=version_docs_by_subset_id.keys() ) - for subset in subsets: - subset_id = subset["_id"] - asset_id = subset["parent"] - subsets_by_subset_id[subset_id] = subset - subsets_by_asset_id[asset_id].append(subset) + for subset_doc in subset_docs: + subset_id = subset_doc["_id"] + asset_id = subset_doc["parent"] + subset_docs_by_subset_id[subset_id] = subset_doc + subset_docs_by_asset_id[asset_id].append(subset_doc) - assets = get_assets(project_name, asset_ids=subsets_by_asset_id.keys()) - assets_by_id = { - asset["_id"]: asset - for asset in assets + asset_docs = get_assets( + project_name, asset_ids=subset_docs_by_asset_id.keys() + ) + asset_docs_by_id = { + asset_doc["_id"]: asset_doc + for asset_doc in asset_docs } - project = get_project(project_name) + project_doc = get_project(project_name) - for version_id, representations in repres_by_version_id.items(): - asset = None - subset = None - version = versions_by_version_id.get(version_id) - if version: - subset_id = version["parent"] - subset = subsets_by_subset_id.get(subset_id) - if subset: - asset_id = subset["parent"] - asset = assets_by_id.get(asset_id) + for version_id, repre_docs in repre_docs_by_version_id.items(): + asset_doc = None + subset_doc = None + version_doc = version_docs_by_version_id.get(version_id) + if version_doc: + subset_id = version_doc["parent"] + subset_doc = subset_docs_by_subset_id.get(subset_id) + if subset_doc: + asset_id = subset_doc["parent"] + asset_doc = asset_docs_by_id.get(asset_id) - for representation in representations: - repre_id = representation["_id"] - output[repre_id] = (version, subset, asset, project) + for repre_doc in repre_docs: + repre_id = repre_doc["_id"] + output[repre_id] = ( + version_doc, subset_doc, asset_doc, project_doc + ) return output diff --git a/openpype/host/host.py b/openpype/host/host.py index 48907e7ec7..9cdbb819e1 100644 --- a/openpype/host/host.py +++ b/openpype/host/host.py @@ -19,8 +19,15 @@ class MissingMethodsError(ValueError): joined_missing = ", ".join( ['"{}"'.format(item) for item in missing_methods] ) + if isinstance(host, HostBase): + host_name = host.name + else: + try: + host_name = host.__file__.replace("\\", "/").split("/")[-3] + except Exception: + host_name = str(host) message = ( - "Host \"{}\" miss methods {}".format(host.name, joined_missing) + "Host \"{}\" miss methods {}".format(host_name, joined_missing) ) super(MissingMethodsError, self).__init__(message) diff --git a/openpype/hosts/aftereffects/plugins/create/workfile_creator.py b/openpype/hosts/aftereffects/plugins/create/workfile_creator.py index badb3675fd..f82d15b3c9 100644 --- a/openpype/hosts/aftereffects/plugins/create/workfile_creator.py +++ b/openpype/hosts/aftereffects/plugins/create/workfile_creator.py @@ -11,6 +11,8 @@ class AEWorkfileCreator(AutoCreator): identifier = "workfile" family = "workfile" + default_variant = "Main" + def get_instance_attr_defs(self): return [] @@ -35,7 +37,6 @@ class AEWorkfileCreator(AutoCreator): existing_instance = instance break - variant = '' project_name = legacy_io.Session["AVALON_PROJECT"] asset_name = legacy_io.Session["AVALON_ASSET"] task_name = legacy_io.Session["AVALON_TASK"] @@ -44,15 +45,17 @@ class AEWorkfileCreator(AutoCreator): if existing_instance is None: asset_doc = get_asset_by_name(project_name, asset_name) subset_name = self.get_subset_name( - variant, task_name, asset_doc, project_name, host_name + self.default_variant, task_name, asset_doc, + project_name, host_name ) data = { "asset": asset_name, "task": task_name, - "variant": variant + "variant": self.default_variant } data.update(self.get_dynamic_data( - variant, task_name, asset_doc, project_name, host_name + self.default_variant, task_name, asset_doc, + project_name, host_name )) new_instance = CreatedInstance( @@ -69,7 +72,9 @@ class AEWorkfileCreator(AutoCreator): ): asset_doc = get_asset_by_name(project_name, asset_name) subset_name = self.get_subset_name( - variant, task_name, asset_doc, project_name, host_name + self.default_variant, task_name, asset_doc, + project_name, host_name ) existing_instance["asset"] = asset_name existing_instance["task"] = task_name + existing_instance["subset"] = subset_name diff --git a/openpype/hosts/aftereffects/plugins/publish/collect_workfile.py b/openpype/hosts/aftereffects/plugins/publish/collect_workfile.py index 9cb6900b0a..fef5448a4c 100644 --- a/openpype/hosts/aftereffects/plugins/publish/collect_workfile.py +++ b/openpype/hosts/aftereffects/plugins/publish/collect_workfile.py @@ -11,6 +11,8 @@ class CollectWorkfile(pyblish.api.ContextPlugin): label = "Collect After Effects Workfile Instance" order = pyblish.api.CollectorOrder + 0.1 + default_variant = "Main" + def process(self, context): existing_instance = None for instance in context: @@ -71,7 +73,7 @@ class CollectWorkfile(pyblish.api.ContextPlugin): family = "workfile" subset = get_subset_name_with_asset_doc( family, - "", + self.default_variant, context.data["anatomyData"]["task"]["name"], context.data["assetEntity"], context.data["anatomyData"]["project"]["name"], diff --git a/openpype/hosts/blender/plugins/publish/extract_layout.py b/openpype/hosts/blender/plugins/publish/extract_layout.py index 75d9cf440d..8502c6fbd4 100644 --- a/openpype/hosts/blender/plugins/publish/extract_layout.py +++ b/openpype/hosts/blender/plugins/publish/extract_layout.py @@ -180,7 +180,7 @@ class ExtractLayout(openpype.api.Extractor): "rotation": { "x": asset.rotation_euler.x, "y": asset.rotation_euler.y, - "z": asset.rotation_euler.z, + "z": asset.rotation_euler.z }, "scale": { "x": asset.scale.x, @@ -189,6 +189,18 @@ class ExtractLayout(openpype.api.Extractor): } } + json_element["transform_matrix"] = [] + + for row in list(asset.matrix_world.transposed()): + json_element["transform_matrix"].append(list(row)) + + json_element["basis"] = [ + [1, 0, 0, 0], + [0, -1, 0, 0], + [0, 0, 1, 0], + [0, 0, 0, 1] + ] + # Extract the animation as well if family == "rig": f, n = self._export_animation( diff --git a/openpype/hosts/flame/api/__init__.py b/openpype/hosts/flame/api/__init__.py index 2c461e5f16..76c1c93379 100644 --- a/openpype/hosts/flame/api/__init__.py +++ b/openpype/hosts/flame/api/__init__.py @@ -30,7 +30,8 @@ from .lib import ( maintained_temp_file_path, get_clip_segment, get_batch_group_from_desktop, - MediaInfoFile + MediaInfoFile, + TimeEffectMetadata ) from .utils import ( setup, @@ -107,6 +108,7 @@ __all__ = [ "get_clip_segment", "get_batch_group_from_desktop", "MediaInfoFile", + "TimeEffectMetadata", # pipeline "install", diff --git a/openpype/hosts/flame/api/lib.py b/openpype/hosts/flame/api/lib.py index d59308ad6c..94c46fe937 100644 --- a/openpype/hosts/flame/api/lib.py +++ b/openpype/hosts/flame/api/lib.py @@ -5,10 +5,11 @@ import json import pickle import clique import tempfile +import traceback import itertools import contextlib import xml.etree.cElementTree as cET -from copy import deepcopy +from copy import deepcopy, copy from xml.etree import ElementTree as ET from pprint import pformat from .constants import ( @@ -266,7 +267,7 @@ def get_current_sequence(selection): def rescan_hooks(): import flame try: - flame.execute_shortcut('Rescan Python Hooks') + flame.execute_shortcut("Rescan Python Hooks") except Exception: pass @@ -1082,21 +1083,21 @@ class MediaInfoFile(object): xml_data (ET.Element): clip data """ try: - for out_track in xml_data.iter('track'): - for out_feed in out_track.iter('feed'): + for out_track in xml_data.iter("track"): + for out_feed in out_track.iter("feed"): # start frame out_feed_nb_ticks_obj = out_feed.find( - 'startTimecode/nbTicks') + "startTimecode/nbTicks") self.start_frame = out_feed_nb_ticks_obj.text # fps out_feed_fps_obj = out_feed.find( - 'startTimecode/rate') + "startTimecode/rate") self.fps = out_feed_fps_obj.text # drop frame mode out_feed_drop_mode_obj = out_feed.find( - 'startTimecode/dropMode') + "startTimecode/dropMode") self.drop_mode = out_feed_drop_mode_obj.text break except Exception as msg: @@ -1118,8 +1119,153 @@ class MediaInfoFile(object): tree = cET.ElementTree(xml_element_data) tree.write( fpath, xml_declaration=True, - method='xml', encoding='UTF-8' + method="xml", encoding="UTF-8" ) except IOError as error: raise IOError( "Not able to write data to file: {}".format(error)) + + +class TimeEffectMetadata(object): + log = log + _data = {} + _retime_modes = { + 0: "speed", + 1: "timewarp", + 2: "duration" + } + + def __init__(self, segment, logger=None): + if logger: + self.log = logger + + self._data = self._get_metadata(segment) + + @property + def data(self): + """ Returns timewarp effect data + + Returns: + dict: retime data + """ + return self._data + + def _get_metadata(self, segment): + effects = segment.effects or [] + for effect in effects: + if effect.type == "Timewarp": + with maintained_temp_file_path(".timewarp_node") as tmp_path: + self.log.info("Temp File: {}".format(tmp_path)) + effect.save_setup(tmp_path) + return self._get_attributes_from_xml(tmp_path) + + return {} + + def _get_attributes_from_xml(self, tmp_path): + with open(tmp_path, "r") as tw_setup_file: + tw_setup_string = tw_setup_file.read() + tw_setup_file.close() + + tw_setup_xml = ET.fromstring(tw_setup_string) + tw_setup = self._dictify(tw_setup_xml) + # pprint(tw_setup) + try: + tw_setup_state = tw_setup["Setup"]["State"][0] + mode = int( + tw_setup_state["TW_RetimerMode"][0]["_text"] + ) + r_data = { + "type": self._retime_modes[mode], + "effectStart": int( + tw_setup["Setup"]["Base"][0]["Range"][0]["Start"]), + "effectEnd": int( + tw_setup["Setup"]["Base"][0]["Range"][0]["End"]) + } + + if mode == 0: # speed + r_data[self._retime_modes[mode]] = float( + tw_setup_state["TW_Speed"] + [0]["Channel"][0]["Value"][0]["_text"] + ) / 100 + elif mode == 1: # timewarp + print("timing") + r_data[self._retime_modes[mode]] = self._get_anim_keys( + tw_setup_state["TW_Timing"] + ) + elif mode == 2: # duration + r_data[self._retime_modes[mode]] = { + "start": { + "source": int( + tw_setup_state["TW_DurationTiming"][0]["Channel"] + [0]["KFrames"][0]["Key"][0]["Value"][0]["_text"] + ), + "timeline": int( + tw_setup_state["TW_DurationTiming"][0]["Channel"] + [0]["KFrames"][0]["Key"][0]["Frame"][0]["_text"] + ) + }, + "end": { + "source": int( + tw_setup_state["TW_DurationTiming"][0]["Channel"] + [0]["KFrames"][0]["Key"][1]["Value"][0]["_text"] + ), + "timeline": int( + tw_setup_state["TW_DurationTiming"][0]["Channel"] + [0]["KFrames"][0]["Key"][1]["Frame"][0]["_text"] + ) + } + } + except Exception: + lines = traceback.format_exception(*sys.exc_info()) + self.log.error("\n".join(lines)) + return + + return r_data + + def _get_anim_keys(self, setup_cat, index=None): + return_data = { + "extrapolation": ( + setup_cat[0]["Channel"][0]["Extrap"][0]["_text"] + ), + "animKeys": [] + } + for key in setup_cat[0]["Channel"][0]["KFrames"][0]["Key"]: + if index and int(key["Index"]) != index: + continue + key_data = { + "source": float(key["Value"][0]["_text"]), + "timeline": float(key["Frame"][0]["_text"]), + "index": int(key["Index"]), + "curveMode": key["CurveMode"][0]["_text"], + "curveOrder": key["CurveOrder"][0]["_text"] + } + if key.get("TangentMode"): + key_data["tangentMode"] = key["TangentMode"][0]["_text"] + + return_data["animKeys"].append(key_data) + + return return_data + + def _dictify(self, xml_, root=True): + """ Convert xml object to dictionary + + Args: + xml_ (xml.etree.ElementTree.Element): xml data + root (bool, optional): is root available. Defaults to True. + + Returns: + dict: dictionarized xml + """ + + if root: + return {xml_.tag: self._dictify(xml_, False)} + + d = copy(xml_.attrib) + if xml_.text: + d["_text"] = xml_.text + + for x in xml_.findall("./*"): + if x.tag not in d: + d[x.tag] = [] + d[x.tag].append(self._dictify(x, False)) + return d diff --git a/openpype/hosts/flame/otio/flame_export.py b/openpype/hosts/flame/otio/flame_export.py index 1e4ef866ed..6d6b33d2a1 100644 --- a/openpype/hosts/flame/otio/flame_export.py +++ b/openpype/hosts/flame/otio/flame_export.py @@ -275,7 +275,7 @@ def create_otio_reference(clip_data, fps=None): def create_otio_clip(clip_data): - from openpype.hosts.flame.api import MediaInfoFile + from openpype.hosts.flame.api import MediaInfoFile, TimeEffectMetadata segment = clip_data["PySegment"] @@ -284,14 +284,31 @@ def create_otio_clip(clip_data): media_timecode_start = media_info.start_frame media_fps = media_info.fps + # Timewarp metadata + tw_data = TimeEffectMetadata(segment, logger=log).data + log.debug("__ tw_data: {}".format(tw_data)) + # define first frame - first_frame = media_timecode_start or utils.get_frame_from_filename( - clip_data["fpath"]) or 0 + file_first_frame = utils.get_frame_from_filename( + clip_data["fpath"]) + if file_first_frame: + file_first_frame = int(file_first_frame) + + first_frame = media_timecode_start or file_first_frame or 0 _clip_source_in = int(clip_data["source_in"]) _clip_source_out = int(clip_data["source_out"]) + _clip_record_in = clip_data["record_in"] + _clip_record_out = clip_data["record_out"] _clip_record_duration = int(clip_data["record_duration"]) + log.debug("_ file_first_frame: {}".format(file_first_frame)) + log.debug("_ first_frame: {}".format(first_frame)) + log.debug("_ _clip_source_in: {}".format(_clip_source_in)) + log.debug("_ _clip_source_out: {}".format(_clip_source_out)) + log.debug("_ _clip_record_in: {}".format(_clip_record_in)) + log.debug("_ _clip_record_out: {}".format(_clip_record_out)) + # first solve if the reverse timing speed = 1 if clip_data["source_in"] > clip_data["source_out"]: @@ -302,16 +319,28 @@ def create_otio_clip(clip_data): source_in = _clip_source_in - int(first_frame) source_out = _clip_source_out - int(first_frame) + log.debug("_ source_in: {}".format(source_in)) + log.debug("_ source_out: {}".format(source_out)) + + if file_first_frame: + log.debug("_ file_source_in: {}".format( + file_first_frame + source_in)) + log.debug("_ file_source_in: {}".format( + file_first_frame + source_out)) + source_duration = (source_out - source_in + 1) # secondly check if any change of speed if source_duration != _clip_record_duration: retime_speed = float(source_duration) / float(_clip_record_duration) - log.debug("_ retime_speed: {}".format(retime_speed)) + log.debug("_ calculated speed: {}".format(retime_speed)) speed *= retime_speed - log.debug("_ source_in: {}".format(source_in)) - log.debug("_ source_out: {}".format(source_out)) + # get speed from metadata if available + if tw_data.get("speed"): + speed = tw_data["speed"] + log.debug("_ metadata speed: {}".format(speed)) + log.debug("_ speed: {}".format(speed)) log.debug("_ source_duration: {}".format(source_duration)) log.debug("_ _clip_record_duration: {}".format(_clip_record_duration)) diff --git a/openpype/hosts/flame/plugins/publish/extract_subset_resources.py b/openpype/hosts/flame/plugins/publish/extract_subset_resources.py index d34f5d5854..3e1e8db986 100644 --- a/openpype/hosts/flame/plugins/publish/extract_subset_resources.py +++ b/openpype/hosts/flame/plugins/publish/extract_subset_resources.py @@ -8,6 +8,9 @@ import pyblish.api import openpype.api from openpype.hosts.flame import api as opfapi from openpype.hosts.flame.api import MediaInfoFile +from openpype.pipeline.editorial import ( + get_media_range_with_retimes +) import flame @@ -47,7 +50,6 @@ class ExtractSubsetResources(openpype.api.Extractor): export_presets_mapping = {} def process(self, instance): - if not self.keep_original_representation: # remove previeous representation if not needed instance.data["representations"] = [] @@ -67,18 +69,60 @@ class ExtractSubsetResources(openpype.api.Extractor): # get media source first frame source_first_frame = instance.data["sourceFirstFrame"] + self.log.debug("_ frame_start: {}".format(frame_start)) + self.log.debug("_ source_first_frame: {}".format(source_first_frame)) + # get timeline in/out of segment clip_in = instance.data["clipIn"] clip_out = instance.data["clipOut"] + # get retimed attributres + retimed_data = self._get_retimed_attributes(instance) + + # get individual keys + r_handle_start = retimed_data["handle_start"] + r_handle_end = retimed_data["handle_end"] + r_source_dur = retimed_data["source_duration"] + r_speed = retimed_data["speed"] + # get handles value - take only the max from both handle_start = instance.data["handleStart"] - handle_end = instance.data["handleStart"] + handle_end = instance.data["handleEnd"] handles = max(handle_start, handle_end) + include_handles = instance.data.get("includeHandles") # get media source range with handles source_start_handles = instance.data["sourceStartH"] source_end_handles = instance.data["sourceEndH"] + # retime if needed + if r_speed != 1.0: + source_start_handles = ( + instance.data["sourceStart"] - r_handle_start) + source_end_handles = ( + source_start_handles + + (r_source_dur - 1) + + r_handle_start + + r_handle_end + ) + + # get frame range with handles for representation range + frame_start_handle = frame_start - handle_start + repre_frame_start = frame_start_handle + if include_handles: + if r_speed == 1.0: + frame_start_handle = frame_start + else: + frame_start_handle = ( + frame_start - handle_start) + r_handle_start + + self.log.debug("_ frame_start_handle: {}".format( + frame_start_handle)) + self.log.debug("_ repre_frame_start: {}".format( + repre_frame_start)) + + # calculate duration with handles + source_duration_handles = ( + source_end_handles - source_start_handles) + 1 # create staging dir path staging_dir = self.staging_dir(instance) @@ -93,6 +137,28 @@ class ExtractSubsetResources(openpype.api.Extractor): } export_presets.update(self.export_presets_mapping) + if not instance.data.get("versionData"): + instance.data["versionData"] = {} + + # set versiondata if any retime + version_data = retimed_data.get("version_data") + self.log.debug("_ version_data: {}".format(version_data)) + + if version_data: + instance.data["versionData"].update(version_data) + + if r_speed != 1.0: + instance.data["versionData"].update({ + "frameStart": frame_start_handle, + "frameEnd": ( + (frame_start_handle + source_duration_handles - 1) + - (r_handle_start + r_handle_end) + ) + }) + self.log.debug("_ i_version_data: {}".format( + instance.data["versionData"] + )) + # loop all preset names and for unique_name, preset_config in export_presets.items(): modify_xml_data = {} @@ -115,20 +181,10 @@ class ExtractSubsetResources(openpype.api.Extractor): ) ) - # get frame range with handles for representation range - frame_start_handle = frame_start - handle_start - - # calculate duration with handles - source_duration_handles = ( - source_end_handles - source_start_handles) - - # define in/out marks - in_mark = (source_start_handles - source_first_frame) + 1 - out_mark = in_mark + source_duration_handles - exporting_clip = None name_patern_xml = "_{}.".format( unique_name) + if export_type == "Sequence Publish": # change export clip to sequence exporting_clip = flame.duplicate(sequence_clip) @@ -142,19 +198,25 @@ class ExtractSubsetResources(openpype.api.Extractor): "__{}.").format( unique_name) - # change in/out marks to timeline in/out + # only for h264 with baked retime in_mark = clip_in - out_mark = clip_out + out_mark = clip_out + 1 + modify_xml_data.update({ + "exportHandles": True, + "nbHandles": handles + }) else: + in_mark = (source_start_handles - source_first_frame) + 1 + out_mark = in_mark + source_duration_handles exporting_clip = self.import_clip(clip_path) exporting_clip.name.set_value("{}_{}".format( asset_name, segment_name)) # add xml tags modifications modify_xml_data.update({ - "exportHandles": True, - "nbHandles": handles, - "startFrame": frame_start, + # enum position low start from 0 + "frameIndex": 0, + "startFrame": repre_frame_start, "namePattern": name_patern_xml }) @@ -162,6 +224,9 @@ class ExtractSubsetResources(openpype.api.Extractor): # add any xml overrides collected form segment.comment modify_xml_data.update(instance.data["xml_overrides"]) + self.log.debug("_ in_mark: {}".format(in_mark)) + self.log.debug("_ out_mark: {}".format(out_mark)) + export_kwargs = {} # validate xml preset file is filled if preset_file == "": @@ -196,9 +261,8 @@ class ExtractSubsetResources(openpype.api.Extractor): "namePattern": "__thumbnail" }) thumb_frame_number = int(in_mark + ( - source_duration_handles / 2)) + (out_mark - in_mark + 1) / 2)) - self.log.debug("__ in_mark: {}".format(in_mark)) self.log.debug("__ thumb_frame_number: {}".format( thumb_frame_number )) @@ -210,9 +274,6 @@ class ExtractSubsetResources(openpype.api.Extractor): "out_mark": out_mark }) - self.log.debug("__ modify_xml_data: {}".format( - pformat(modify_xml_data) - )) preset_path = opfapi.modify_preset_file( preset_orig_xml_path, staging_dir, modify_xml_data) @@ -281,9 +342,9 @@ class ExtractSubsetResources(openpype.api.Extractor): # add frame range if preset_config["representation_add_range"]: representation_data.update({ - "frameStart": frame_start_handle, + "frameStart": repre_frame_start, "frameEnd": ( - frame_start_handle + source_duration_handles), + repre_frame_start + source_duration_handles) - 1, "fps": instance.data["fps"] }) @@ -300,8 +361,32 @@ class ExtractSubsetResources(openpype.api.Extractor): # at the end remove the duplicated clip flame.delete(exporting_clip) - self.log.debug("All representations: {}".format( - pformat(instance.data["representations"]))) + def _get_retimed_attributes(self, instance): + handle_start = instance.data["handleStart"] + handle_end = instance.data["handleEnd"] + + # get basic variables + otio_clip = instance.data["otioClip"] + + # get available range trimmed with processed retimes + retimed_attributes = get_media_range_with_retimes( + otio_clip, handle_start, handle_end) + self.log.debug( + ">> retimed_attributes: {}".format(retimed_attributes)) + + r_media_in = int(retimed_attributes["mediaIn"]) + r_media_out = int(retimed_attributes["mediaOut"]) + version_data = retimed_attributes.get("versionData") + + return { + "version_data": version_data, + "handle_start": int(retimed_attributes["handleStart"]), + "handle_end": int(retimed_attributes["handleEnd"]), + "source_duration": ( + (r_media_out - r_media_in) + 1 + ), + "speed": float(retimed_attributes["speed"]) + } def _should_skip(self, preset_config, clip_path, unique_name): # get activating attributes @@ -313,8 +398,6 @@ class ExtractSubsetResources(openpype.api.Extractor): unique_name, activated_preset, filter_path_regex ) ) - self.log.debug( - "__ clip_path: `{}`".format(clip_path)) # skip if not activated presete if not activated_preset: diff --git a/openpype/hosts/maya/__init__.py b/openpype/hosts/maya/__init__.py index c1c82c62e5..72b4d5853c 100644 --- a/openpype/hosts/maya/__init__.py +++ b/openpype/hosts/maya/__init__.py @@ -1,27 +1,6 @@ -import os +from .module import OpenPypeMaya -def add_implementation_envs(env, _app): - # Add requirements to PYTHONPATH - pype_root = os.environ["OPENPYPE_REPOS_ROOT"] - new_python_paths = [ - os.path.join(pype_root, "openpype", "hosts", "maya", "startup") - ] - old_python_path = env.get("PYTHONPATH") or "" - for path in old_python_path.split(os.pathsep): - if not path: - continue - - norm_path = os.path.normpath(path) - if norm_path not in new_python_paths: - new_python_paths.append(norm_path) - - env["PYTHONPATH"] = os.pathsep.join(new_python_paths) - - # Set default values if are not already set via settings - defaults = { - "OPENPYPE_LOG_NO_COLORS": "Yes" - } - for key, value in defaults.items(): - if not env.get(key): - env[key] = value +__all__ = ( + "OpenPypeMaya", +) diff --git a/openpype/hosts/maya/api/lib_rendersettings.py b/openpype/hosts/maya/api/lib_rendersettings.py index 9aea55a03b..7cd2193086 100644 --- a/openpype/hosts/maya/api/lib_rendersettings.py +++ b/openpype/hosts/maya/api/lib_rendersettings.py @@ -60,8 +60,7 @@ class RenderSettings(object): try: aov_separator = self._aov_chars[( self._project_settings["maya"] - ["create"] - ["CreateRender"] + ["RenderSettings"] ["aov_separator"] )] except KeyError: diff --git a/openpype/hosts/maya/api/lib_template_builder.py b/openpype/hosts/maya/api/lib_template_builder.py new file mode 100644 index 0000000000..34a8450a26 --- /dev/null +++ b/openpype/hosts/maya/api/lib_template_builder.py @@ -0,0 +1,253 @@ +import json +from collections import OrderedDict +import maya.cmds as cmds + +import qargparse +from openpype.tools.utils.widgets import OptionDialog +from .lib import get_main_window, imprint + +# To change as enum +build_types = ["context_asset", "linked_asset", "all_assets"] + + +def get_placeholder_attributes(node): + return { + attr: cmds.getAttr("{}.{}".format(node, attr)) + for attr in cmds.listAttr(node, userDefined=True)} + + +def delete_placeholder_attributes(node): + ''' + function to delete all extra placeholder attributes + ''' + extra_attributes = get_placeholder_attributes(node) + for attribute in extra_attributes: + cmds.deleteAttr(node + '.' + attribute) + + +def create_placeholder(): + args = placeholder_window() + + if not args: + return # operation canceled, no locator created + + # custom arg parse to force empty data query + # and still imprint them on placeholder + # and getting items when arg is of type Enumerator + options = create_options(args) + + # create placeholder name dynamically from args and options + placeholder_name = create_placeholder_name(args, options) + + selection = cmds.ls(selection=True) + if not selection: + raise ValueError("Nothing is selected") + + placeholder = cmds.spaceLocator(name=placeholder_name)[0] + + # get the long name of the placeholder (with the groups) + placeholder_full_name = cmds.ls(selection[0], long=True)[ + 0] + '|' + placeholder.replace('|', '') + + if selection: + cmds.parent(placeholder, selection[0]) + + imprint(placeholder_full_name, options) + + # Some tweaks because imprint force enums to to default value so we get + # back arg read and force them to attributes + imprint_enum(placeholder_full_name, args) + + # Add helper attributes to keep placeholder info + cmds.addAttr( + placeholder_full_name, + longName="parent", + hidden=True, + dataType="string" + ) + cmds.addAttr( + placeholder_full_name, + longName="index", + hidden=True, + attributeType="short", + defaultValue=-1 + ) + + cmds.setAttr(placeholder_full_name + '.parent', "", type="string") + + +def create_placeholder_name(args, options): + placeholder_builder_type = [ + arg.read() for arg in args if 'builder_type' in str(arg) + ][0] + placeholder_family = options['family'] + placeholder_name = placeholder_builder_type.split('_') + + # add famlily in any + if placeholder_family: + placeholder_name.insert(1, placeholder_family) + + # add loader arguments if any + if options['loader_args']: + pos = 2 + loader_args = options['loader_args'].replace('\'', '\"') + loader_args = json.loads(loader_args) + values = [v for v in loader_args.values()] + for i in range(len(values)): + placeholder_name.insert(i + pos, values[i]) + + placeholder_name = '_'.join(placeholder_name) + + return placeholder_name.capitalize() + + +def update_placeholder(): + placeholder = cmds.ls(selection=True) + if len(placeholder) == 0: + raise ValueError("No node selected") + if len(placeholder) > 1: + raise ValueError("Too many selected nodes") + placeholder = placeholder[0] + + args = placeholder_window(get_placeholder_attributes(placeholder)) + + if not args: + return # operation canceled + + # delete placeholder attributes + delete_placeholder_attributes(placeholder) + + options = create_options(args) + + imprint(placeholder, options) + imprint_enum(placeholder, args) + + cmds.addAttr( + placeholder, + longName="parent", + hidden=True, + dataType="string" + ) + cmds.addAttr( + placeholder, + longName="index", + hidden=True, + attributeType="short", + defaultValue=-1 + ) + + cmds.setAttr(placeholder + '.parent', '', type="string") + + +def create_options(args): + options = OrderedDict() + for arg in args: + if not type(arg) == qargparse.Separator: + options[str(arg)] = arg._data.get("items") or arg.read() + return options + + +def imprint_enum(placeholder, args): + """ + Imprint method doesn't act properly with enums. + Replacing the functionnality with this for now + """ + enum_values = {str(arg): arg.read() + for arg in args if arg._data.get("items")} + string_to_value_enum_table = { + build: i for i, build + in enumerate(build_types)} + for key, value in enum_values.items(): + cmds.setAttr( + placeholder + "." + key, + string_to_value_enum_table[value]) + + +def placeholder_window(options=None): + options = options or dict() + dialog = OptionDialog(parent=get_main_window()) + dialog.setWindowTitle("Create Placeholder") + + args = [ + qargparse.Separator("Main attributes"), + qargparse.Enum( + "builder_type", + label="Asset Builder Type", + default=options.get("builder_type", 0), + items=build_types, + help="""Asset Builder Type +Builder type describe what template loader will look for. +context_asset : Template loader will look for subsets of +current context asset (Asset bob will find asset) +linked_asset : Template loader will look for assets linked +to current context asset. +Linked asset are looked in avalon database under field "inputLinks" +""" + ), + qargparse.String( + "family", + default=options.get("family", ""), + label="OpenPype Family", + placeholder="ex: model, look ..."), + qargparse.String( + "representation", + default=options.get("representation", ""), + label="OpenPype Representation", + placeholder="ex: ma, abc ..."), + qargparse.String( + "loader", + default=options.get("loader", ""), + label="Loader", + placeholder="ex: ReferenceLoader, LightLoader ...", + help="""Loader +Defines what openpype loader will be used to load assets. +Useable loader depends on current host's loader list. +Field is case sensitive. +"""), + qargparse.String( + "loader_args", + default=options.get("loader_args", ""), + label="Loader Arguments", + placeholder='ex: {"camera":"persp", "lights":True}', + help="""Loader +Defines a dictionnary of arguments used to load assets. +Useable arguments depend on current placeholder Loader. +Field should be a valid python dict. Anything else will be ignored. +"""), + qargparse.Integer( + "order", + default=options.get("order", 0), + min=0, + max=999, + label="Order", + placeholder="ex: 0, 100 ... (smallest order loaded first)", + help="""Order +Order defines asset loading priority (0 to 999) +Priority rule is : "lowest is first to load"."""), + qargparse.Separator( + "Optional attributes"), + qargparse.String( + "asset", + default=options.get("asset", ""), + label="Asset filter", + placeholder="regex filtering by asset name", + help="Filtering assets by matching field regex to asset's name"), + qargparse.String( + "subset", + default=options.get("subset", ""), + label="Subset filter", + placeholder="regex filtering by subset name", + help="Filtering assets by matching field regex to subset's name"), + qargparse.String( + "hierarchy", + default=options.get("hierarchy", ""), + label="Hierarchy filter", + placeholder="regex filtering by asset's hierarchy", + help="Filtering assets by matching field asset's hierarchy") + ] + dialog.create(args) + + if not dialog.exec_(): + return None + + return args diff --git a/openpype/hosts/maya/api/menu.py b/openpype/hosts/maya/api/menu.py index b7ab529a55..ebba706a6c 100644 --- a/openpype/hosts/maya/api/menu.py +++ b/openpype/hosts/maya/api/menu.py @@ -9,10 +9,15 @@ import maya.cmds as cmds from openpype.settings import get_project_settings from openpype.pipeline import legacy_io from openpype.pipeline.workfile import BuildWorkfile +from openpype.pipeline.workfile.build_template import ( + build_workfile_template, + update_workfile_template +) from openpype.tools.utils import host_tools from openpype.hosts.maya.api import lib, lib_rendersettings from .lib import get_main_window, IS_HEADLESS from .commands import reset_frame_range +from .lib_template_builder import create_placeholder, update_placeholder log = logging.getLogger(__name__) @@ -147,6 +152,34 @@ def install(): parent_widget ) ) + + builder_menu = cmds.menuItem( + "Template Builder", + subMenu=True, + tearOff=True, + parent=MENU_NAME + ) + cmds.menuItem( + "Create Placeholder", + parent=builder_menu, + command=lambda *args: create_placeholder() + ) + cmds.menuItem( + "Update Placeholder", + parent=builder_menu, + command=lambda *args: update_placeholder() + ) + cmds.menuItem( + "Build Workfile from template", + parent=builder_menu, + command=build_workfile_template + ) + cmds.menuItem( + "Update Workfile from template", + parent=builder_menu, + command=update_workfile_template + ) + cmds.setParent(MENU_NAME, menu=True) def add_scripts_menu(): diff --git a/openpype/hosts/maya/api/template_loader.py b/openpype/hosts/maya/api/template_loader.py new file mode 100644 index 0000000000..ecffafc93d --- /dev/null +++ b/openpype/hosts/maya/api/template_loader.py @@ -0,0 +1,252 @@ +import re +from maya import cmds + +from openpype.client import get_representations +from openpype.pipeline import legacy_io +from openpype.pipeline.workfile.abstract_template_loader import ( + AbstractPlaceholder, + AbstractTemplateLoader +) +from openpype.pipeline.workfile.build_template_exceptions import ( + TemplateAlreadyImported +) + +PLACEHOLDER_SET = 'PLACEHOLDERS_SET' + + +class MayaTemplateLoader(AbstractTemplateLoader): + """Concrete implementation of AbstractTemplateLoader for maya + """ + + def import_template(self, path): + """Import template into current scene. + Block if a template is already loaded. + Args: + path (str): A path to current template (usually given by + get_template_path implementation) + Returns: + bool: Wether the template was succesfully imported or not + """ + if cmds.objExists(PLACEHOLDER_SET): + raise TemplateAlreadyImported( + "Build template already loaded\n" + "Clean scene if needed (File > New Scene)") + + cmds.sets(name=PLACEHOLDER_SET, empty=True) + self.new_nodes = cmds.file(path, i=True, returnNewNodes=True) + cmds.setAttr(PLACEHOLDER_SET + '.hiddenInOutliner', True) + + for set in cmds.listSets(allSets=True): + if (cmds.objExists(set) and + cmds.attributeQuery('id', node=set, exists=True) and + cmds.getAttr(set + '.id') == 'pyblish.avalon.instance'): + if cmds.attributeQuery('asset', node=set, exists=True): + cmds.setAttr( + set + '.asset', + legacy_io.Session['AVALON_ASSET'], type='string' + ) + + return True + + def template_already_imported(self, err_msg): + clearButton = "Clear scene and build" + updateButton = "Update template" + abortButton = "Abort" + + title = "Scene already builded" + message = ( + "It's seems a template was already build for this scene.\n" + "Error message reveived :\n\n\"{}\"".format(err_msg)) + buttons = [clearButton, updateButton, abortButton] + defaultButton = clearButton + cancelButton = abortButton + dismissString = abortButton + answer = cmds.confirmDialog( + t=title, + m=message, + b=buttons, + db=defaultButton, + cb=cancelButton, + ds=dismissString) + + if answer == clearButton: + cmds.file(newFile=True, force=True) + self.import_template(self.template_path) + self.populate_template() + elif answer == updateButton: + self.update_missing_containers() + elif answer == abortButton: + return + + @staticmethod + def get_template_nodes(): + attributes = cmds.ls('*.builder_type', long=True) + return [attribute.rpartition('.')[0] for attribute in attributes] + + def get_loaded_containers_by_id(self): + try: + containers = cmds.sets("AVALON_CONTAINERS", q=True) + except ValueError: + return None + + return [ + cmds.getAttr(container + '.representation') + for container in containers] + + +class MayaPlaceholder(AbstractPlaceholder): + """Concrete implementation of AbstractPlaceholder for maya + """ + + optional_keys = {'asset', 'subset', 'hierarchy'} + + def get_data(self, node): + user_data = dict() + for attr in self.required_keys.union(self.optional_keys): + attribute_name = '{}.{}'.format(node, attr) + if not cmds.attributeQuery(attr, node=node, exists=True): + print("{} not found".format(attribute_name)) + continue + user_data[attr] = cmds.getAttr( + attribute_name, + asString=True) + user_data['parent'] = ( + cmds.getAttr(node + '.parent', asString=True) + or node.rpartition('|')[0] + or "" + ) + user_data['node'] = node + if user_data['parent']: + siblings = cmds.listRelatives(user_data['parent'], children=True) + else: + siblings = cmds.ls(assemblies=True) + node_shortname = user_data['node'].rpartition('|')[2] + current_index = cmds.getAttr(node + '.index', asString=True) + user_data['index'] = ( + current_index if current_index >= 0 + else siblings.index(node_shortname)) + + self.data = user_data + + def parent_in_hierarchy(self, containers): + """Parent loaded container to placeholder's parent + ie : Set loaded content as placeholder's sibling + Args: + containers (String): Placeholder loaded containers + """ + if not containers: + return + + roots = cmds.sets(containers, q=True) + nodes_to_parent = [] + for root in roots: + if root.endswith("_RN"): + refRoot = cmds.referenceQuery(root, n=True)[0] + refRoot = cmds.listRelatives(refRoot, parent=True) or [refRoot] + nodes_to_parent.extend(refRoot) + elif root in cmds.listSets(allSets=True): + if not cmds.sets(root, q=True): + return + else: + continue + else: + nodes_to_parent.append(root) + + if self.data['parent']: + cmds.parent(nodes_to_parent, self.data['parent']) + # Move loaded nodes to correct index in outliner hierarchy + placeholder_node = self.data['node'] + placeholder_form = cmds.xform( + placeholder_node, + q=True, + matrix=True, + worldSpace=True + ) + for node in set(nodes_to_parent): + cmds.reorder(node, front=True) + cmds.reorder(node, relative=self.data['index']) + cmds.xform(node, matrix=placeholder_form, ws=True) + + holding_sets = cmds.listSets(object=placeholder_node) + if not holding_sets: + return + for holding_set in holding_sets: + cmds.sets(roots, forceElement=holding_set) + + def clean(self): + """Hide placeholder, parent them to root + add them to placeholder set and register placeholder's parent + to keep placeholder info available for future use + """ + node = self.data['node'] + if self.data['parent']: + cmds.setAttr(node + '.parent', self.data['parent'], type='string') + if cmds.getAttr(node + '.index') < 0: + cmds.setAttr(node + '.index', self.data['index']) + + holding_sets = cmds.listSets(object=node) + if holding_sets: + for set in holding_sets: + cmds.sets(node, remove=set) + + if cmds.listRelatives(node, p=True): + node = cmds.parent(node, world=True)[0] + cmds.sets(node, addElement=PLACEHOLDER_SET) + cmds.hide(node) + cmds.setAttr(node + '.hiddenInOutliner', True) + + def get_representations(self, current_asset_doc, linked_asset_docs): + project_name = legacy_io.active_project() + + builder_type = self.data["builder_type"] + if builder_type == "context_asset": + context_filters = { + "asset": [current_asset_doc["name"]], + "subset": [re.compile(self.data["subset"])], + "hierarchy": [re.compile(self.data["hierarchy"])], + "representations": [self.data["representation"]], + "family": [self.data["family"]] + } + + elif builder_type != "linked_asset": + context_filters = { + "asset": [re.compile(self.data["asset"])], + "subset": [re.compile(self.data["subset"])], + "hierarchy": [re.compile(self.data["hierarchy"])], + "representation": [self.data["representation"]], + "family": [self.data["family"]] + } + + else: + asset_regex = re.compile(self.data["asset"]) + linked_asset_names = [] + for asset_doc in linked_asset_docs: + asset_name = asset_doc["name"] + if asset_regex.match(asset_name): + linked_asset_names.append(asset_name) + + context_filters = { + "asset": linked_asset_names, + "subset": [re.compile(self.data["subset"])], + "hierarchy": [re.compile(self.data["hierarchy"])], + "representation": [self.data["representation"]], + "family": [self.data["family"]], + } + + return list(get_representations( + project_name, + context_filters=context_filters + )) + + def err_message(self): + return ( + "Error while trying to load a representation.\n" + "Either the subset wasn't published or the template is malformed." + "\n\n" + "Builder was looking for :\n{attributes}".format( + attributes="\n".join([ + "{}: {}".format(key.title(), value) + for key, value in self.data.items()] + ) + ) + ) diff --git a/openpype/hosts/maya/api/workio.py b/openpype/hosts/maya/api/workio.py index fd4961c4bf..8c31974c73 100644 --- a/openpype/hosts/maya/api/workio.py +++ b/openpype/hosts/maya/api/workio.py @@ -2,11 +2,9 @@ import os from maya import cmds -from openpype.pipeline import HOST_WORKFILE_EXTENSIONS - def file_extensions(): - return HOST_WORKFILE_EXTENSIONS["maya"] + return [".ma", ".mb"] def has_unsaved_changes(): diff --git a/openpype/hosts/maya/module.py b/openpype/hosts/maya/module.py new file mode 100644 index 0000000000..5a215be8d2 --- /dev/null +++ b/openpype/hosts/maya/module.py @@ -0,0 +1,47 @@ +import os +from openpype.modules import OpenPypeModule +from openpype.modules.interfaces import IHostModule + +MAYA_ROOT_DIR = os.path.dirname(os.path.abspath(__file__)) + + +class OpenPypeMaya(OpenPypeModule, IHostModule): + name = "openpype_maya" + host_name = "maya" + + def initialize(self, module_settings): + self.enabled = True + + def add_implementation_envs(self, env, _app): + # Add requirements to PYTHONPATH + new_python_paths = [ + os.path.join(MAYA_ROOT_DIR, "startup") + ] + old_python_path = env.get("PYTHONPATH") or "" + for path in old_python_path.split(os.pathsep): + if not path: + continue + + norm_path = os.path.normpath(path) + if norm_path not in new_python_paths: + new_python_paths.append(norm_path) + + env["PYTHONPATH"] = os.pathsep.join(new_python_paths) + + # Set default values if are not already set via settings + defaults = { + "OPENPYPE_LOG_NO_COLORS": "Yes" + } + for key, value in defaults.items(): + if not env.get(key): + env[key] = value + + def get_launch_hook_paths(self, app): + if app.host_name != self.host_name: + return [] + return [ + os.path.join(MAYA_ROOT_DIR, "hooks") + ] + + def get_workfile_extensions(self): + return [".ma", ".mb"] diff --git a/openpype/hosts/maya/plugins/create/create_render.py b/openpype/hosts/maya/plugins/create/create_render.py index fbe670b1ea..5418ec1f2f 100644 --- a/openpype/hosts/maya/plugins/create/create_render.py +++ b/openpype/hosts/maya/plugins/create/create_render.py @@ -71,7 +71,6 @@ class CreateRender(plugin.Creator): label = "Render" family = "rendering" icon = "eye" - _token = None _user = None _password = None @@ -220,6 +219,12 @@ class CreateRender(plugin.Creator): self.data["tilesY"] = 2 self.data["convertToScanline"] = False self.data["useReferencedAovs"] = False + self.data["renderSetupIncludeLights"] = ( + self._project_settings.get( + "maya", {}).get( + "RenderSettings", {}).get( + "enable_all_lights", False) + ) # Disable for now as this feature is not working yet # self.data["assScene"] = False diff --git a/openpype/hosts/maya/plugins/publish/collect_render.py b/openpype/hosts/maya/plugins/publish/collect_render.py index e132cffe53..ebda5e190d 100644 --- a/openpype/hosts/maya/plugins/publish/collect_render.py +++ b/openpype/hosts/maya/plugins/publish/collect_render.py @@ -154,12 +154,6 @@ class CollectMayaRender(pyblish.api.ContextPlugin): layer_name = "rs_{}".format(expected_layer_name) # collect all frames we are expecting to be rendered - renderer = self.get_render_attribute("currentRenderer", - layer=layer_name) - # handle various renderman names - if renderer.startswith("renderman"): - renderer = "renderman" - # return all expected files for all cameras and aovs in given # frame range layer_render_products = get_layer_render_products(layer_name) @@ -202,8 +196,7 @@ class CollectMayaRender(pyblish.api.ContextPlugin): aov_dict = {} default_render_file = context.data.get('project_settings')\ .get('maya')\ - .get('create')\ - .get('CreateRender')\ + .get('RenderSettings')\ .get('default_render_image_folder') or "" # replace relative paths with absolute. Render products are # returned as list of dictionaries. @@ -318,7 +311,10 @@ class CollectMayaRender(pyblish.api.ContextPlugin): "useReferencedAovs": render_instance.data.get( "useReferencedAovs") or render_instance.data.get( "vrayUseReferencedAovs") or False, - "aovSeparator": layer_render_products.layer_data.aov_separator # noqa: E501 + "aovSeparator": layer_render_products.layer_data.aov_separator, # noqa: E501 + "renderSetupIncludeLights": render_instance.data.get( + "renderSetupIncludeLights" + ) } # Collect Deadline url if Deadline module is enabled @@ -361,6 +357,7 @@ class CollectMayaRender(pyblish.api.ContextPlugin): instance = context.create_instance(expected_layer_name) instance.data["label"] = label + instance.data["farm"] = True instance.data.update(data) self.log.debug("data: {}".format(json.dumps(data, indent=4))) diff --git a/openpype/hosts/maya/plugins/publish/extract_layout.py b/openpype/hosts/maya/plugins/publish/extract_layout.py new file mode 100644 index 0000000000..991217684a --- /dev/null +++ b/openpype/hosts/maya/plugins/publish/extract_layout.py @@ -0,0 +1,146 @@ +import math +import os +import json + +from maya import cmds +from maya.api import OpenMaya as om + +from bson.objectid import ObjectId + +from openpype.pipeline import legacy_io +import openpype.api + + +class ExtractLayout(openpype.api.Extractor): + """Extract a layout.""" + + label = "Extract Layout" + hosts = ["maya"] + families = ["layout"] + optional = True + + def process(self, instance): + # Define extract output file path + stagingdir = self.staging_dir(instance) + + # Perform extraction + self.log.info("Performing extraction..") + + if "representations" not in instance.data: + instance.data["representations"] = [] + + json_data = [] + + for asset in cmds.sets(str(instance), query=True): + # Find the container + grp_name = asset.split(':')[0] + containers = cmds.ls(f"{grp_name}*_CON") + + assert len(containers) == 1, \ + f"More than one container found for {asset}" + + container = containers[0] + + representation_id = cmds.getAttr(f"{container}.representation") + + representation = legacy_io.find_one( + { + "type": "representation", + "_id": ObjectId(representation_id) + }, projection={"parent": True, "context.family": True}) + + self.log.info(representation) + + version_id = representation.get("parent") + family = representation.get("context").get("family") + + json_element = { + "family": family, + "instance_name": cmds.getAttr(f"{container}.name"), + "representation": str(representation_id), + "version": str(version_id) + } + + loc = cmds.xform(asset, query=True, translation=True) + rot = cmds.xform(asset, query=True, rotation=True, euler=True) + scl = cmds.xform(asset, query=True, relative=True, scale=True) + + json_element["transform"] = { + "translation": { + "x": loc[0], + "y": loc[1], + "z": loc[2] + }, + "rotation": { + "x": math.radians(rot[0]), + "y": math.radians(rot[1]), + "z": math.radians(rot[2]) + }, + "scale": { + "x": scl[0], + "y": scl[1], + "z": scl[2] + } + } + + row_length = 4 + t_matrix_list = cmds.xform(asset, query=True, matrix=True) + + transform_mm = om.MMatrix(t_matrix_list) + transform = om.MTransformationMatrix(transform_mm) + + t = transform.translation(om.MSpace.kWorld) + t = om.MVector(t.x, t.z, -t.y) + transform.setTranslation(t, om.MSpace.kWorld) + transform.rotateBy( + om.MEulerRotation(math.radians(-90), 0, 0), om.MSpace.kWorld) + transform.scaleBy([1.0, 1.0, -1.0], om.MSpace.kObject) + + t_matrix_list = list(transform.asMatrix()) + + t_matrix = [] + for i in range(0, len(t_matrix_list), row_length): + t_matrix.append(t_matrix_list[i:i + row_length]) + + json_element["transform_matrix"] = [] + for row in t_matrix: + json_element["transform_matrix"].append(list(row)) + + basis_list = [ + 1, 0, 0, 0, + 0, 1, 0, 0, + 0, 0, -1, 0, + 0, 0, 0, 1 + ] + + basis_mm = om.MMatrix(basis_list) + basis = om.MTransformationMatrix(basis_mm) + + b_matrix_list = list(basis.asMatrix()) + b_matrix = [] + + for i in range(0, len(b_matrix_list), row_length): + b_matrix.append(b_matrix_list[i:i + row_length]) + + json_element["basis"] = [] + for row in b_matrix: + json_element["basis"].append(list(row)) + + json_data.append(json_element) + + json_filename = "{}.json".format(instance.name) + json_path = os.path.join(stagingdir, json_filename) + + with open(json_path, "w+") as file: + json.dump(json_data, fp=file, indent=2) + + json_representation = { + 'name': 'json', + 'ext': 'json', + 'files': json_filename, + "stagingDir": stagingdir, + } + instance.data["representations"].append(json_representation) + + self.log.info("Extracted instance '%s' to: %s", + instance.name, json_representation) diff --git a/openpype/hosts/maya/plugins/publish/extract_look.py b/openpype/hosts/maya/plugins/publish/extract_look.py index 8be0c7aae5..ce3b265566 100644 --- a/openpype/hosts/maya/plugins/publish/extract_look.py +++ b/openpype/hosts/maya/plugins/publish/extract_look.py @@ -40,11 +40,13 @@ def get_ocio_config_path(profile_folder): Returns: str: Path to vendorized config file. """ + return os.path.join( os.environ["OPENPYPE_ROOT"], "vendor", - "configs", - "OpenColorIO-Configs", + "bin", + "ocioconfig", + "OpenColorIOConfigs", profile_folder, "config.ocio" ) diff --git a/openpype/hosts/maya/plugins/publish/validate_render_image_rule.py b/openpype/hosts/maya/plugins/publish/validate_render_image_rule.py index 642ca9e25d..4d3796e429 100644 --- a/openpype/hosts/maya/plugins/publish/validate_render_image_rule.py +++ b/openpype/hosts/maya/plugins/publish/validate_render_image_rule.py @@ -1,17 +1,15 @@ -import maya.mel as mel -import pymel.core as pm +from maya import cmds import pyblish.api import openpype.api -def get_file_rule(rule): - """Workaround for a bug in python with cmds.workspace""" - return mel.eval('workspace -query -fileRuleEntry "{}"'.format(rule)) - - class ValidateRenderImageRule(pyblish.api.InstancePlugin): - """Validates "images" file rule is set to "renders/" + """Validates Maya Workpace "images" file rule matches project settings. + + This validates against the configured default render image folder: + Studio Settings > Project > Maya > + Render Settings > Default render image folder. """ @@ -23,24 +21,29 @@ class ValidateRenderImageRule(pyblish.api.InstancePlugin): def process(self, instance): - default_render_file = self.get_default_render_image_folder(instance) + required_images_rule = self.get_default_render_image_folder(instance) + current_images_rule = cmds.workspace(fileRuleEntry="images") - assert get_file_rule("images") == default_render_file, ( - "Workspace's `images` file rule must be set to: {}".format( - default_render_file + assert current_images_rule == required_images_rule, ( + "Invalid workspace `images` file rule value: '{}'. " + "Must be set to: '{}'".format( + current_images_rule, required_images_rule ) ) @classmethod def repair(cls, instance): - default = cls.get_default_render_image_folder(instance) - pm.workspace.fileRules["images"] = default - pm.system.Workspace.save() + + required_images_rule = cls.get_default_render_image_folder(instance) + current_images_rule = cmds.workspace(fileRuleEntry="images") + + if current_images_rule != required_images_rule: + cmds.workspace(fileRule=("images", required_images_rule)) + cmds.workspace(saveWorkspace=True) @staticmethod def get_default_render_image_folder(instance): return instance.context.data.get('project_settings')\ .get('maya') \ - .get('create') \ - .get('CreateRender') \ + .get('RenderSettings') \ .get('default_render_image_folder') diff --git a/openpype/hosts/maya/plugins/publish/validate_rendersettings.py b/openpype/hosts/maya/plugins/publish/validate_rendersettings.py index 1dab3274a0..f19c0bff36 100644 --- a/openpype/hosts/maya/plugins/publish/validate_rendersettings.py +++ b/openpype/hosts/maya/plugins/publish/validate_rendersettings.py @@ -242,6 +242,14 @@ class ValidateRenderSettings(pyblish.api.InstancePlugin): instance.context.data["project_settings"]["maya"]["publish"]["ValidateRenderSettings"].get( # noqa: E501 "{}_render_attributes".format(renderer)) or [] ) + settings_lights_flag = instance.context.data["project_settings"].get( + "maya", {}).get( + "RenderSettings", {}).get( + "enable_all_lights", False) + + instance_lights_flag = instance.data.get("renderSetupIncludeLights") + if settings_lights_flag != instance_lights_flag: + cls.log.warning('Instance flag for "Render Setup Include Lights" is set to {0} and Settings flag is set to {1}'.format(instance_lights_flag, settings_lights_flag)) # noqa # go through definitions and test if such node.attribute exists. # if so, compare its value from the one required. diff --git a/openpype/hosts/nuke/plugins/load/load_clip.py b/openpype/hosts/nuke/plugins/load/load_clip.py index b2dc4a52d7..346773b5af 100644 --- a/openpype/hosts/nuke/plugins/load/load_clip.py +++ b/openpype/hosts/nuke/plugins/load/load_clip.py @@ -162,7 +162,15 @@ class LoadClip(plugin.NukeLoader): data_imprint = {} for k in add_keys: if k == 'version': - data_imprint[k] = context["version"]['name'] + version_doc = context["version"] + if version_doc["type"] == "hero_version": + version = "hero" + else: + version = version_doc.get("name") + + if version: + data_imprint[k] = version + elif k == 'colorspace': colorspace = repre["data"].get(k) colorspace = colorspace or version_data.get(k) diff --git a/openpype/hosts/photoshop/plugins/create/workfile_creator.py b/openpype/hosts/photoshop/plugins/create/workfile_creator.py index 43302329f1..e79d16d154 100644 --- a/openpype/hosts/photoshop/plugins/create/workfile_creator.py +++ b/openpype/hosts/photoshop/plugins/create/workfile_creator.py @@ -11,6 +11,8 @@ class PSWorkfileCreator(AutoCreator): identifier = "workfile" family = "workfile" + default_variant = "Main" + def get_instance_attr_defs(self): return [] @@ -35,7 +37,6 @@ class PSWorkfileCreator(AutoCreator): existing_instance = instance break - variant = '' project_name = legacy_io.Session["AVALON_PROJECT"] asset_name = legacy_io.Session["AVALON_ASSET"] task_name = legacy_io.Session["AVALON_TASK"] @@ -43,15 +44,17 @@ class PSWorkfileCreator(AutoCreator): if existing_instance is None: asset_doc = get_asset_by_name(project_name, asset_name) subset_name = self.get_subset_name( - variant, task_name, asset_doc, project_name, host_name + self.default_variant, task_name, asset_doc, + project_name, host_name ) data = { "asset": asset_name, "task": task_name, - "variant": variant + "variant": self.default_variant } data.update(self.get_dynamic_data( - variant, task_name, asset_doc, project_name, host_name + self.default_variant, task_name, asset_doc, + project_name, host_name )) new_instance = CreatedInstance( @@ -67,7 +70,9 @@ class PSWorkfileCreator(AutoCreator): ): asset_doc = get_asset_by_name(project_name, asset_name) subset_name = self.get_subset_name( - variant, task_name, asset_doc, project_name, host_name + self.default_variant, task_name, asset_doc, + project_name, host_name ) existing_instance["asset"] = asset_name existing_instance["task"] = task_name + existing_instance["subset"] = subset_name diff --git a/openpype/hosts/photoshop/plugins/publish/collect_workfile.py b/openpype/hosts/photoshop/plugins/publish/collect_workfile.py index e4f0a07b34..9cf6d5227e 100644 --- a/openpype/hosts/photoshop/plugins/publish/collect_workfile.py +++ b/openpype/hosts/photoshop/plugins/publish/collect_workfile.py @@ -11,6 +11,8 @@ class CollectWorkfile(pyblish.api.ContextPlugin): label = "Collect Workfile" hosts = ["photoshop"] + default_variant = "Main" + def process(self, context): existing_instance = None for instance in context: @@ -20,9 +22,11 @@ class CollectWorkfile(pyblish.api.ContextPlugin): break family = "workfile" + # context.data["variant"] might come only from collect_batch_data + variant = context.data.get("variant") or self.default_variant subset = get_subset_name_with_asset_doc( family, - "", + variant, context.data["anatomyData"]["task"]["name"], context.data["assetEntity"], context.data["anatomyData"]["project"]["name"], diff --git a/openpype/hosts/photoshop/plugins/publish/extract_review.py b/openpype/hosts/photoshop/plugins/publish/extract_review.py index d076610ead..5d37c86ed8 100644 --- a/openpype/hosts/photoshop/plugins/publish/extract_review.py +++ b/openpype/hosts/photoshop/plugins/publish/extract_review.py @@ -1,5 +1,6 @@ import os import shutil +from PIL import Image import openpype.api import openpype.lib @@ -8,10 +9,17 @@ from openpype.hosts.photoshop import api as photoshop class ExtractReview(openpype.api.Extractor): """ - Produce a flattened or sequence image file from all 'image' instances. + Produce a flattened or sequence image files from all 'image' instances. If no 'image' instance is created, it produces flattened image from all visible layers. + + It creates review, thumbnail and mov representations. + + 'review' family could be used in other steps as a reference, as it + contains flattened image by default. (Eg. artist could load this + review as a single item and see full image. In most cases 'image' + family is separated by layers to better usage in animation or comp.) """ label = "Extract Review" @@ -22,6 +30,7 @@ class ExtractReview(openpype.api.Extractor): jpg_options = None mov_options = None make_image_sequence = None + max_downscale_size = 8192 def process(self, instance): staging_dir = self.staging_dir(instance) @@ -49,7 +58,7 @@ class ExtractReview(openpype.api.Extractor): "stagingDir": staging_dir, "tags": self.jpg_options['tags'], }) - + processed_img_names = img_list else: self.log.info("Extract layers to flatten image.") img_list = self._saves_flattened_layers(staging_dir, layers) @@ -57,26 +66,33 @@ class ExtractReview(openpype.api.Extractor): instance.data["representations"].append({ "name": "jpg", "ext": "jpg", - "files": img_list, + "files": img_list, # cannot be [] for single frame "stagingDir": staging_dir, "tags": self.jpg_options['tags'] }) + processed_img_names = [img_list] ffmpeg_path = openpype.lib.get_ffmpeg_tool_path("ffmpeg") instance.data["stagingDir"] = staging_dir - # Generate thumbnail. + source_files_pattern = os.path.join(staging_dir, + self.output_seq_filename) + source_files_pattern = self._check_and_resize(processed_img_names, + source_files_pattern, + staging_dir) + # Generate thumbnail thumbnail_path = os.path.join(staging_dir, "thumbnail.jpg") self.log.info(f"Generate thumbnail {thumbnail_path}") args = [ ffmpeg_path, "-y", - "-i", os.path.join(staging_dir, self.output_seq_filename), + "-i", source_files_pattern, "-vf", "scale=300:-1", "-vframes", "1", thumbnail_path ] + self.log.debug("thumbnail args:: {}".format(args)) output = openpype.lib.run_subprocess(args) instance.data["representations"].append({ @@ -94,11 +110,12 @@ class ExtractReview(openpype.api.Extractor): args = [ ffmpeg_path, "-y", - "-i", os.path.join(staging_dir, self.output_seq_filename), + "-i", source_files_pattern, "-vf", "pad=ceil(iw/2)*2:ceil(ih/2)*2", "-vframes", str(img_number), mov_path ] + self.log.debug("mov args:: {}".format(args)) output = openpype.lib.run_subprocess(args) self.log.debug(output) instance.data["representations"].append({ @@ -120,6 +137,34 @@ class ExtractReview(openpype.api.Extractor): self.log.info(f"Extracted {instance} to {staging_dir}") + def _check_and_resize(self, processed_img_names, source_files_pattern, + staging_dir): + """Check if saved image could be used in ffmpeg. + + Ffmpeg has max size 16384x16384. Saved image(s) must be resized to be + used as a source for thumbnail or review mov. + """ + Image.MAX_IMAGE_PIXELS = None + first_url = os.path.join(staging_dir, processed_img_names[0]) + with Image.open(first_url) as im: + width, height = im.size + + if width > self.max_downscale_size or height > self.max_downscale_size: + resized_dir = os.path.join(staging_dir, "resized") + os.mkdir(resized_dir) + source_files_pattern = os.path.join(resized_dir, + self.output_seq_filename) + for file_name in processed_img_names: + source_url = os.path.join(staging_dir, file_name) + with Image.open(source_url) as res_img: + # 'thumbnail' automatically keeps aspect ratio + res_img.thumbnail((self.max_downscale_size, + self.max_downscale_size), + Image.ANTIALIAS) + res_img.save(os.path.join(resized_dir, file_name)) + + return source_files_pattern + def _get_image_path_from_instances(self, instance): img_list = [] diff --git a/openpype/hosts/resolve/__init__.py b/openpype/hosts/resolve/__init__.py index 3e49ce3b9b..e69de29bb2 100644 --- a/openpype/hosts/resolve/__init__.py +++ b/openpype/hosts/resolve/__init__.py @@ -1,129 +0,0 @@ -from .api.utils import ( - setup, - get_resolve_module -) - -from .api.pipeline import ( - install, - uninstall, - ls, - containerise, - update_container, - publish, - launch_workfiles_app, - maintained_selection, - remove_instance, - list_instances -) - -from .api.lib import ( - maintain_current_timeline, - publish_clip_color, - get_project_manager, - get_current_project, - get_current_timeline, - create_bin, - get_media_pool_item, - create_media_pool_item, - create_timeline_item, - get_timeline_item, - get_video_track_names, - get_current_timeline_items, - get_pype_timeline_item_by_name, - get_timeline_item_pype_tag, - set_timeline_item_pype_tag, - imprint, - set_publish_attribute, - get_publish_attribute, - create_compound_clip, - swap_clips, - get_pype_clip_metadata, - set_project_manager_to_folder_name, - get_otio_clip_instance_data, - get_reformated_path -) - -from .api.menu import launch_pype_menu - -from .api.plugin import ( - ClipLoader, - TimelineItemLoader, - Creator, - PublishClip -) - -from .api.workio import ( - open_file, - save_file, - current_file, - has_unsaved_changes, - file_extensions, - work_root -) - -from .api.testing_utils import TestGUI - - -__all__ = [ - # pipeline - "install", - "uninstall", - "ls", - "containerise", - "update_container", - "reload_pipeline", - "publish", - "launch_workfiles_app", - "maintained_selection", - "remove_instance", - "list_instances", - - # utils - "setup", - "get_resolve_module", - - # lib - "maintain_current_timeline", - "publish_clip_color", - "get_project_manager", - "get_current_project", - "get_current_timeline", - "create_bin", - "get_media_pool_item", - "create_media_pool_item", - "create_timeline_item", - "get_timeline_item", - "get_video_track_names", - "get_current_timeline_items", - "get_pype_timeline_item_by_name", - "get_timeline_item_pype_tag", - "set_timeline_item_pype_tag", - "imprint", - "set_publish_attribute", - "get_publish_attribute", - "create_compound_clip", - "swap_clips", - "get_pype_clip_metadata", - "set_project_manager_to_folder_name", - "get_otio_clip_instance_data", - "get_reformated_path", - - # menu - "launch_pype_menu", - - # plugin - "ClipLoader", - "TimelineItemLoader", - "Creator", - "PublishClip", - - # workio - "open_file", - "save_file", - "current_file", - "has_unsaved_changes", - "file_extensions", - "work_root", - - "TestGUI" -] diff --git a/openpype/hosts/resolve/api/__init__.py b/openpype/hosts/resolve/api/__init__.py index 48bd938e57..cf1edb4c35 100644 --- a/openpype/hosts/resolve/api/__init__.py +++ b/openpype/hosts/resolve/api/__init__.py @@ -1,11 +1,137 @@ """ resolve api """ -import os bmdvr = None bmdvf = None -API_DIR = os.path.dirname(os.path.abspath(__file__)) -HOST_DIR = os.path.dirname(API_DIR) -PLUGINS_DIR = os.path.join(HOST_DIR, "plugins") +from .utils import ( + get_resolve_module +) + +from .pipeline import ( + install, + uninstall, + ls, + containerise, + update_container, + publish, + launch_workfiles_app, + maintained_selection, + remove_instance, + list_instances +) + +from .lib import ( + maintain_current_timeline, + publish_clip_color, + get_project_manager, + get_current_project, + get_current_timeline, + create_bin, + get_media_pool_item, + create_media_pool_item, + create_timeline_item, + get_timeline_item, + get_video_track_names, + get_current_timeline_items, + get_pype_timeline_item_by_name, + get_timeline_item_pype_tag, + set_timeline_item_pype_tag, + imprint, + set_publish_attribute, + get_publish_attribute, + create_compound_clip, + swap_clips, + get_pype_clip_metadata, + set_project_manager_to_folder_name, + get_otio_clip_instance_data, + get_reformated_path +) + +from .menu import launch_pype_menu + +from .plugin import ( + ClipLoader, + TimelineItemLoader, + Creator, + PublishClip +) + +from .workio import ( + open_file, + save_file, + current_file, + has_unsaved_changes, + file_extensions, + work_root +) + +from .testing_utils import TestGUI + + +__all__ = [ + "bmdvr", + "bmdvf", + + # pipeline + "install", + "uninstall", + "ls", + "containerise", + "update_container", + "reload_pipeline", + "publish", + "launch_workfiles_app", + "maintained_selection", + "remove_instance", + "list_instances", + + # utils + "get_resolve_module", + + # lib + "maintain_current_timeline", + "publish_clip_color", + "get_project_manager", + "get_current_project", + "get_current_timeline", + "create_bin", + "get_media_pool_item", + "create_media_pool_item", + "create_timeline_item", + "get_timeline_item", + "get_video_track_names", + "get_current_timeline_items", + "get_pype_timeline_item_by_name", + "get_timeline_item_pype_tag", + "set_timeline_item_pype_tag", + "imprint", + "set_publish_attribute", + "get_publish_attribute", + "create_compound_clip", + "swap_clips", + "get_pype_clip_metadata", + "set_project_manager_to_folder_name", + "get_otio_clip_instance_data", + "get_reformated_path", + + # menu + "launch_pype_menu", + + # plugin + "ClipLoader", + "TimelineItemLoader", + "Creator", + "PublishClip", + + # workio + "open_file", + "save_file", + "current_file", + "has_unsaved_changes", + "file_extensions", + "work_root", + + "TestGUI" +] diff --git a/openpype/hosts/resolve/api/action.py b/openpype/hosts/resolve/api/action.py index f8f338a850..d55a24a39a 100644 --- a/openpype/hosts/resolve/api/action.py +++ b/openpype/hosts/resolve/api/action.py @@ -4,7 +4,7 @@ from __future__ import absolute_import import pyblish.api -from ...action import get_errored_instances_from_context +from openpype.action import get_errored_instances_from_context class SelectInvalidAction(pyblish.api.Action): diff --git a/openpype/hosts/resolve/api/lib.py b/openpype/hosts/resolve/api/lib.py index 93ccdaf812..f41eb36caf 100644 --- a/openpype/hosts/resolve/api/lib.py +++ b/openpype/hosts/resolve/api/lib.py @@ -4,13 +4,13 @@ import re import os import contextlib from opentimelineio import opentime + +from openpype.lib import Logger from openpype.pipeline.editorial import is_overlapping_otio_ranges from ..otio import davinci_export as otio_export -from openpype.api import Logger - -log = Logger().get_logger(__name__) +log = Logger.get_logger(__name__) self = sys.modules[__name__] self.project_manager = None diff --git a/openpype/hosts/resolve/api/menu.py b/openpype/hosts/resolve/api/menu.py index 9e0dd12376..2c7678ee5b 100644 --- a/openpype/hosts/resolve/api/menu.py +++ b/openpype/hosts/resolve/api/menu.py @@ -3,13 +3,13 @@ import sys from Qt import QtWidgets, QtCore +from openpype.tools.utils import host_tools + from .pipeline import ( publish, launch_workfiles_app ) -from openpype.tools.utils import host_tools - def load_stylesheet(): path = os.path.join(os.path.dirname(__file__), "menu_style.qss") diff --git a/openpype/hosts/resolve/api/pipeline.py b/openpype/hosts/resolve/api/pipeline.py index 4a7d1c5bea..1c8d9dc01c 100644 --- a/openpype/hosts/resolve/api/pipeline.py +++ b/openpype/hosts/resolve/api/pipeline.py @@ -7,7 +7,7 @@ from collections import OrderedDict from pyblish import api as pyblish -from openpype.api import Logger +from openpype.lib import Logger from openpype.pipeline import ( schema, register_loader_plugin_path, @@ -16,11 +16,15 @@ from openpype.pipeline import ( deregister_creator_plugin_path, AVALON_CONTAINER_ID, ) -from . import lib -from . import PLUGINS_DIR from openpype.tools.utils import host_tools -log = Logger().get_logger(__name__) +from . import lib +from .utils import get_resolve_module + +log = Logger.get_logger(__name__) + +HOST_DIR = os.path.dirname(os.path.abspath(os.path.dirname(__file__))) +PLUGINS_DIR = os.path.join(HOST_DIR, "plugins") PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish") LOAD_PATH = os.path.join(PLUGINS_DIR, "load") CREATE_PATH = os.path.join(PLUGINS_DIR, "create") @@ -39,7 +43,6 @@ def install(): See the Maya equivalent for inspiration on how to implement this. """ - from .. import get_resolve_module log.info("openpype.hosts.resolve installed") diff --git a/openpype/hosts/resolve/api/preload_console.py b/openpype/hosts/resolve/api/preload_console.py index 1e3a56b4dd..a822ea2460 100644 --- a/openpype/hosts/resolve/api/preload_console.py +++ b/openpype/hosts/resolve/api/preload_console.py @@ -1,9 +1,9 @@ #!/usr/bin/env python import time from openpype.hosts.resolve.utils import get_resolve_module -from openpype.api import Logger +from openpype.lib import Logger -log = Logger().get_logger(__name__) +log = Logger.get_logger(__name__) wait_delay = 2.5 wait = 0.00 diff --git a/openpype/hosts/resolve/api/utils.py b/openpype/hosts/resolve/api/utils.py index 9b3762f328..871b3af38d 100644 --- a/openpype/hosts/resolve/api/utils.py +++ b/openpype/hosts/resolve/api/utils.py @@ -4,21 +4,21 @@ Resolve's tools for setting environment """ -import sys import os -import shutil -from . import HOST_DIR -from openpype.api import Logger -log = Logger().get_logger(__name__) +import sys + +from openpype.lib import Logger + +log = Logger.get_logger(__name__) def get_resolve_module(): - from openpype.hosts import resolve + from openpype.hosts.resolve import api # dont run if already loaded - if resolve.api.bmdvr: + if api.bmdvr: log.info(("resolve module is assigned to " - f"`pype.hosts.resolve.api.bmdvr`: {resolve.api.bmdvr}")) - return resolve.api.bmdvr + f"`pype.hosts.resolve.api.bmdvr`: {api.bmdvr}")) + return api.bmdvr try: """ The PYTHONPATH needs to be set correctly for this import @@ -71,79 +71,9 @@ def get_resolve_module(): # assign global var and return bmdvr = bmd.scriptapp("Resolve") bmdvf = bmd.scriptapp("Fusion") - resolve.api.bmdvr = bmdvr - resolve.api.bmdvf = bmdvf + api.bmdvr = bmdvr + api.bmdvf = bmdvf log.info(("Assigning resolve module to " - f"`pype.hosts.resolve.api.bmdvr`: {resolve.api.bmdvr}")) + f"`pype.hosts.resolve.api.bmdvr`: {api.bmdvr}")) log.info(("Assigning resolve module to " - f"`pype.hosts.resolve.api.bmdvf`: {resolve.api.bmdvf}")) - - -def _sync_utility_scripts(env=None): - """ Synchronizing basic utlility scripts for resolve. - - To be able to run scripts from inside `Resolve/Workspace/Scripts` menu - all scripts has to be accessible from defined folder. - """ - if not env: - env = os.environ - - # initiate inputs - scripts = {} - us_env = env.get("RESOLVE_UTILITY_SCRIPTS_SOURCE_DIR") - us_dir = env.get("RESOLVE_UTILITY_SCRIPTS_DIR", "") - us_paths = [os.path.join( - HOST_DIR, - "utility_scripts" - )] - - # collect script dirs - if us_env: - log.info(f"Utility Scripts Env: `{us_env}`") - us_paths = us_env.split( - os.pathsep) + us_paths - - # collect scripts from dirs - for path in us_paths: - scripts.update({path: os.listdir(path)}) - - log.info(f"Utility Scripts Dir: `{us_paths}`") - log.info(f"Utility Scripts: `{scripts}`") - - # make sure no script file is in folder - if next((s for s in os.listdir(us_dir)), None): - for s in os.listdir(us_dir): - path = os.path.join(us_dir, s) - log.info(f"Removing `{path}`...") - if os.path.isdir(path): - shutil.rmtree(path, onerror=None) - else: - os.remove(path) - - # copy scripts into Resolve's utility scripts dir - for d, sl in scripts.items(): - # directory and scripts list - for s in sl: - # script in script list - src = os.path.join(d, s) - dst = os.path.join(us_dir, s) - log.info(f"Copying `{src}` to `{dst}`...") - if os.path.isdir(src): - shutil.copytree( - src, dst, symlinks=False, - ignore=None, ignore_dangling_symlinks=False - ) - else: - shutil.copy2(src, dst) - - -def setup(env=None): - """ Wrapper installer started from pype.hooks.resolve.ResolvePrelaunch() - """ - if not env: - env = os.environ - - # synchronize resolve utility scripts - _sync_utility_scripts(env) - - log.info("Resolve OpenPype wrapper has been installed") + f"`pype.hosts.resolve.api.bmdvf`: {api.bmdvf}")) diff --git a/openpype/hosts/resolve/api/workio.py b/openpype/hosts/resolve/api/workio.py index f175769387..5a742ecf7e 100644 --- a/openpype/hosts/resolve/api/workio.py +++ b/openpype/hosts/resolve/api/workio.py @@ -2,14 +2,14 @@ import os from openpype.api import Logger -from .. import ( +from .lib import ( get_project_manager, get_current_project, set_project_manager_to_folder_name ) -log = Logger().get_logger(__name__) +log = Logger.get_logger(__name__) exported_projet_ext = ".drp" @@ -60,7 +60,7 @@ def open_file(filepath): # load project from input path project = pm.LoadProject(fname) log.info(f"Project {project.GetName()} opened...") - return True + except AttributeError: log.warning((f"Project with name `{fname}` does not exist! It will " f"be imported from {filepath} and then loaded...")) @@ -69,9 +69,8 @@ def open_file(filepath): project = pm.LoadProject(fname) log.info(f"Project imported/loaded {project.GetName()}...") return True - else: - return False - + return False + return True def current_file(): pm = get_project_manager() @@ -80,13 +79,9 @@ def current_file(): name = project.GetName() fname = name + exported_projet_ext current_file = os.path.join(current_dir, fname) - normalised = os.path.normpath(current_file) - - # Unsaved current file - if normalised == "": + if not current_file: return None - - return normalised + return os.path.normpath(current_file) def work_root(session): diff --git a/openpype/hosts/resolve/hooks/pre_resolve_setup.py b/openpype/hosts/resolve/hooks/pre_resolve_setup.py index 978e3760fd..1d977e2d8e 100644 --- a/openpype/hosts/resolve/hooks/pre_resolve_setup.py +++ b/openpype/hosts/resolve/hooks/pre_resolve_setup.py @@ -1,7 +1,7 @@ import os -import importlib + from openpype.lib import PreLaunchHook -from openpype.hosts.resolve.api import utils +from openpype.hosts.resolve.utils import setup class ResolvePrelaunch(PreLaunchHook): @@ -43,18 +43,6 @@ class ResolvePrelaunch(PreLaunchHook): self.launch_context.env.get("PRE_PYTHON_SCRIPT", "")) self.launch_context.env["PRE_PYTHON_SCRIPT"] = pre_py_sc self.log.debug(f"-- pre_py_sc: `{pre_py_sc}`...") - try: - __import__("openpype.hosts.resolve") - __import__("pyblish") - except ImportError: - self.log.warning( - "pyblish: Could not load Resolve integration.", - exc_info=True - ) - - else: - # Resolve Setup integration - importlib.reload(utils) - self.log.debug(f"-- utils.__file__: `{utils.__file__}`") - utils.setup(self.launch_context.env) + # Resolve Setup integration + setup(self.launch_context.env) diff --git a/openpype/hosts/resolve/plugins/create/create_shot_clip.py b/openpype/hosts/resolve/plugins/create/create_shot_clip.py index dbf10c5163..4b14f2493f 100644 --- a/openpype/hosts/resolve/plugins/create/create_shot_clip.py +++ b/openpype/hosts/resolve/plugins/create/create_shot_clip.py @@ -1,9 +1,12 @@ # from pprint import pformat -from openpype.hosts import resolve -from openpype.hosts.resolve.api import lib +from openpype.hosts.resolve.api import plugin, lib +from openpype.hosts.resolve.api.lib import ( + get_video_track_names, + create_bin, +) -class CreateShotClip(resolve.Creator): +class CreateShotClip(plugin.Creator): """Publishable clip""" label = "Create Publishable Clip" @@ -11,7 +14,7 @@ class CreateShotClip(resolve.Creator): icon = "film" defaults = ["Main"] - gui_tracks = resolve.get_video_track_names() + gui_tracks = get_video_track_names() gui_name = "OpenPype publish attributes creator" gui_info = "Define sequential rename and fill hierarchy data." gui_inputs = { @@ -250,7 +253,7 @@ class CreateShotClip(resolve.Creator): sq_markers = self.timeline.GetMarkers() # create media bin for compound clips (trackItems) - mp_folder = resolve.create_bin(self.timeline.GetName()) + mp_folder = create_bin(self.timeline.GetName()) kwargs = { "ui_inputs": widget.result, @@ -264,6 +267,6 @@ class CreateShotClip(resolve.Creator): self.rename_index = i self.log.info(track_item_data) # convert track item to timeline media pool item - track_item = resolve.PublishClip( + track_item = plugin.PublishClip( self, track_item_data, **kwargs).convert() track_item.SetClipColor(lib.publish_clip_color) diff --git a/openpype/hosts/resolve/plugins/load/load_clip.py b/openpype/hosts/resolve/plugins/load/load_clip.py index 190a5a7206..a0c78c182f 100644 --- a/openpype/hosts/resolve/plugins/load/load_clip.py +++ b/openpype/hosts/resolve/plugins/load/load_clip.py @@ -1,21 +1,22 @@ from copy import deepcopy -from importlib import reload from openpype.client import ( get_version_by_id, get_last_version_by_subset_id, ) -from openpype.hosts import resolve +# from openpype.hosts import resolve from openpype.pipeline import ( get_representation_path, legacy_io, ) from openpype.hosts.resolve.api import lib, plugin -reload(plugin) -reload(lib) +from openpype.hosts.resolve.api.pipeline import ( + containerise, + update_container, +) -class LoadClip(resolve.TimelineItemLoader): +class LoadClip(plugin.TimelineItemLoader): """Load a subset to timeline as clip Place clip to timeline on its asset origin timings collected @@ -46,7 +47,7 @@ class LoadClip(resolve.TimelineItemLoader): }) # load clip to timeline and get main variables - timeline_item = resolve.ClipLoader( + timeline_item = plugin.ClipLoader( self, context, **options).load() namespace = namespace or timeline_item.GetName() version = context['version'] @@ -80,7 +81,7 @@ class LoadClip(resolve.TimelineItemLoader): self.log.info("Loader done: `{}`".format(name)) - return resolve.containerise( + return containerise( timeline_item, name, namespace, context, self.__class__.__name__, @@ -98,7 +99,7 @@ class LoadClip(resolve.TimelineItemLoader): context.update({"representation": representation}) name = container['name'] namespace = container['namespace'] - timeline_item_data = resolve.get_pype_timeline_item_by_name(namespace) + timeline_item_data = lib.get_pype_timeline_item_by_name(namespace) timeline_item = timeline_item_data["clip"]["item"] project_name = legacy_io.active_project() version = get_version_by_id(project_name, representation["parent"]) @@ -109,7 +110,7 @@ class LoadClip(resolve.TimelineItemLoader): self.fname = get_representation_path(representation) context["version"] = {"data": version_data} - loader = resolve.ClipLoader(self, context) + loader = plugin.ClipLoader(self, context) timeline_item = loader.update(timeline_item) # add additional metadata from the version to imprint Avalon knob @@ -136,7 +137,7 @@ class LoadClip(resolve.TimelineItemLoader): # update color of clip regarding the version order self.set_item_color(timeline_item, version) - return resolve.update_container(timeline_item, data_imprint) + return update_container(timeline_item, data_imprint) @classmethod def set_item_color(cls, timeline_item, version): diff --git a/openpype/hosts/resolve/plugins/publish/extract_workfile.py b/openpype/hosts/resolve/plugins/publish/extract_workfile.py index e3d60465a2..ea8f19cd8c 100644 --- a/openpype/hosts/resolve/plugins/publish/extract_workfile.py +++ b/openpype/hosts/resolve/plugins/publish/extract_workfile.py @@ -1,7 +1,7 @@ import os import pyblish.api import openpype.api -from openpype.hosts import resolve +from openpype.hosts.resolve.api.lib import get_project_manager class ExtractWorkfile(openpype.api.Extractor): @@ -29,7 +29,7 @@ class ExtractWorkfile(openpype.api.Extractor): os.path.join(staging_dir, drp_file_name)) # write out the drp workfile - resolve.get_project_manager().ExportProject( + get_project_manager().ExportProject( project.GetName(), drp_file_path) # create drp workfile representation diff --git a/openpype/hosts/resolve/plugins/publish/precollect_instances.py b/openpype/hosts/resolve/plugins/publish/precollect_instances.py index ee51998c0d..8ec169ad65 100644 --- a/openpype/hosts/resolve/plugins/publish/precollect_instances.py +++ b/openpype/hosts/resolve/plugins/publish/precollect_instances.py @@ -1,9 +1,15 @@ -import pyblish -from openpype.hosts import resolve - -# # developer reload modules from pprint import pformat +import pyblish + +from openpype.hosts.resolve.api.lib import ( + get_current_timeline_items, + get_timeline_item_pype_tag, + publish_clip_color, + get_publish_attribute, + get_otio_clip_instance_data, +) + class PrecollectInstances(pyblish.api.ContextPlugin): """Collect all Track items selection.""" @@ -14,8 +20,8 @@ class PrecollectInstances(pyblish.api.ContextPlugin): def process(self, context): otio_timeline = context.data["otioTimeline"] - selected_timeline_items = resolve.get_current_timeline_items( - filter=True, selecting_color=resolve.publish_clip_color) + selected_timeline_items = get_current_timeline_items( + filter=True, selecting_color=publish_clip_color) self.log.info( "Processing enabled track items: {}".format( @@ -27,7 +33,7 @@ class PrecollectInstances(pyblish.api.ContextPlugin): timeline_item = timeline_item_data["clip"]["item"] # get pype tag data - tag_data = resolve.get_timeline_item_pype_tag(timeline_item) + tag_data = get_timeline_item_pype_tag(timeline_item) self.log.debug(f"__ tag_data: {pformat(tag_data)}") if not tag_data: @@ -67,7 +73,7 @@ class PrecollectInstances(pyblish.api.ContextPlugin): "asset": asset, "item": timeline_item, "families": families, - "publish": resolve.get_publish_attribute(timeline_item), + "publish": get_publish_attribute(timeline_item), "fps": context.data["fps"], "handleStart": handle_start, "handleEnd": handle_end, @@ -75,7 +81,7 @@ class PrecollectInstances(pyblish.api.ContextPlugin): }) # otio clip data - otio_data = resolve.get_otio_clip_instance_data( + otio_data = get_otio_clip_instance_data( otio_timeline, timeline_item_data) or {} data.update(otio_data) @@ -134,7 +140,7 @@ class PrecollectInstances(pyblish.api.ContextPlugin): "asset": asset, "family": family, "families": [], - "publish": resolve.get_publish_attribute(timeline_item) + "publish": get_publish_attribute(timeline_item) }) context.create_instance(**data) diff --git a/openpype/hosts/resolve/plugins/publish/precollect_workfile.py b/openpype/hosts/resolve/plugins/publish/precollect_workfile.py index 53e67aee0e..0f94216556 100644 --- a/openpype/hosts/resolve/plugins/publish/precollect_workfile.py +++ b/openpype/hosts/resolve/plugins/publish/precollect_workfile.py @@ -1,11 +1,9 @@ import pyblish.api from pprint import pformat -from importlib import reload -from openpype.hosts import resolve +from openpype.hosts.resolve import api as rapi from openpype.pipeline import legacy_io from openpype.hosts.resolve.otio import davinci_export -reload(davinci_export) class PrecollectWorkfile(pyblish.api.ContextPlugin): @@ -18,9 +16,9 @@ class PrecollectWorkfile(pyblish.api.ContextPlugin): asset = legacy_io.Session["AVALON_ASSET"] subset = "workfile" - project = resolve.get_current_project() + project = rapi.get_current_project() fps = project.GetSetting("timelineFrameRate") - video_tracks = resolve.get_video_track_names() + video_tracks = rapi.get_video_track_names() # adding otio timeline to context otio_timeline = davinci_export.create_otio_timeline(project) diff --git a/openpype/hosts/resolve/utility_scripts/OpenPype_sync_util_scripts.py b/openpype/hosts/resolve/utility_scripts/OpenPype_sync_util_scripts.py index 3a16b9c966..8f3917bece 100644 --- a/openpype/hosts/resolve/utility_scripts/OpenPype_sync_util_scripts.py +++ b/openpype/hosts/resolve/utility_scripts/OpenPype_sync_util_scripts.py @@ -6,10 +6,11 @@ from openpype.pipeline import install_host def main(env): - import openpype.hosts.resolve as bmdvr + from openpype.hosts.resolve.utils import setup + import openpype.hosts.resolve.api as bmdvr # Registers openpype's Global pyblish plugins install_host(bmdvr) - bmdvr.setup(env) + setup(env) if __name__ == "__main__": diff --git a/openpype/hosts/resolve/utility_scripts/__OpenPype__Menu__.py b/openpype/hosts/resolve/utility_scripts/__OpenPype__Menu__.py index 89ade9238b..1087a7b7a0 100644 --- a/openpype/hosts/resolve/utility_scripts/__OpenPype__Menu__.py +++ b/openpype/hosts/resolve/utility_scripts/__OpenPype__Menu__.py @@ -2,13 +2,13 @@ import os import sys from openpype.pipeline import install_host -from openpype.api import Logger +from openpype.lib import Logger -log = Logger().get_logger(__name__) +log = Logger.get_logger(__name__) def main(env): - import openpype.hosts.resolve as bmdvr + import openpype.hosts.resolve.api as bmdvr # activate resolve from openpype install_host(bmdvr) diff --git a/openpype/hosts/resolve/utility_scripts/tests/test_otio_as_edl.py b/openpype/hosts/resolve/utility_scripts/tests/test_otio_as_edl.py index 8433bd9172..92f2e43a72 100644 --- a/openpype/hosts/resolve/utility_scripts/tests/test_otio_as_edl.py +++ b/openpype/hosts/resolve/utility_scripts/tests/test_otio_as_edl.py @@ -6,8 +6,8 @@ import opentimelineio as otio from openpype.pipeline import install_host -from openpype.hosts.resolve import TestGUI -import openpype.hosts.resolve as bmdvr +import openpype.hosts.resolve.api as bmdvr +from openpype.hosts.resolve.api.testing_utils import TestGUI from openpype.hosts.resolve.otio import davinci_export as otio_export diff --git a/openpype/hosts/resolve/utility_scripts/tests/testing_create_timeline_item_from_path.py b/openpype/hosts/resolve/utility_scripts/tests/testing_create_timeline_item_from_path.py index 477955d527..91a361ec08 100644 --- a/openpype/hosts/resolve/utility_scripts/tests/testing_create_timeline_item_from_path.py +++ b/openpype/hosts/resolve/utility_scripts/tests/testing_create_timeline_item_from_path.py @@ -2,11 +2,16 @@ import os import sys -from openpype.pipeline import install_host -from openpype.hosts.resolve import TestGUI -import openpype.hosts.resolve as bmdvr import clique +from openpype.pipeline import install_host +from openpype.hosts.resolve.api.testing_utils import TestGUI +import openpype.hosts.resolve.api as bmdvr +from openpype.hosts.resolve.api.lib import ( + create_media_pool_item, + create_timeline_item, +) + class ThisTestGUI(TestGUI): extensions = [".exr", ".jpg", ".mov", ".png", ".mp4", ".ari", ".arx"] @@ -55,10 +60,10 @@ class ThisTestGUI(TestGUI): # skip if unwanted extension if ext not in self.extensions: return - media_pool_item = bmdvr.create_media_pool_item(fpath) + media_pool_item = create_media_pool_item(fpath) print(media_pool_item) - track_item = bmdvr.create_timeline_item(media_pool_item) + track_item = create_timeline_item(media_pool_item) print(track_item) diff --git a/openpype/hosts/resolve/utility_scripts/tests/testing_load_media_pool_item.py b/openpype/hosts/resolve/utility_scripts/tests/testing_load_media_pool_item.py index 872d620162..2e83188bde 100644 --- a/openpype/hosts/resolve/utility_scripts/tests/testing_load_media_pool_item.py +++ b/openpype/hosts/resolve/utility_scripts/tests/testing_load_media_pool_item.py @@ -1,13 +1,17 @@ #! python3 from openpype.pipeline import install_host -import openpype.hosts.resolve as bmdvr +from openpype.hosts.resolve import api as bmdvr +from openpype.hosts.resolve.api.lib import ( + create_media_pool_item, + create_timeline_item, +) def file_processing(fpath): - media_pool_item = bmdvr.create_media_pool_item(fpath) + media_pool_item = create_media_pool_item(fpath) print(media_pool_item) - track_item = bmdvr.create_timeline_item(media_pool_item) + track_item = create_timeline_item(media_pool_item) print(track_item) diff --git a/openpype/hosts/resolve/utils.py b/openpype/hosts/resolve/utils.py new file mode 100644 index 0000000000..382a7cf344 --- /dev/null +++ b/openpype/hosts/resolve/utils.py @@ -0,0 +1,54 @@ +import os +import shutil +from openpype.lib import Logger + +RESOLVE_ROOT_DIR = os.path.dirname(os.path.abspath(__file__)) + + +def setup(env): + log = Logger.get_logger("ResolveSetup") + scripts = {} + us_env = env.get("RESOLVE_UTILITY_SCRIPTS_SOURCE_DIR") + us_dir = env.get("RESOLVE_UTILITY_SCRIPTS_DIR", "") + us_paths = [os.path.join( + RESOLVE_ROOT_DIR, + "utility_scripts" + )] + + # collect script dirs + if us_env: + log.info(f"Utility Scripts Env: `{us_env}`") + us_paths = us_env.split( + os.pathsep) + us_paths + + # collect scripts from dirs + for path in us_paths: + scripts.update({path: os.listdir(path)}) + + log.info(f"Utility Scripts Dir: `{us_paths}`") + log.info(f"Utility Scripts: `{scripts}`") + + # make sure no script file is in folder + for s in os.listdir(us_dir): + path = os.path.join(us_dir, s) + log.info(f"Removing `{path}`...") + if os.path.isdir(path): + shutil.rmtree(path, onerror=None) + else: + os.remove(path) + + # copy scripts into Resolve's utility scripts dir + for d, sl in scripts.items(): + # directory and scripts list + for s in sl: + # script in script list + src = os.path.join(d, s) + dst = os.path.join(us_dir, s) + log.info(f"Copying `{src}` to `{dst}`...") + if os.path.isdir(src): + shutil.copytree( + src, dst, symlinks=False, + ignore=None, ignore_dangling_symlinks=False + ) + else: + shutil.copy2(src, dst) diff --git a/openpype/hosts/standalonepublisher/__init__.py b/openpype/hosts/standalonepublisher/__init__.py index e69de29bb2..394d5be397 100644 --- a/openpype/hosts/standalonepublisher/__init__.py +++ b/openpype/hosts/standalonepublisher/__init__.py @@ -0,0 +1,6 @@ +from .standalonepublish_module import StandAlonePublishModule + + +__all__ = ( + "StandAlonePublishModule", +) diff --git a/openpype/hosts/standalonepublisher/standalonepublish_module.py b/openpype/hosts/standalonepublisher/standalonepublish_module.py new file mode 100644 index 0000000000..bf8e1d2c23 --- /dev/null +++ b/openpype/hosts/standalonepublisher/standalonepublish_module.py @@ -0,0 +1,57 @@ +import os + +import click + +from openpype.lib import get_openpype_execute_args +from openpype.lib.execute import run_detached_process +from openpype.modules import OpenPypeModule +from openpype.modules.interfaces import ITrayAction, IHostModule + +STANDALONEPUBLISH_ROOT_DIR = os.path.dirname(os.path.abspath(__file__)) + + +class StandAlonePublishModule(OpenPypeModule, ITrayAction, IHostModule): + label = "Publish" + name = "standalonepublish_tool" + host_name = "standalonepublisher" + + def initialize(self, modules_settings): + self.enabled = modules_settings[self.name]["enabled"] + self.publish_paths = [ + os.path.join(STANDALONEPUBLISH_ROOT_DIR, "plugins", "publish") + ] + + def tray_init(self): + return + + def on_action_trigger(self): + self.run_standalone_publisher() + + def connect_with_modules(self, enabled_modules): + """Collect publish paths from other modules.""" + + publish_paths = self.manager.collect_plugin_paths()["publish"] + self.publish_paths.extend(publish_paths) + + def run_standalone_publisher(self): + args = get_openpype_execute_args("module", self.name, "launch") + run_detached_process(args) + + def cli(self, click_group): + click_group.add_command(cli_main) + + +@click.group( + StandAlonePublishModule.name, + help="StandalonePublisher related commands.") +def cli_main(): + pass + + +@cli_main.command() +def launch(): + """Launch StandalonePublisher tool UI.""" + + from openpype.tools import standalonepublish + + standalonepublish.main() diff --git a/openpype/hosts/traypublisher/__init__.py b/openpype/hosts/traypublisher/__init__.py new file mode 100644 index 0000000000..4eb7bf3eef --- /dev/null +++ b/openpype/hosts/traypublisher/__init__.py @@ -0,0 +1,6 @@ +from .module import TrayPublishModule + + +__all__ = ( + "TrayPublishModule", +) diff --git a/openpype/modules/traypublish_action.py b/openpype/hosts/traypublisher/module.py similarity index 59% rename from openpype/modules/traypublish_action.py rename to openpype/hosts/traypublisher/module.py index 39163b8eb8..92a2312fec 100644 --- a/openpype/modules/traypublish_action.py +++ b/openpype/hosts/traypublisher/module.py @@ -1,25 +1,24 @@ import os + +import click + from openpype.lib import get_openpype_execute_args from openpype.lib.execute import run_detached_process from openpype.modules import OpenPypeModule -from openpype_interfaces import ITrayAction +from openpype.modules.interfaces import ITrayAction, IHostModule + +TRAYPUBLISH_ROOT_DIR = os.path.dirname(os.path.abspath(__file__)) -class TrayPublishAction(OpenPypeModule, ITrayAction): +class TrayPublishModule(OpenPypeModule, IHostModule, ITrayAction): label = "New Publish (beta)" name = "traypublish_tool" + host_name = "traypublish" def initialize(self, modules_settings): - import openpype self.enabled = True self.publish_paths = [ - os.path.join( - openpype.PACKAGE_DIR, - "hosts", - "traypublisher", - "plugins", - "publish" - ) + os.path.join(TRAYPUBLISH_ROOT_DIR, "plugins", "publish") ] self._experimental_tools = None @@ -29,7 +28,7 @@ class TrayPublishAction(OpenPypeModule, ITrayAction): self._experimental_tools = ExperimentalTools() def tray_menu(self, *args, **kwargs): - super(TrayPublishAction, self).tray_menu(*args, **kwargs) + super(TrayPublishModule, self).tray_menu(*args, **kwargs) traypublisher = self._experimental_tools.get("traypublisher") visible = False if traypublisher and traypublisher.enabled: @@ -45,5 +44,24 @@ class TrayPublishAction(OpenPypeModule, ITrayAction): self.publish_paths.extend(publish_paths) def run_traypublisher(self): - args = get_openpype_execute_args("traypublisher") + args = get_openpype_execute_args( + "module", self.name, "launch" + ) run_detached_process(args) + + def cli(self, click_group): + click_group.add_command(cli_main) + + +@click.group(TrayPublishModule.name, help="TrayPublisher related commands.") +def cli_main(): + pass + + +@cli_main.command() +def launch(): + """Launch TrayPublish tool UI.""" + + from openpype.tools import traypublisher + + traypublisher.main() diff --git a/openpype/hosts/tvpaint/__init__.py b/openpype/hosts/tvpaint/__init__.py index 09b7c52cd1..0a84b575dc 100644 --- a/openpype/hosts/tvpaint/__init__.py +++ b/openpype/hosts/tvpaint/__init__.py @@ -1,20 +1,12 @@ -import os +from .tvpaint_module import ( + get_launch_script_path, + TVPaintModule, + TVPAINT_ROOT_DIR, +) -def add_implementation_envs(env, _app): - """Modify environments to contain all required for implementation.""" - defaults = { - "OPENPYPE_LOG_NO_COLORS": "True" - } - for key, value in defaults.items(): - if not env.get(key): - env[key] = value - - -def get_launch_script_path(): - current_dir = os.path.dirname(os.path.abspath(__file__)) - return os.path.join( - current_dir, - "api", - "launch_script.py" - ) +__all__ = ( + "get_launch_script_path", + "TVPaintModule", + "TVPAINT_ROOT_DIR", +) diff --git a/openpype/hosts/tvpaint/api/__init__.py b/openpype/hosts/tvpaint/api/__init__.py index c461b33f4b..43d411d8f9 100644 --- a/openpype/hosts/tvpaint/api/__init__.py +++ b/openpype/hosts/tvpaint/api/__init__.py @@ -6,7 +6,6 @@ from . import pipeline from . import plugin from .pipeline import ( install, - uninstall, maintained_selection, remove_instance, list_instances, @@ -33,7 +32,6 @@ __all__ = ( "plugin", "install", - "uninstall", "maintained_selection", "remove_instance", "list_instances", diff --git a/openpype/hosts/tvpaint/api/pipeline.py b/openpype/hosts/tvpaint/api/pipeline.py index 0118c0104b..427c927264 100644 --- a/openpype/hosts/tvpaint/api/pipeline.py +++ b/openpype/hosts/tvpaint/api/pipeline.py @@ -16,8 +16,6 @@ from openpype.pipeline import ( legacy_io, register_loader_plugin_path, register_creator_plugin_path, - deregister_loader_plugin_path, - deregister_creator_plugin_path, AVALON_CONTAINER_ID, ) @@ -91,19 +89,6 @@ def install(): register_event_callback("application.exit", application_exit) -def uninstall(): - """Uninstall TVPaint-specific functionality. - - This function is called automatically on calling `uninstall_host()`. - """ - - log.info("OpenPype - Uninstalling TVPaint integration") - pyblish.api.deregister_host("tvpaint") - pyblish.api.deregister_plugin_path(PUBLISH_PATH) - deregister_loader_plugin_path(LOAD_PATH) - deregister_creator_plugin_path(CREATE_PATH) - - def containerise( name, namespace, members, context, loader, current_containers=None ): diff --git a/openpype/hosts/tvpaint/tvpaint_module.py b/openpype/hosts/tvpaint/tvpaint_module.py new file mode 100644 index 0000000000..a004359231 --- /dev/null +++ b/openpype/hosts/tvpaint/tvpaint_module.py @@ -0,0 +1,41 @@ +import os +from openpype.modules import OpenPypeModule +from openpype.modules.interfaces import IHostModule + +TVPAINT_ROOT_DIR = os.path.dirname(os.path.abspath(__file__)) + + +def get_launch_script_path(): + return os.path.join( + TVPAINT_ROOT_DIR, + "api", + "launch_script.py" + ) + + +class TVPaintModule(OpenPypeModule, IHostModule): + name = "tvpaint" + host_name = "tvpaint" + + def initialize(self, module_settings): + self.enabled = True + + def add_implementation_envs(self, env, _app): + """Modify environments to contain all required for implementation.""" + + defaults = { + "OPENPYPE_LOG_NO_COLORS": "True" + } + for key, value in defaults.items(): + if not env.get(key): + env[key] = value + + def get_launch_hook_paths(self, app): + if app.host_name != self.host_name: + return [] + return [ + os.path.join(TVPAINT_ROOT_DIR, "hooks") + ] + + def get_workfile_extensions(self): + return [".tvpp"] diff --git a/openpype/hosts/unreal/__init__.py b/openpype/hosts/unreal/__init__.py index 10e9c5100e..41222f4f94 100644 --- a/openpype/hosts/unreal/__init__.py +++ b/openpype/hosts/unreal/__init__.py @@ -1,24 +1,6 @@ -import os -import openpype.hosts -from openpype.lib.applications import Application +from .module import UnrealModule -def add_implementation_envs(env: dict, _app: Application) -> None: - """Modify environments to contain all required for implementation.""" - # Set OPENPYPE_UNREAL_PLUGIN required for Unreal implementation - - ue_plugin = "UE_5.0" if _app.name[:1] == "5" else "UE_4.7" - unreal_plugin_path = os.path.join( - os.path.dirname(os.path.abspath(openpype.hosts.__file__)), - "unreal", "integration", ue_plugin - ) - if not env.get("OPENPYPE_UNREAL_PLUGIN"): - env["OPENPYPE_UNREAL_PLUGIN"] = unreal_plugin_path - - # Set default environments if are not set via settings - defaults = { - "OPENPYPE_LOG_NO_COLORS": "True" - } - for key, value in defaults.items(): - if not env.get(key): - env[key] = value +__all__ = ( + "UnrealModule", +) diff --git a/openpype/hosts/unreal/api/__init__.py b/openpype/hosts/unreal/api/__init__.py index ede71aa218..870982f5f9 100644 --- a/openpype/hosts/unreal/api/__init__.py +++ b/openpype/hosts/unreal/api/__init__.py @@ -19,6 +19,7 @@ from .pipeline import ( show_tools_dialog, show_tools_popup, instantiate, + UnrealHost, ) __all__ = [ @@ -36,5 +37,6 @@ __all__ = [ "show_experimental_tools", "show_tools_dialog", "show_tools_popup", - "instantiate" + "instantiate", + "UnrealHost", ] diff --git a/openpype/hosts/unreal/api/pipeline.py b/openpype/hosts/unreal/api/pipeline.py index bbca7916d3..d396b64072 100644 --- a/openpype/hosts/unreal/api/pipeline.py +++ b/openpype/hosts/unreal/api/pipeline.py @@ -14,6 +14,7 @@ from openpype.pipeline import ( ) from openpype.tools.utils import host_tools import openpype.hosts.unreal +from openpype.host import HostBase, ILoadHost import unreal # noqa @@ -29,6 +30,32 @@ CREATE_PATH = os.path.join(PLUGINS_DIR, "create") INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory") +class UnrealHost(HostBase, ILoadHost): + """Unreal host implementation. + + For some time this class will re-use functions from module based + implementation for backwards compatibility of older unreal projects. + """ + + name = "unreal" + + def install(self): + install() + + def get_containers(self): + return ls() + + def show_tools_popup(self): + """Show tools popup with actions leading to show other tools.""" + + show_tools_popup() + + def show_tools_dialog(self): + """Show tools dialog with actions leading to show other tools.""" + + show_tools_dialog() + + def install(): """Install Unreal configuration for OpenPype.""" print("-=" * 40) diff --git a/openpype/hosts/unreal/integration/UE_4.7/Content/Python/init_unreal.py b/openpype/hosts/unreal/integration/UE_4.7/Content/Python/init_unreal.py index 4bb03b07ed..b85f970699 100644 --- a/openpype/hosts/unreal/integration/UE_4.7/Content/Python/init_unreal.py +++ b/openpype/hosts/unreal/integration/UE_4.7/Content/Python/init_unreal.py @@ -3,7 +3,9 @@ import unreal openpype_detected = True try: from openpype.pipeline import install_host - from openpype.hosts.unreal import api as openpype_host + from openpype.hosts.unreal.api import UnrealHost + + openpype_host = UnrealHost() except ImportError as exc: openpype_host = None openpype_detected = False diff --git a/openpype/hosts/unreal/integration/UE_5.0/Content/Python/init_unreal.py b/openpype/hosts/unreal/integration/UE_5.0/Content/Python/init_unreal.py index 4bb03b07ed..b85f970699 100644 --- a/openpype/hosts/unreal/integration/UE_5.0/Content/Python/init_unreal.py +++ b/openpype/hosts/unreal/integration/UE_5.0/Content/Python/init_unreal.py @@ -3,7 +3,9 @@ import unreal openpype_detected = True try: from openpype.pipeline import install_host - from openpype.hosts.unreal import api as openpype_host + from openpype.hosts.unreal.api import UnrealHost + + openpype_host = UnrealHost() except ImportError as exc: openpype_host = None openpype_detected = False diff --git a/openpype/hosts/unreal/module.py b/openpype/hosts/unreal/module.py new file mode 100644 index 0000000000..aa08c8c130 --- /dev/null +++ b/openpype/hosts/unreal/module.py @@ -0,0 +1,42 @@ +import os +from openpype.modules import OpenPypeModule +from openpype.modules.interfaces import IHostModule + +UNREAL_ROOT_DIR = os.path.dirname(os.path.abspath(__file__)) + + +class UnrealModule(OpenPypeModule, IHostModule): + name = "unreal" + host_name = "unreal" + + def initialize(self, module_settings): + self.enabled = True + + def add_implementation_envs(self, env, app) -> None: + """Modify environments to contain all required for implementation.""" + # Set OPENPYPE_UNREAL_PLUGIN required for Unreal implementation + + ue_plugin = "UE_5.0" if app.name[:1] == "5" else "UE_4.7" + unreal_plugin_path = os.path.join( + UNREAL_ROOT_DIR, "integration", ue_plugin + ) + if not env.get("OPENPYPE_UNREAL_PLUGIN"): + env["OPENPYPE_UNREAL_PLUGIN"] = unreal_plugin_path + + # Set default environments if are not set via settings + defaults = { + "OPENPYPE_LOG_NO_COLORS": "True" + } + for key, value in defaults.items(): + if not env.get(key): + env[key] = value + + def get_launch_hook_paths(self, app): + if app.host_name != self.host_name: + return [] + return [ + os.path.join(UNREAL_ROOT_DIR, "hooks") + ] + + def get_workfile_extensions(self): + return [".uproject"] diff --git a/openpype/hosts/unreal/plugins/load/load_alembic_skeletalmesh.py b/openpype/hosts/unreal/plugins/load/load_alembic_skeletalmesh.py index b2c3889f68..9fe5f3ab4b 100644 --- a/openpype/hosts/unreal/plugins/load/load_alembic_skeletalmesh.py +++ b/openpype/hosts/unreal/plugins/load/load_alembic_skeletalmesh.py @@ -20,6 +20,34 @@ class SkeletalMeshAlembicLoader(plugin.Loader): icon = "cube" color = "orange" + def get_task(self, filename, asset_dir, asset_name, replace): + task = unreal.AssetImportTask() + options = unreal.AbcImportSettings() + sm_settings = unreal.AbcStaticMeshSettings() + conversion_settings = unreal.AbcConversionSettings( + preset=unreal.AbcConversionPreset.CUSTOM, + flip_u=False, flip_v=False, + rotation=[0.0, 0.0, 0.0], + scale=[1.0, 1.0, 1.0]) + + task.set_editor_property('filename', filename) + task.set_editor_property('destination_path', asset_dir) + task.set_editor_property('destination_name', asset_name) + task.set_editor_property('replace_existing', replace) + task.set_editor_property('automated', True) + task.set_editor_property('save', True) + + # set import options here + # Unreal 4.24 ignores the settings. It works with Unreal 4.26 + options.set_editor_property( + 'import_type', unreal.AlembicImportType.SKELETAL) + + options.static_mesh_settings = sm_settings + options.conversion_settings = conversion_settings + task.options = options + + return task + def load(self, context, name, namespace, data): """Load and containerise representation into Content Browser. @@ -50,36 +78,24 @@ class SkeletalMeshAlembicLoader(plugin.Loader): asset_name = "{}_{}".format(asset, name) else: asset_name = "{}".format(name) + version = context.get('version').get('name') tools = unreal.AssetToolsHelpers().get_asset_tools() asset_dir, container_name = tools.create_unique_asset_name( - "{}/{}/{}".format(root, asset, name), suffix="") + f"{root}/{asset}/{name}_v{version:03d}", suffix="") container_name += suffix - unreal.EditorAssetLibrary.make_directory(asset_dir) + if not unreal.EditorAssetLibrary.does_directory_exist(asset_dir): + unreal.EditorAssetLibrary.make_directory(asset_dir) - task = unreal.AssetImportTask() + task = self.get_task(self.fname, asset_dir, asset_name, False) - task.set_editor_property('filename', self.fname) - task.set_editor_property('destination_path', asset_dir) - task.set_editor_property('destination_name', asset_name) - task.set_editor_property('replace_existing', False) - task.set_editor_property('automated', True) - task.set_editor_property('save', True) + unreal.AssetToolsHelpers.get_asset_tools().import_asset_tasks([task]) # noqa: E501 - # set import options here - # Unreal 4.24 ignores the settings. It works with Unreal 4.26 - options = unreal.AbcImportSettings() - options.set_editor_property( - 'import_type', unreal.AlembicImportType.SKELETAL) - - task.options = options - unreal.AssetToolsHelpers.get_asset_tools().import_asset_tasks([task]) # noqa: E501 - - # Create Asset Container - unreal_pipeline.create_container( - container=container_name, path=asset_dir) + # Create Asset Container + unreal_pipeline.create_container( + container=container_name, path=asset_dir) data = { "schema": "openpype:container-2.0", @@ -110,23 +126,8 @@ class SkeletalMeshAlembicLoader(plugin.Loader): source_path = get_representation_path(representation) destination_path = container["namespace"] - task = unreal.AssetImportTask() + task = self.get_task(source_path, destination_path, name, True) - task.set_editor_property('filename', source_path) - task.set_editor_property('destination_path', destination_path) - # strip suffix - task.set_editor_property('destination_name', name) - task.set_editor_property('replace_existing', True) - task.set_editor_property('automated', True) - task.set_editor_property('save', True) - - # set import options here - # Unreal 4.24 ignores the settings. It works with Unreal 4.26 - options = unreal.AbcImportSettings() - options.set_editor_property( - 'import_type', unreal.AlembicImportType.SKELETAL) - - task.options = options # do import fbx and replace existing data unreal.AssetToolsHelpers.get_asset_tools().import_asset_tasks([task]) container_path = "{}/{}".format(container["namespace"], diff --git a/openpype/hosts/unreal/plugins/load/load_alembic_staticmesh.py b/openpype/hosts/unreal/plugins/load/load_alembic_staticmesh.py index 5a73c72c64..50e498dbb0 100644 --- a/openpype/hosts/unreal/plugins/load/load_alembic_staticmesh.py +++ b/openpype/hosts/unreal/plugins/load/load_alembic_staticmesh.py @@ -24,7 +24,11 @@ class StaticMeshAlembicLoader(plugin.Loader): task = unreal.AssetImportTask() options = unreal.AbcImportSettings() sm_settings = unreal.AbcStaticMeshSettings() - conversion_settings = unreal.AbcConversionSettings() + conversion_settings = unreal.AbcConversionSettings( + preset=unreal.AbcConversionPreset.CUSTOM, + flip_u=False, flip_v=False, + rotation=[0.0, 0.0, 0.0], + scale=[1.0, 1.0, 1.0]) task.set_editor_property('filename', filename) task.set_editor_property('destination_path', asset_dir) @@ -40,13 +44,6 @@ class StaticMeshAlembicLoader(plugin.Loader): sm_settings.set_editor_property('merge_meshes', True) - conversion_settings.set_editor_property('flip_u', False) - conversion_settings.set_editor_property('flip_v', True) - conversion_settings.set_editor_property( - 'scale', unreal.Vector(x=100.0, y=100.0, z=100.0)) - conversion_settings.set_editor_property( - 'rotation', unreal.Vector(x=-90.0, y=0.0, z=180.0)) - options.static_mesh_settings = sm_settings options.conversion_settings = conversion_settings task.options = options @@ -83,22 +80,24 @@ class StaticMeshAlembicLoader(plugin.Loader): asset_name = "{}_{}".format(asset, name) else: asset_name = "{}".format(name) + version = context.get('version').get('name') tools = unreal.AssetToolsHelpers().get_asset_tools() asset_dir, container_name = tools.create_unique_asset_name( - "{}/{}/{}".format(root, asset, name), suffix="") + f"{root}/{asset}/{name}_v{version:03d}", suffix="") container_name += suffix - unreal.EditorAssetLibrary.make_directory(asset_dir) + if not unreal.EditorAssetLibrary.does_directory_exist(asset_dir): + unreal.EditorAssetLibrary.make_directory(asset_dir) - task = self.get_task(self.fname, asset_dir, asset_name, False) + task = self.get_task(self.fname, asset_dir, asset_name, False) - unreal.AssetToolsHelpers.get_asset_tools().import_asset_tasks([task]) # noqa: E501 + unreal.AssetToolsHelpers.get_asset_tools().import_asset_tasks([task]) # noqa: E501 - # Create Asset Container - unreal_pipeline.create_container( - container=container_name, path=asset_dir) + # Create Asset Container + unreal_pipeline.create_container( + container=container_name, path=asset_dir) data = { "schema": "openpype:container-2.0", diff --git a/openpype/hosts/unreal/plugins/load/load_layout.py b/openpype/hosts/unreal/plugins/load/load_layout.py index 01d589c69b..926c932a85 100644 --- a/openpype/hosts/unreal/plugins/load/load_layout.py +++ b/openpype/hosts/unreal/plugins/load/load_layout.py @@ -9,7 +9,10 @@ from unreal import EditorLevelLibrary from unreal import EditorLevelUtils from unreal import AssetToolsHelpers from unreal import FBXImportType -from unreal import MathLibrary as umath +from unreal import MovieSceneLevelVisibilityTrack +from unreal import MovieSceneSubTrack + +from bson.objectid import ObjectId from openpype.client import get_asset_by_name, get_assets from openpype.pipeline import ( @@ -21,6 +24,7 @@ from openpype.pipeline import ( legacy_io, ) from openpype.pipeline.context_tools import get_current_project_asset +from openpype.api import get_current_project_settings from openpype.hosts.unreal.api import plugin from openpype.hosts.unreal.api import pipeline as unreal_pipeline @@ -159,9 +163,29 @@ class LayoutLoader(plugin.Loader): hid_section.set_row_index(index) hid_section.set_level_names(maps) - @staticmethod + def _transform_from_basis(self, transform, basis): + """Transform a transform from a basis to a new basis.""" + # Get the basis matrix + basis_matrix = unreal.Matrix( + basis[0], + basis[1], + basis[2], + basis[3] + ) + transform_matrix = unreal.Matrix( + transform[0], + transform[1], + transform[2], + transform[3] + ) + + new_transform = ( + basis_matrix.get_inverse() * transform_matrix * basis_matrix) + + return new_transform.transform() + def _process_family( - assets, class_name, transform, sequence, inst_name=None + self, assets, class_name, transform, basis, sequence, inst_name=None ): ar = unreal.AssetRegistryHelpers.get_asset_registry() @@ -171,30 +195,12 @@ class LayoutLoader(plugin.Loader): for asset in assets: obj = ar.get_asset_by_object_path(asset).get_asset() if obj.get_class().get_name() == class_name: + t = self._transform_from_basis(transform, basis) actor = EditorLevelLibrary.spawn_actor_from_object( - obj, - transform.get('translation') + obj, t.translation ) - if inst_name: - try: - # Rename method leads to crash - # actor.rename(name=inst_name) - - # The label works, although it make it slightly more - # complicated to check for the names, as we need to - # loop through all the actors in the level - actor.set_actor_label(inst_name) - except Exception as e: - print(e) - actor.set_actor_rotation(unreal.Rotator( - umath.radians_to_degrees( - transform.get('rotation').get('x')), - -umath.radians_to_degrees( - transform.get('rotation').get('y')), - umath.radians_to_degrees( - transform.get('rotation').get('z')), - ), False) - actor.set_actor_scale3d(transform.get('scale')) + actor.set_actor_rotation(t.rotation.rotator(), False) + actor.set_actor_scale3d(t.scale3d) if class_name == 'SkeletalMesh': skm_comp = actor.get_editor_property( @@ -203,16 +209,17 @@ class LayoutLoader(plugin.Loader): actors.append(actor) - binding = None - for p in sequence.get_possessables(): - if p.get_name() == actor.get_name(): - binding = p - break + if sequence: + binding = None + for p in sequence.get_possessables(): + if p.get_name() == actor.get_name(): + binding = p + break - if not binding: - binding = sequence.add_possessable(actor) + if not binding: + binding = sequence.add_possessable(actor) - bindings.append(binding) + bindings.append(binding) return actors, bindings @@ -301,52 +308,53 @@ class LayoutLoader(plugin.Loader): actor.skeletal_mesh_component.animation_data.set_editor_property( 'anim_to_play', animation) - # Add animation to the sequencer - bindings = bindings_dict.get(instance_name) + if sequence: + # Add animation to the sequencer + bindings = bindings_dict.get(instance_name) - ar = unreal.AssetRegistryHelpers.get_asset_registry() + ar = unreal.AssetRegistryHelpers.get_asset_registry() - for binding in bindings: - tracks = binding.get_tracks() - track = None - track = tracks[0] if tracks else binding.add_track( - unreal.MovieSceneSkeletalAnimationTrack) + for binding in bindings: + tracks = binding.get_tracks() + track = None + track = tracks[0] if tracks else binding.add_track( + unreal.MovieSceneSkeletalAnimationTrack) - sections = track.get_sections() - section = None - if not sections: - section = track.add_section() - else: - section = sections[0] + sections = track.get_sections() + section = None + if not sections: + section = track.add_section() + else: + section = sections[0] + sec_params = section.get_editor_property('params') + curr_anim = sec_params.get_editor_property('animation') + + if curr_anim: + # Checks if the animation path has a container. + # If it does, it means that the animation is + # already in the sequencer. + anim_path = str(Path( + curr_anim.get_path_name()).parent + ).replace('\\', '/') + + _filter = unreal.ARFilter( + class_names=["AssetContainer"], + package_paths=[anim_path], + recursive_paths=False) + containers = ar.get_assets(_filter) + + if len(containers) > 0: + return + + section.set_range( + sequence.get_playback_start(), + sequence.get_playback_end()) sec_params = section.get_editor_property('params') - curr_anim = sec_params.get_editor_property('animation') - - if curr_anim: - # Checks if the animation path has a container. - # If it does, it means that the animation is already - # in the sequencer. - anim_path = str(Path( - curr_anim.get_path_name()).parent - ).replace('\\', '/') - - _filter = unreal.ARFilter( - class_names=["AssetContainer"], - package_paths=[anim_path], - recursive_paths=False) - containers = ar.get_assets(_filter) - - if len(containers) > 0: - return - - section.set_range( - sequence.get_playback_start(), - sequence.get_playback_end()) - sec_params = section.get_editor_property('params') - sec_params.set_editor_property('animation', animation) + sec_params.set_editor_property('animation', animation) @staticmethod - def _generate_sequence(self, h, h_dir): + def _generate_sequence(h, h_dir): tools = unreal.AssetToolsHelpers().get_asset_tools() sequence = tools.create_asset( @@ -402,7 +410,7 @@ class LayoutLoader(plugin.Loader): return sequence, (min_frame, max_frame) - def _process(self, lib_path, asset_dir, sequence, loaded=None): + def _process(self, lib_path, asset_dir, sequence, repr_loaded=None): ar = unreal.AssetRegistryHelpers.get_asset_registry() with open(lib_path, "r") as fp: @@ -410,8 +418,8 @@ class LayoutLoader(plugin.Loader): all_loaders = discover_loader_plugins() - if not loaded: - loaded = [] + if not repr_loaded: + repr_loaded = [] path = Path(lib_path) @@ -422,36 +430,65 @@ class LayoutLoader(plugin.Loader): loaded_assets = [] for element in data: - reference = None - if element.get('reference_fbx'): - reference = element.get('reference_fbx') + representation = None + repr_format = None + if element.get('representation'): + # representation = element.get('representation') + + self.log.info(element.get("version")) + + valid_formats = ['fbx', 'abc'] + + repr_data = legacy_io.find_one({ + "type": "representation", + "parent": ObjectId(element.get("version")), + "name": {"$in": valid_formats} + }) + repr_format = repr_data.get('name') + + if not repr_data: + self.log.error( + f"No valid representation found for version " + f"{element.get('version')}") + continue + + representation = str(repr_data.get('_id')) + print(representation) + # This is to keep compatibility with old versions of the + # json format. + elif element.get('reference_fbx'): + representation = element.get('reference_fbx') + repr_format = 'fbx' elif element.get('reference_abc'): - reference = element.get('reference_abc') + representation = element.get('reference_abc') + repr_format = 'abc' # If reference is None, this element is skipped, as it cannot be # imported in Unreal - if not reference: + if not representation: continue instance_name = element.get('instance_name') skeleton = None - if reference not in loaded: - loaded.append(reference) + if representation not in repr_loaded: + repr_loaded.append(representation) family = element.get('family') loaders = loaders_from_representation( - all_loaders, reference) + all_loaders, representation) loader = None - if reference == element.get('reference_fbx'): + if repr_format == 'fbx': loader = self._get_fbx_loader(loaders, family) - elif reference == element.get('reference_abc'): + elif repr_format == 'abc': loader = self._get_abc_loader(loaders, family) if not loader: + self.log.error( + f"No valid loader found for {representation}") continue options = { @@ -460,7 +497,7 @@ class LayoutLoader(plugin.Loader): assets = load_container( loader, - reference, + representation, namespace=instance_name, options=options ) @@ -478,28 +515,36 @@ class LayoutLoader(plugin.Loader): instances = [ item for item in data - if (item.get('reference_fbx') == reference or - item.get('reference_abc') == reference)] + if ((item.get('version') and + item.get('version') == element.get('version')) or + item.get('reference_fbx') == representation or + item.get('reference_abc') == representation)] for instance in instances: - transform = instance.get('transform') + # transform = instance.get('transform') + transform = instance.get('transform_matrix') + basis = instance.get('basis') inst = instance.get('instance_name') actors = [] if family == 'model': actors, _ = self._process_family( - assets, 'StaticMesh', transform, sequence, inst) + assets, 'StaticMesh', transform, basis, + sequence, inst + ) elif family == 'rig': actors, bindings = self._process_family( - assets, 'SkeletalMesh', transform, sequence, inst) + assets, 'SkeletalMesh', transform, basis, + sequence, inst + ) actors_dict[inst] = actors bindings_dict[inst] = bindings if skeleton: - skeleton_dict[reference] = skeleton + skeleton_dict[representation] = skeleton else: - skeleton = skeleton_dict.get(reference) + skeleton = skeleton_dict.get(representation) animation_file = element.get('animation') @@ -573,6 +618,9 @@ class LayoutLoader(plugin.Loader): Returns: list(str): list of container content """ + data = get_current_project_settings() + create_sequences = data["unreal"]["level_sequences_for_layouts"] + # Create directory for asset and avalon container hierarchy = context.get('asset').get('data').get('parents') root = self.ASSET_ROOT @@ -593,81 +641,88 @@ class LayoutLoader(plugin.Loader): EditorAssetLibrary.make_directory(asset_dir) - # Create map for the shot, and create hierarchy of map. If the maps - # already exist, we will use them. - h_dir = hierarchy_dir_list[0] - h_asset = hierarchy[0] - master_level = f"{h_dir}/{h_asset}_map.{h_asset}_map" - if not EditorAssetLibrary.does_asset_exist(master_level): - EditorLevelLibrary.new_level(f"{h_dir}/{h_asset}_map") + master_level = None + shot = None + sequences = [] level = f"{asset_dir}/{asset}_map.{asset}_map" EditorLevelLibrary.new_level(f"{asset_dir}/{asset}_map") - EditorLevelLibrary.load_level(master_level) - EditorLevelUtils.add_level_to_world( - EditorLevelLibrary.get_editor_world(), - level, - unreal.LevelStreamingDynamic - ) - EditorLevelLibrary.save_all_dirty_levels() - EditorLevelLibrary.load_level(level) + if create_sequences: + # Create map for the shot, and create hierarchy of map. If the + # maps already exist, we will use them. + if hierarchy: + h_dir = hierarchy_dir_list[0] + h_asset = hierarchy[0] + master_level = f"{h_dir}/{h_asset}_map.{h_asset}_map" + if not EditorAssetLibrary.does_asset_exist(master_level): + EditorLevelLibrary.new_level(f"{h_dir}/{h_asset}_map") - # Get all the sequences in the hierarchy. It will create them, if - # they don't exist. - sequences = [] - frame_ranges = [] - for (h_dir, h) in zip(hierarchy_dir_list, hierarchy): - root_content = EditorAssetLibrary.list_assets( - h_dir, recursive=False, include_folder=False) + if master_level: + EditorLevelLibrary.load_level(master_level) + EditorLevelUtils.add_level_to_world( + EditorLevelLibrary.get_editor_world(), + level, + unreal.LevelStreamingDynamic + ) + EditorLevelLibrary.save_all_dirty_levels() + EditorLevelLibrary.load_level(level) - existing_sequences = [ - EditorAssetLibrary.find_asset_data(asset) - for asset in root_content - if EditorAssetLibrary.find_asset_data( - asset).get_class().get_name() == 'LevelSequence' - ] + # Get all the sequences in the hierarchy. It will create them, if + # they don't exist. + frame_ranges = [] + for (h_dir, h) in zip(hierarchy_dir_list, hierarchy): + root_content = EditorAssetLibrary.list_assets( + h_dir, recursive=False, include_folder=False) - if not existing_sequences: - sequence, frame_range = self._generate_sequence(h, h_dir) + existing_sequences = [ + EditorAssetLibrary.find_asset_data(asset) + for asset in root_content + if EditorAssetLibrary.find_asset_data( + asset).get_class().get_name() == 'LevelSequence' + ] - sequences.append(sequence) - frame_ranges.append(frame_range) - else: - for e in existing_sequences: - sequences.append(e.get_asset()) - frame_ranges.append(( - e.get_asset().get_playback_start(), - e.get_asset().get_playback_end())) + if not existing_sequences: + sequence, frame_range = self._generate_sequence(h, h_dir) - shot = tools.create_asset( - asset_name=asset, - package_path=asset_dir, - asset_class=unreal.LevelSequence, - factory=unreal.LevelSequenceFactoryNew() - ) + sequences.append(sequence) + frame_ranges.append(frame_range) + else: + for e in existing_sequences: + sequences.append(e.get_asset()) + frame_ranges.append(( + e.get_asset().get_playback_start(), + e.get_asset().get_playback_end())) - # sequences and frame_ranges have the same length - for i in range(0, len(sequences) - 1): - self._set_sequence_hierarchy( - sequences[i], sequences[i + 1], - frame_ranges[i][1], - frame_ranges[i + 1][0], frame_ranges[i + 1][1], - [level]) + shot = tools.create_asset( + asset_name=asset, + package_path=asset_dir, + asset_class=unreal.LevelSequence, + factory=unreal.LevelSequenceFactoryNew() + ) - project_name = legacy_io.active_project() - data = get_asset_by_name(project_name, asset)["data"] - shot.set_display_rate( - unreal.FrameRate(data.get("fps"), 1.0)) - shot.set_playback_start(0) - shot.set_playback_end(data.get('clipOut') - data.get('clipIn') + 1) - self._set_sequence_hierarchy( - sequences[-1], shot, - frame_ranges[-1][1], - data.get('clipIn'), data.get('clipOut'), - [level]) + # sequences and frame_ranges have the same length + for i in range(0, len(sequences) - 1): + self._set_sequence_hierarchy( + sequences[i], sequences[i + 1], + frame_ranges[i][1], + frame_ranges[i + 1][0], frame_ranges[i + 1][1], + [level]) - EditorLevelLibrary.load_level(level) + project_name = legacy_io.active_project() + data = get_asset_by_name(project_name, asset)["data"] + shot.set_display_rate( + unreal.FrameRate(data.get("fps"), 1.0)) + shot.set_playback_start(0) + shot.set_playback_end(data.get('clipOut') - data.get('clipIn') + 1) + if sequences: + self._set_sequence_hierarchy( + sequences[-1], shot, + frame_ranges[-1][1], + data.get('clipIn'), data.get('clipOut'), + [level]) + + EditorLevelLibrary.load_level(level) loaded_assets = self._process(self.fname, asset_dir, shot) @@ -702,32 +757,47 @@ class LayoutLoader(plugin.Loader): for a in asset_content: EditorAssetLibrary.save_asset(a) - EditorLevelLibrary.load_level(master_level) + if master_level: + EditorLevelLibrary.load_level(master_level) return asset_content def update(self, container, representation): + data = get_current_project_settings() + create_sequences = data["unreal"]["level_sequences_for_layouts"] + ar = unreal.AssetRegistryHelpers.get_asset_registry() root = "/Game/OpenPype" asset_dir = container.get('namespace') - context = representation.get("context") - hierarchy = context.get('hierarchy').split("/") - h_dir = f"{root}/{hierarchy[0]}" - h_asset = hierarchy[0] - master_level = f"{h_dir}/{h_asset}_map.{h_asset}_map" + sequence = None + master_level = None - # # Create a temporary level to delete the layout level. - # EditorLevelLibrary.save_all_dirty_levels() - # EditorAssetLibrary.make_directory(f"{root}/tmp") - # tmp_level = f"{root}/tmp/temp_map" - # if not EditorAssetLibrary.does_asset_exist(f"{tmp_level}.temp_map"): - # EditorLevelLibrary.new_level(tmp_level) - # else: - # EditorLevelLibrary.load_level(tmp_level) + if create_sequences: + hierarchy = context.get('hierarchy').split("/") + h_dir = f"{root}/{hierarchy[0]}" + h_asset = hierarchy[0] + master_level = f"{h_dir}/{h_asset}_map.{h_asset}_map" + + filter = unreal.ARFilter( + class_names=["LevelSequence"], + package_paths=[asset_dir], + recursive_paths=False) + sequences = ar.get_assets(filter) + sequence = sequences[0].get_asset() + + prev_level = None + + if not master_level: + curr_level = unreal.LevelEditorSubsystem().get_current_level() + curr_level_path = curr_level.get_outer().get_path_name() + # If the level path does not start with "/Game/", the current + # level is a temporary, unsaved level. + if curr_level_path.startswith("/Game/"): + prev_level = curr_level_path # Get layout level filter = unreal.ARFilter( @@ -735,11 +805,6 @@ class LayoutLoader(plugin.Loader): package_paths=[asset_dir], recursive_paths=False) levels = ar.get_assets(filter) - filter = unreal.ARFilter( - class_names=["LevelSequence"], - package_paths=[asset_dir], - recursive_paths=False) - sequences = ar.get_assets(filter) layout_level = levels[0].get_editor_property('object_path') @@ -751,14 +816,14 @@ class LayoutLoader(plugin.Loader): for actor in actors: unreal.EditorLevelLibrary.destroy_actor(actor) - EditorLevelLibrary.save_current_level() + if create_sequences: + EditorLevelLibrary.save_current_level() EditorAssetLibrary.delete_directory(f"{asset_dir}/animations/") source_path = get_representation_path(representation) - loaded_assets = self._process( - source_path, asset_dir, sequences[0].get_asset()) + loaded_assets = self._process(source_path, asset_dir, sequence) data = { "representation": str(representation["_id"]), @@ -776,13 +841,20 @@ class LayoutLoader(plugin.Loader): for a in asset_content: EditorAssetLibrary.save_asset(a) - EditorLevelLibrary.load_level(master_level) + if master_level: + EditorLevelLibrary.load_level(master_level) + elif prev_level: + EditorLevelLibrary.load_level(prev_level) def remove(self, container): """ Delete the layout. First, check if the assets loaded with the layout are used by other layouts. If not, delete the assets. """ + data = get_current_project_settings() + create_sequences = data["unreal"]["level_sequences_for_layouts"] + + root = "/Game/OpenPype" path = Path(container.get("namespace")) containers = unreal_pipeline.ls() @@ -793,7 +865,7 @@ class LayoutLoader(plugin.Loader): # Check if the assets have been loaded by other layouts, and deletes # them if they haven't. - for asset in container.get('loaded_assets'): + for asset in eval(container.get('loaded_assets')): layouts = [ lc for lc in layout_containers if asset in lc.get('loaded_assets')] @@ -801,71 +873,87 @@ class LayoutLoader(plugin.Loader): if not layouts: EditorAssetLibrary.delete_directory(str(Path(asset).parent)) - # Remove the Level Sequence from the parent. - # We need to traverse the hierarchy from the master sequence to find - # the level sequence. - root = "/Game/OpenPype" - namespace = container.get('namespace').replace(f"{root}/", "") - ms_asset = namespace.split('/')[0] - ar = unreal.AssetRegistryHelpers.get_asset_registry() - _filter = unreal.ARFilter( - class_names=["LevelSequence"], - package_paths=[f"{root}/{ms_asset}"], - recursive_paths=False) - sequences = ar.get_assets(_filter) - master_sequence = sequences[0].get_asset() - _filter = unreal.ARFilter( - class_names=["World"], - package_paths=[f"{root}/{ms_asset}"], - recursive_paths=False) - levels = ar.get_assets(_filter) - master_level = levels[0].get_editor_property('object_path') + # Delete the parent folder if there aren't any more + # layouts in it. + asset_content = EditorAssetLibrary.list_assets( + str(Path(asset).parent.parent), recursive=False, + include_folder=True + ) - sequences = [master_sequence] + if len(asset_content) == 0: + EditorAssetLibrary.delete_directory( + str(Path(asset).parent.parent)) - parent = None - for s in sequences: - tracks = s.get_master_tracks() - subscene_track = None - visibility_track = None - for t in tracks: - if t.get_class() == unreal.MovieSceneSubTrack.static_class(): - subscene_track = t - if (t.get_class() == - unreal.MovieSceneLevelVisibilityTrack.static_class()): - visibility_track = t - if subscene_track: - sections = subscene_track.get_sections() - for ss in sections: - if ss.get_sequence().get_name() == container.get('asset'): - parent = s - subscene_track.remove_section(ss) - break - sequences.append(ss.get_sequence()) - # Update subscenes indexes. - i = 0 - for ss in sections: - ss.set_row_index(i) - i += 1 + master_sequence = None + master_level = None + sequences = [] - if visibility_track: - sections = visibility_track.get_sections() - for ss in sections: - if (unreal.Name(f"{container.get('asset')}_map") - in ss.get_level_names()): - visibility_track.remove_section(ss) - # Update visibility sections indexes. - i = -1 - prev_name = [] - for ss in sections: - if prev_name != ss.get_level_names(): + if create_sequences: + # Remove the Level Sequence from the parent. + # We need to traverse the hierarchy from the master sequence to + # find the level sequence. + namespace = container.get('namespace').replace(f"{root}/", "") + ms_asset = namespace.split('/')[0] + ar = unreal.AssetRegistryHelpers.get_asset_registry() + _filter = unreal.ARFilter( + class_names=["LevelSequence"], + package_paths=[f"{root}/{ms_asset}"], + recursive_paths=False) + sequences = ar.get_assets(_filter) + master_sequence = sequences[0].get_asset() + _filter = unreal.ARFilter( + class_names=["World"], + package_paths=[f"{root}/{ms_asset}"], + recursive_paths=False) + levels = ar.get_assets(_filter) + master_level = levels[0].get_editor_property('object_path') + + sequences = [master_sequence] + + parent = None + for s in sequences: + tracks = s.get_master_tracks() + subscene_track = None + visibility_track = None + for t in tracks: + if t.get_class() == MovieSceneSubTrack.static_class(): + subscene_track = t + if (t.get_class() == + MovieSceneLevelVisibilityTrack.static_class()): + visibility_track = t + if subscene_track: + sections = subscene_track.get_sections() + for ss in sections: + if (ss.get_sequence().get_name() == + container.get('asset')): + parent = s + subscene_track.remove_section(ss) + break + sequences.append(ss.get_sequence()) + # Update subscenes indexes. + i = 0 + for ss in sections: + ss.set_row_index(i) i += 1 - ss.set_row_index(i) - prev_name = ss.get_level_names() - if parent: - break - assert parent, "Could not find the parent sequence" + if visibility_track: + sections = visibility_track.get_sections() + for ss in sections: + if (unreal.Name(f"{container.get('asset')}_map") + in ss.get_level_names()): + visibility_track.remove_section(ss) + # Update visibility sections indexes. + i = -1 + prev_name = [] + for ss in sections: + if prev_name != ss.get_level_names(): + i += 1 + ss.set_row_index(i) + prev_name = ss.get_level_names() + if parent: + break + + assert parent, "Could not find the parent sequence" # Create a temporary level to delete the layout level. EditorLevelLibrary.save_all_dirty_levels() @@ -879,10 +967,9 @@ class LayoutLoader(plugin.Loader): # Delete the layout directory. EditorAssetLibrary.delete_directory(str(path)) - EditorLevelLibrary.load_level(master_level) - EditorAssetLibrary.delete_directory(f"{root}/tmp") - - EditorLevelLibrary.save_current_level() + if create_sequences: + EditorLevelLibrary.load_level(master_level) + EditorAssetLibrary.delete_directory(f"{root}/tmp") # Delete the parent folder if there aren't any more layouts in it. asset_content = EditorAssetLibrary.list_assets( diff --git a/openpype/lib/applications.py b/openpype/lib/applications.py index 8c92665366..074e815160 100644 --- a/openpype/lib/applications.py +++ b/openpype/lib/applications.py @@ -957,32 +957,24 @@ class ApplicationLaunchContext: # TODO load additional studio paths from settings import openpype - pype_dir = os.path.dirname(os.path.abspath(openpype.__file__)) + openpype_dir = os.path.dirname(os.path.abspath(openpype.__file__)) - # --- START: Backwards compatibility --- - hooks_dir = os.path.join(pype_dir, "hooks") + global_hooks_dir = os.path.join(openpype_dir, "hooks") - subfolder_names = ["global"] - if self.host_name: - subfolder_names.append(self.host_name) - for subfolder_name in subfolder_names: - path = os.path.join(hooks_dir, subfolder_name) - if ( - os.path.exists(path) - and os.path.isdir(path) - and path not in paths - ): - paths.append(path) - # --- END: Backwards compatibility --- - - subfolders_list = [ - ["hooks"] + hooks_dirs = [ + global_hooks_dir ] if self.host_name: - subfolders_list.append(["hosts", self.host_name, "hooks"]) + # If host requires launch hooks and is module then launch hooks + # should be collected using 'collect_launch_hook_paths' + # - module have to implement 'get_launch_hook_paths' + host_module = self.modules_manager.get_host_module(self.host_name) + if not host_module: + hooks_dirs.append(os.path.join( + openpype_dir, "hosts", self.host_name, "hooks" + )) - for subfolders in subfolders_list: - path = os.path.join(pype_dir, *subfolders) + for path in hooks_dirs: if ( os.path.exists(path) and os.path.isdir(path) @@ -991,7 +983,9 @@ class ApplicationLaunchContext: paths.append(path) # Load modules paths - paths.extend(self.modules_manager.collect_launch_hook_paths()) + paths.extend( + self.modules_manager.collect_launch_hook_paths(self.application) + ) return paths @@ -1304,6 +1298,7 @@ def get_app_environments_for_context( dict: Environments for passed context and application. """ + from openpype.modules import ModulesManager from openpype.pipeline import AvalonMongoDB, Anatomy # Avalon database connection @@ -1316,8 +1311,6 @@ def get_app_environments_for_context( asset_doc = get_asset_by_name(project_name, asset_name) if modules_manager is None: - from openpype.modules import ModulesManager - modules_manager = ModulesManager() # Prepare app object which can be obtained only from ApplciationManager @@ -1344,7 +1337,7 @@ def get_app_environments_for_context( }) prepare_app_environments(data, env_group, modules_manager) - prepare_context_environments(data, env_group) + prepare_context_environments(data, env_group, modules_manager) # Discard avalon connection dbcon.uninstall() @@ -1503,8 +1496,10 @@ def prepare_app_environments( final_env = None # Add host specific environments if app.host_name and implementation_envs: - module = __import__("openpype.hosts", fromlist=[app.host_name]) - host_module = getattr(module, app.host_name, None) + host_module = modules_manager.get_host_module(app.host_name) + if not host_module: + module = __import__("openpype.hosts", fromlist=[app.host_name]) + host_module = getattr(module, app.host_name, None) add_implementation_envs = None if host_module: add_implementation_envs = getattr( @@ -1563,7 +1558,7 @@ def apply_project_environments_value( return env -def prepare_context_environments(data, env_group=None): +def prepare_context_environments(data, env_group=None, modules_manager=None): """Modify launch environments with context data for launched host. Args: @@ -1658,10 +1653,10 @@ def prepare_context_environments(data, env_group=None): data["env"]["AVALON_APP"] = app.host_name data["env"]["AVALON_WORKDIR"] = workdir - _prepare_last_workfile(data, workdir) + _prepare_last_workfile(data, workdir, modules_manager) -def _prepare_last_workfile(data, workdir): +def _prepare_last_workfile(data, workdir, modules_manager): """last workfile workflow preparation. Function check if should care about last workfile workflow and tries @@ -1676,8 +1671,13 @@ def _prepare_last_workfile(data, workdir): result will be stored. workdir (str): Path to folder where workfiles should be stored. """ + + from openpype.modules import ModulesManager from openpype.pipeline import HOST_WORKFILE_EXTENSIONS + if not modules_manager: + modules_manager = ModulesManager() + log = data["log"] _workdir_data = data.get("workdir_data") @@ -1725,7 +1725,12 @@ def _prepare_last_workfile(data, workdir): # Last workfile path last_workfile_path = data.get("last_workfile_path") or "" if not last_workfile_path: - extensions = HOST_WORKFILE_EXTENSIONS.get(app.host_name) + host_module = modules_manager.get_host_module(app.host_name) + if host_module: + extensions = host_module.get_workfile_extensions() + else: + extensions = HOST_WORKFILE_EXTENSIONS.get(app.host_name) + if extensions: from openpype.pipeline.workfile import ( get_workfile_template_key, diff --git a/openpype/lib/avalon_context.py b/openpype/lib/avalon_context.py index 3c7e9b6289..eed17fce9d 100644 --- a/openpype/lib/avalon_context.py +++ b/openpype/lib/avalon_context.py @@ -1,11 +1,9 @@ """Should be used only inside of hosts.""" import os -import json import re import copy import platform import logging -import collections import functools import warnings @@ -13,13 +11,9 @@ from openpype.client import ( get_project, get_assets, get_asset_by_name, - get_subsets, - get_last_versions, get_last_version_by_subset_name, - get_representations, get_workfile_info, ) -from openpype.settings import get_project_settings from .profiles_filtering import filter_profiles from .events import emit_event from .path_templates import StringTemplate diff --git a/openpype/modules/base.py b/openpype/modules/base.py index 1bd343fd07..e26075283d 100644 --- a/openpype/modules/base.py +++ b/openpype/modules/base.py @@ -140,7 +140,7 @@ class _LoadCache: def get_default_modules_dir(): """Path to default OpenPype modules.""" - current_dir = os.path.abspath(os.path.dirname(__file__)) + current_dir = os.path.dirname(os.path.abspath(__file__)) output = [] for folder_name in ("default_modules", ): @@ -298,6 +298,8 @@ def _load_modules(): # Add current directory at first place # - has small differences in import logic current_dir = os.path.abspath(os.path.dirname(__file__)) + hosts_dir = os.path.join(os.path.dirname(current_dir), "hosts") + module_dirs.insert(0, hosts_dir) module_dirs.insert(0, current_dir) processed_paths = set() @@ -314,6 +316,7 @@ def _load_modules(): continue is_in_current_dir = dirpath == current_dir + is_in_host_dir = dirpath == hosts_dir for filename in os.listdir(dirpath): # Ignore filenames if filename in IGNORED_FILENAMES: @@ -353,6 +356,24 @@ def _load_modules(): sys.modules[new_import_str] = default_module setattr(openpype_modules, basename, default_module) + elif is_in_host_dir: + import_str = "openpype.hosts.{}".format(basename) + new_import_str = "{}.{}".format(modules_key, basename) + # Until all hosts are converted to be able use them as + # modules is this error check needed + try: + default_module = __import__( + import_str, fromlist=("", ) + ) + sys.modules[new_import_str] = default_module + setattr(openpype_modules, basename, default_module) + + except Exception: + log.warning( + "Failed to import host folder {}".format(basename), + exc_info=True + ) + elif os.path.isdir(fullpath): import_module_from_dirpath(dirpath, filename, modules_key) @@ -768,24 +789,50 @@ class ModulesManager: output.extend(paths) return output - def collect_launch_hook_paths(self): - """Helper to collect hooks from modules inherited ILaunchHookPaths. + def collect_launch_hook_paths(self, app): + """Helper to collect application launch hooks. + + It used to be based on 'ILaunchHookPaths' which is not true anymore. + Module just have to have implemented 'get_launch_hook_paths' method. + + Args: + app (Application): Application object which can be used for + filtering of which launch hook paths are returned. Returns: list: Paths to launch hook directories. """ - from openpype_interfaces import ILaunchHookPaths str_type = type("") expected_types = (list, tuple, set) output = [] for module in self.get_enabled_modules(): - # Skip module that do not inherit from `ILaunchHookPaths` - if not isinstance(module, ILaunchHookPaths): + # Skip module if does not have implemented 'get_launch_hook_paths' + func = getattr(module, "get_launch_hook_paths", None) + if func is None: + continue + + func = module.get_launch_hook_paths + if hasattr(inspect, "signature"): + sig = inspect.signature(func) + expect_args = len(sig.parameters) > 0 + else: + expect_args = len(inspect.getargspec(func)[0]) > 0 + + # Pass application argument if method expect it. + try: + if expect_args: + hook_paths = func(app) + else: + hook_paths = func() + except Exception: + self.log.warning( + "Failed to call 'get_launch_hook_paths'", + exc_info=True + ) continue - hook_paths = module.get_launch_hook_paths() if not hook_paths: continue @@ -804,6 +851,45 @@ class ModulesManager: output.extend(hook_paths) return output + def get_host_module(self, host_name): + """Find host module by host name. + + Args: + host_name (str): Host name for which is found host module. + + Returns: + OpenPypeModule: Found host module by name. + None: There was not found module inheriting IHostModule which has + host name set to passed 'host_name'. + """ + + from openpype_interfaces import IHostModule + + for module in self.get_enabled_modules(): + if ( + isinstance(module, IHostModule) + and module.host_name == host_name + ): + return module + return None + + def get_host_names(self): + """List of available host names based on host modules. + + Returns: + Iterable[str]: All available host names based on enabled modules + inheriting 'IHostModule'. + """ + + from openpype_interfaces import IHostModule + + host_names = { + module.host_name + for module in self.get_enabled_modules() + if isinstance(module, IHostModule) + } + return host_names + def print_report(self): """Print out report of time spent on modules initialization parts. diff --git a/openpype/modules/deadline/abstract_submit_deadline.py b/openpype/modules/deadline/abstract_submit_deadline.py index 3f54273a56..0bad981fdf 100644 --- a/openpype/modules/deadline/abstract_submit_deadline.py +++ b/openpype/modules/deadline/abstract_submit_deadline.py @@ -4,6 +4,7 @@ It provides Deadline JobInfo data class. """ +import json.decoder import os from abc import abstractmethod import platform @@ -15,7 +16,12 @@ import attr import requests import pyblish.api -from openpype.pipeline.publish import AbstractMetaInstancePlugin +from openpype.pipeline.publish import ( + AbstractMetaInstancePlugin, + KnownPublishError +) + +JSONDecodeError = getattr(json.decoder, "JSONDecodeError", ValueError) def requests_post(*args, **kwargs): @@ -615,7 +621,7 @@ class AbstractSubmitDeadline(pyblish.api.InstancePlugin): str: resulting Deadline job id. Throws: - RuntimeError: if submission fails. + KnownPublishError: if submission fails. """ url = "{}/api/jobs".format(self._deadline_url) @@ -625,9 +631,16 @@ class AbstractSubmitDeadline(pyblish.api.InstancePlugin): self.log.error(response.status_code) self.log.error(response.content) self.log.debug(payload) - raise RuntimeError(response.text) + raise KnownPublishError(response.text) + + try: + result = response.json() + except JSONDecodeError: + msg = "Broken response {}. ".format(response) + msg += "Try restarting the Deadline Webservice." + self.log.warning(msg, exc_info=True) + raise KnownPublishError("Broken response from DL") - result = response.json() # for submit publish job self._instance.data["deadlineSubmissionJob"] = result diff --git a/openpype/modules/deadline/plugins/publish/submit_maya_deadline.py b/openpype/modules/deadline/plugins/publish/submit_maya_deadline.py index f253ceb21a..7966861358 100644 --- a/openpype/modules/deadline/plugins/publish/submit_maya_deadline.py +++ b/openpype/modules/deadline/plugins/publish/submit_maya_deadline.py @@ -62,6 +62,7 @@ payload_skeleton_template = { "RenderLayer": None, # Render only this layer "Renderer": None, "ProjectPath": None, # Resolve relative references + "RenderSetupIncludeLights": None, # Include all lights flag. }, "AuxFiles": [] # Mandatory for Deadline, may be empty } @@ -413,8 +414,7 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin): # Gather needed data ------------------------------------------------ default_render_file = instance.context.data.get('project_settings')\ .get('maya')\ - .get('create')\ - .get('CreateRender')\ + .get('RenderSettings')\ .get('default_render_image_folder') filename = os.path.basename(filepath) comment = context.data.get("comment", "") @@ -505,6 +505,7 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin): self.payload_skeleton["JobInfo"]["Comment"] = comment self.payload_skeleton["PluginInfo"]["RenderLayer"] = renderlayer + self.payload_skeleton["PluginInfo"]["RenderSetupIncludeLights"] = instance.data.get("renderSetupIncludeLights") # noqa # Adding file dependencies. dependencies = instance.context.data["fileDependencies"] dependencies.append(filepath) diff --git a/openpype/modules/deadline/repository/custom/plugins/GlobalJobPreLoad.py b/openpype/modules/deadline/repository/custom/plugins/GlobalJobPreLoad.py index 172649c951..61b95cf06d 100644 --- a/openpype/modules/deadline/repository/custom/plugins/GlobalJobPreLoad.py +++ b/openpype/modules/deadline/repository/custom/plugins/GlobalJobPreLoad.py @@ -34,7 +34,7 @@ def get_openpype_version_from_path(path, build=True): # if only builds are requested if build and not os.path.isfile(exe): # noqa: E501 - print(f" ! path is not a build: {path}") + print(" ! path is not a build: {}".format(path)) return None version = {} @@ -70,11 +70,12 @@ def inject_openpype_environment(deadlinePlugin): # lets go over all available and find compatible build. requested_version = job.GetJobEnvironmentKeyValue("OPENPYPE_VERSION") if requested_version: - print((">>> Scanning for compatible requested " - f"version {requested_version}")) + print(( + ">>> Scanning for compatible requested version {}" + ).format(requested_version)) install_dir = DirectoryUtils.SearchDirectoryList(dir_list) if install_dir: - print(f"--- Looking for OpenPype at: {install_dir}") + print("--- Looking for OpenPype at: {}".format(install_dir)) sub_dirs = [ f.path for f in os.scandir(install_dir) if f.is_dir() @@ -83,18 +84,20 @@ def inject_openpype_environment(deadlinePlugin): version = get_openpype_version_from_path(subdir) if not version: continue - print(f" - found: {version} - {subdir}") + print(" - found: {} - {}".format(version, subdir)) openpype_versions.append((version, subdir)) exe = FileUtils.SearchFileList(exe_list) if openpype_versions: # if looking for requested compatible version, # add the implicitly specified to the list too. - print(f"Looking for OpenPype at: {os.path.dirname(exe)}") + print("Looking for OpenPype at: {}".format(os.path.dirname(exe))) version = get_openpype_version_from_path( os.path.dirname(exe)) if version: - print(f" - found: {version} - {os.path.dirname(exe)}") + print(" - found: {} - {}".format( + version, os.path.dirname(exe) + )) openpype_versions.append((version, os.path.dirname(exe))) if requested_version: @@ -106,8 +109,9 @@ def inject_openpype_environment(deadlinePlugin): int(t) if t.isdigit() else t.lower() for t in re.split(r"(\d+)", ver[0]) ]) - print(("*** Latest available version found is " - f"{openpype_versions[-1][0]}")) + print(( + "*** Latest available version found is {}" + ).format(openpype_versions[-1][0])) requested_major, requested_minor, _ = requested_version.split(".")[:3] # noqa: E501 compatible_versions = [] for version in openpype_versions: @@ -127,8 +131,9 @@ def inject_openpype_environment(deadlinePlugin): int(t) if t.isdigit() else t.lower() for t in re.split(r"(\d+)", ver[0]) ]) - print(("*** Latest compatible version found is " - f"{compatible_versions[-1][0]}")) + print(( + "*** Latest compatible version found is {}" + ).format(compatible_versions[-1][0])) # create list of executables for different platform and let # Deadline decide. exe_list = [ @@ -234,78 +239,6 @@ def inject_render_job_id(deadlinePlugin): print(">>> Injection end.") -def pype_command_line(executable, arguments, workingDirectory): - """Remap paths in comand line argument string. - - Using Deadline rempper it will remap all path found in command-line. - - Args: - executable (str): path to executable - arguments (str): arguments passed to executable - workingDirectory (str): working directory path - - Returns: - Tuple(executable, arguments, workingDirectory) - - """ - print("-" * 40) - print("executable: {}".format(executable)) - print("arguments: {}".format(arguments)) - print("workingDirectory: {}".format(workingDirectory)) - print("-" * 40) - print("Remapping arguments ...") - arguments = RepositoryUtils.CheckPathMapping(arguments) - print("* {}".format(arguments)) - print("-" * 40) - return executable, arguments, workingDirectory - - -def pype(deadlinePlugin): - """Remaps `PYPE_METADATA_FILE` and `PYPE_PYTHON_EXE` environment vars. - - `PYPE_METADATA_FILE` is used on farm to point to rendered data. This path - originates on platform from which this job was published. To be able to - publish on different platform, this path needs to be remapped. - - `PYPE_PYTHON_EXE` can be used to specify custom location of python - interpreter to use for Pype. This is remappeda also if present even - though it probably doesn't make much sense. - - Arguments: - deadlinePlugin: Deadline job plugin passed by Deadline - - """ - print(">>> Getting job ...") - job = deadlinePlugin.GetJob() - # PYPE should be here, not OPENPYPE - backward compatibility!! - pype_metadata = job.GetJobEnvironmentKeyValue("PYPE_METADATA_FILE") - pype_python = job.GetJobEnvironmentKeyValue("PYPE_PYTHON_EXE") - print(">>> Having backward compatible env vars {}/{}".format(pype_metadata, - pype_python)) - # test if it is pype publish job. - if pype_metadata: - pype_metadata = RepositoryUtils.CheckPathMapping(pype_metadata) - if platform.system().lower() == "linux": - pype_metadata = pype_metadata.replace("\\", "/") - - print("- remapping PYPE_METADATA_FILE: {}".format(pype_metadata)) - job.SetJobEnvironmentKeyValue("PYPE_METADATA_FILE", pype_metadata) - deadlinePlugin.SetProcessEnvironmentVariable( - "PYPE_METADATA_FILE", pype_metadata) - - if pype_python: - pype_python = RepositoryUtils.CheckPathMapping(pype_python) - if platform.system().lower() == "linux": - pype_python = pype_python.replace("\\", "/") - - print("- remapping PYPE_PYTHON_EXE: {}".format(pype_python)) - job.SetJobEnvironmentKeyValue("PYPE_PYTHON_EXE", pype_python) - deadlinePlugin.SetProcessEnvironmentVariable( - "PYPE_PYTHON_EXE", pype_python) - - deadlinePlugin.ModifyCommandLineCallback += pype_command_line - - def __main__(deadlinePlugin): print("*** GlobalJobPreload start ...") print(">>> Getting job ...") @@ -329,5 +262,3 @@ def __main__(deadlinePlugin): inject_render_job_id(deadlinePlugin) elif openpype_render_job == '1' or openpype_remote_job == '1': inject_openpype_environment(deadlinePlugin) - else: - pype(deadlinePlugin) # backward compatibility with Pype2 diff --git a/openpype/modules/ftrack/event_handlers_server/event_sync_to_avalon.py b/openpype/modules/ftrack/event_handlers_server/event_sync_to_avalon.py index a4e791aaf0..738181dc9a 100644 --- a/openpype/modules/ftrack/event_handlers_server/event_sync_to_avalon.py +++ b/openpype/modules/ftrack/event_handlers_server/event_sync_to_avalon.py @@ -697,13 +697,22 @@ class SyncToAvalonEvent(BaseEvent): continue auto_sync = changes[CUST_ATTR_AUTO_SYNC]["new"] - if auto_sync == "1": + turned_on = auto_sync == "1" + ft_project = self.cur_project + username = self._get_username(session, event) + message = ( + "Auto sync was turned {} for project \"{}\" by \"{}\"." + ).format( + "on" if turned_on else "off", + ft_project["full_name"], + username + ) + if turned_on: + message += " Triggering syncToAvalon action." + self.log.debug(message) + + if turned_on: # Trigger sync to avalon action if auto sync was turned on - ft_project = self.cur_project - self.log.debug(( - "Auto sync was turned on for project <{}>." - " Triggering syncToAvalon action." - ).format(ft_project["full_name"])) selection = [{ "entityId": ft_project["id"], "entityType": "show" @@ -851,6 +860,26 @@ class SyncToAvalonEvent(BaseEvent): self.report() return True + def _get_username(self, session, event): + username = "Unknown" + event_source = event.get("source") + if not event_source: + return username + user_info = event_source.get("user") + if not user_info: + return username + user_id = user_info.get("id") + if not user_id: + return username + + user_entity = session.query( + "User where id is {}".format(user_id) + ).first() + if user_entity: + username = user_entity["username"] or username + return username + + def process_removed(self): """ Handles removed entities (not removed tasks - handle separately). diff --git a/openpype/modules/ftrack/plugins/publish/collect_ftrack_api.py b/openpype/modules/ftrack/plugins/publish/collect_ftrack_api.py index 14da188150..e13b7e65cd 100644 --- a/openpype/modules/ftrack/plugins/publish/collect_ftrack_api.py +++ b/openpype/modules/ftrack/plugins/publish/collect_ftrack_api.py @@ -105,11 +105,17 @@ class CollectFtrackApi(pyblish.api.ContextPlugin): context.data["ftrackEntity"] = asset_entity context.data["ftrackTask"] = task_entity - self.per_instance_process(context, asset_name, task_name) + self.per_instance_process(context, asset_entity, task_entity) def per_instance_process( - self, context, context_asset_name, context_task_name + self, context, context_asset_entity, context_task_entity ): + context_task_name = None + context_asset_name = None + if context_asset_entity: + context_asset_name = context_asset_entity["name"] + if context_task_entity: + context_task_name = context_task_entity["name"] instance_by_asset_and_task = {} for instance in context: self.log.debug( @@ -120,6 +126,8 @@ class CollectFtrackApi(pyblish.api.ContextPlugin): if not instance_asset_name and not instance_task_name: self.log.debug("Instance does not have set context keys.") + instance.data["ftrackEntity"] = context_asset_entity + instance.data["ftrackTask"] = context_task_entity continue elif instance_asset_name and instance_task_name: @@ -131,6 +139,8 @@ class CollectFtrackApi(pyblish.api.ContextPlugin): "Instance's context is same as in publish context." " Asset: {} | Task: {}" ).format(context_asset_name, context_task_name)) + instance.data["ftrackEntity"] = context_asset_entity + instance.data["ftrackTask"] = context_task_entity continue asset_name = instance_asset_name task_name = instance_task_name @@ -141,6 +151,8 @@ class CollectFtrackApi(pyblish.api.ContextPlugin): "Instance's context task is same as in publish" " context. Task: {}" ).format(context_task_name)) + instance.data["ftrackEntity"] = context_asset_entity + instance.data["ftrackTask"] = context_task_entity continue asset_name = context_asset_name @@ -152,6 +164,8 @@ class CollectFtrackApi(pyblish.api.ContextPlugin): "Instance's context asset is same as in publish" " context. Asset: {}" ).format(context_asset_name)) + instance.data["ftrackEntity"] = context_asset_entity + instance.data["ftrackTask"] = context_task_entity continue # Do not use context's task name diff --git a/openpype/modules/ftrack/plugins/publish/integrate_ftrack_farm_status.py b/openpype/modules/ftrack/plugins/publish/integrate_ftrack_farm_status.py new file mode 100644 index 0000000000..ab5738c33f --- /dev/null +++ b/openpype/modules/ftrack/plugins/publish/integrate_ftrack_farm_status.py @@ -0,0 +1,150 @@ +import pyblish.api +from openpype.lib import filter_profiles + + +class IntegrateFtrackFarmStatus(pyblish.api.ContextPlugin): + """Change task status when should be published on farm. + + Instance which has set "farm" key in data to 'True' is considered as will + be rendered on farm thus it's status should be changed. + """ + + order = pyblish.api.IntegratorOrder + 0.48 + label = "Integrate Ftrack Farm Status" + + farm_status_profiles = [] + + def process(self, context): + # Quick end + if not self.farm_status_profiles: + project_name = context.data["projectName"] + self.log.info(( + "Status profiles are not filled for project \"{}\". Skipping" + ).format(project_name)) + return + + filtered_instances = self.filter_instances(context) + instances_with_status_names = self.get_instances_with_statuse_names( + context, filtered_instances + ) + if instances_with_status_names: + self.fill_statuses(context, instances_with_status_names) + + def filter_instances(self, context): + filtered_instances = [] + for instance in context: + # Skip disabled instances + if instance.data.get("publish") is False: + continue + subset_name = instance.data["subset"] + msg_start = "Skipping instance {}.".format(subset_name) + if not instance.data.get("farm"): + self.log.debug( + "{} Won't be rendered on farm.".format(msg_start) + ) + continue + + task_entity = instance.data.get("ftrackTask") + if not task_entity: + self.log.debug( + "{} Does not have filled task".format(msg_start) + ) + continue + + filtered_instances.append(instance) + return filtered_instances + + def get_instances_with_statuse_names(self, context, instances): + instances_with_status_names = [] + for instance in instances: + family = instance.data["family"] + subset_name = instance.data["subset"] + task_entity = instance.data["ftrackTask"] + host_name = context.data["hostName"] + task_name = task_entity["name"] + task_type = task_entity["type"]["name"] + status_profile = filter_profiles( + self.farm_status_profiles, + { + "hosts": host_name, + "task_types": task_type, + "task_names": task_name, + "families": family, + "subsets": subset_name, + }, + logger=self.log + ) + if not status_profile: + # There already is log in 'filter_profiles' + continue + + status_name = status_profile["status_name"] + if status_name: + instances_with_status_names.append((instance, status_name)) + return instances_with_status_names + + def fill_statuses(self, context, instances_with_status_names): + # Prepare available task statuses on the project + project_name = context.data["projectName"] + session = context.data["ftrackSession"] + project_entity = session.query(( + "select project_schema from Project where full_name is \"{}\"" + ).format(project_name)).one() + project_schema = project_entity["project_schema"] + + task_type_ids = set() + for item in instances_with_status_names: + instance, _ = item + task_entity = instance.data["ftrackTask"] + task_type_ids.add(task_entity["type"]["id"]) + + task_statuses_by_type_id = { + task_type_id: project_schema.get_statuses("Task", task_type_id) + for task_type_id in task_type_ids + } + + # Keep track if anything has changed + skipped_status_names = set() + status_changed = False + for item in instances_with_status_names: + instance, status_name = item + task_entity = instance.data["ftrackTask"] + task_statuses = task_statuses_by_type_id[task_entity["type"]["id"]] + status_name_low = status_name.lower() + + status_id = None + status_name = None + # Skip if status name was already tried to be found + for status in task_statuses: + if status["name"].lower() == status_name_low: + status_id = status["id"] + status_name = status["name"] + break + + if status_id is None: + if status_name_low not in skipped_status_names: + skipped_status_names.add(status_name_low) + joined_status_names = ", ".join({ + '"{}"'.format(status["name"]) + for status in task_statuses + }) + self.log.warning(( + "Status \"{}\" is not available on project \"{}\"." + " Available statuses are {}" + ).format(status_name, project_name, joined_status_names)) + continue + + # Change task status id + if status_id != task_entity["status_id"]: + task_entity["status_id"] = status_id + status_changed = True + path = "/".join([ + item["name"] + for item in task_entity["link"] + ]) + self.log.debug("Set status \"{}\" to \"{}\"".format( + status_name, path + )) + + if status_changed: + session.commit() diff --git a/openpype/modules/ftrack/plugins/publish/integrate_ftrack_instances.py b/openpype/modules/ftrack/plugins/publish/integrate_ftrack_instances.py index a1e5922730..1bf4caac77 100644 --- a/openpype/modules/ftrack/plugins/publish/integrate_ftrack_instances.py +++ b/openpype/modules/ftrack/plugins/publish/integrate_ftrack_instances.py @@ -3,6 +3,7 @@ import json import copy import pyblish.api +from openpype.lib.openpype_version import get_openpype_version from openpype.lib.transcoding import ( get_ffprobe_streams, convert_ffprobe_fps_to_float, @@ -20,6 +21,17 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin): label = "Integrate Ftrack Component" families = ["ftrack"] + metadata_keys_to_label = { + "openpype_version": "OpenPype version", + "frame_start": "Frame start", + "frame_end": "Frame end", + "duration": "Duration", + "width": "Resolution width", + "height": "Resolution height", + "fps": "FPS", + "codec": "Codec" + } + family_mapping = { "camera": "cam", "look": "look", @@ -43,6 +55,7 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin): } keep_first_subset_name_for_review = True asset_versions_status_profiles = {} + additional_metadata_keys = [] def process(self, instance): self.log.debug("instance {}".format(instance)) @@ -105,7 +118,8 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin): "component_data": None, "component_path": None, "component_location": None, - "component_location_name": None + "component_location_name": None, + "additional_data": {} } # Filter types of representations @@ -152,6 +166,7 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin): "name": "thumbnail" } thumbnail_item["thumbnail"] = True + # Create copy of item before setting location src_components_to_add.append(copy.deepcopy(thumbnail_item)) # Create copy of first thumbnail @@ -248,19 +263,15 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin): first_thumbnail_component[ "asset_data"]["name"] = extended_asset_name - component_meta = self._prepare_component_metadata( - instance, repre, repre_path, True - ) - # Change location review_item["component_path"] = repre_path # Change component data review_item["component_data"] = { # Default component name is "main". "name": "ftrackreview-mp4", - "metadata": { - "ftr_meta": json.dumps(component_meta) - } + "metadata": self._prepare_component_metadata( + instance, repre, repre_path, True + ) } if is_first_review_repre: @@ -302,13 +313,9 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin): component_data = copy_src_item["component_data"] component_name = component_data["name"] component_data["name"] = component_name + "_src" - component_meta = self._prepare_component_metadata( + component_data["metadata"] = self._prepare_component_metadata( instance, repre, copy_src_item["component_path"], False ) - if component_meta: - component_data["metadata"] = { - "ftr_meta": json.dumps(component_meta) - } component_list.append(copy_src_item) # Add others representations as component @@ -326,16 +333,12 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin): ): other_item["asset_data"]["name"] = extended_asset_name - component_meta = self._prepare_component_metadata( - instance, repre, published_path, False - ) component_data = { - "name": repre["name"] + "name": repre["name"], + "metadata": self._prepare_component_metadata( + instance, repre, published_path, False + ) } - if component_meta: - component_data["metadata"] = { - "ftr_meta": json.dumps(component_meta) - } other_item["component_data"] = component_data other_item["component_location_name"] = unmanaged_location_name other_item["component_path"] = published_path @@ -354,6 +357,9 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin): )) instance.data["ftrackComponentsList"] = component_list + def _collect_additional_metadata(self, streams): + pass + def _get_repre_path(self, instance, repre, only_published): """Get representation path that can be used for integration. @@ -423,6 +429,11 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin): def _prepare_component_metadata( self, instance, repre, component_path, is_review ): + metadata = {} + if "openpype_version" in self.additional_metadata_keys: + label = self.metadata_keys_to_label["openpype_version"] + metadata[label] = get_openpype_version() + extension = os.path.splitext(component_path)[-1] streams = [] try: @@ -442,13 +453,23 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin): # - exr is special case which can have issues with reading through # ffmpegh but we want to set fps for it if not video_streams and extension not in [".exr"]: - return {} + return metadata stream_width = None stream_height = None stream_fps = None frame_out = None + codec_label = None for video_stream in video_streams: + codec_label = video_stream.get("codec_long_name") + if not codec_label: + codec_label = video_stream.get("codec") + + if codec_label: + pix_fmt = video_stream.get("pix_fmt") + if pix_fmt: + codec_label += " ({})".format(pix_fmt) + tmp_width = video_stream.get("width") tmp_height = video_stream.get("height") if tmp_width and tmp_height: @@ -456,8 +477,8 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin): stream_height = tmp_height input_framerate = video_stream.get("r_frame_rate") - duration = video_stream.get("duration") - if input_framerate is None or duration is None: + stream_duration = video_stream.get("duration") + if input_framerate is None or stream_duration is None: continue try: stream_fps = convert_ffprobe_fps_to_float( @@ -473,9 +494,9 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin): stream_height = tmp_height self.log.debug("FPS from stream is {} and duration is {}".format( - input_framerate, duration + input_framerate, stream_duration )) - frame_out = float(duration) * stream_fps + frame_out = float(stream_duration) * stream_fps break # Prepare FPS @@ -483,43 +504,58 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin): if instance_fps is None: instance_fps = instance.context.data["fps"] - if not is_review: - output = {} - fps = stream_fps or instance_fps - if fps: - output["frameRate"] = fps - - if stream_width and stream_height: - output["width"] = int(stream_width) - output["height"] = int(stream_height) - return output - - frame_start = repre.get("frameStartFtrack") - frame_end = repre.get("frameEndFtrack") - if frame_start is None or frame_end is None: - frame_start = instance.data["frameStart"] - frame_end = instance.data["frameEnd"] - - fps = None repre_fps = repre.get("fps") if repre_fps is not None: repre_fps = float(repre_fps) fps = stream_fps or repre_fps or instance_fps + # Prepare frame ranges + frame_start = repre.get("frameStartFtrack") + frame_end = repre.get("frameEndFtrack") + if frame_start is None or frame_end is None: + frame_start = instance.data["frameStart"] + frame_end = instance.data["frameEnd"] + duration = (frame_end - frame_start) + 1 + + for key, value in [ + ("fps", fps), + ("frame_start", frame_start), + ("frame_end", frame_end), + ("duration", duration), + ("width", stream_width), + ("height", stream_height), + ("fps", fps), + ("codec", codec_label) + ]: + if not value or key not in self.additional_metadata_keys: + continue + label = self.metadata_keys_to_label[key] + metadata[label] = value + + if not is_review: + ftr_meta = {} + if fps: + ftr_meta["frameRate"] = fps + + if stream_width and stream_height: + ftr_meta["width"] = int(stream_width) + ftr_meta["height"] = int(stream_height) + metadata["ftr_meta"] = json.dumps(ftr_meta) + return metadata + # Frame end of uploaded video file should be duration in frames # - frame start is always 0 # - frame end is duration in frames if not frame_out: - frame_out = frame_end - frame_start + 1 + frame_out = duration # Ftrack documentation says that it is required to have # 'width' and 'height' in review component. But with those values # review video does not play. - component_meta = { + metadata["ftr_meta"] = json.dumps({ "frameIn": 0, "frameOut": frame_out, "frameRate": float(fps) - } - - return component_meta + }) + return metadata diff --git a/openpype/modules/ftrack/plugins/publish/integrate_hierarchy_ftrack.py b/openpype/modules/ftrack/plugins/publish/integrate_hierarchy_ftrack.py index b8855ee2bd..8d39baa8d7 100644 --- a/openpype/modules/ftrack/plugins/publish/integrate_hierarchy_ftrack.py +++ b/openpype/modules/ftrack/plugins/publish/integrate_hierarchy_ftrack.py @@ -1,9 +1,12 @@ import sys import collections import six -import pyblish.api from copy import deepcopy + +import pyblish.api + from openpype.client import get_asset_by_id +from openpype.lib import filter_profiles # Copy of constant `openpype_modules.ftrack.lib.avalon_sync.CUST_ATTR_AUTO_SYNC` @@ -73,6 +76,7 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin): "traypublisher" ] optional = False + create_task_status_profiles = [] def process(self, context): self.context = context @@ -82,14 +86,16 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin): hierarchy_context = self._get_active_assets(context) self.log.debug("__ hierarchy_context: {}".format(hierarchy_context)) - self.session = self.context.data["ftrackSession"] + session = self.context.data["ftrackSession"] project_name = self.context.data["projectEntity"]["name"] query = 'Project where full_name is "{}"'.format(project_name) - project = self.session.query(query).one() - auto_sync_state = project[ - "custom_attributes"][CUST_ATTR_AUTO_SYNC] + project = session.query(query).one() + auto_sync_state = project["custom_attributes"][CUST_ATTR_AUTO_SYNC] - self.ft_project = None + self.session = session + self.ft_project = project + self.task_types = self.get_all_task_types(project) + self.task_statuses = self.get_task_statuses(project) # disable termporarily ftrack project's autosyncing if auto_sync_state: @@ -121,10 +127,7 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin): self.log.debug(entity_type) if entity_type.lower() == 'project': - query = 'Project where full_name is "{}"'.format(entity_name) - entity = self.session.query(query).one() - self.ft_project = entity - self.task_types = self.get_all_task_types(entity) + entity = self.ft_project elif self.ft_project is None or parent is None: raise AssertionError( @@ -217,13 +220,6 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin): task_type=task_type, parent=entity ) - try: - self.session.commit() - except Exception: - tp, value, tb = sys.exc_info() - self.session.rollback() - self.session._configure_locations() - six.reraise(tp, value, tb) # Incoming links. self.create_links(project_name, entity_data, entity) @@ -303,7 +299,37 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin): return tasks + def get_task_statuses(self, project_entity): + project_schema = project_entity["project_schema"] + task_workflow_statuses = project_schema["_task_workflow"]["statuses"] + return { + status["id"]: status + for status in task_workflow_statuses + } + def create_task(self, name, task_type, parent): + filter_data = { + "task_names": name, + "task_types": task_type + } + profile = filter_profiles( + self.create_task_status_profiles, + filter_data + ) + status_id = None + if profile: + status_name = profile["status_name"] + status_name_low = status_name.lower() + for _status_id, status in self.task_statuses.items(): + if status["name"].lower() == status_name_low: + status_id = _status_id + break + + if status_id is None: + self.log.warning( + "Task status \"{}\" was not found".format(status_name) + ) + task = self.session.create('Task', { 'name': name, 'parent': parent @@ -312,6 +338,8 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin): self.log.info(task_type) self.log.info(self.task_types) task['type'] = self.task_types[task_type] + if status_id is not None: + task["status_id"] = status_id try: self.session.commit() diff --git a/openpype/modules/interfaces.py b/openpype/modules/interfaces.py index 334485cab2..14f49204ee 100644 --- a/openpype/modules/interfaces.py +++ b/openpype/modules/interfaces.py @@ -1,4 +1,4 @@ -from abc import abstractmethod +from abc import abstractmethod, abstractproperty from openpype import resources @@ -50,12 +50,32 @@ class IPluginPaths(OpenPypeInterface): class ILaunchHookPaths(OpenPypeInterface): """Module has launch hook paths to return. + Modules does not have to inherit from this interface (changed 8.11.2022). + Module just have to have implemented 'get_launch_hook_paths' to be able use + the advantage. + Expected result is list of paths. ["path/to/launch_hooks_dir"] """ @abstractmethod - def get_launch_hook_paths(self): + def get_launch_hook_paths(self, app): + """Paths to directory with application launch hooks. + + Method can be also defined without arguments. + ```python + def get_launch_hook_paths(self): + return [] + ``` + + Args: + app (Application): Application object which can be used for + filtering of which launch hook paths are returned. + + Returns: + Iterable[str]: Path to directories where launch hooks can be found. + """ + pass @@ -66,6 +86,7 @@ class ITrayModule(OpenPypeInterface): The module still must be usable if is not used in tray even if would do nothing. """ + tray_initialized = False _tray_manager = None @@ -78,16 +99,19 @@ class ITrayModule(OpenPypeInterface): This is where GUIs should be loaded or tray specific parts should be prepared. """ + pass @abstractmethod def tray_menu(self, tray_menu): """Add module's action to tray menu.""" + pass @abstractmethod def tray_start(self): """Start procedure in Pype tray.""" + pass @abstractmethod @@ -96,6 +120,7 @@ class ITrayModule(OpenPypeInterface): This is place where all threads should be shut. """ + pass def execute_in_main_thread(self, callback): @@ -104,6 +129,7 @@ class ITrayModule(OpenPypeInterface): Some callbacks need to be processed on main thread (menu actions must be added on main thread or they won't get triggered etc.) """ + if not self.tray_initialized: # TODO Called without initialized tray, still main thread needed try: @@ -128,6 +154,7 @@ class ITrayModule(OpenPypeInterface): msecs (int): Duration of message visibility in miliseconds. Default is 10000 msecs, may differ by Qt version. """ + if self._tray_manager: self._tray_manager.show_tray_message(title, message, icon, msecs) @@ -280,16 +307,19 @@ class ITrayService(ITrayModule): def set_service_running_icon(self): """Change icon of an QAction to green circle.""" + if self.menu_action: self.menu_action.setIcon(self.get_icon_running()) def set_service_failed_icon(self): """Change icon of an QAction to red circle.""" + if self.menu_action: self.menu_action.setIcon(self.get_icon_failed()) def set_service_idle_icon(self): """Change icon of an QAction to orange circle.""" + if self.menu_action: self.menu_action.setIcon(self.get_icon_idle()) @@ -303,6 +333,7 @@ class ISettingsChangeListener(OpenPypeInterface): "publish": ["path/to/publish_plugins"] } """ + @abstractmethod def on_system_settings_save( self, old_value, new_value, changes, new_value_metadata @@ -320,3 +351,24 @@ class ISettingsChangeListener(OpenPypeInterface): self, old_value, new_value, changes, project_name, new_value_metadata ): pass + + +class IHostModule(OpenPypeInterface): + """Module which also contain a host implementation.""" + + @abstractproperty + def host_name(self): + """Name of host which module represents.""" + + pass + + def get_workfile_extensions(self): + """Define workfile extensions for host. + + Not all hosts support workfiles thus this is optional implementation. + + Returns: + List[str]: Extensions used for workfiles with dot. + """ + + return [] diff --git a/openpype/modules/kitsu/utils/update_zou_with_op.py b/openpype/modules/kitsu/utils/update_zou_with_op.py index 57d7094e95..da924aa5ee 100644 --- a/openpype/modules/kitsu/utils/update_zou_with_op.py +++ b/openpype/modules/kitsu/utils/update_zou_with_op.py @@ -6,7 +6,11 @@ from typing import List import gazu from pymongo import UpdateOne -from openpype.client import get_project, get_assets +from openpype.client import ( + get_projects, + get_project, + get_assets, +) from openpype.pipeline import AvalonMongoDB from openpype.api import get_project_settings from openpype.modules.kitsu.utils.credentials import validate_credentials @@ -37,7 +41,7 @@ def sync_zou(login: str, password: str): dbcon = AvalonMongoDB() dbcon.install() - op_projects = [p for p in dbcon.projects()] + op_projects = list(get_projects()) for project_doc in op_projects: sync_zou_from_op_project(project_doc["name"], dbcon, project_doc) diff --git a/openpype/modules/standalonepublish_action.py b/openpype/modules/standalonepublish_action.py deleted file mode 100644 index ba53ce9b9e..0000000000 --- a/openpype/modules/standalonepublish_action.py +++ /dev/null @@ -1,49 +0,0 @@ -import os -import platform -import subprocess -from openpype.lib import get_openpype_execute_args -from openpype.modules import OpenPypeModule -from openpype_interfaces import ITrayAction - - -class StandAlonePublishAction(OpenPypeModule, ITrayAction): - label = "Publish" - name = "standalonepublish_tool" - - def initialize(self, modules_settings): - import openpype - self.enabled = modules_settings[self.name]["enabled"] - self.publish_paths = [ - os.path.join( - openpype.PACKAGE_DIR, - "hosts", - "standalonepublisher", - "plugins", - "publish" - ) - ] - - def tray_init(self): - return - - def on_action_trigger(self): - self.run_standalone_publisher() - - def connect_with_modules(self, enabled_modules): - """Collect publish paths from other modules.""" - publish_paths = self.manager.collect_plugin_paths()["publish"] - self.publish_paths.extend(publish_paths) - - def run_standalone_publisher(self): - args = get_openpype_execute_args("standalonepublisher") - kwargs = {} - if platform.system().lower() == "darwin": - new_args = ["open", "-na", args.pop(0), "--args"] - new_args.extend(args) - args = new_args - - detached_process = getattr(subprocess, "DETACHED_PROCESS", None) - if detached_process is not None: - kwargs["creationflags"] = detached_process - - subprocess.Popen(args, **kwargs) diff --git a/openpype/modules/sync_server/providers/abstract_provider.py b/openpype/modules/sync_server/providers/abstract_provider.py index 688a17f14f..8c2fe1cad9 100644 --- a/openpype/modules/sync_server/providers/abstract_provider.py +++ b/openpype/modules/sync_server/providers/abstract_provider.py @@ -62,7 +62,7 @@ class AbstractProvider: @abc.abstractmethod def upload_file(self, source_path, path, - server, collection, file, representation, site, + server, project_name, file, representation, site, overwrite=False): """ Copy file from 'source_path' to 'target_path' on provider. @@ -75,7 +75,7 @@ class AbstractProvider: arguments for saving progress: server (SyncServer): server instance to call update_db on - collection (str): name of collection + project_name (str): name of project_name file (dict): info about uploaded file (matches structure from db) representation (dict): complete repre containing 'file' site (str): site name @@ -87,7 +87,7 @@ class AbstractProvider: @abc.abstractmethod def download_file(self, source_path, local_path, - server, collection, file, representation, site, + server, project_name, file, representation, site, overwrite=False): """ Download file from provider into local system @@ -99,7 +99,7 @@ class AbstractProvider: arguments for saving progress: server (SyncServer): server instance to call update_db on - collection (str): name of collection + project_name (str): file (dict): info about uploaded file (matches structure from db) representation (dict): complete repre containing 'file' site (str): site name diff --git a/openpype/modules/sync_server/providers/dropbox.py b/openpype/modules/sync_server/providers/dropbox.py index dfc42fed75..89d6990841 100644 --- a/openpype/modules/sync_server/providers/dropbox.py +++ b/openpype/modules/sync_server/providers/dropbox.py @@ -224,7 +224,7 @@ class DropboxHandler(AbstractProvider): return False def upload_file(self, source_path, path, - server, collection, file, representation, site, + server, project_name, file, representation, site, overwrite=False): """ Copy file from 'source_path' to 'target_path' on provider. @@ -237,7 +237,7 @@ class DropboxHandler(AbstractProvider): arguments for saving progress: server (SyncServer): server instance to call update_db on - collection (str): name of collection + project_name (str): file (dict): info about uploaded file (matches structure from db) representation (dict): complete repre containing 'file' site (str): site name @@ -290,7 +290,7 @@ class DropboxHandler(AbstractProvider): cursor.offset = f.tell() server.update_db( - collection=collection, + project_name=project_name, new_file_id=None, file=file, representation=representation, @@ -301,7 +301,7 @@ class DropboxHandler(AbstractProvider): return path def download_file(self, source_path, local_path, - server, collection, file, representation, site, + server, project_name, file, representation, site, overwrite=False): """ Download file from provider into local system @@ -313,7 +313,7 @@ class DropboxHandler(AbstractProvider): arguments for saving progress: server (SyncServer): server instance to call update_db on - collection (str): name of collection + project_name (str): file (dict): info about uploaded file (matches structure from db) representation (dict): complete repre containing 'file' site (str): site name @@ -337,7 +337,7 @@ class DropboxHandler(AbstractProvider): self.dbx.files_download_to_file(local_path, source_path) server.update_db( - collection=collection, + project_name=project_name, new_file_id=None, file=file, representation=representation, diff --git a/openpype/modules/sync_server/providers/gdrive.py b/openpype/modules/sync_server/providers/gdrive.py index aa7329b104..bef707788b 100644 --- a/openpype/modules/sync_server/providers/gdrive.py +++ b/openpype/modules/sync_server/providers/gdrive.py @@ -251,7 +251,7 @@ class GDriveHandler(AbstractProvider): return folder_id def upload_file(self, source_path, path, - server, collection, file, representation, site, + server, project_name, file, representation, site, overwrite=False): """ Uploads single file from 'source_path' to destination 'path'. @@ -264,7 +264,7 @@ class GDriveHandler(AbstractProvider): arguments for saving progress: server (SyncServer): server instance to call update_db on - collection (str): name of collection + project_name (str): file (dict): info about uploaded file (matches structure from db) representation (dict): complete repre containing 'file' site (str): site name @@ -324,7 +324,7 @@ class GDriveHandler(AbstractProvider): while response is None: if server.is_representation_paused(representation['_id'], check_parents=True, - project_name=collection): + project_name=project_name): raise ValueError("Paused during process, please redo.") if status: status_val = float(status.progress()) @@ -333,7 +333,7 @@ class GDriveHandler(AbstractProvider): last_tick = time.time() log.debug("Uploaded %d%%." % int(status_val * 100)) - server.update_db(collection=collection, + server.update_db(project_name=project_name, new_file_id=None, file=file, representation=representation, @@ -358,7 +358,7 @@ class GDriveHandler(AbstractProvider): return response['id'] def download_file(self, source_path, local_path, - server, collection, file, representation, site, + server, project_name, file, representation, site, overwrite=False): """ Downloads single file from 'source_path' (remote) to 'local_path'. @@ -372,7 +372,7 @@ class GDriveHandler(AbstractProvider): arguments for saving progress: server (SyncServer): server instance to call update_db on - collection (str): name of collection + project_name (str): file (dict): info about uploaded file (matches structure from db) representation (dict): complete repre containing 'file' site (str): site name @@ -410,7 +410,7 @@ class GDriveHandler(AbstractProvider): while response is None: if server.is_representation_paused(representation['_id'], check_parents=True, - project_name=collection): + project_name=project_name): raise ValueError("Paused during process, please redo.") if status: status_val = float(status.progress()) @@ -419,7 +419,7 @@ class GDriveHandler(AbstractProvider): last_tick = time.time() log.debug("Downloaded %d%%." % int(status_val * 100)) - server.update_db(collection=collection, + server.update_db(project_name=project_name, new_file_id=None, file=file, representation=representation, diff --git a/openpype/modules/sync_server/providers/local_drive.py b/openpype/modules/sync_server/providers/local_drive.py index 172cb338cf..01bc891d08 100644 --- a/openpype/modules/sync_server/providers/local_drive.py +++ b/openpype/modules/sync_server/providers/local_drive.py @@ -82,7 +82,7 @@ class LocalDriveHandler(AbstractProvider): return editable def upload_file(self, source_path, target_path, - server, collection, file, representation, site, + server, project_name, file, representation, site, overwrite=False, direction="Upload"): """ Copies file from 'source_path' to 'target_path' @@ -95,7 +95,7 @@ class LocalDriveHandler(AbstractProvider): thread = threading.Thread(target=self._copy, args=(source_path, target_path)) thread.start() - self._mark_progress(collection, file, representation, server, + self._mark_progress(project_name, file, representation, server, site, source_path, target_path, direction) else: if os.path.exists(target_path): @@ -105,13 +105,14 @@ class LocalDriveHandler(AbstractProvider): return os.path.basename(target_path) def download_file(self, source_path, local_path, - server, collection, file, representation, site, + server, project_name, file, representation, site, overwrite=False): """ Download a file form 'source_path' to 'local_path' """ return self.upload_file(source_path, local_path, - server, collection, file, representation, site, + server, project_name, file, + representation, site, overwrite, direction="Download") def delete_file(self, path): @@ -188,7 +189,7 @@ class LocalDriveHandler(AbstractProvider): except shutil.SameFileError: print("same files, skipping") - def _mark_progress(self, collection, file, representation, server, site, + def _mark_progress(self, project_name, file, representation, server, site, source_path, target_path, direction): """ Updates progress field in DB by values 0-1. @@ -204,7 +205,7 @@ class LocalDriveHandler(AbstractProvider): status_val = target_file_size / source_file_size last_tick = time.time() log.debug(direction + "ed %d%%." % int(status_val * 100)) - server.update_db(collection=collection, + server.update_db(project_name=project_name, new_file_id=None, file=file, representation=representation, diff --git a/openpype/modules/sync_server/providers/sftp.py b/openpype/modules/sync_server/providers/sftp.py index 49b87b14ec..302ffae3e6 100644 --- a/openpype/modules/sync_server/providers/sftp.py +++ b/openpype/modules/sync_server/providers/sftp.py @@ -222,7 +222,7 @@ class SFTPHandler(AbstractProvider): return os.path.basename(path) def upload_file(self, source_path, target_path, - server, collection, file, representation, site, + server, project_name, file, representation, site, overwrite=False): """ Uploads single file from 'source_path' to destination 'path'. @@ -235,7 +235,7 @@ class SFTPHandler(AbstractProvider): arguments for saving progress: server (SyncServer): server instance to call update_db on - collection (str): name of collection + project_name (str): file (dict): info about uploaded file (matches structure from db) representation (dict): complete repre containing 'file' site (str): site name @@ -256,7 +256,7 @@ class SFTPHandler(AbstractProvider): thread = threading.Thread(target=self._upload, args=(source_path, target_path)) thread.start() - self._mark_progress(collection, file, representation, server, + self._mark_progress(project_name, file, representation, server, site, source_path, target_path, "upload") return os.path.basename(target_path) @@ -267,7 +267,7 @@ class SFTPHandler(AbstractProvider): conn.put(source_path, target_path) def download_file(self, source_path, target_path, - server, collection, file, representation, site, + server, project_name, file, representation, site, overwrite=False): """ Downloads single file from 'source_path' (remote) to 'target_path'. @@ -281,7 +281,7 @@ class SFTPHandler(AbstractProvider): arguments for saving progress: server (SyncServer): server instance to call update_db on - collection (str): name of collection + project_name (str): file (dict): info about uploaded file (matches structure from db) representation (dict): complete repre containing 'file' site (str): site name @@ -302,7 +302,7 @@ class SFTPHandler(AbstractProvider): thread = threading.Thread(target=self._download, args=(source_path, target_path)) thread.start() - self._mark_progress(collection, file, representation, server, + self._mark_progress(project_name, file, representation, server, site, source_path, target_path, "download") return os.path.basename(target_path) @@ -425,7 +425,7 @@ class SFTPHandler(AbstractProvider): pysftp.exceptions.ConnectionException): log.warning("Couldn't connect", exc_info=True) - def _mark_progress(self, collection, file, representation, server, site, + def _mark_progress(self, project_name, file, representation, server, site, source_path, target_path, direction): """ Updates progress field in DB by values 0-1. @@ -446,7 +446,7 @@ class SFTPHandler(AbstractProvider): status_val = target_file_size / source_file_size last_tick = time.time() log.debug(direction + "ed %d%%." % int(status_val * 100)) - server.update_db(collection=collection, + server.update_db(project_name=project_name, new_file_id=None, file=file, representation=representation, diff --git a/openpype/modules/sync_server/sync_server.py b/openpype/modules/sync_server/sync_server.py index 356a75f99d..97538fcd4e 100644 --- a/openpype/modules/sync_server/sync_server.py +++ b/openpype/modules/sync_server/sync_server.py @@ -14,7 +14,7 @@ from .utils import SyncStatus, ResumableError log = PypeLogger().get_logger("SyncServer") -async def upload(module, collection, file, representation, provider_name, +async def upload(module, project_name, file, representation, provider_name, remote_site_name, tree=None, preset=None): """ Upload single 'file' of a 'representation' to 'provider'. @@ -31,7 +31,7 @@ async def upload(module, collection, file, representation, provider_name, Args: module(SyncServerModule): object to run SyncServerModule API - collection (str): source collection + project_name (str): source db file (dictionary): of file from representation in Mongo representation (dictionary): of representation provider_name (string): gdrive, gdc etc. @@ -47,15 +47,16 @@ async def upload(module, collection, file, representation, provider_name, # thread can do that at a time, upload/download to prepared # structure should be run in parallel remote_handler = lib.factory.get_provider(provider_name, - collection, + project_name, remote_site_name, tree=tree, presets=preset) file_path = file.get("path", "") try: - local_file_path, remote_file_path = resolve_paths(module, - file_path, collection, remote_site_name, remote_handler + local_file_path, remote_file_path = resolve_paths( + module, file_path, project_name, + remote_site_name, remote_handler ) except Exception as exp: print(exp) @@ -74,27 +75,28 @@ async def upload(module, collection, file, representation, provider_name, local_file_path, remote_file_path, module, - collection, + project_name, file, representation, remote_site_name, True ) - module.handle_alternate_site(collection, representation, remote_site_name, + module.handle_alternate_site(project_name, representation, + remote_site_name, file["_id"], file_id) return file_id -async def download(module, collection, file, representation, provider_name, +async def download(module, project_name, file, representation, provider_name, remote_site_name, tree=None, preset=None): """ Downloads file to local folder denoted in representation.Context. Args: module(SyncServerModule): object to run SyncServerModule API - collection (str): source collection + project_name (str): source file (dictionary) : info about processed file representation (dictionary): repr that 'file' belongs to provider_name (string): 'gdrive' etc @@ -108,20 +110,20 @@ async def download(module, collection, file, representation, provider_name, """ with module.lock: remote_handler = lib.factory.get_provider(provider_name, - collection, + project_name, remote_site_name, tree=tree, presets=preset) file_path = file.get("path", "") local_file_path, remote_file_path = resolve_paths( - module, file_path, collection, remote_site_name, remote_handler + module, file_path, project_name, remote_site_name, remote_handler ) local_folder = os.path.dirname(local_file_path) os.makedirs(local_folder, exist_ok=True) - local_site = module.get_active_site(collection) + local_site = module.get_active_site(project_name) loop = asyncio.get_running_loop() file_id = await loop.run_in_executor(None, @@ -129,20 +131,20 @@ async def download(module, collection, file, representation, provider_name, remote_file_path, local_file_path, module, - collection, + project_name, file, representation, local_site, True ) - module.handle_alternate_site(collection, representation, local_site, + module.handle_alternate_site(project_name, representation, local_site, file["_id"], file_id) return file_id -def resolve_paths(module, file_path, collection, +def resolve_paths(module, file_path, project_name, remote_site_name=None, remote_handler=None): """ Returns tuple of local and remote file paths with {root} @@ -153,7 +155,7 @@ def resolve_paths(module, file_path, collection, Args: module(SyncServerModule): object to run SyncServerModule API file_path(string): path with {root} - collection(string): project name + project_name(string): project name remote_site_name(string): remote site remote_handler(AbstractProvider): implementation Returns: @@ -164,7 +166,7 @@ def resolve_paths(module, file_path, collection, remote_file_path = remote_handler.resolve_path(file_path) local_handler = lib.factory.get_provider( - 'local_drive', collection, module.get_active_site(collection)) + 'local_drive', project_name, module.get_active_site(project_name)) local_file_path = local_handler.resolve_path(file_path) return local_file_path, remote_file_path @@ -269,8 +271,8 @@ class SyncServerThread(threading.Thread): - gets list of collections in DB - gets list of active remote providers (has configuration, credentials) - - for each collection it looks for representations that should - be synced + - for each project_name it looks for representations that + should be synced - synchronize found collections - update representations - fills error messages for exceptions - waits X seconds and repeat @@ -282,17 +284,17 @@ class SyncServerThread(threading.Thread): import time start_time = time.time() self.module.set_sync_project_settings() # clean cache - collection = None + project_name = None enabled_projects = self.module.get_enabled_projects() - for collection in enabled_projects: - preset = self.module.sync_project_settings[collection] + for project_name in enabled_projects: + preset = self.module.sync_project_settings[project_name] - local_site, remote_site = self._working_sites(collection) + local_site, remote_site = self._working_sites(project_name) if not all([local_site, remote_site]): continue sync_repres = self.module.get_sync_representations( - collection, + project_name, local_site, remote_site ) @@ -310,7 +312,7 @@ class SyncServerThread(threading.Thread): remote_provider = \ self.module.get_provider_for_site(site=remote_site) handler = lib.factory.get_provider(remote_provider, - collection, + project_name, remote_site, presets=site_preset) limit = lib.factory.get_provider_batch_limit( @@ -341,7 +343,7 @@ class SyncServerThread(threading.Thread): limit -= 1 task = asyncio.create_task( upload(self.module, - collection, + project_name, file, sync, remote_provider, @@ -353,7 +355,7 @@ class SyncServerThread(threading.Thread): files_processed_info.append((file, sync, remote_site, - collection + project_name )) processed_file_path.add(file_path) if status == SyncStatus.DO_DOWNLOAD: @@ -361,7 +363,7 @@ class SyncServerThread(threading.Thread): limit -= 1 task = asyncio.create_task( download(self.module, - collection, + project_name, file, sync, remote_provider, @@ -373,7 +375,7 @@ class SyncServerThread(threading.Thread): files_processed_info.append((file, sync, local_site, - collection + project_name )) processed_file_path.add(file_path) @@ -384,12 +386,12 @@ class SyncServerThread(threading.Thread): return_exceptions=True) for file_id, info in zip(files_created, files_processed_info): - file, representation, site, collection = info + file, representation, site, project_name = info error = None if isinstance(file_id, BaseException): error = str(file_id) file_id = None - self.module.update_db(collection, + self.module.update_db(project_name, file_id, file, representation, @@ -399,7 +401,7 @@ class SyncServerThread(threading.Thread): duration = time.time() - start_time log.debug("One loop took {:.2f}s".format(duration)) - delay = self.module.get_loop_delay(collection) + delay = self.module.get_loop_delay(project_name) log.debug("Waiting for {} seconds to new loop".format(delay)) self.timer = asyncio.create_task(self.run_timer(delay)) await asyncio.gather(self.timer) @@ -458,19 +460,19 @@ class SyncServerThread(threading.Thread): self.timer.cancel() self.timer = None - def _working_sites(self, collection): - if self.module.is_project_paused(collection): + def _working_sites(self, project_name): + if self.module.is_project_paused(project_name): log.debug("Both sites same, skipping") return None, None - local_site = self.module.get_active_site(collection) - remote_site = self.module.get_remote_site(collection) + local_site = self.module.get_active_site(project_name) + remote_site = self.module.get_remote_site(project_name) if local_site == remote_site: log.debug("{}-{} sites same, skipping".format(local_site, remote_site)) return None, None - configured_sites = _get_configured_sites(self.module, collection) + configured_sites = _get_configured_sites(self.module, project_name) if not all([local_site in configured_sites, remote_site in configured_sites]): log.debug("Some of the sites {} - {} is not ".format(local_site, diff --git a/openpype/modules/sync_server/sync_server_module.py b/openpype/modules/sync_server/sync_server_module.py index 4027561d22..c7f9484e55 100644 --- a/openpype/modules/sync_server/sync_server_module.py +++ b/openpype/modules/sync_server/sync_server_module.py @@ -6,7 +6,7 @@ import platform import copy from collections import deque, defaultdict - +from openpype.client import get_projects from openpype.modules import OpenPypeModule from openpype_interfaces import ITrayModule from openpype.settings import ( @@ -25,6 +25,8 @@ from .providers import lib from .utils import time_function, SyncStatus, SiteAlreadyPresentError +from openpype.client import get_representations, get_representation_by_id + log = PypeLogger.get_logger("SyncServer") @@ -128,12 +130,12 @@ class SyncServerModule(OpenPypeModule, ITrayModule): self.projects_processed = set() """ Start of Public API """ - def add_site(self, collection, representation_id, site_name=None, + def add_site(self, project_name, representation_id, site_name=None, force=False): """ Adds new site to representation to be synced. - 'collection' must have synchronization enabled (globally or + 'project_name' must have synchronization enabled (globally or project only) Used as a API endpoint from outside applications (Loader etc). @@ -141,7 +143,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule): Use 'force' to reset existing site. Args: - collection (string): project name (must match DB) + project_name (string): project name (must match DB) representation_id (string): MongoDB _id value site_name (string): name of configured and active site force (bool): reset site if exists @@ -151,25 +153,25 @@ class SyncServerModule(OpenPypeModule, ITrayModule): not 'force' ValueError - other errors (repre not found, misconfiguration) """ - if not self.get_sync_project_setting(collection): + if not self.get_sync_project_setting(project_name): raise ValueError("Project not configured") if not site_name: site_name = self.DEFAULT_SITE - self.reset_site_on_representation(collection, + self.reset_site_on_representation(project_name, representation_id, site_name=site_name, force=force) - def remove_site(self, collection, representation_id, site_name, + def remove_site(self, project_name, representation_id, site_name, remove_local_files=False): """ Removes 'site_name' for particular 'representation_id' on - 'collection' + 'project_name' Args: - collection (string): project name (must match DB) + project_name (string): project name (must match DB) representation_id (string): MongoDB _id value site_name (string): name of configured and active site remove_local_files (bool): remove only files for 'local_id' @@ -178,15 +180,15 @@ class SyncServerModule(OpenPypeModule, ITrayModule): Returns: throws ValueError if any issue """ - if not self.get_sync_project_setting(collection): + if not self.get_sync_project_setting(project_name): raise ValueError("Project not configured") - self.reset_site_on_representation(collection, + self.reset_site_on_representation(project_name, representation_id, site_name=site_name, remove=True) if remove_local_files: - self._remove_local_file(collection, representation_id, site_name) + self._remove_local_file(project_name, representation_id, site_name) def compute_resource_sync_sites(self, project_name): """Get available resource sync sites state for publish process. @@ -333,9 +335,9 @@ class SyncServerModule(OpenPypeModule, ITrayModule): return alt_site_pairs - def clear_project(self, collection, site_name): + def clear_project(self, project_name, site_name): """ - Clear 'collection' of 'site_name' and its local files + Clear 'project_name' of 'site_name' and its local files Works only on real local sites, not on 'studio' """ @@ -344,16 +346,17 @@ class SyncServerModule(OpenPypeModule, ITrayModule): "files.sites.name": site_name } + # TODO currently not possible to replace with get_representations representations = list( - self.connection.database[collection].find(query)) + self.connection.database[project_name].find(query)) if not representations: self.log.debug("No repre found") return for repre in representations: - self.remove_site(collection, repre.get("_id"), site_name, True) + self.remove_site(project_name, repre.get("_id"), site_name, True) - def create_validate_project_task(self, collection, site_name): + def create_validate_project_task(self, project_name, site_name): """Adds metadata about project files validation on a queue. This process will loop through all representation and check if @@ -370,33 +373,28 @@ class SyncServerModule(OpenPypeModule, ITrayModule): """ task = { "type": "validate", - "project_name": collection, - "func": lambda: self.validate_project(collection, site_name, + "project_name": project_name, + "func": lambda: self.validate_project(project_name, site_name, reset_missing=True) } - self.projects_processed.add(collection) + self.projects_processed.add(project_name) self.long_running_tasks.append(task) - def validate_project(self, collection, site_name, reset_missing=False): - """Validate 'collection' of 'site_name' and its local files + def validate_project(self, project_name, site_name, reset_missing=False): + """Validate 'project_name' of 'site_name' and its local files If file present and not marked with a 'site_name' in DB, DB is updated with site name and file modified date. Args: - collection (string): project name + project_name (string): project name site_name (string): active site name reset_missing (bool): if True reset site in DB if missing physically """ - self.log.debug("Validation of {} for {} started".format(collection, + self.log.debug("Validation of {} for {} started".format(project_name, site_name)) - query = { - "type": "representation" - } - - representations = list( - self.connection.database[collection].find(query)) + representations = list(get_representations(project_name)) if not representations: self.log.debug("No repre found") return @@ -416,7 +414,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule): continue file_path = repre_file.get("path", "") - local_file_path = self.get_local_file_path(collection, + local_file_path = self.get_local_file_path(project_name, site_name, file_path) @@ -428,14 +426,11 @@ class SyncServerModule(OpenPypeModule, ITrayModule): "Adding site {} for {}".format(site_name, repre_id)) - query = { - "_id": repre_id - } created_dt = datetime.fromtimestamp( os.path.getmtime(local_file_path)) elem = {"name": site_name, "created_dt": created_dt} - self._add_site(collection, query, repre, elem, + self._add_site(project_name, repre, elem, site_name=site_name, file_id=repre_file["_id"], force=True) @@ -445,41 +440,42 @@ class SyncServerModule(OpenPypeModule, ITrayModule): self.log.debug("Resetting site {} for {}". format(site_name, repre_id)) self.reset_site_on_representation( - collection, repre_id, site_name=site_name, + project_name, repre_id, site_name=site_name, file_id=repre_file["_id"]) sites_reset += 1 if sites_added % 100 == 0: self.log.debug("Sites added {}".format(sites_added)) - self.log.debug("Validation of {} for {} ended".format(collection, + self.log.debug("Validation of {} for {} ended".format(project_name, site_name)) self.log.info("Sites added {}, sites reset {}".format(sites_added, reset_missing)) - def pause_representation(self, collection, representation_id, site_name): + def pause_representation(self, project_name, representation_id, site_name): """ Sets 'representation_id' as paused, eg. no syncing should be happening on it. Args: - collection (string): project name + project_name (string): project name representation_id (string): MongoDB objectId value site_name (string): 'gdrive', 'studio' etc. """ log.info("Pausing SyncServer for {}".format(representation_id)) self._paused_representations.add(representation_id) - self.reset_site_on_representation(collection, representation_id, + self.reset_site_on_representation(project_name, representation_id, site_name=site_name, pause=True) - def unpause_representation(self, collection, representation_id, site_name): + def unpause_representation(self, project_name, + representation_id, site_name): """ Sets 'representation_id' as unpaused. Does not fail or warn if repre wasn't paused. Args: - collection (string): project name + project_name (string): project name representation_id (string): MongoDB objectId value site_name (string): 'gdrive', 'studio' etc. """ @@ -489,7 +485,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule): except KeyError: pass # self.paused_representations is not persistent - self.reset_site_on_representation(collection, representation_id, + self.reset_site_on_representation(project_name, representation_id, site_name=site_name, pause=False) def is_representation_paused(self, representation_id, @@ -520,7 +516,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule): happening on all representation inside. Args: - project_name (string): collection name + project_name (string): project_name name """ log.info("Pausing SyncServer for {}".format(project_name)) self._paused_projects.add(project_name) @@ -532,7 +528,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule): Does not fail or warn if project wasn't paused. Args: - project_name (string): collection name + project_name (string): """ log.info("Unpausing SyncServer for {}".format(project_name)) try: @@ -545,7 +541,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule): Returns if 'project_name' is paused or not. Args: - project_name (string): collection name + project_name (string): check_parents (bool): check if server itself is not paused Returns: @@ -917,7 +913,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule): enabled_projects = [] if self.enabled: - for project in self.connection.projects(projection={"name": 1}): + for project in get_projects(fields=["name"]): project_name = project["name"] if self.is_project_enabled(project_name): enabled_projects.append(project_name) @@ -944,8 +940,8 @@ class SyncServerModule(OpenPypeModule, ITrayModule): return True return False - def handle_alternate_site(self, collection, representation, processed_site, - file_id, synced_file_id): + def handle_alternate_site(self, project_name, representation, + processed_site, file_id, synced_file_id): """ For special use cases where one site vendors another. @@ -958,7 +954,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule): same location >> file is accesible on 'sftp' site right away. Args: - collection (str): name of project + project_name (str): name of project representation (dict) processed_site (str): real site_name of published/uploaded file file_id (ObjectId): DB id of file handled @@ -982,26 +978,112 @@ class SyncServerModule(OpenPypeModule, ITrayModule): alternate_sites = set(alternate_sites) for alt_site in alternate_sites: - query = { - "_id": representation["_id"] - } elem = {"name": alt_site, "created_dt": datetime.now(), "id": synced_file_id} self.log.debug("Adding alternate {} to {}".format( alt_site, representation["_id"])) - self._add_site(collection, query, + self._add_site(project_name, representation, elem, alt_site, file_id=file_id, force=True) + def get_repre_info_for_versions(self, project_name, version_ids, + active_site, remote_site): + """Returns representation documents for versions and sites combi + + Args: + project_name (str) + version_ids (list): of version[_id] + active_site (string): 'local', 'studio' etc + remote_site (string): dtto + Returns: + + """ + self.connection.Session["AVALON_PROJECT"] = project_name + query = [ + {"$match": {"parent": {"$in": version_ids}, + "type": "representation", + "files.sites.name": {"$exists": 1}}}, + {"$unwind": "$files"}, + {'$addFields': { + 'order_local': { + '$filter': { + 'input': '$files.sites', 'as': 'p', + 'cond': {'$eq': ['$$p.name', active_site]} + } + } + }}, + {'$addFields': { + 'order_remote': { + '$filter': { + 'input': '$files.sites', 'as': 'p', + 'cond': {'$eq': ['$$p.name', remote_site]} + } + } + }}, + {'$addFields': { + 'progress_local': {"$arrayElemAt": [{ + '$cond': [ + {'$size': "$order_local.progress"}, + "$order_local.progress", + # if exists created_dt count is as available + {'$cond': [ + {'$size': "$order_local.created_dt"}, + [1], + [0] + ]} + ]}, + 0 + ]} + }}, + {'$addFields': { + 'progress_remote': {"$arrayElemAt": [{ + '$cond': [ + {'$size': "$order_remote.progress"}, + "$order_remote.progress", + # if exists created_dt count is as available + {'$cond': [ + {'$size': "$order_remote.created_dt"}, + [1], + [0] + ]} + ]}, + 0 + ]} + }}, + {'$group': { # first group by repre + '_id': '$_id', + 'parent': {'$first': '$parent'}, + 'avail_ratio_local': { + '$first': { + '$divide': [{'$sum': "$progress_local"}, {'$sum': 1}] + } + }, + 'avail_ratio_remote': { + '$first': { + '$divide': [{'$sum': "$progress_remote"}, {'$sum': 1}] + } + } + }}, + {'$group': { # second group by parent, eg version_id + '_id': '$parent', + 'repre_count': {'$sum': 1}, # total representations + # fully available representation for site + 'avail_repre_local': {'$sum': "$avail_ratio_local"}, + 'avail_repre_remote': {'$sum': "$avail_ratio_remote"}, + }}, + ] + # docs = list(self.connection.aggregate(query)) + return self.connection.aggregate(query) + """ End of Public API """ - def get_local_file_path(self, collection, site_name, file_path): + def get_local_file_path(self, project_name, site_name, file_path): """ Externalized for app """ - handler = LocalDriveHandler(collection, site_name) + handler = LocalDriveHandler(project_name, site_name) local_file_path = handler.resolve_path(file_path) return local_file_path @@ -1160,10 +1242,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule): def _prepare_sync_project_settings(self, exclude_locals): sync_project_settings = {} system_sites = self.get_all_site_configs() - project_docs = self.connection.projects( - projection={"name": 1}, - only_active=True - ) + project_docs = get_projects(fields=["name"]) for project_doc in project_docs: project_name = project_doc["name"] sites = copy.deepcopy(system_sites) # get all configured sites @@ -1288,7 +1367,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule): return sites.get(site, 'N/A') @time_function - def get_sync_representations(self, collection, active_site, remote_site): + def get_sync_representations(self, project_name, active_site, remote_site): """ Get representations that should be synced, these could be recognised by presence of document in 'files.sites', where key is @@ -1299,8 +1378,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule): better performance. Goal is to get as few representations as possible. Args: - collection (string): name of collection (in most cases matches - project name + project_name (string): active_site (string): identifier of current active site (could be 'local_0' when working from home, 'studio' when working in the studio (default) @@ -1309,10 +1387,10 @@ class SyncServerModule(OpenPypeModule, ITrayModule): Returns: (list) of dictionaries """ - log.debug("Check representations for : {}".format(collection)) - self.connection.Session["AVALON_PROJECT"] = collection + log.debug("Check representations for : {}".format(project_name)) + self.connection.Session["AVALON_PROJECT"] = project_name # retry_cnt - number of attempts to sync specific file before giving up - retries_arr = self._get_retries_arr(collection) + retries_arr = self._get_retries_arr(project_name) match = { "type": "representation", "$or": [ @@ -1449,14 +1527,14 @@ class SyncServerModule(OpenPypeModule, ITrayModule): return SyncStatus.DO_NOTHING - def update_db(self, collection, new_file_id, file, representation, + def update_db(self, project_name, new_file_id, file, representation, site, error=None, progress=None, priority=None): """ Update 'provider' portion of records in DB with success (file_id) or error (exception) Args: - collection (string): name of project - force to db connection as + project_name (string): name of project - force to db connection as each file might come from different collection new_file_id (string): file (dictionary): info about processed file (pulled from DB) @@ -1499,7 +1577,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule): if file_id: arr_filter.append({'f._id': ObjectId(file_id)}) - self.connection.database[collection].update_one( + self.connection.database[project_name].update_one( query, update, upsert=True, @@ -1562,7 +1640,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule): return -1, None - def reset_site_on_representation(self, collection, representation_id, + def reset_site_on_representation(self, project_name, representation_id, side=None, file_id=None, site_name=None, remove=False, pause=None, force=False): """ @@ -1579,7 +1657,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule): Should be used when repre should be synced to new site. Args: - collection (string): name of project (eg. collection) in DB + project_name (string): name of project (eg. collection) in DB representation_id(string): _id of representation file_id (string): file _id in representation side (string): local or remote side @@ -1593,20 +1671,18 @@ class SyncServerModule(OpenPypeModule, ITrayModule): not 'force' ValueError - other errors (repre not found, misconfiguration) """ - query = { - "_id": ObjectId(representation_id) - } - - representation = self.connection.database[collection].find_one(query) + representation = get_representation_by_id(project_name, + representation_id) if not representation: raise ValueError("Representation {} not found in {}". - format(representation_id, collection)) + format(representation_id, project_name)) + if side and site_name: raise ValueError("Misconfiguration, only one of side and " + "site_name arguments should be passed.") - local_site = self.get_active_site(collection) - remote_site = self.get_remote_site(collection) + local_site = self.get_active_site(project_name) + remote_site = self.get_remote_site(project_name) if side: if side == 'local': @@ -1617,37 +1693,43 @@ class SyncServerModule(OpenPypeModule, ITrayModule): elem = {"name": site_name} if file_id: # reset site for particular file - self._reset_site_for_file(collection, query, + self._reset_site_for_file(project_name, representation_id, elem, file_id, site_name) elif side: # reset site for whole representation - self._reset_site(collection, query, elem, site_name) + self._reset_site(project_name, representation_id, elem, site_name) elif remove: # remove site for whole representation - self._remove_site(collection, query, representation, site_name) + self._remove_site(project_name, + representation, site_name) elif pause is not None: - self._pause_unpause_site(collection, query, + self._pause_unpause_site(project_name, representation, site_name, pause) else: # add new site to all files for representation - self._add_site(collection, query, representation, elem, site_name, + self._add_site(project_name, representation, elem, site_name, force=force) - def _update_site(self, collection, query, update, arr_filter): + def _update_site(self, project_name, representation_id, + update, arr_filter): """ Auxiliary method to call update_one function on DB Used for refactoring ugly reset_provider_for_file """ - self.connection.database[collection].update_one( + query = { + "_id": ObjectId(representation_id) + } + + self.connection.database[project_name].update_one( query, update, upsert=True, array_filters=arr_filter ) - def _reset_site_for_file(self, collection, query, + def _reset_site_for_file(self, project_name, representation_id, elem, file_id, site_name): """ Resets 'site_name' for 'file_id' on representation in 'query' on - 'collection' + 'project_name' """ update = { "$set": {"files.$[f].sites.$[s]": elem} @@ -1660,9 +1742,9 @@ class SyncServerModule(OpenPypeModule, ITrayModule): {'f._id': file_id} ] - self._update_site(collection, query, update, arr_filter) + self._update_site(project_name, representation_id, update, arr_filter) - def _reset_site(self, collection, query, elem, site_name): + def _reset_site(self, project_name, representation_id, elem, site_name): """ Resets 'site_name' for all files of representation in 'query' """ @@ -1674,9 +1756,9 @@ class SyncServerModule(OpenPypeModule, ITrayModule): {'s.name': site_name} ] - self._update_site(collection, query, update, arr_filter) + self._update_site(project_name, representation_id, update, arr_filter) - def _remove_site(self, collection, query, representation, site_name): + def _remove_site(self, project_name, representation, site_name): """ Removes 'site_name' for 'representation' in 'query' @@ -1698,10 +1780,11 @@ class SyncServerModule(OpenPypeModule, ITrayModule): } arr_filter = [] - self._update_site(collection, query, update, arr_filter) + self._update_site(project_name, representation["_id"], + update, arr_filter) - def _pause_unpause_site(self, collection, query, - representation, site_name, pause): + def _pause_unpause_site(self, project_name, representation, + site_name, pause): """ Pauses/unpauses all files for 'representation' based on 'pause' @@ -1733,12 +1816,13 @@ class SyncServerModule(OpenPypeModule, ITrayModule): {'s.name': site_name} ] - self._update_site(collection, query, update, arr_filter) + self._update_site(project_name, representation["_id"], + update, arr_filter) - def _add_site(self, collection, query, representation, elem, site_name, + def _add_site(self, project_name, representation, elem, site_name, force=False, file_id=None): """ - Adds 'site_name' to 'representation' on 'collection' + Adds 'site_name' to 'representation' on 'project_name' Args: representation (dict) @@ -1746,10 +1830,11 @@ class SyncServerModule(OpenPypeModule, ITrayModule): Use 'force' to remove existing or raises ValueError """ + representation_id = representation["_id"] reset_existing = False files = representation.get("files", []) if not files: - log.debug("No files for {}".format(representation["_id"])) + log.debug("No files for {}".format(representation_id)) return for repre_file in files: @@ -1759,7 +1844,8 @@ class SyncServerModule(OpenPypeModule, ITrayModule): for site in repre_file.get("sites"): if site["name"] == site_name: if force or site.get("error"): - self._reset_site_for_file(collection, query, + self._reset_site_for_file(project_name, + representation_id, elem, repre_file["_id"], site_name) reset_existing = True @@ -1785,14 +1871,15 @@ class SyncServerModule(OpenPypeModule, ITrayModule): {'f._id': file_id} ] - self._update_site(collection, query, update, arr_filter) + self._update_site(project_name, representation_id, + update, arr_filter) - def _remove_local_file(self, collection, representation_id, site_name): + def _remove_local_file(self, project_name, representation_id, site_name): """ Removes all local files for 'site_name' of 'representation_id' Args: - collection (string): project name (must match DB) + project_name (string): project name (must match DB) representation_id (string): MongoDB _id value site_name (string): name of configured and active site @@ -1808,21 +1895,17 @@ class SyncServerModule(OpenPypeModule, ITrayModule): provider_name = self.get_provider_for_site(site=site_name) if provider_name == 'local_drive': - query = { - "_id": ObjectId(representation_id) - } - - representation = list( - self.connection.database[collection].find(query)) + representation = get_representation_by_id(project_name, + representation_id, + fields=["files"]) if not representation: self.log.debug("No repre {} found".format( representation_id)) return - representation = representation.pop() local_file_path = '' for file in representation.get("files"): - local_file_path = self.get_local_file_path(collection, + local_file_path = self.get_local_file_path(project_name, site_name, file.get("path", "") ) diff --git a/openpype/modules/sync_server/tray/models.py b/openpype/modules/sync_server/tray/models.py index 6d1e85c17a..629c4cbbf1 100644 --- a/openpype/modules/sync_server/tray/models.py +++ b/openpype/modules/sync_server/tray/models.py @@ -11,6 +11,7 @@ from openpype.tools.utils.delegates import pretty_timestamp from openpype.lib import PypeLogger from openpype.api import get_local_site_id +from openpype.client import get_representation_by_id from . import lib @@ -440,7 +441,7 @@ class SyncRepresentationSummaryModel(_SyncRepresentationModel): full text filtering. Allows pagination, most of heavy lifting is being done on DB side. - Single model matches to single collection. When project is changed, + Single model matches to single project. When project is changed, model is reset and refreshed. Args: @@ -919,11 +920,10 @@ class SyncRepresentationSummaryModel(_SyncRepresentationModel): repre_id = self.data(index, Qt.UserRole) - representation = list(self.dbcon.find({"type": "representation", - "_id": repre_id})) + representation = get_representation_by_id(self.project, repre_id) if representation: self.sync_server.update_db(self.project, None, None, - representation.pop(), + representation, get_local_site_id(), priority=value) self.is_editing = False @@ -1357,11 +1357,10 @@ class SyncRepresentationDetailModel(_SyncRepresentationModel): file_id = self.data(index, Qt.UserRole) updated_file = None - # conversion from cursor to list - representations = list(self.dbcon.find({"type": "representation", - "_id": self._id})) + representation = get_representation_by_id(self.project, self._id) + if not representation: + return - representation = representations.pop() for repre_file in representation["files"]: if repre_file["_id"] == file_id: updated_file = repre_file diff --git a/openpype/pipeline/editorial.py b/openpype/pipeline/editorial.py index f62a1842e0..564d78ea6f 100644 --- a/openpype/pipeline/editorial.py +++ b/openpype/pipeline/editorial.py @@ -263,16 +263,17 @@ def get_media_range_with_retimes(otio_clip, handle_start, handle_end): "retime": True, "speed": time_scalar, "timewarps": time_warp_nodes, - "handleStart": round(handle_start), - "handleEnd": round(handle_end) + "handleStart": int(round(handle_start)), + "handleEnd": int(round(handle_end)) } } returning_dict = { "mediaIn": media_in_trimmed, "mediaOut": media_out_trimmed, - "handleStart": round(handle_start), - "handleEnd": round(handle_end) + "handleStart": int(round(handle_start)), + "handleEnd": int(round(handle_end)), + "speed": time_scalar } # add version data only if retime diff --git a/openpype/pipeline/load/__init__.py b/openpype/pipeline/load/__init__.py index e46d9f152b..b6bdd13d50 100644 --- a/openpype/pipeline/load/__init__.py +++ b/openpype/pipeline/load/__init__.py @@ -16,6 +16,7 @@ from .utils import ( switch_container, get_loader_identifier, + get_loaders_by_name, get_representation_path_from_context, get_representation_path, @@ -61,6 +62,7 @@ __all__ = ( "switch_container", "get_loader_identifier", + "get_loaders_by_name", "get_representation_path_from_context", "get_representation_path", diff --git a/openpype/pipeline/load/utils.py b/openpype/pipeline/load/utils.py index fe5102353d..99d6876d4b 100644 --- a/openpype/pipeline/load/utils.py +++ b/openpype/pipeline/load/utils.py @@ -222,13 +222,20 @@ def get_representation_context(representation): project_name, representation ) + if not representation: + raise AssertionError("Representation was not found in database") + version, subset, asset, project = get_representation_parents( project_name, representation ) - - assert all([representation, version, subset, asset, project]), ( - "This is a bug" - ) + if not version: + raise AssertionError("Version was not found in database") + if not subset: + raise AssertionError("Subset was not found in database") + if not asset: + raise AssertionError("Asset was not found in database") + if not project: + raise AssertionError("Project was not found in database") context = { "project": { @@ -369,6 +376,20 @@ def get_loader_identifier(loader): return loader.__name__ +def get_loaders_by_name(): + from .plugins import discover_loader_plugins + + loaders_by_name = {} + for loader in discover_loader_plugins(): + loader_name = loader.__name__ + if loader_name in loaders_by_name: + raise KeyError( + "Duplicated loader name {} !".format(loader_name) + ) + loaders_by_name[loader_name] = loader + return loaders_by_name + + def _get_container_loader(container): """Return the Loader corresponding to the container""" from .plugins import discover_loader_plugins diff --git a/openpype/pipeline/workfile/abstract_template_loader.py b/openpype/pipeline/workfile/abstract_template_loader.py new file mode 100644 index 0000000000..05a98a1ddc --- /dev/null +++ b/openpype/pipeline/workfile/abstract_template_loader.py @@ -0,0 +1,526 @@ +import os +from abc import ABCMeta, abstractmethod + +import six +import logging +from functools import reduce + +from openpype.client import get_asset_by_name +from openpype.settings import get_project_settings +from openpype.lib import ( + StringTemplate, + Logger, + filter_profiles, + get_linked_assets, +) +from openpype.pipeline import legacy_io, Anatomy +from openpype.pipeline.load import ( + get_loaders_by_name, + get_representation_context, + load_with_repre_context, +) + +from .build_template_exceptions import ( + TemplateAlreadyImported, + TemplateLoadingFailed, + TemplateProfileNotFound, + TemplateNotFound +) + +log = logging.getLogger(__name__) + + +def update_representations(entities, entity): + if entity['context']['subset'] not in entities: + entities[entity['context']['subset']] = entity + else: + current = entities[entity['context']['subset']] + incomming = entity + entities[entity['context']['subset']] = max( + current, incomming, + key=lambda entity: entity["context"].get("version", -1)) + + return entities + + +def parse_loader_args(loader_args): + if not loader_args: + return dict() + try: + parsed_args = eval(loader_args) + if not isinstance(parsed_args, dict): + return dict() + else: + return parsed_args + except Exception as err: + print( + "Error while parsing loader arguments '{}'.\n{}: {}\n\n" + "Continuing with default arguments. . .".format( + loader_args, + err.__class__.__name__, + err)) + return dict() + + +@six.add_metaclass(ABCMeta) +class AbstractTemplateLoader: + """ + Abstraction of Template Loader. + Properties: + template_path : property to get current template path + Methods: + import_template : Abstract Method. Used to load template, + depending on current host + get_template_nodes : Abstract Method. Used to query nodes acting + as placeholders. Depending on current host + """ + + _log = None + + def __init__(self, placeholder_class): + # TODO template loader should expect host as and argument + # - host have all responsibility for most of code (also provide + # placeholder class) + # - also have responsibility for current context + # - this won't work in DCCs where multiple workfiles with + # different contexts can be opened at single time + # - template loader should have ability to change context + project_name = legacy_io.active_project() + asset_name = legacy_io.Session["AVALON_ASSET"] + + self.loaders_by_name = get_loaders_by_name() + self.current_asset = asset_name + self.project_name = project_name + self.host_name = legacy_io.Session["AVALON_APP"] + self.task_name = legacy_io.Session["AVALON_TASK"] + self.placeholder_class = placeholder_class + self.current_asset_doc = get_asset_by_name(project_name, asset_name) + self.task_type = ( + self.current_asset_doc + .get("data", {}) + .get("tasks", {}) + .get(self.task_name, {}) + .get("type") + ) + + self.log.info( + "BUILDING ASSET FROM TEMPLATE :\n" + "Starting templated build for {asset} in {project}\n\n" + "Asset : {asset}\n" + "Task : {task_name} ({task_type})\n" + "Host : {host}\n" + "Project : {project}\n".format( + asset=self.current_asset, + host=self.host_name, + project=self.project_name, + task_name=self.task_name, + task_type=self.task_type + )) + # Skip if there is no loader + if not self.loaders_by_name: + self.log.warning( + "There is no registered loaders. No assets will be loaded") + return + + @property + def log(self): + if self._log is None: + self._log = Logger.get_logger(self.__class__.__name__) + return self._log + + def template_already_imported(self, err_msg): + """In case template was already loaded. + Raise the error as a default action. + Override this method in your template loader implementation + to manage this case.""" + self.log.error("{}: {}".format( + err_msg.__class__.__name__, + err_msg)) + raise TemplateAlreadyImported(err_msg) + + def template_loading_failed(self, err_msg): + """In case template loading failed + Raise the error as a default action. + Override this method in your template loader implementation + to manage this case. + """ + self.log.error("{}: {}".format( + err_msg.__class__.__name__, + err_msg)) + raise TemplateLoadingFailed(err_msg) + + @property + def template_path(self): + """ + Property returning template path. Avoiding setter. + Getting template path from open pype settings based on current avalon + session and solving the path variables if needed. + Returns: + str: Solved template path + Raises: + TemplateProfileNotFound: No profile found from settings for + current avalon session + KeyError: Could not solve path because a key does not exists + in avalon context + TemplateNotFound: Solved path does not exists on current filesystem + """ + project_name = self.project_name + host_name = self.host_name + task_name = self.task_name + task_type = self.task_type + + anatomy = Anatomy(project_name) + project_settings = get_project_settings(project_name) + + build_info = project_settings[host_name]["templated_workfile_build"] + profile = filter_profiles( + build_info["profiles"], + { + "task_types": task_type, + "tasks": task_name + } + ) + + if not profile: + raise TemplateProfileNotFound( + "No matching profile found for task '{}' of type '{}' " + "with host '{}'".format(task_name, task_type, host_name) + ) + + path = profile["path"] + if not path: + raise TemplateLoadingFailed( + "Template path is not set.\n" + "Path need to be set in {}\\Template Workfile Build " + "Settings\\Profiles".format(host_name.title())) + + # Try fill path with environments and anatomy roots + fill_data = { + key: value + for key, value in os.environ.items() + } + fill_data["root"] = anatomy.roots + result = StringTemplate.format_template(path, fill_data) + if result.solved: + path = result.normalized() + + if path and os.path.exists(path): + self.log.info("Found template at: '{}'".format(path)) + return path + + solved_path = None + while True: + try: + solved_path = anatomy.path_remapper(path) + except KeyError as missing_key: + raise KeyError( + "Could not solve key '{}' in template path '{}'".format( + missing_key, path)) + + if solved_path is None: + solved_path = path + if solved_path == path: + break + path = solved_path + + solved_path = os.path.normpath(solved_path) + if not os.path.exists(solved_path): + raise TemplateNotFound( + "Template found in openPype settings for task '{}' with host " + "'{}' does not exists. (Not found : {})".format( + task_name, host_name, solved_path)) + + self.log.info("Found template at: '{}'".format(solved_path)) + + return solved_path + + def populate_template(self, ignored_ids=None): + """ + Use template placeholders to load assets and parent them in hierarchy + Arguments : + ignored_ids : + Returns: + None + """ + + loaders_by_name = self.loaders_by_name + current_asset_doc = self.current_asset_doc + linked_assets = get_linked_assets(current_asset_doc) + + ignored_ids = ignored_ids or [] + placeholders = self.get_placeholders() + self.log.debug("Placeholders found in template: {}".format( + [placeholder.name for placeholder in placeholders] + )) + for placeholder in placeholders: + self.log.debug("Start to processing placeholder {}".format( + placeholder.name + )) + placeholder_representations = self.get_placeholder_representations( + placeholder, + current_asset_doc, + linked_assets + ) + + if not placeholder_representations: + self.log.info( + "There's no representation for this placeholder: " + "{}".format(placeholder.name) + ) + continue + + for representation in placeholder_representations: + self.preload(placeholder, loaders_by_name, representation) + + if self.load_data_is_incorrect( + placeholder, + representation, + ignored_ids): + continue + + self.log.info( + "Loading {}_{} with loader {}\n" + "Loader arguments used : {}".format( + representation['context']['asset'], + representation['context']['subset'], + placeholder.loader_name, + placeholder.loader_args)) + + try: + container = self.load( + placeholder, loaders_by_name, representation) + except Exception: + self.load_failed(placeholder, representation) + else: + self.load_succeed(placeholder, container) + finally: + self.postload(placeholder) + + def get_placeholder_representations( + self, placeholder, current_asset_doc, linked_asset_docs + ): + placeholder_representations = placeholder.get_representations( + current_asset_doc, + linked_asset_docs + ) + for repre_doc in reduce( + update_representations, + placeholder_representations, + dict() + ).values(): + yield repre_doc + + def load_data_is_incorrect( + self, placeholder, last_representation, ignored_ids): + if not last_representation: + self.log.warning(placeholder.err_message()) + return True + if (str(last_representation['_id']) in ignored_ids): + print("Ignoring : ", last_representation['_id']) + return True + return False + + def preload(self, placeholder, loaders_by_name, last_representation): + pass + + def load(self, placeholder, loaders_by_name, last_representation): + repre = get_representation_context(last_representation) + return load_with_repre_context( + loaders_by_name[placeholder.loader_name], + repre, + options=parse_loader_args(placeholder.loader_args)) + + def load_succeed(self, placeholder, container): + placeholder.parent_in_hierarchy(container) + + def load_failed(self, placeholder, last_representation): + self.log.warning( + "Got error trying to load {}:{} with {}".format( + last_representation['context']['asset'], + last_representation['context']['subset'], + placeholder.loader_name + ), + exc_info=True + ) + + def postload(self, placeholder): + placeholder.clean() + + def update_missing_containers(self): + loaded_containers_ids = self.get_loaded_containers_by_id() + self.populate_template(ignored_ids=loaded_containers_ids) + + def get_placeholders(self): + placeholders = map(self.placeholder_class, self.get_template_nodes()) + valid_placeholders = filter( + lambda i: i.is_valid, + placeholders + ) + sorted_placeholders = list(sorted( + valid_placeholders, + key=lambda i: i.order + )) + return sorted_placeholders + + @abstractmethod + def get_loaded_containers_by_id(self): + """ + Collect already loaded containers for updating scene + Return: + dict (string, node): A dictionnary id as key + and containers as value + """ + pass + + @abstractmethod + def import_template(self, template_path): + """ + Import template in current host + Args: + template_path (str): fullpath to current task and + host's template file + Return: + None + """ + pass + + @abstractmethod + def get_template_nodes(self): + """ + Returning a list of nodes acting as host placeholders for + templating. The data representation is by user. + AbstractLoadTemplate (and LoadTemplate) won't directly manipulate nodes + Args : + None + Returns: + list(AnyNode): Solved template path + """ + pass + + +@six.add_metaclass(ABCMeta) +class AbstractPlaceholder: + """Abstraction of placeholders logic. + + Properties: + required_keys: A list of mandatory keys to decribe placeholder + and assets to load. + optional_keys: A list of optional keys to decribe + placeholder and assets to load + loader_name: Name of linked loader to use while loading assets + + Args: + identifier (str): Placeholder identifier. Should be possible to be + used as identifier in "a scene" (e.g. unique node name). + """ + + required_keys = { + "builder_type", + "family", + "representation", + "order", + "loader", + "loader_args" + } + optional_keys = {} + + def __init__(self, identifier): + self._log = None + self._name = identifier + self.get_data(identifier) + + @property + def log(self): + if self._log is None: + self._log = Logger.get_logger(repr(self)) + return self._log + + def __repr__(self): + return "< {} {} >".format(self.__class__.__name__, self.name) + + @property + def name(self): + return self._name + + @property + def loader_args(self): + return self.data["loader_args"] + + @property + def builder_type(self): + return self.data["builder_type"] + + @property + def order(self): + return self.data["order"] + + @property + def loader_name(self): + """Return placeholder loader name. + + Returns: + str: Loader name that will be used to load placeholder + representations. + """ + + return self.data["loader"] + + @property + def is_valid(self): + """Test validity of placeholder. + + i.e.: every required key exists in placeholder data + + Returns: + bool: True if every key is in data + """ + + if set(self.required_keys).issubset(self.data.keys()): + self.log.debug("Valid placeholder : {}".format(self.name)) + return True + self.log.info("Placeholder is not valid : {}".format(self.name)) + return False + + @abstractmethod + def parent_in_hierarchy(self, container): + """Place loaded container in correct hierarchy given by placeholder + + Args: + container (Dict[str, Any]): Loaded container created by loader. + """ + + pass + + @abstractmethod + def clean(self): + """Clean placeholder from hierarchy after loading assets.""" + + pass + + @abstractmethod + def get_representations(self, current_asset_doc, linked_asset_docs): + """Query representations based on placeholder data. + + Args: + current_asset_doc (Dict[str, Any]): Document of current + context asset. + linked_asset_docs (List[Dict[str, Any]]): Documents of assets + linked to current context asset. + + Returns: + Iterable[Dict[str, Any]]: Representations that are matching + placeholder filters. + """ + + pass + + @abstractmethod + def get_data(self, identifier): + """Collect information about placeholder by identifier. + + Args: + identifier (str): A unique placeholder identifier defined by + implementation. + """ + + pass diff --git a/openpype/pipeline/workfile/build_template.py b/openpype/pipeline/workfile/build_template.py new file mode 100644 index 0000000000..e6396578c5 --- /dev/null +++ b/openpype/pipeline/workfile/build_template.py @@ -0,0 +1,68 @@ +from importlib import import_module +from openpype.lib import classes_from_module +from openpype.host import HostBase +from openpype.pipeline import registered_host + +from .abstract_template_loader import ( + AbstractPlaceholder, + AbstractTemplateLoader) + +from .build_template_exceptions import ( + TemplateLoadingFailed, + TemplateAlreadyImported, + MissingHostTemplateModule, + MissingTemplatePlaceholderClass, + MissingTemplateLoaderClass +) + +_module_path_format = 'openpype.hosts.{host}.api.template_loader' + + +def build_workfile_template(*args): + template_loader = build_template_loader() + try: + template_loader.import_template(template_loader.template_path) + except TemplateAlreadyImported as err: + template_loader.template_already_imported(err) + except TemplateLoadingFailed as err: + template_loader.template_loading_failed(err) + else: + template_loader.populate_template() + + +def update_workfile_template(args): + template_loader = build_template_loader() + template_loader.update_missing_containers() + + +def build_template_loader(): + # TODO refactor to use advantage of 'HostBase' and don't import dynamically + # - hosts should have methods that gives option to return builders + host = registered_host() + if isinstance(host, HostBase): + host_name = host.name + else: + host_name = host.__name__.partition('.')[2] + module_path = _module_path_format.format(host=host_name) + module = import_module(module_path) + if not module: + raise MissingHostTemplateModule( + "No template loader found for host {}".format(host_name)) + + template_loader_class = classes_from_module( + AbstractTemplateLoader, + module + ) + template_placeholder_class = classes_from_module( + AbstractPlaceholder, + module + ) + + if not template_loader_class: + raise MissingTemplateLoaderClass() + template_loader_class = template_loader_class[0] + + if not template_placeholder_class: + raise MissingTemplatePlaceholderClass() + template_placeholder_class = template_placeholder_class[0] + return template_loader_class(template_placeholder_class) diff --git a/openpype/pipeline/workfile/build_template_exceptions.py b/openpype/pipeline/workfile/build_template_exceptions.py new file mode 100644 index 0000000000..7a5075e3dc --- /dev/null +++ b/openpype/pipeline/workfile/build_template_exceptions.py @@ -0,0 +1,35 @@ +class MissingHostTemplateModule(Exception): + """Error raised when expected module does not exists""" + pass + + +class MissingTemplatePlaceholderClass(Exception): + """Error raised when module doesn't implement a placeholder class""" + pass + + +class MissingTemplateLoaderClass(Exception): + """Error raised when module doesn't implement a template loader class""" + pass + + +class TemplateNotFound(Exception): + """Exception raised when template does not exist.""" + pass + + +class TemplateProfileNotFound(Exception): + """Exception raised when current profile + doesn't match any template profile""" + pass + + +class TemplateAlreadyImported(Exception): + """Error raised when Template was already imported by host for + this session""" + pass + + +class TemplateLoadingFailed(Exception): + """Error raised whend Template loader was unable to load the template""" + pass diff --git a/openpype/plugins/publish/collect_otio_subset_resources.py b/openpype/plugins/publish/collect_otio_subset_resources.py index 9c19f8a78e..3387cd1176 100644 --- a/openpype/plugins/publish/collect_otio_subset_resources.py +++ b/openpype/plugins/publish/collect_otio_subset_resources.py @@ -121,10 +121,8 @@ class CollectOtioSubsetResources(pyblish.api.InstancePlugin): otio.schema.ImageSequenceReference ): is_sequence = True - else: - # for OpenTimelineIO 0.12 and older - if metadata.get("padding"): - is_sequence = True + elif metadata.get("padding"): + is_sequence = True self.log.info( "frame_start-frame_end: {}-{}".format(frame_start, frame_end)) diff --git a/openpype/plugins/publish/extract_review.py b/openpype/plugins/publish/extract_review.py index 7442d3aacb..27117510b2 100644 --- a/openpype/plugins/publish/extract_review.py +++ b/openpype/plugins/publish/extract_review.py @@ -1210,7 +1210,6 @@ class ExtractReview(pyblish.api.InstancePlugin): # Get instance data pixel_aspect = temp_data["pixel_aspect"] - if reformat_in_baking: self.log.debug(( "Using resolution from input. It is already " @@ -1230,6 +1229,10 @@ class ExtractReview(pyblish.api.InstancePlugin): # - settings value can't have None but has value of 0 output_width = output_def.get("width") or output_width or None output_height = output_def.get("height") or output_height or None + # Force to use input resolution if output resolution was not defined + # in settings. Resolution from instance is not used when + # 'use_input_res' is set to 'True'. + use_input_res = False # Overscal color overscan_color_value = "black" @@ -1241,6 +1244,17 @@ class ExtractReview(pyblish.api.InstancePlugin): ) self.log.debug("Overscan color: `{}`".format(overscan_color_value)) + # Scale input to have proper pixel aspect ratio + # - scale width by the pixel aspect ratio + scale_pixel_aspect = output_def.get("scale_pixel_aspect", True) + if scale_pixel_aspect and pixel_aspect != 1: + # Change input width after pixel aspect + input_width = int(input_width * pixel_aspect) + use_input_res = True + filters.append(( + "scale={}x{}:flags=lanczos".format(input_width, input_height) + )) + # Convert overscan value video filters overscan_crop = output_def.get("overscan_crop") overscan = OverscanCrop( @@ -1251,13 +1265,10 @@ class ExtractReview(pyblish.api.InstancePlugin): # resolution by it's values if overscan_crop_filters: filters.extend(overscan_crop_filters) + # Change input resolution after overscan crop input_width = overscan.width() input_height = overscan.height() - # Use output resolution as inputs after cropping to skip usage of - # instance data resolution - if output_width is None or output_height is None: - output_width = input_width - output_height = input_height + use_input_res = True # Make sure input width and height is not an odd number input_width_is_odd = bool(input_width % 2 != 0) @@ -1283,8 +1294,10 @@ class ExtractReview(pyblish.api.InstancePlugin): self.log.debug("input_width: `{}`".format(input_width)) self.log.debug("input_height: `{}`".format(input_height)) - # Use instance resolution if output definition has not set it. - if output_width is None or output_height is None: + # Use instance resolution if output definition has not set it + # - use instance resolution only if there were not scale changes + # that may massivelly affect output 'use_input_res' + if not use_input_res and output_width is None or output_height is None: output_width = temp_data["resolution_width"] output_height = temp_data["resolution_height"] @@ -1326,7 +1339,6 @@ class ExtractReview(pyblish.api.InstancePlugin): output_width == input_width and output_height == input_height and not letter_box_enabled - and pixel_aspect == 1 ): self.log.debug( "Output resolution is same as input's" @@ -1336,39 +1348,8 @@ class ExtractReview(pyblish.api.InstancePlugin): new_repre["resolutionHeight"] = input_height return filters - # defining image ratios - input_res_ratio = ( - (float(input_width) * pixel_aspect) / input_height - ) - output_res_ratio = float(output_width) / float(output_height) - self.log.debug("input_res_ratio: `{}`".format(input_res_ratio)) - self.log.debug("output_res_ratio: `{}`".format(output_res_ratio)) - - # Round ratios to 2 decimal places for comparing - input_res_ratio = round(input_res_ratio, 2) - output_res_ratio = round(output_res_ratio, 2) - - # get scale factor - scale_factor_by_width = ( - float(output_width) / (input_width * pixel_aspect) - ) - scale_factor_by_height = ( - float(output_height) / input_height - ) - - self.log.debug( - "scale_factor_by_with: `{}`".format(scale_factor_by_width) - ) - self.log.debug( - "scale_factor_by_height: `{}`".format(scale_factor_by_height) - ) - # scaling none square pixels and 1920 width - if ( - input_height != output_height - or input_width != output_width - or pixel_aspect != 1 - ): + if input_height != output_height or input_width != output_width: filters.extend([ ( "scale={}x{}" @@ -1478,6 +1459,8 @@ class ExtractReview(pyblish.api.InstancePlugin): output = -1 regexes = self.compile_list_of_regexes(in_list) for regex in regexes: + if not value: + continue if re.match(regex, value): output = 1 break diff --git a/openpype/plugins/publish/integrate_subset_group.py b/openpype/plugins/publish/integrate_subset_group.py index 910cb060a6..a24ebba3a5 100644 --- a/openpype/plugins/publish/integrate_subset_group.py +++ b/openpype/plugins/publish/integrate_subset_group.py @@ -93,6 +93,6 @@ class IntegrateSubsetGroup(pyblish.api.InstancePlugin): return { "families": anatomy_data["family"], "tasks": task.get("name"), - "hosts": anatomy_data["app"], + "hosts": instance.context.data["hostName"], "task_types": task.get("type") } diff --git a/openpype/pype_commands.py b/openpype/pype_commands.py index a447aa916b..66bf5e9bb4 100644 --- a/openpype/pype_commands.py +++ b/openpype/pype_commands.py @@ -76,11 +76,6 @@ class PypeCommands: import (run_webserver) return run_webserver(*args, **kwargs) - @staticmethod - def launch_standalone_publisher(): - from openpype.tools import standalonepublish - standalonepublish.main() - @staticmethod def launch_traypublisher(): from openpype.tools import traypublisher diff --git a/openpype/settings/defaults/project_settings/ftrack.json b/openpype/settings/defaults/project_settings/ftrack.json index 3e86581a03..2d5f889aa5 100644 --- a/openpype/settings/defaults/project_settings/ftrack.json +++ b/openpype/settings/defaults/project_settings/ftrack.json @@ -434,6 +434,9 @@ "enabled": false, "custom_attribute_keys": [] }, + "IntegrateHierarchyToFtrack": { + "create_task_status_profiles": [] + }, "IntegrateFtrackNote": { "enabled": true, "note_template": "{intent}: {comment}", @@ -488,7 +491,11 @@ "usd": "usd" }, "keep_first_subset_name_for_review": true, - "asset_versions_status_profiles": [] + "asset_versions_status_profiles": [], + "additional_metadata_keys": [] + }, + "IntegrateFtrackFarmStatus": { + "farm_status_profiles": [] } } } \ No newline at end of file diff --git a/openpype/settings/defaults/project_settings/global.json b/openpype/settings/defaults/project_settings/global.json index e509db2791..0ff9363ba7 100644 --- a/openpype/settings/defaults/project_settings/global.json +++ b/openpype/settings/defaults/project_settings/global.json @@ -85,6 +85,7 @@ ], "width": 0, "height": 0, + "scale_pixel_aspect": true, "bg_color": [ 0, 0, diff --git a/openpype/settings/defaults/project_settings/maya.json b/openpype/settings/defaults/project_settings/maya.json index ac0f161cf2..28f6d23e4d 100644 --- a/openpype/settings/defaults/project_settings/maya.json +++ b/openpype/settings/defaults/project_settings/maya.json @@ -33,7 +33,8 @@ }, "RenderSettings": { "apply_render_settings": true, - "default_render_image_folder": "", + "default_render_image_folder": "renders", + "enable_all_lights": false, "aov_separator": "underscore", "reset_current_frame": false, "arnold_renderer": { @@ -967,6 +968,9 @@ } ] }, + "templated_workfile_build": { + "profiles": [] + }, "filters": { "preset 1": { "ValidateNoAnimation": false, @@ -976,4 +980,4 @@ "ValidateNoAnimation": false } } -} \ No newline at end of file +} diff --git a/openpype/settings/defaults/project_settings/photoshop.json b/openpype/settings/defaults/project_settings/photoshop.json index d9b7a8083f..758ac64a35 100644 --- a/openpype/settings/defaults/project_settings/photoshop.json +++ b/openpype/settings/defaults/project_settings/photoshop.json @@ -32,6 +32,7 @@ }, "ExtractReview": { "make_image_sequence": false, + "max_downscale_size": 8192, "jpg_options": { "tags": [] }, diff --git a/openpype/settings/defaults/project_settings/shotgrid.json b/openpype/settings/defaults/project_settings/shotgrid.json index 83b6f69074..774bce714b 100644 --- a/openpype/settings/defaults/project_settings/shotgrid.json +++ b/openpype/settings/defaults/project_settings/shotgrid.json @@ -19,4 +19,4 @@ "step": "step" } } -} +} \ No newline at end of file diff --git a/openpype/settings/defaults/project_settings/unreal.json b/openpype/settings/defaults/project_settings/unreal.json index dad61cd1f0..c5f5cdf719 100644 --- a/openpype/settings/defaults/project_settings/unreal.json +++ b/openpype/settings/defaults/project_settings/unreal.json @@ -1,4 +1,5 @@ { + "level_sequences_for_layouts": false, "project_setup": { "dev_mode": true } diff --git a/openpype/settings/entities/schemas/projects_schema/schema_project_ftrack.json b/openpype/settings/entities/schemas/projects_schema/schema_project_ftrack.json index c06bec0f58..5ff8b87427 100644 --- a/openpype/settings/entities/schemas/projects_schema/schema_project_ftrack.json +++ b/openpype/settings/entities/schemas/projects_schema/schema_project_ftrack.json @@ -841,6 +841,44 @@ } ] }, + { + "type": "dict", + "key": "IntegrateHierarchyToFtrack", + "label": "Integrate Hierarchy to ftrack", + "is_group": true, + "collapsible": true, + "children": [ + { + "type": "label", + "label": "Set task status on new task creation. Ftrack's default status is used otherwise." + }, + { + "type": "list", + "key": "create_task_status_profiles", + "object_type": { + "type": "dict", + "children": [ + { + "key": "task_types", + "label": "Task types", + "type": "task-types-enum" + }, + { + "key": "task_names", + "label": "Task names", + "type": "list", + "object_type": "text" + }, + { + "type": "text", + "key": "status_name", + "label": "Status name" + } + ] + } + } + ] + }, { "type": "dict", "collapsible": true, @@ -1001,6 +1039,82 @@ } ] } + }, + { + "key": "additional_metadata_keys", + "label": "Additional metadata keys on components", + "type": "enum", + "multiselection": true, + "enum_items": [ + {"openpype_version": "OpenPype version"}, + {"frame_start": "Frame start"}, + {"frame_end": "Frame end"}, + {"duration": "Duration"}, + {"width": "Resolution width"}, + {"height": "Resolution height"}, + {"fps": "FPS"}, + {"code": "Codec"} + ] + } + ] + }, + { + "type": "dict", + "key": "IntegrateFtrackFarmStatus", + "label": "Integrate Ftrack Farm Status", + "children": [ + { + "type": "label", + "label": "Change status of task when it's subset is submitted to farm" + }, + { + "type": "list", + "collapsible": true, + "key": "farm_status_profiles", + "label": "Farm status profiles", + "use_label_wrap": true, + "object_type": { + "type": "dict", + "children": [ + { + "key": "hosts", + "label": "Host names", + "type": "hosts-enum", + "multiselection": true + }, + { + "key": "task_types", + "label": "Task types", + "type": "task-types-enum" + }, + { + "key": "task_names", + "label": "Task names", + "type": "list", + "object_type": "text" + }, + { + "key": "families", + "label": "Families", + "type": "list", + "object_type": "text" + }, + { + "key": "subsets", + "label": "Subset names", + "type": "list", + "object_type": "text" + }, + { + "type": "separator" + }, + { + "key": "status_name", + "label": "Status name", + "type": "text" + } + ] + } } ] } diff --git a/openpype/settings/entities/schemas/projects_schema/schema_project_maya.json b/openpype/settings/entities/schemas/projects_schema/schema_project_maya.json index cb380194a7..816874779e 100644 --- a/openpype/settings/entities/schemas/projects_schema/schema_project_maya.json +++ b/openpype/settings/entities/schemas/projects_schema/schema_project_maya.json @@ -77,6 +77,10 @@ "type": "schema", "name": "schema_workfile_build" }, + { + "type": "schema", + "name": "schema_templated_workfile_build" + }, { "type": "schema", "name": "schema_publish_gui_filter" diff --git a/openpype/settings/entities/schemas/projects_schema/schema_project_photoshop.json b/openpype/settings/entities/schemas/projects_schema/schema_project_photoshop.json index badf94229b..49860301b6 100644 --- a/openpype/settings/entities/schemas/projects_schema/schema_project_photoshop.json +++ b/openpype/settings/entities/schemas/projects_schema/schema_project_photoshop.json @@ -186,6 +186,15 @@ "key": "make_image_sequence", "label": "Makes an image sequence instead of a flatten image" }, + { + "type": "number", + "key": "max_downscale_size", + "label": "Maximum size of sources for review", + "tooltip": "FFMpeg can only handle limited resolution for creation of review and/or thumbnail", + "minimum": 300, + "maximum": 16384, + "decimal": 0 + }, { "type": "dict", "collapsible": false, diff --git a/openpype/settings/entities/schemas/projects_schema/schema_project_unreal.json b/openpype/settings/entities/schemas/projects_schema/schema_project_unreal.json index 4e197e9fc8..d26b5c1ccf 100644 --- a/openpype/settings/entities/schemas/projects_schema/schema_project_unreal.json +++ b/openpype/settings/entities/schemas/projects_schema/schema_project_unreal.json @@ -5,6 +5,11 @@ "label": "Unreal Engine", "is_file": true, "children": [ + { + "type": "boolean", + "key": "level_sequences_for_layouts", + "label": "Generate level sequences when loading layouts" + }, { "type": "dict", "collapsible": true, diff --git a/openpype/settings/entities/schemas/projects_schema/schemas/schema_global_publish.json b/openpype/settings/entities/schemas/projects_schema/schemas/schema_global_publish.json index b9d0b7daba..e1aa230b49 100644 --- a/openpype/settings/entities/schemas/projects_schema/schemas/schema_global_publish.json +++ b/openpype/settings/entities/schemas/projects_schema/schemas/schema_global_publish.json @@ -319,6 +319,15 @@ "minimum": 0, "maximum": 100000 }, + { + "type": "label", + "label": "Rescale input when it's pixel aspect ratio is not 1. Usefull for anamorph reviews." + }, + { + "key": "scale_pixel_aspect", + "label": "Scale pixel aspect", + "type": "boolean" + }, { "type": "label", "label": "Background color is used only when input have transparency and Alpha is higher than 0." diff --git a/openpype/settings/entities/schemas/projects_schema/schemas/schema_maya_render_settings.json b/openpype/settings/entities/schemas/projects_schema/schemas/schema_maya_render_settings.json index af197604f8..6ee02ca78f 100644 --- a/openpype/settings/entities/schemas/projects_schema/schemas/schema_maya_render_settings.json +++ b/openpype/settings/entities/schemas/projects_schema/schemas/schema_maya_render_settings.json @@ -14,6 +14,11 @@ "key": "default_render_image_folder", "label": "Default render image folder" }, + { + "type": "boolean", + "key": "enable_all_lights", + "label": "Include all lights in Render Setup Layers by default" + }, { "key": "aov_separator", "label": "AOV Separator character", diff --git a/openpype/settings/entities/schemas/projects_schema/schemas/schema_templated_workfile_build.json b/openpype/settings/entities/schemas/projects_schema/schemas/schema_templated_workfile_build.json new file mode 100644 index 0000000000..a591facf98 --- /dev/null +++ b/openpype/settings/entities/schemas/projects_schema/schemas/schema_templated_workfile_build.json @@ -0,0 +1,35 @@ +{ + "type": "dict", + "collapsible": true, + "key": "templated_workfile_build", + "label": "Templated Workfile Build Settings", + "children": [ + { + "type": "list", + "key": "profiles", + "label": "Profiles", + "object_type": { + "type": "dict", + "children": [ + { + "key": "task_types", + "label": "Task types", + "type": "task-types-enum" + }, + { + "key": "tasks", + "label": "Task names", + "type": "list", + "object_type": "text" + }, + { + "key": "path", + "label": "Path to template", + "type": "text", + "object_type": "text" + } + ] + } + } + ] +} diff --git a/openpype/settings/handlers.py b/openpype/settings/handlers.py index 15ae2351fd..79ec6248ac 100644 --- a/openpype/settings/handlers.py +++ b/openpype/settings/handlers.py @@ -22,6 +22,164 @@ from .constants import ( ) +class SettingsStateInfo: + """Helper state information about some settings state. + + Is used to hold information about last saved and last opened UI. Keep + information about the time when that happened and on which machine under + which user and on which openpype version. + + To create currrent machine and time information use 'create_new' method. + """ + + timestamp_format = "%Y-%m-%d %H:%M:%S.%f" + + def __init__( + self, + openpype_version, + settings_type, + project_name, + timestamp, + hostname, + hostip, + username, + system_name, + local_id + ): + self.openpype_version = openpype_version + self.settings_type = settings_type + self.project_name = project_name + + timestamp_obj = None + if timestamp: + timestamp_obj = datetime.datetime.strptime( + timestamp, self.timestamp_format + ) + self.timestamp = timestamp + self.timestamp_obj = timestamp_obj + self.hostname = hostname + self.hostip = hostip + self.username = username + self.system_name = system_name + self.local_id = local_id + + def copy(self): + return self.from_data(self.to_data()) + + @classmethod + def create_new( + cls, openpype_version, settings_type=None, project_name=None + ): + """Create information about this machine for current time.""" + + from openpype.lib.pype_info import get_workstation_info + + now = datetime.datetime.now() + workstation_info = get_workstation_info() + + return cls( + openpype_version, + settings_type, + project_name, + now.strftime(cls.timestamp_format), + workstation_info["hostname"], + workstation_info["hostip"], + workstation_info["username"], + workstation_info["system_name"], + workstation_info["local_id"] + ) + + @classmethod + def from_data(cls, data): + """Create object from data.""" + + return cls( + data["openpype_version"], + data["settings_type"], + data["project_name"], + data["timestamp"], + data["hostname"], + data["hostip"], + data["username"], + data["system_name"], + data["local_id"] + ) + + def to_data(self): + data = self.to_document_data() + data.update({ + "openpype_version": self.openpype_version, + "settings_type": self.settings_type, + "project_name": self.project_name + }) + return data + + @classmethod + def create_new_empty(cls, openpype_version, settings_type=None): + return cls( + openpype_version, + settings_type, + None, + None, + None, + None, + None, + None, + None + ) + + @classmethod + def from_document(cls, openpype_version, settings_type, document): + document = document or {} + project_name = document.get("project_name") + last_saved_info = document.get("last_saved_info") + if last_saved_info: + copy_last_saved_info = copy.deepcopy(last_saved_info) + copy_last_saved_info.update({ + "openpype_version": openpype_version, + "settings_type": settings_type, + "project_name": project_name, + }) + return cls.from_data(copy_last_saved_info) + return cls( + openpype_version, + settings_type, + project_name, + None, + None, + None, + None, + None, + None + ) + + def to_document_data(self): + return { + "timestamp": self.timestamp, + "hostname": self.hostname, + "hostip": self.hostip, + "username": self.username, + "system_name": self.system_name, + "local_id": self.local_id, + } + + def __eq__(self, other): + if not isinstance(other, SettingsStateInfo): + return False + + if other.timestamp_obj != self.timestamp_obj: + return False + + return ( + self.openpype_version == other.openpype_version + and self.hostname == other.hostname + and self.hostip == other.hostip + and self.username == other.username + and self.system_name == other.system_name + and self.local_id == other.local_id + ) + + @six.add_metaclass(ABCMeta) class SettingsHandler: @abstractmethod @@ -226,7 +384,7 @@ class SettingsHandler: """OpenPype versions that have any studio project anatomy overrides. Returns: - list: OpenPype versions strings. + List[str]: OpenPype versions strings. """ pass @@ -237,7 +395,7 @@ class SettingsHandler: """OpenPype versions that have any studio project settings overrides. Returns: - list: OpenPype versions strings. + List[str]: OpenPype versions strings. """ pass @@ -251,8 +409,87 @@ class SettingsHandler: project_name(str): Name of project. Returns: - list: OpenPype versions strings. + List[str]: OpenPype versions strings. """ + + pass + + @abstractmethod + def get_system_last_saved_info(self): + """State of last system settings overrides at the moment when called. + + This method must provide most recent data so using cached data is not + the way. + + Returns: + SettingsStateInfo: Information about system settings overrides. + """ + + pass + + @abstractmethod + def get_project_last_saved_info(self, project_name): + """State of last project settings overrides at the moment when called. + + This method must provide most recent data so using cached data is not + the way. + + Args: + project_name (Union[None, str]): Project name for which state + should be returned. + + Returns: + SettingsStateInfo: Information about project settings overrides. + """ + + pass + + # UI related calls + @abstractmethod + def get_last_opened_info(self): + """Get information about last opened UI. + + Last opened UI is empty if there is noone who would have opened UI at + the moment when called. + + Returns: + Union[None, SettingsStateInfo]: Information about machine who had + opened Settings UI. + """ + + pass + + @abstractmethod + def opened_settings_ui(self): + """Callback called when settings UI is opened. + + Information about this machine must be available when + 'get_last_opened_info' is called from anywhere until + 'closed_settings_ui' is called again. + + Returns: + SettingsStateInfo: Object representing information about this + machine. Must be passed to 'closed_settings_ui' when finished. + """ + + pass + + @abstractmethod + def closed_settings_ui(self, info_obj): + """Callback called when settings UI is closed. + + From the moment this method is called the information about this + machine is removed and no more available when 'get_last_opened_info' + is called. + + Callback should validate if this machine is still stored as opened ui + before changing any value. + + Args: + info_obj (SettingsStateInfo): Object created when + 'opened_settings_ui' was called. + """ + pass @@ -285,19 +522,22 @@ class CacheValues: self.data = None self.creation_time = None self.version = None + self.last_saved_info = None def data_copy(self): if not self.data: return {} return copy.deepcopy(self.data) - def update_data(self, data, version=None): + def update_data(self, data, version): self.data = data self.creation_time = datetime.datetime.now() - if version is not None: - self.version = version + self.version = version - def update_from_document(self, document, version=None): + def update_last_saved_info(self, last_saved_info): + self.last_saved_info = last_saved_info + + def update_from_document(self, document, version): data = {} if document: if "data" in document: @@ -306,9 +546,9 @@ class CacheValues: value = document["value"] if value: data = json.loads(value) + self.data = data - if version is not None: - self.version = version + self.version = version def to_json_string(self): return json.dumps(self.data or {}) @@ -320,6 +560,9 @@ class CacheValues: delta = (datetime.datetime.now() - self.creation_time).seconds return delta > self.cache_lifetime + def set_outdated(self): + self.create_time = None + class MongoSettingsHandler(SettingsHandler): """Settings handler that use mongo for storing and loading of settings.""" @@ -509,6 +752,14 @@ class MongoSettingsHandler(SettingsHandler): # Update cache self.system_settings_cache.update_data(data, self._current_version) + last_saved_info = SettingsStateInfo.create_new( + self._current_version, + SYSTEM_SETTINGS_KEY + ) + self.system_settings_cache.update_last_saved_info( + last_saved_info + ) + # Get copy of just updated cache system_settings_data = self.system_settings_cache.data_copy() @@ -517,20 +768,29 @@ class MongoSettingsHandler(SettingsHandler): system_settings_data ) - # Store system settings - self.collection.replace_one( + system_settings_doc = self.collection.find_one( { "type": self._system_settings_key, "version": self._current_version }, - { - "type": self._system_settings_key, - "data": system_settings_data, - "version": self._current_version - }, - upsert=True + {"_id": True} ) + # Store system settings + new_system_settings_doc = { + "type": self._system_settings_key, + "version": self._current_version, + "data": system_settings_data, + "last_saved_info": last_saved_info.to_document_data() + } + if not system_settings_doc: + self.collection.insert_one(new_system_settings_doc) + else: + self.collection.update_one( + {"_id": system_settings_doc["_id"]}, + {"$set": new_system_settings_doc} + ) + # Store global settings self.collection.replace_one( { @@ -562,6 +822,14 @@ class MongoSettingsHandler(SettingsHandler): data_cache = self.project_settings_cache[project_name] data_cache.update_data(overrides, self._current_version) + last_saved_info = SettingsStateInfo.create_new( + self._current_version, + PROJECT_SETTINGS_KEY, + project_name + ) + + data_cache.update_last_saved_info(last_saved_info) + self._save_project_data( project_name, self._project_settings_key, data_cache ) @@ -665,26 +933,34 @@ class MongoSettingsHandler(SettingsHandler): def _save_project_data(self, project_name, doc_type, data_cache): is_default = bool(project_name is None) - replace_filter = { + query_filter = { "type": doc_type, "is_default": is_default, "version": self._current_version } - replace_data = { + last_saved_info = data_cache.last_saved_info + new_project_settings_doc = { "type": doc_type, "data": data_cache.data, "is_default": is_default, - "version": self._current_version + "version": self._current_version, + "last_saved_info": last_saved_info.to_data() } if not is_default: - replace_filter["project_name"] = project_name - replace_data["project_name"] = project_name + query_filter["project_name"] = project_name + new_project_settings_doc["project_name"] = project_name - self.collection.replace_one( - replace_filter, - replace_data, - upsert=True + project_settings_doc = self.collection.find_one( + query_filter, + {"_id": True} ) + if project_settings_doc: + self.collection.update_one( + {"_id": project_settings_doc["_id"]}, + {"$set": new_project_settings_doc} + ) + else: + self.collection.insert_one(new_project_settings_doc) def _get_versions_order_doc(self, projection=None): # TODO cache @@ -1011,19 +1287,11 @@ class MongoSettingsHandler(SettingsHandler): globals_document = self.collection.find_one({ "type": GLOBAL_SETTINGS_KEY }) - document = ( - self._get_studio_system_settings_overrides_for_version() + document, version = self._get_system_settings_overrides_doc() + + last_saved_info = SettingsStateInfo.from_document( + version, SYSTEM_SETTINGS_KEY, document ) - if document is None: - document = self._find_closest_system_settings() - - version = None - if document: - if document["type"] == self._system_settings_key: - version = document["version"] - else: - version = LEGACY_SETTINGS_VERSION - merged_document = self._apply_global_settings( document, globals_document ) @@ -1031,6 +1299,9 @@ class MongoSettingsHandler(SettingsHandler): self.system_settings_cache.update_from_document( merged_document, version ) + self.system_settings_cache.update_last_saved_info( + last_saved_info + ) cache = self.system_settings_cache data = cache.data_copy() @@ -1038,24 +1309,43 @@ class MongoSettingsHandler(SettingsHandler): return data, cache.version return data + def _get_system_settings_overrides_doc(self): + document = ( + self._get_studio_system_settings_overrides_for_version() + ) + if document is None: + document = self._find_closest_system_settings() + + version = None + if document: + if document["type"] == self._system_settings_key: + version = document["version"] + else: + version = LEGACY_SETTINGS_VERSION + + return document, version + + def get_system_last_saved_info(self): + # Make sure settings are recaches + self.system_settings_cache.set_outdated() + self.get_studio_system_settings_overrides(False) + + return self.system_settings_cache.last_saved_info.copy() + def _get_project_settings_overrides(self, project_name, return_version): if self.project_settings_cache[project_name].is_outdated: - document = self._get_project_settings_overrides_for_version( + document, version = self._get_project_settings_overrides_doc( project_name ) - if document is None: - document = self._find_closest_project_settings(project_name) - - version = None - if document: - if document["type"] == self._project_settings_key: - version = document["version"] - else: - version = LEGACY_SETTINGS_VERSION - self.project_settings_cache[project_name].update_from_document( document, version ) + last_saved_info = SettingsStateInfo.from_document( + version, PROJECT_SETTINGS_KEY, document + ) + self.project_settings_cache[project_name].update_last_saved_info( + last_saved_info + ) cache = self.project_settings_cache[project_name] data = cache.data_copy() @@ -1063,6 +1353,29 @@ class MongoSettingsHandler(SettingsHandler): return data, cache.version return data + def _get_project_settings_overrides_doc(self, project_name): + document = self._get_project_settings_overrides_for_version( + project_name + ) + if document is None: + document = self._find_closest_project_settings(project_name) + + version = None + if document: + if document["type"] == self._project_settings_key: + version = document["version"] + else: + version = LEGACY_SETTINGS_VERSION + + return document, version + + def get_project_last_saved_info(self, project_name): + # Make sure settings are recaches + self.project_settings_cache[project_name].set_outdated() + self._get_project_settings_overrides(project_name, False) + + return self.project_settings_cache[project_name].last_saved_info.copy() + def get_studio_project_settings_overrides(self, return_version): """Studio overrides of default project settings.""" return self._get_project_settings_overrides(None, return_version) @@ -1140,6 +1453,7 @@ class MongoSettingsHandler(SettingsHandler): self.project_anatomy_cache[project_name].update_from_document( document, version ) + else: project_doc = get_project(project_name) self.project_anatomy_cache[project_name].update_data( @@ -1359,6 +1673,64 @@ class MongoSettingsHandler(SettingsHandler): return output return self._sort_versions(output) + def get_last_opened_info(self): + doc = self.collection.find_one({ + "type": "last_opened_settings_ui", + "version": self._current_version + }) or {} + info_data = doc.get("info") + if not info_data: + return None + + # Fill not available information + info_data["openpype_version"] = self._current_version + info_data["settings_type"] = None + info_data["project_name"] = None + return SettingsStateInfo.from_data(info_data) + + def opened_settings_ui(self): + doc_filter = { + "type": "last_opened_settings_ui", + "version": self._current_version + } + + opened_info = SettingsStateInfo.create_new(self._current_version) + new_doc_data = copy.deepcopy(doc_filter) + new_doc_data["info"] = opened_info.to_document_data() + + doc = self.collection.find_one( + doc_filter, + {"_id": True} + ) + if doc: + self.collection.update_one( + {"_id": doc["_id"]}, + {"$set": new_doc_data} + ) + else: + self.collection.insert_one(new_doc_data) + return opened_info + + def closed_settings_ui(self, info_obj): + doc_filter = { + "type": "last_opened_settings_ui", + "version": self._current_version + } + doc = self.collection.find_one(doc_filter) or {} + info_data = doc.get("info") + if not info_data: + return + + info_data["openpype_version"] = self._current_version + info_data["settings_type"] = None + info_data["project_name"] = None + current_info = SettingsStateInfo.from_data(info_data) + if current_info == info_obj: + self.collection.update_one( + {"_id": doc["_id"]}, + {"$set": {"info": None}} + ) + class MongoLocalSettingsHandler(LocalSettingsHandler): """Settings handler that use mongo for store and load local settings. @@ -1405,7 +1777,7 @@ class MongoLocalSettingsHandler(LocalSettingsHandler): """ data = data or {} - self.local_settings_cache.update_data(data) + self.local_settings_cache.update_data(data, None) self.collection.replace_one( { @@ -1428,6 +1800,6 @@ class MongoLocalSettingsHandler(LocalSettingsHandler): "site_id": self.local_site_id }) - self.local_settings_cache.update_from_document(document) + self.local_settings_cache.update_from_document(document, None) return self.local_settings_cache.data_copy() diff --git a/openpype/settings/lib.py b/openpype/settings/lib.py index 6df41112c8..5eaddf6e6e 100644 --- a/openpype/settings/lib.py +++ b/openpype/settings/lib.py @@ -91,6 +91,31 @@ def calculate_changes(old_value, new_value): return changes +@require_handler +def get_system_last_saved_info(): + return _SETTINGS_HANDLER.get_system_last_saved_info() + + +@require_handler +def get_project_last_saved_info(project_name): + return _SETTINGS_HANDLER.get_project_last_saved_info(project_name) + + +@require_handler +def get_last_opened_info(): + return _SETTINGS_HANDLER.get_last_opened_info() + + +@require_handler +def opened_settings_ui(): + return _SETTINGS_HANDLER.opened_settings_ui() + + +@require_handler +def closed_settings_ui(info_obj): + return _SETTINGS_HANDLER.closed_settings_ui(info_obj) + + @require_handler def save_studio_settings(data): """Save studio overrides of system settings. diff --git a/openpype/tools/launcher/models.py b/openpype/tools/launcher/models.py index 3f899cc05e..6d40d21f96 100644 --- a/openpype/tools/launcher/models.py +++ b/openpype/tools/launcher/models.py @@ -10,6 +10,7 @@ from Qt import QtCore, QtGui import qtawesome from openpype.client import ( + get_projects, get_project, get_assets, ) @@ -527,7 +528,7 @@ class LauncherModel(QtCore.QObject): current_project = self.project_name project_names = set() project_docs_by_name = {} - for project_doc in self._dbcon.projects(only_active=True): + for project_doc in get_projects(): project_name = project_doc["name"] project_names.add(project_name) project_docs_by_name[project_name] = project_doc diff --git a/openpype/tools/libraryloader/app.py b/openpype/tools/libraryloader/app.py index 5f4d10d796..d2af1b7151 100644 --- a/openpype/tools/libraryloader/app.py +++ b/openpype/tools/libraryloader/app.py @@ -3,7 +3,7 @@ import sys from Qt import QtWidgets, QtCore, QtGui from openpype import style -from openpype.client import get_project +from openpype.client import get_projects, get_project from openpype.pipeline import AvalonMongoDB from openpype.tools.utils import lib as tools_lib from openpype.tools.loader.widgets import ( @@ -239,7 +239,7 @@ class LibraryLoaderWindow(QtWidgets.QDialog): def get_filtered_projects(self): projects = list() - for project in self.dbcon.projects(): + for project in get_projects(fields=["name", "data.library_project"]): is_library = project.get("data", {}).get("library_project", False) if ( (is_library and self.show_libraries) or diff --git a/openpype/tools/loader/model.py b/openpype/tools/loader/model.py index a5174bd804..3ce44ea6c8 100644 --- a/openpype/tools/loader/model.py +++ b/openpype/tools/loader/model.py @@ -272,15 +272,15 @@ class SubsetsModel(TreeModel, BaseRepresentationModel): # update availability on active site when version changes if self.sync_server.enabled and version_doc: - query = self._repre_per_version_pipeline( + repre_info = self.sync_server.get_repre_info_for_versions( + project_name, [version_doc["_id"]], self.active_site, self.remote_site ) - docs = list(self.dbcon.aggregate(query)) - if docs: - repre = docs.pop() - version_doc["data"].update(self._get_repre_dict(repre)) + if repre_info: + version_doc["data"].update( + self._get_repre_dict(repre_info[0])) self.set_version(index, version_doc) @@ -478,16 +478,16 @@ class SubsetsModel(TreeModel, BaseRepresentationModel): for _subset_id, doc in last_versions_by_subset_id.items(): version_ids.add(doc["_id"]) - query = self._repre_per_version_pipeline( + repres = self.sync_server.get_repre_info_for_versions( + project_name, list(version_ids), self.active_site, self.remote_site ) - - for doc in self.dbcon.aggregate(query): + for repre in repres: if self._doc_fetching_stop: return doc["active_provider"] = self.active_provider doc["remote_provider"] = self.remote_provider - repre_info[doc["_id"]] = doc + repre_info[repre["_id"]] = repre self._doc_payload = { "asset_docs_by_id": asset_docs_by_id, @@ -827,83 +827,6 @@ class SubsetsModel(TreeModel, BaseRepresentationModel): return data - def _repre_per_version_pipeline(self, version_ids, - active_site, remote_site): - query = [ - {"$match": {"parent": {"$in": version_ids}, - "type": "representation", - "files.sites.name": {"$exists": 1}}}, - {"$unwind": "$files"}, - {'$addFields': { - 'order_local': { - '$filter': { - 'input': '$files.sites', 'as': 'p', - 'cond': {'$eq': ['$$p.name', active_site]} - } - } - }}, - {'$addFields': { - 'order_remote': { - '$filter': { - 'input': '$files.sites', 'as': 'p', - 'cond': {'$eq': ['$$p.name', remote_site]} - } - } - }}, - {'$addFields': { - 'progress_local': {"$arrayElemAt": [{ - '$cond': [ - {'$size': "$order_local.progress"}, - "$order_local.progress", - # if exists created_dt count is as available - {'$cond': [ - {'$size': "$order_local.created_dt"}, - [1], - [0] - ]} - ]}, - 0 - ]} - }}, - {'$addFields': { - 'progress_remote': {"$arrayElemAt": [{ - '$cond': [ - {'$size': "$order_remote.progress"}, - "$order_remote.progress", - # if exists created_dt count is as available - {'$cond': [ - {'$size': "$order_remote.created_dt"}, - [1], - [0] - ]} - ]}, - 0 - ]} - }}, - {'$group': { # first group by repre - '_id': '$_id', - 'parent': {'$first': '$parent'}, - 'avail_ratio_local': { - '$first': { - '$divide': [{'$sum': "$progress_local"}, {'$sum': 1}] - } - }, - 'avail_ratio_remote': { - '$first': { - '$divide': [{'$sum': "$progress_remote"}, {'$sum': 1}] - } - } - }}, - {'$group': { # second group by parent, eg version_id - '_id': '$parent', - 'repre_count': {'$sum': 1}, # total representations - # fully available representation for site - 'avail_repre_local': {'$sum': "$avail_ratio_local"}, - 'avail_repre_remote': {'$sum': "$avail_ratio_remote"}, - }}, - ] - return query - class GroupMemberFilterProxyModel(QtCore.QSortFilterProxyModel): """Provide the feature of filtering group by the acceptance of members diff --git a/openpype/tools/loader/widgets.py b/openpype/tools/loader/widgets.py index 48c038418a..597c35e89b 100644 --- a/openpype/tools/loader/widgets.py +++ b/openpype/tools/loader/widgets.py @@ -567,12 +567,12 @@ class SubsetWidget(QtWidgets.QWidget): # Trigger project_name = self.dbcon.active_project() - subset_names_by_version_id = collections.defaultdict(set) + subset_name_by_version_id = dict() for item in items: version_id = item["version_document"]["_id"] - subset_names_by_version_id[version_id].add(item["subset"]) + subset_name_by_version_id[version_id] = item["subset"] - version_ids = set(subset_names_by_version_id.keys()) + version_ids = set(subset_name_by_version_id.keys()) repre_docs = get_representations( project_name, representation_names=[representation_name], @@ -584,14 +584,15 @@ class SubsetWidget(QtWidgets.QWidget): for repre_doc in repre_docs: repre_ids.append(repre_doc["_id"]) + # keep only version ids without representation with that name version_id = repre_doc["parent"] - if version_id not in version_ids: - version_ids.remove(version_id) + version_ids.discard(version_id) - for version_id in version_ids: + if version_ids: + # report versions that didn't have valid representation joined_subset_names = ", ".join([ - '"{}"'.format(subset) - for subset in subset_names_by_version_id[version_id] + '"{}"'.format(subset_name_by_version_id[version_id]) + for version_id in version_ids ]) self.echo("Subsets {} don't have representation '{}'".format( joined_subset_names, representation_name @@ -1546,6 +1547,11 @@ def _load_representations_by_loader(loader, repre_contexts, return for repre_context in repre_contexts.values(): + version_doc = repre_context["version"] + if version_doc["type"] == "hero_version": + version_name = "Hero" + else: + version_name = version_doc.get("name") try: if data_by_repre_id: _id = repre_context["representation"]["_id"] @@ -1563,7 +1569,7 @@ def _load_representations_by_loader(loader, repre_contexts, None, repre_context["representation"]["name"], repre_context["subset"]["name"], - repre_context["version"]["name"] + version_name )) except Exception as exc: @@ -1576,7 +1582,7 @@ def _load_representations_by_loader(loader, repre_contexts, formatted_traceback, repre_context["representation"]["name"], repre_context["subset"]["name"], - repre_context["version"]["name"] + version_name )) return error_info diff --git a/openpype/tools/project_manager/project_manager/model.py b/openpype/tools/project_manager/project_manager/model.py index c5bde5aaec..3aaee75698 100644 --- a/openpype/tools/project_manager/project_manager/model.py +++ b/openpype/tools/project_manager/project_manager/model.py @@ -8,6 +8,7 @@ from pymongo import UpdateOne, DeleteOne from Qt import QtCore, QtGui from openpype.client import ( + get_projects, get_project, get_assets, get_asset_ids_with_subsets, @@ -54,12 +55,8 @@ class ProjectModel(QtGui.QStandardItemModel): self._items_by_name[None] = none_project new_project_items.append(none_project) - project_docs = self.dbcon.projects( - projection={"name": 1}, - only_active=True - ) project_names = set() - for project_doc in project_docs: + for project_doc in get_projects(fields=["name"]): project_name = project_doc.get("name") if not project_name: continue diff --git a/openpype/tools/sceneinventory/window.py b/openpype/tools/sceneinventory/window.py index 054c2a2daa..463280b71c 100644 --- a/openpype/tools/sceneinventory/window.py +++ b/openpype/tools/sceneinventory/window.py @@ -4,8 +4,9 @@ import sys from Qt import QtWidgets, QtCore import qtawesome -from openpype.pipeline import legacy_io from openpype import style +from openpype.client import get_projects +from openpype.pipeline import legacy_io from openpype.tools.utils.delegates import VersionDelegate from openpype.tools.utils.lib import ( qt_app_context, @@ -195,8 +196,7 @@ def show(root=None, debug=False, parent=None, items=None): if not os.environ.get("AVALON_PROJECT"): any_project = next( - project for project in legacy_io.projects() - if project.get("active", True) is not False + project for project in get_projects() ) project_name = any_project["name"] diff --git a/openpype/tools/settings/settings/categories.py b/openpype/tools/settings/settings/categories.py index f42027d9e2..f4b2c13a12 100644 --- a/openpype/tools/settings/settings/categories.py +++ b/openpype/tools/settings/settings/categories.py @@ -36,6 +36,11 @@ from openpype.settings.entities.op_version_entity import ( ) from openpype.settings import SaveWarningExc +from openpype.settings.lib import ( + get_system_last_saved_info, + get_project_last_saved_info, +) +from .dialogs import SettingsLastSavedChanged, SettingsControlTaken from .widgets import ( ProjectListWidget, VersionAction @@ -115,12 +120,19 @@ class SettingsCategoryWidget(QtWidgets.QWidget): "settings to update them to you current running OpenPype version." ) - def __init__(self, user_role, parent=None): + def __init__(self, controller, parent=None): super(SettingsCategoryWidget, self).__init__(parent) - self.user_role = user_role + self._controller = controller + controller.event_system.add_callback( + "edit.mode.changed", + self._edit_mode_changed + ) self.entity = None + self._edit_mode = None + self._last_saved_info = None + self._reset_crashed = False self._state = CategoryState.Idle @@ -191,6 +203,31 @@ class SettingsCategoryWidget(QtWidgets.QWidget): ) raise TypeError("Unknown type: {}".format(label)) + def _edit_mode_changed(self, event): + self.set_edit_mode(event["edit_mode"]) + + def set_edit_mode(self, enabled): + if enabled is self._edit_mode: + return + + was_false = self._edit_mode is False + self._edit_mode = enabled + + self.save_btn.setEnabled(enabled and not self._reset_crashed) + if enabled: + tooltip = ( + "Someone else has opened settings UI." + "\nTry hit refresh to check if settings are already available." + ) + else: + tooltip = "Save settings" + + self.save_btn.setToolTip(tooltip) + + # Reset when last saved information has changed + if was_false and not self._check_last_saved_info(): + self.reset() + @property def state(self): return self._state @@ -286,7 +323,7 @@ class SettingsCategoryWidget(QtWidgets.QWidget): footer_layout = QtWidgets.QHBoxLayout(footer_widget) footer_layout.setContentsMargins(5, 5, 5, 5) - if self.user_role == "developer": + if self._controller.user_role == "developer": self._add_developer_ui(footer_layout, footer_widget) footer_layout.addWidget(empty_label, 1) @@ -434,6 +471,9 @@ class SettingsCategoryWidget(QtWidgets.QWidget): self.set_state(CategoryState.Idle) def save(self): + if not self._edit_mode: + return + if not self.items_are_valid(): return @@ -664,14 +704,16 @@ class SettingsCategoryWidget(QtWidgets.QWidget): ) def _on_reset_crash(self): + self._reset_crashed = True self.save_btn.setEnabled(False) if self.breadcrumbs_model is not None: self.breadcrumbs_model.set_entity(None) def _on_reset_success(self): + self._reset_crashed = False if not self.save_btn.isEnabled(): - self.save_btn.setEnabled(True) + self.save_btn.setEnabled(self._edit_mode) if self.breadcrumbs_model is not None: path = self.breadcrumbs_bar.path() @@ -716,7 +758,24 @@ class SettingsCategoryWidget(QtWidgets.QWidget): """Callback on any tab widget save.""" return + def _check_last_saved_info(self): + raise NotImplementedError(( + "{} does not have implemented '_check_last_saved_info'" + ).format(self.__class__.__name__)) + def _save(self): + self._controller.update_last_opened_info() + if not self._controller.opened_info: + dialog = SettingsControlTaken(self._last_saved_info, self) + dialog.exec_() + return + + if not self._check_last_saved_info(): + dialog = SettingsLastSavedChanged(self._last_saved_info, self) + dialog.exec_() + if dialog.result() == 0: + return + # Don't trigger restart if defaults are modified if self.is_modifying_defaults: require_restart = False @@ -775,6 +834,13 @@ class SystemWidget(SettingsCategoryWidget): self._actions = [] super(SystemWidget, self).__init__(*args, **kwargs) + def _check_last_saved_info(self): + if self.is_modifying_defaults: + return True + + last_saved_info = get_system_last_saved_info() + return self._last_saved_info == last_saved_info + def contain_category_key(self, category): if category == "system_settings": return True @@ -789,6 +855,10 @@ class SystemWidget(SettingsCategoryWidget): ) entity.on_change_callbacks.append(self._on_entity_change) self.entity = entity + last_saved_info = None + if not self.is_modifying_defaults: + last_saved_info = get_system_last_saved_info() + self._last_saved_info = last_saved_info try: if self.is_modifying_defaults: entity.set_defaults_state() @@ -822,6 +892,13 @@ class ProjectWidget(SettingsCategoryWidget): def __init__(self, *args, **kwargs): super(ProjectWidget, self).__init__(*args, **kwargs) + def _check_last_saved_info(self): + if self.is_modifying_defaults: + return True + + last_saved_info = get_project_last_saved_info(self.project_name) + return self._last_saved_info == last_saved_info + def contain_category_key(self, category): if category in ("project_settings", "project_anatomy"): return True @@ -901,6 +978,11 @@ class ProjectWidget(SettingsCategoryWidget): entity.on_change_callbacks.append(self._on_entity_change) self.project_list_widget.set_entity(entity) self.entity = entity + + last_saved_info = None + if not self.is_modifying_defaults: + last_saved_info = get_project_last_saved_info(self.project_name) + self._last_saved_info = last_saved_info try: if self.is_modifying_defaults: self.entity.set_defaults_state() diff --git a/openpype/tools/settings/settings/dialogs.py b/openpype/tools/settings/settings/dialogs.py new file mode 100644 index 0000000000..f25374a48c --- /dev/null +++ b/openpype/tools/settings/settings/dialogs.py @@ -0,0 +1,202 @@ +from Qt import QtWidgets, QtCore + +from openpype.tools.utils.delegates import pretty_date + + +class BaseInfoDialog(QtWidgets.QDialog): + width = 600 + height = 400 + + def __init__(self, message, title, info_obj, parent=None): + super(BaseInfoDialog, self).__init__(parent) + self._result = 0 + self._info_obj = info_obj + + self.setWindowTitle(title) + + message_label = QtWidgets.QLabel(message, self) + message_label.setWordWrap(True) + + separator_widget_1 = QtWidgets.QFrame(self) + separator_widget_2 = QtWidgets.QFrame(self) + for separator_widget in ( + separator_widget_1, + separator_widget_2 + ): + separator_widget.setObjectName("Separator") + separator_widget.setMinimumHeight(1) + separator_widget.setMaximumHeight(1) + + other_information = QtWidgets.QWidget(self) + other_information_layout = QtWidgets.QFormLayout(other_information) + other_information_layout.setContentsMargins(0, 0, 0, 0) + for label, value in ( + ("Username", info_obj.username), + ("Host name", info_obj.hostname), + ("Host IP", info_obj.hostip), + ("System name", info_obj.system_name), + ("Local ID", info_obj.local_id), + ): + other_information_layout.addRow( + label, + QtWidgets.QLabel(value, other_information) + ) + + timestamp_label = QtWidgets.QLabel( + pretty_date(info_obj.timestamp_obj), other_information + ) + other_information_layout.addRow("Time", timestamp_label) + + footer_widget = QtWidgets.QWidget(self) + buttons_widget = QtWidgets.QWidget(footer_widget) + + buttons_layout = QtWidgets.QHBoxLayout(buttons_widget) + buttons_layout.setContentsMargins(0, 0, 0, 0) + buttons = self.get_buttons(buttons_widget) + for button in buttons: + buttons_layout.addWidget(button, 1) + + footer_layout = QtWidgets.QHBoxLayout(footer_widget) + footer_layout.setContentsMargins(0, 0, 0, 0) + footer_layout.addStretch(1) + footer_layout.addWidget(buttons_widget, 0) + + layout = QtWidgets.QVBoxLayout(self) + layout.addWidget(message_label, 0) + layout.addWidget(separator_widget_1, 0) + layout.addStretch(1) + layout.addWidget(other_information, 0, QtCore.Qt.AlignHCenter) + layout.addStretch(1) + layout.addWidget(separator_widget_2, 0) + layout.addWidget(footer_widget, 0) + + timestamp_timer = QtCore.QTimer() + timestamp_timer.setInterval(1000) + timestamp_timer.timeout.connect(self._on_timestamp_timer) + + self._timestamp_label = timestamp_label + self._timestamp_timer = timestamp_timer + + def showEvent(self, event): + super(BaseInfoDialog, self).showEvent(event) + self._timestamp_timer.start() + self.resize(self.width, self.height) + + def closeEvent(self, event): + self._timestamp_timer.stop() + super(BaseInfoDialog, self).closeEvent(event) + + def _on_timestamp_timer(self): + self._timestamp_label.setText( + pretty_date(self._info_obj.timestamp_obj) + ) + + def result(self): + return self._result + + def get_buttons(self, parent): + return [] + + +class SettingsUIOpenedElsewhere(BaseInfoDialog): + def __init__(self, info_obj, parent=None): + title = "Someone else has opened Settings UI" + message = ( + "Someone else has opened Settings UI which could cause data loss." + " Please contact the person on the other side." + "

You can continue in view-only mode." + " All changes in view mode will be lost." + "

You can take control which will cause that" + " all changes of settings on the other side will be lost.
" + ) + super(SettingsUIOpenedElsewhere, self).__init__( + message, title, info_obj, parent + ) + + def _on_take_control(self): + self._result = 1 + self.close() + + def _on_view_mode(self): + self._result = 0 + self.close() + + def get_buttons(self, parent): + take_control_btn = QtWidgets.QPushButton( + "Take control", parent + ) + view_mode_btn = QtWidgets.QPushButton( + "View only", parent + ) + + take_control_btn.clicked.connect(self._on_take_control) + view_mode_btn.clicked.connect(self._on_view_mode) + + return [ + take_control_btn, + view_mode_btn + ] + + +class SettingsLastSavedChanged(BaseInfoDialog): + width = 500 + height = 300 + + def __init__(self, info_obj, parent=None): + title = "Settings has changed" + message = ( + "Settings has changed while you had opened this settings session." + "

It is recommended to refresh settings" + " and re-apply changes in the new session." + ) + super(SettingsLastSavedChanged, self).__init__( + message, title, info_obj, parent + ) + + def _on_save(self): + self._result = 1 + self.close() + + def _on_close(self): + self._result = 0 + self.close() + + def get_buttons(self, parent): + close_btn = QtWidgets.QPushButton( + "Close", parent + ) + save_btn = QtWidgets.QPushButton( + "Save anyway", parent + ) + + close_btn.clicked.connect(self._on_close) + save_btn.clicked.connect(self._on_save) + + return [ + close_btn, + save_btn + ] + + +class SettingsControlTaken(BaseInfoDialog): + width = 500 + height = 300 + + def __init__(self, info_obj, parent=None): + title = "Settings control taken" + message = ( + "Someone took control over your settings." + "

It is not possible to save changes of currently" + " opened session. Copy changes you want to keep and hit refresh." + ) + super(SettingsControlTaken, self).__init__( + message, title, info_obj, parent + ) + + def _on_confirm(self): + self.close() + + def get_buttons(self, parent): + confirm_btn = QtWidgets.QPushButton("Understand", parent) + confirm_btn.clicked.connect(self._on_confirm) + return [confirm_btn] diff --git a/openpype/tools/settings/settings/widgets.py b/openpype/tools/settings/settings/widgets.py index 88d923c16a..1a4a6877b0 100644 --- a/openpype/tools/settings/settings/widgets.py +++ b/openpype/tools/settings/settings/widgets.py @@ -3,6 +3,7 @@ import uuid from Qt import QtWidgets, QtCore, QtGui import qtawesome +from openpype.client import get_projects from openpype.pipeline import AvalonMongoDB from openpype.style import get_objected_colors from openpype.tools.utils.widgets import ImageButton @@ -783,8 +784,6 @@ class ProjectModel(QtGui.QStandardItemModel): self.setColumnCount(2) - self.dbcon = None - self._only_active = only_active self._default_item = None self._items_by_name = {} @@ -828,9 +827,6 @@ class ProjectModel(QtGui.QStandardItemModel): index = self.index(index.row(), 0, index.parent()) return super(ProjectModel, self).flags(index) - def set_dbcon(self, dbcon): - self.dbcon = dbcon - def refresh(self): # Change id of versions refresh self._version_refresh_id = uuid.uuid4() @@ -846,31 +842,30 @@ class ProjectModel(QtGui.QStandardItemModel): self._default_item.setData("", PROJECT_VERSION_ROLE) project_names = set() - if self.dbcon is not None: - for project_doc in self.dbcon.projects( - projection={"name": 1, "data.active": 1}, - only_active=self._only_active - ): - project_name = project_doc["name"] - project_names.add(project_name) - if project_name in self._items_by_name: - item = self._items_by_name[project_name] - else: - item = QtGui.QStandardItem(project_name) + for project_doc in get_projects( + inactive=not self._only_active, + fields=["name", "data.active"] + ): + project_name = project_doc["name"] + project_names.add(project_name) + if project_name in self._items_by_name: + item = self._items_by_name[project_name] + else: + item = QtGui.QStandardItem(project_name) - self._items_by_name[project_name] = item - new_items.append(item) + self._items_by_name[project_name] = item + new_items.append(item) - is_active = project_doc.get("data", {}).get("active", True) - item.setData(project_name, PROJECT_NAME_ROLE) - item.setData(is_active, PROJECT_IS_ACTIVE_ROLE) - item.setData("", PROJECT_VERSION_ROLE) - item.setData(False, PROJECT_IS_SELECTED_ROLE) + is_active = project_doc.get("data", {}).get("active", True) + item.setData(project_name, PROJECT_NAME_ROLE) + item.setData(is_active, PROJECT_IS_ACTIVE_ROLE) + item.setData("", PROJECT_VERSION_ROLE) + item.setData(False, PROJECT_IS_SELECTED_ROLE) - if not is_active: - font = item.font() - font.setItalic(True) - item.setFont(font) + if not is_active: + font = item.font() + font.setItalic(True) + item.setFont(font) root_item = self.invisibleRootItem() for project_name in tuple(self._items_by_name.keys()): @@ -1067,8 +1062,6 @@ class ProjectListWidget(QtWidgets.QWidget): self.project_model = project_model self.inactive_chk = inactive_chk - self.dbcon = None - def set_entity(self, entity): self._entity = entity @@ -1211,15 +1204,6 @@ class ProjectListWidget(QtWidgets.QWidget): selected_project = index.data(PROJECT_NAME_ROLE) break - if not self.dbcon: - try: - self.dbcon = AvalonMongoDB() - self.dbcon.install() - except Exception: - self.dbcon = None - self.current_project = None - - self.project_model.set_dbcon(self.dbcon) self.project_model.refresh() self.project_proxy.sort(0) diff --git a/openpype/tools/settings/settings/window.py b/openpype/tools/settings/settings/window.py index 22778e4a5b..77a2f64dac 100644 --- a/openpype/tools/settings/settings/window.py +++ b/openpype/tools/settings/settings/window.py @@ -1,4 +1,18 @@ from Qt import QtWidgets, QtGui, QtCore + +from openpype import style + +from openpype.lib import is_admin_password_required +from openpype.lib.events import EventSystem +from openpype.widgets import PasswordDialog + +from openpype.settings.lib import ( + get_last_opened_info, + opened_settings_ui, + closed_settings_ui, +) + +from .dialogs import SettingsUIOpenedElsewhere from .categories import ( CategoryState, SystemWidget, @@ -10,10 +24,80 @@ from .widgets import ( SettingsTabWidget ) from .search_dialog import SearchEntitiesDialog -from openpype import style -from openpype.lib import is_admin_password_required -from openpype.widgets import PasswordDialog + +class SettingsController: + """Controller for settings tools. + + Added when tool was finished for checks of last opened in settings + categories and being able communicated with main widget logic. + """ + + def __init__(self, user_role): + self._user_role = user_role + self._event_system = EventSystem() + + self._opened_info = None + self._last_opened_info = None + self._edit_mode = None + + @property + def user_role(self): + return self._user_role + + @property + def event_system(self): + return self._event_system + + @property + def opened_info(self): + return self._opened_info + + @property + def last_opened_info(self): + return self._last_opened_info + + @property + def edit_mode(self): + return self._edit_mode + + def ui_closed(self): + if self._opened_info is not None: + closed_settings_ui(self._opened_info) + + self._opened_info = None + self._edit_mode = None + + def set_edit_mode(self, enabled): + if self._edit_mode is enabled: + return + + opened_info = None + if enabled: + opened_info = opened_settings_ui() + self._last_opened_info = opened_info + + self._opened_info = opened_info + self._edit_mode = enabled + + self.event_system.emit( + "edit.mode.changed", + {"edit_mode": enabled}, + "controller" + ) + + def update_last_opened_info(self): + last_opened_info = get_last_opened_info() + enabled = False + if ( + last_opened_info is None + or self._opened_info == last_opened_info + ): + enabled = True + + self._last_opened_info = last_opened_info + + self.set_edit_mode(enabled) class MainWidget(QtWidgets.QWidget): @@ -21,17 +105,25 @@ class MainWidget(QtWidgets.QWidget): widget_width = 1000 widget_height = 600 + window_title = "OpenPype Settings" def __init__(self, user_role, parent=None, reset_on_show=True): super(MainWidget, self).__init__(parent) + controller = SettingsController(user_role) + + # Object referencing to this machine and time when UI was opened + # - is used on close event + self._main_reset = False + self._controller = controller + self._user_passed = False self._reset_on_show = reset_on_show self._password_dialog = None self.setObjectName("SettingsMainWidget") - self.setWindowTitle("OpenPype Settings") + self.setWindowTitle(self.window_title) self.resize(self.widget_width, self.widget_height) @@ -41,8 +133,8 @@ class MainWidget(QtWidgets.QWidget): header_tab_widget = SettingsTabWidget(parent=self) - studio_widget = SystemWidget(user_role, header_tab_widget) - project_widget = ProjectWidget(user_role, header_tab_widget) + studio_widget = SystemWidget(controller, header_tab_widget) + project_widget = ProjectWidget(controller, header_tab_widget) tab_widgets = [ studio_widget, @@ -64,6 +156,11 @@ class MainWidget(QtWidgets.QWidget): self._shadow_widget = ShadowWidget("Working...", self) self._shadow_widget.setVisible(False) + controller.event_system.add_callback( + "edit.mode.changed", + self._edit_mode_changed + ) + header_tab_widget.currentChanged.connect(self._on_tab_changed) search_dialog.path_clicked.connect(self._on_search_path_clicked) @@ -74,7 +171,7 @@ class MainWidget(QtWidgets.QWidget): self._on_restart_required ) tab_widget.reset_started.connect(self._on_reset_started) - tab_widget.reset_started.connect(self._on_reset_finished) + tab_widget.reset_finished.connect(self._on_reset_finished) tab_widget.full_path_requested.connect(self._on_full_path_request) header_tab_widget.context_menu_requested.connect( @@ -131,11 +228,31 @@ class MainWidget(QtWidgets.QWidget): def showEvent(self, event): super(MainWidget, self).showEvent(event) + if self._reset_on_show: self._reset_on_show = False # Trigger reset with 100ms delay QtCore.QTimer.singleShot(100, self.reset) + def closeEvent(self, event): + self._controller.ui_closed() + + super(MainWidget, self).closeEvent(event) + + def _check_on_reset(self): + self._controller.update_last_opened_info() + if self._controller.edit_mode: + return + + # if self._edit_mode is False: + # return + + dialog = SettingsUIOpenedElsewhere( + self._controller.last_opened_info, self + ) + dialog.exec_() + self._controller.set_edit_mode(dialog.result() == 1) + def _show_password_dialog(self): if self._password_dialog: self._password_dialog.open() @@ -176,8 +293,11 @@ class MainWidget(QtWidgets.QWidget): if self._reset_on_show: self._reset_on_show = False + self._main_reset = True for tab_widget in self.tab_widgets: tab_widget.reset() + self._main_reset = False + self._check_on_reset() def _update_search_dialog(self, clear=False): if self._search_dialog.isVisible(): @@ -187,6 +307,12 @@ class MainWidget(QtWidgets.QWidget): entity = widget.entity self._search_dialog.set_root_entity(entity) + def _edit_mode_changed(self, event): + title = self.window_title + if not event["edit_mode"]: + title += " [View only]" + self.setWindowTitle(title) + def _on_tab_changed(self): self._update_search_dialog() @@ -221,6 +347,9 @@ class MainWidget(QtWidgets.QWidget): if current_widget is widget: self._update_search_dialog() + if not self._main_reset: + self._check_on_reset() + def keyPressEvent(self, event): if event.matches(QtGui.QKeySequence.Find): # todo: search in all widgets (or in active)? diff --git a/openpype/tools/standalonepublish/widgets/widget_asset.py b/openpype/tools/standalonepublish/widgets/widget_asset.py index 73114f7960..77d756a606 100644 --- a/openpype/tools/standalonepublish/widgets/widget_asset.py +++ b/openpype/tools/standalonepublish/widgets/widget_asset.py @@ -3,6 +3,7 @@ from Qt import QtWidgets, QtCore import qtawesome from openpype.client import ( + get_projects, get_project, get_asset_by_id, ) @@ -291,9 +292,7 @@ class AssetWidget(QtWidgets.QWidget): def _set_projects(self): project_names = list() - for doc in self.dbcon.projects(projection={"name": 1}, - only_active=True): - + for doc in get_projects(fields=["name"]): project_name = doc.get("name") if project_name: project_names.append(project_name) @@ -320,8 +319,7 @@ class AssetWidget(QtWidgets.QWidget): def on_project_change(self): projects = list() - for project in self.dbcon.projects(projection={"name": 1}, - only_active=True): + for project in get_projects(fields=["name"]): projects.append(project['name']) project_name = self.combo_projects.currentText() if project_name in projects: diff --git a/openpype/tools/tray/pype_tray.py b/openpype/tools/tray/pype_tray.py index 4e5db06a92..85bc00ead6 100644 --- a/openpype/tools/tray/pype_tray.py +++ b/openpype/tools/tray/pype_tray.py @@ -10,19 +10,19 @@ from Qt import QtCore, QtGui, QtWidgets import openpype.version from openpype.api import ( - Logger, resources, get_system_settings ) -from openpype.lib import ( - get_openpype_execute_args, +from openpype.lib import get_openpype_execute_args, Logger +from openpype.lib.openpype_version import ( op_version_control_available, + get_expected_version, + get_installed_version, is_current_version_studio_latest, is_current_version_higher_than_expected, is_running_from_build, is_running_staging, - get_expected_version, - get_openpype_version + get_openpype_version, ) from openpype.modules import TrayModulesManager from openpype import style @@ -329,6 +329,25 @@ class TrayManager: self._version_dialog.close() return + installed_version = get_installed_version() + expected_version = get_expected_version() + + # Request new build if is needed + if ( + # Backwards compatibility + not hasattr(expected_version, "is_compatible") + or not expected_version.is_compatible(installed_version) + ): + if ( + self._version_dialog is not None + and self._version_dialog.isVisible() + ): + self._version_dialog.close() + + dialog = BuildVersionDialog() + dialog.exec_() + return + if self._version_dialog is None: self._version_dialog = VersionUpdateDialog() self._version_dialog.restart_requested.connect( @@ -338,7 +357,6 @@ class TrayManager: self._outdated_version_ignored ) - expected_version = get_expected_version() current_version = get_openpype_version() current_is_higher = is_current_version_higher_than_expected() diff --git a/openpype/tools/utils/lib.py b/openpype/tools/utils/lib.py index 2169cf8ef1..99d8c75ab4 100644 --- a/openpype/tools/utils/lib.py +++ b/openpype/tools/utils/lib.py @@ -443,10 +443,6 @@ class FamilyConfigCache: if profiles: # Make sure connection is installed # - accessing attribute which does not have auto-install - self.dbcon.install() - database = getattr(self.dbcon, "database", None) - if database is None: - database = self.dbcon._database asset_doc = get_asset_by_name( project_name, asset_name, fields=["data.tasks"] ) or {} diff --git a/openpype/tools/utils/models.py b/openpype/tools/utils/models.py index 8991614fe1..1faccef4dd 100644 --- a/openpype/tools/utils/models.py +++ b/openpype/tools/utils/models.py @@ -3,6 +3,7 @@ import logging import Qt from Qt import QtCore, QtGui +from openpype.client import get_projects from .constants import ( PROJECT_IS_ACTIVE_ROLE, PROJECT_NAME_ROLE, @@ -296,29 +297,29 @@ class ProjectModel(QtGui.QStandardItemModel): self._default_item = item project_names = set() - if self.dbcon is not None: - for project_doc in self.dbcon.projects( - projection={"name": 1, "data.active": 1}, - only_active=self._only_active - ): - project_name = project_doc["name"] - project_names.add(project_name) - if project_name in self._items_by_name: - item = self._items_by_name[project_name] - else: - item = QtGui.QStandardItem(project_name) + project_docs = get_projects( + inactive=not self._only_active, + fields=["name", "data.active"] + ) + for project_doc in project_docs: + project_name = project_doc["name"] + project_names.add(project_name) + if project_name in self._items_by_name: + item = self._items_by_name[project_name] + else: + item = QtGui.QStandardItem(project_name) - self._items_by_name[project_name] = item - new_items.append(item) + self._items_by_name[project_name] = item + new_items.append(item) - is_active = project_doc.get("data", {}).get("active", True) - item.setData(project_name, PROJECT_NAME_ROLE) - item.setData(is_active, PROJECT_IS_ACTIVE_ROLE) + is_active = project_doc.get("data", {}).get("active", True) + item.setData(project_name, PROJECT_NAME_ROLE) + item.setData(is_active, PROJECT_IS_ACTIVE_ROLE) - if not is_active: - font = item.font() - font.setItalic(True) - item.setFont(font) + if not is_active: + font = item.font() + font.setItalic(True) + item.setFont(font) root_item = self.invisibleRootItem() for project_name in tuple(self._items_by_name.keys()): diff --git a/openpype/tools/workfiles/model.py b/openpype/tools/workfiles/model.py index d5b7cef339..9a7fd659a9 100644 --- a/openpype/tools/workfiles/model.py +++ b/openpype/tools/workfiles/model.py @@ -299,7 +299,6 @@ class PublishFilesModel(QtGui.QStandardItemModel): self.project_name, asset_ids=[self._asset_id], fields=["_id", "name"] - ) subset_ids = [subset_doc["_id"] for subset_doc in subset_docs] @@ -329,7 +328,9 @@ class PublishFilesModel(QtGui.QStandardItemModel): # extension extensions = [ext.replace(".", "") for ext in self._file_extensions] repre_docs = get_representations( - self.project_name, version_ids, extensions + self.project_name, + version_ids=version_ids, + context_filters={"ext": extensions} ) # Filter queried representations by task name if task is set diff --git a/openpype/vendor/python/python_2/attr/__init__.py b/openpype/vendor/python/python_2/attr/__init__.py new file mode 100644 index 0000000000..f95c96dd57 --- /dev/null +++ b/openpype/vendor/python/python_2/attr/__init__.py @@ -0,0 +1,80 @@ +# SPDX-License-Identifier: MIT + +from __future__ import absolute_import, division, print_function + +import sys + +from functools import partial + +from . import converters, exceptions, filters, setters, validators +from ._cmp import cmp_using +from ._config import get_run_validators, set_run_validators +from ._funcs import asdict, assoc, astuple, evolve, has, resolve_types +from ._make import ( + NOTHING, + Attribute, + Factory, + attrib, + attrs, + fields, + fields_dict, + make_class, + validate, +) +from ._version_info import VersionInfo + + +__version__ = "21.4.0" +__version_info__ = VersionInfo._from_version_string(__version__) + +__title__ = "attrs" +__description__ = "Classes Without Boilerplate" +__url__ = "https://www.attrs.org/" +__uri__ = __url__ +__doc__ = __description__ + " <" + __uri__ + ">" + +__author__ = "Hynek Schlawack" +__email__ = "hs@ox.cx" + +__license__ = "MIT" +__copyright__ = "Copyright (c) 2015 Hynek Schlawack" + + +s = attributes = attrs +ib = attr = attrib +dataclass = partial(attrs, auto_attribs=True) # happy Easter ;) + +__all__ = [ + "Attribute", + "Factory", + "NOTHING", + "asdict", + "assoc", + "astuple", + "attr", + "attrib", + "attributes", + "attrs", + "cmp_using", + "converters", + "evolve", + "exceptions", + "fields", + "fields_dict", + "filters", + "get_run_validators", + "has", + "ib", + "make_class", + "resolve_types", + "s", + "set_run_validators", + "setters", + "validate", + "validators", +] + +if sys.version_info[:2] >= (3, 6): + from ._next_gen import define, field, frozen, mutable # noqa: F401 + + __all__.extend(("define", "field", "frozen", "mutable")) diff --git a/openpype/vendor/python/python_2/attr/__init__.pyi b/openpype/vendor/python/python_2/attr/__init__.pyi new file mode 100644 index 0000000000..c0a2126503 --- /dev/null +++ b/openpype/vendor/python/python_2/attr/__init__.pyi @@ -0,0 +1,484 @@ +import sys + +from typing import ( + Any, + Callable, + Dict, + Generic, + List, + Mapping, + Optional, + Sequence, + Tuple, + Type, + TypeVar, + Union, + overload, +) + +# `import X as X` is required to make these public +from . import converters as converters +from . import exceptions as exceptions +from . import filters as filters +from . import setters as setters +from . import validators as validators +from ._version_info import VersionInfo + +__version__: str +__version_info__: VersionInfo +__title__: str +__description__: str +__url__: str +__uri__: str +__author__: str +__email__: str +__license__: str +__copyright__: str + +_T = TypeVar("_T") +_C = TypeVar("_C", bound=type) + +_EqOrderType = Union[bool, Callable[[Any], Any]] +_ValidatorType = Callable[[Any, Attribute[_T], _T], Any] +_ConverterType = Callable[[Any], Any] +_FilterType = Callable[[Attribute[_T], _T], bool] +_ReprType = Callable[[Any], str] +_ReprArgType = Union[bool, _ReprType] +_OnSetAttrType = Callable[[Any, Attribute[Any], Any], Any] +_OnSetAttrArgType = Union[ + _OnSetAttrType, List[_OnSetAttrType], setters._NoOpType +] +_FieldTransformer = Callable[ + [type, List[Attribute[Any]]], List[Attribute[Any]] +] +_CompareWithType = Callable[[Any, Any], bool] +# FIXME: in reality, if multiple validators are passed they must be in a list +# or tuple, but those are invariant and so would prevent subtypes of +# _ValidatorType from working when passed in a list or tuple. +_ValidatorArgType = Union[_ValidatorType[_T], Sequence[_ValidatorType[_T]]] + +# _make -- + +NOTHING: object + +# NOTE: Factory lies about its return type to make this possible: +# `x: List[int] # = Factory(list)` +# Work around mypy issue #4554 in the common case by using an overload. +if sys.version_info >= (3, 8): + from typing import Literal + @overload + def Factory(factory: Callable[[], _T]) -> _T: ... + @overload + def Factory( + factory: Callable[[Any], _T], + takes_self: Literal[True], + ) -> _T: ... + @overload + def Factory( + factory: Callable[[], _T], + takes_self: Literal[False], + ) -> _T: ... + +else: + @overload + def Factory(factory: Callable[[], _T]) -> _T: ... + @overload + def Factory( + factory: Union[Callable[[Any], _T], Callable[[], _T]], + takes_self: bool = ..., + ) -> _T: ... + +# Static type inference support via __dataclass_transform__ implemented as per: +# https://github.com/microsoft/pyright/blob/1.1.135/specs/dataclass_transforms.md +# This annotation must be applied to all overloads of "define" and "attrs" +# +# NOTE: This is a typing construct and does not exist at runtime. Extensions +# wrapping attrs decorators should declare a separate __dataclass_transform__ +# signature in the extension module using the specification linked above to +# provide pyright support. +def __dataclass_transform__( + *, + eq_default: bool = True, + order_default: bool = False, + kw_only_default: bool = False, + field_descriptors: Tuple[Union[type, Callable[..., Any]], ...] = (()), +) -> Callable[[_T], _T]: ... + +class Attribute(Generic[_T]): + name: str + default: Optional[_T] + validator: Optional[_ValidatorType[_T]] + repr: _ReprArgType + cmp: _EqOrderType + eq: _EqOrderType + order: _EqOrderType + hash: Optional[bool] + init: bool + converter: Optional[_ConverterType] + metadata: Dict[Any, Any] + type: Optional[Type[_T]] + kw_only: bool + on_setattr: _OnSetAttrType + def evolve(self, **changes: Any) -> "Attribute[Any]": ... + +# NOTE: We had several choices for the annotation to use for type arg: +# 1) Type[_T] +# - Pros: Handles simple cases correctly +# - Cons: Might produce less informative errors in the case of conflicting +# TypeVars e.g. `attr.ib(default='bad', type=int)` +# 2) Callable[..., _T] +# - Pros: Better error messages than #1 for conflicting TypeVars +# - Cons: Terrible error messages for validator checks. +# e.g. attr.ib(type=int, validator=validate_str) +# -> error: Cannot infer function type argument +# 3) type (and do all of the work in the mypy plugin) +# - Pros: Simple here, and we could customize the plugin with our own errors. +# - Cons: Would need to write mypy plugin code to handle all the cases. +# We chose option #1. + +# `attr` lies about its return type to make the following possible: +# attr() -> Any +# attr(8) -> int +# attr(validator=) -> Whatever the callable expects. +# This makes this type of assignments possible: +# x: int = attr(8) +# +# This form catches explicit None or no default but with no other arguments +# returns Any. +@overload +def attrib( + default: None = ..., + validator: None = ..., + repr: _ReprArgType = ..., + cmp: Optional[_EqOrderType] = ..., + hash: Optional[bool] = ..., + init: bool = ..., + metadata: Optional[Mapping[Any, Any]] = ..., + type: None = ..., + converter: None = ..., + factory: None = ..., + kw_only: bool = ..., + eq: Optional[_EqOrderType] = ..., + order: Optional[_EqOrderType] = ..., + on_setattr: Optional[_OnSetAttrArgType] = ..., +) -> Any: ... + +# This form catches an explicit None or no default and infers the type from the +# other arguments. +@overload +def attrib( + default: None = ..., + validator: Optional[_ValidatorArgType[_T]] = ..., + repr: _ReprArgType = ..., + cmp: Optional[_EqOrderType] = ..., + hash: Optional[bool] = ..., + init: bool = ..., + metadata: Optional[Mapping[Any, Any]] = ..., + type: Optional[Type[_T]] = ..., + converter: Optional[_ConverterType] = ..., + factory: Optional[Callable[[], _T]] = ..., + kw_only: bool = ..., + eq: Optional[_EqOrderType] = ..., + order: Optional[_EqOrderType] = ..., + on_setattr: Optional[_OnSetAttrArgType] = ..., +) -> _T: ... + +# This form catches an explicit default argument. +@overload +def attrib( + default: _T, + validator: Optional[_ValidatorArgType[_T]] = ..., + repr: _ReprArgType = ..., + cmp: Optional[_EqOrderType] = ..., + hash: Optional[bool] = ..., + init: bool = ..., + metadata: Optional[Mapping[Any, Any]] = ..., + type: Optional[Type[_T]] = ..., + converter: Optional[_ConverterType] = ..., + factory: Optional[Callable[[], _T]] = ..., + kw_only: bool = ..., + eq: Optional[_EqOrderType] = ..., + order: Optional[_EqOrderType] = ..., + on_setattr: Optional[_OnSetAttrArgType] = ..., +) -> _T: ... + +# This form covers type=non-Type: e.g. forward references (str), Any +@overload +def attrib( + default: Optional[_T] = ..., + validator: Optional[_ValidatorArgType[_T]] = ..., + repr: _ReprArgType = ..., + cmp: Optional[_EqOrderType] = ..., + hash: Optional[bool] = ..., + init: bool = ..., + metadata: Optional[Mapping[Any, Any]] = ..., + type: object = ..., + converter: Optional[_ConverterType] = ..., + factory: Optional[Callable[[], _T]] = ..., + kw_only: bool = ..., + eq: Optional[_EqOrderType] = ..., + order: Optional[_EqOrderType] = ..., + on_setattr: Optional[_OnSetAttrArgType] = ..., +) -> Any: ... +@overload +def field( + *, + default: None = ..., + validator: None = ..., + repr: _ReprArgType = ..., + hash: Optional[bool] = ..., + init: bool = ..., + metadata: Optional[Mapping[Any, Any]] = ..., + converter: None = ..., + factory: None = ..., + kw_only: bool = ..., + eq: Optional[bool] = ..., + order: Optional[bool] = ..., + on_setattr: Optional[_OnSetAttrArgType] = ..., +) -> Any: ... + +# This form catches an explicit None or no default and infers the type from the +# other arguments. +@overload +def field( + *, + default: None = ..., + validator: Optional[_ValidatorArgType[_T]] = ..., + repr: _ReprArgType = ..., + hash: Optional[bool] = ..., + init: bool = ..., + metadata: Optional[Mapping[Any, Any]] = ..., + converter: Optional[_ConverterType] = ..., + factory: Optional[Callable[[], _T]] = ..., + kw_only: bool = ..., + eq: Optional[_EqOrderType] = ..., + order: Optional[_EqOrderType] = ..., + on_setattr: Optional[_OnSetAttrArgType] = ..., +) -> _T: ... + +# This form catches an explicit default argument. +@overload +def field( + *, + default: _T, + validator: Optional[_ValidatorArgType[_T]] = ..., + repr: _ReprArgType = ..., + hash: Optional[bool] = ..., + init: bool = ..., + metadata: Optional[Mapping[Any, Any]] = ..., + converter: Optional[_ConverterType] = ..., + factory: Optional[Callable[[], _T]] = ..., + kw_only: bool = ..., + eq: Optional[_EqOrderType] = ..., + order: Optional[_EqOrderType] = ..., + on_setattr: Optional[_OnSetAttrArgType] = ..., +) -> _T: ... + +# This form covers type=non-Type: e.g. forward references (str), Any +@overload +def field( + *, + default: Optional[_T] = ..., + validator: Optional[_ValidatorArgType[_T]] = ..., + repr: _ReprArgType = ..., + hash: Optional[bool] = ..., + init: bool = ..., + metadata: Optional[Mapping[Any, Any]] = ..., + converter: Optional[_ConverterType] = ..., + factory: Optional[Callable[[], _T]] = ..., + kw_only: bool = ..., + eq: Optional[_EqOrderType] = ..., + order: Optional[_EqOrderType] = ..., + on_setattr: Optional[_OnSetAttrArgType] = ..., +) -> Any: ... +@overload +@__dataclass_transform__(order_default=True, field_descriptors=(attrib, field)) +def attrs( + maybe_cls: _C, + these: Optional[Dict[str, Any]] = ..., + repr_ns: Optional[str] = ..., + repr: bool = ..., + cmp: Optional[_EqOrderType] = ..., + hash: Optional[bool] = ..., + init: bool = ..., + slots: bool = ..., + frozen: bool = ..., + weakref_slot: bool = ..., + str: bool = ..., + auto_attribs: bool = ..., + kw_only: bool = ..., + cache_hash: bool = ..., + auto_exc: bool = ..., + eq: Optional[_EqOrderType] = ..., + order: Optional[_EqOrderType] = ..., + auto_detect: bool = ..., + collect_by_mro: bool = ..., + getstate_setstate: Optional[bool] = ..., + on_setattr: Optional[_OnSetAttrArgType] = ..., + field_transformer: Optional[_FieldTransformer] = ..., + match_args: bool = ..., +) -> _C: ... +@overload +@__dataclass_transform__(order_default=True, field_descriptors=(attrib, field)) +def attrs( + maybe_cls: None = ..., + these: Optional[Dict[str, Any]] = ..., + repr_ns: Optional[str] = ..., + repr: bool = ..., + cmp: Optional[_EqOrderType] = ..., + hash: Optional[bool] = ..., + init: bool = ..., + slots: bool = ..., + frozen: bool = ..., + weakref_slot: bool = ..., + str: bool = ..., + auto_attribs: bool = ..., + kw_only: bool = ..., + cache_hash: bool = ..., + auto_exc: bool = ..., + eq: Optional[_EqOrderType] = ..., + order: Optional[_EqOrderType] = ..., + auto_detect: bool = ..., + collect_by_mro: bool = ..., + getstate_setstate: Optional[bool] = ..., + on_setattr: Optional[_OnSetAttrArgType] = ..., + field_transformer: Optional[_FieldTransformer] = ..., + match_args: bool = ..., +) -> Callable[[_C], _C]: ... +@overload +@__dataclass_transform__(field_descriptors=(attrib, field)) +def define( + maybe_cls: _C, + *, + these: Optional[Dict[str, Any]] = ..., + repr: bool = ..., + hash: Optional[bool] = ..., + init: bool = ..., + slots: bool = ..., + frozen: bool = ..., + weakref_slot: bool = ..., + str: bool = ..., + auto_attribs: bool = ..., + kw_only: bool = ..., + cache_hash: bool = ..., + auto_exc: bool = ..., + eq: Optional[bool] = ..., + order: Optional[bool] = ..., + auto_detect: bool = ..., + getstate_setstate: Optional[bool] = ..., + on_setattr: Optional[_OnSetAttrArgType] = ..., + field_transformer: Optional[_FieldTransformer] = ..., + match_args: bool = ..., +) -> _C: ... +@overload +@__dataclass_transform__(field_descriptors=(attrib, field)) +def define( + maybe_cls: None = ..., + *, + these: Optional[Dict[str, Any]] = ..., + repr: bool = ..., + hash: Optional[bool] = ..., + init: bool = ..., + slots: bool = ..., + frozen: bool = ..., + weakref_slot: bool = ..., + str: bool = ..., + auto_attribs: bool = ..., + kw_only: bool = ..., + cache_hash: bool = ..., + auto_exc: bool = ..., + eq: Optional[bool] = ..., + order: Optional[bool] = ..., + auto_detect: bool = ..., + getstate_setstate: Optional[bool] = ..., + on_setattr: Optional[_OnSetAttrArgType] = ..., + field_transformer: Optional[_FieldTransformer] = ..., + match_args: bool = ..., +) -> Callable[[_C], _C]: ... + +mutable = define +frozen = define # they differ only in their defaults + +# TODO: add support for returning NamedTuple from the mypy plugin +class _Fields(Tuple[Attribute[Any], ...]): + def __getattr__(self, name: str) -> Attribute[Any]: ... + +def fields(cls: type) -> _Fields: ... +def fields_dict(cls: type) -> Dict[str, Attribute[Any]]: ... +def validate(inst: Any) -> None: ... +def resolve_types( + cls: _C, + globalns: Optional[Dict[str, Any]] = ..., + localns: Optional[Dict[str, Any]] = ..., + attribs: Optional[List[Attribute[Any]]] = ..., +) -> _C: ... + +# TODO: add support for returning a proper attrs class from the mypy plugin +# we use Any instead of _CountingAttr so that e.g. `make_class('Foo', +# [attr.ib()])` is valid +def make_class( + name: str, + attrs: Union[List[str], Tuple[str, ...], Dict[str, Any]], + bases: Tuple[type, ...] = ..., + repr_ns: Optional[str] = ..., + repr: bool = ..., + cmp: Optional[_EqOrderType] = ..., + hash: Optional[bool] = ..., + init: bool = ..., + slots: bool = ..., + frozen: bool = ..., + weakref_slot: bool = ..., + str: bool = ..., + auto_attribs: bool = ..., + kw_only: bool = ..., + cache_hash: bool = ..., + auto_exc: bool = ..., + eq: Optional[_EqOrderType] = ..., + order: Optional[_EqOrderType] = ..., + collect_by_mro: bool = ..., + on_setattr: Optional[_OnSetAttrArgType] = ..., + field_transformer: Optional[_FieldTransformer] = ..., +) -> type: ... + +# _funcs -- + +# TODO: add support for returning TypedDict from the mypy plugin +# FIXME: asdict/astuple do not honor their factory args. Waiting on one of +# these: +# https://github.com/python/mypy/issues/4236 +# https://github.com/python/typing/issues/253 +# XXX: remember to fix attrs.asdict/astuple too! +def asdict( + inst: Any, + recurse: bool = ..., + filter: Optional[_FilterType[Any]] = ..., + dict_factory: Type[Mapping[Any, Any]] = ..., + retain_collection_types: bool = ..., + value_serializer: Optional[ + Callable[[type, Attribute[Any], Any], Any] + ] = ..., + tuple_keys: Optional[bool] = ..., +) -> Dict[str, Any]: ... + +# TODO: add support for returning NamedTuple from the mypy plugin +def astuple( + inst: Any, + recurse: bool = ..., + filter: Optional[_FilterType[Any]] = ..., + tuple_factory: Type[Sequence[Any]] = ..., + retain_collection_types: bool = ..., +) -> Tuple[Any, ...]: ... +def has(cls: type) -> bool: ... +def assoc(inst: _T, **changes: Any) -> _T: ... +def evolve(inst: _T, **changes: Any) -> _T: ... + +# _config -- + +def set_run_validators(run: bool) -> None: ... +def get_run_validators() -> bool: ... + +# aliases -- + +s = attributes = attrs +ib = attr = attrib +dataclass = attrs # Technically, partial(attrs, auto_attribs=True) ;) diff --git a/openpype/vendor/python/python_2/attr/_cmp.py b/openpype/vendor/python/python_2/attr/_cmp.py new file mode 100644 index 0000000000..6cffa4dbab --- /dev/null +++ b/openpype/vendor/python/python_2/attr/_cmp.py @@ -0,0 +1,154 @@ +# SPDX-License-Identifier: MIT + +from __future__ import absolute_import, division, print_function + +import functools + +from ._compat import new_class +from ._make import _make_ne + + +_operation_names = {"eq": "==", "lt": "<", "le": "<=", "gt": ">", "ge": ">="} + + +def cmp_using( + eq=None, + lt=None, + le=None, + gt=None, + ge=None, + require_same_type=True, + class_name="Comparable", +): + """ + Create a class that can be passed into `attr.ib`'s ``eq``, ``order``, and + ``cmp`` arguments to customize field comparison. + + The resulting class will have a full set of ordering methods if + at least one of ``{lt, le, gt, ge}`` and ``eq`` are provided. + + :param Optional[callable] eq: `callable` used to evaluate equality + of two objects. + :param Optional[callable] lt: `callable` used to evaluate whether + one object is less than another object. + :param Optional[callable] le: `callable` used to evaluate whether + one object is less than or equal to another object. + :param Optional[callable] gt: `callable` used to evaluate whether + one object is greater than another object. + :param Optional[callable] ge: `callable` used to evaluate whether + one object is greater than or equal to another object. + + :param bool require_same_type: When `True`, equality and ordering methods + will return `NotImplemented` if objects are not of the same type. + + :param Optional[str] class_name: Name of class. Defaults to 'Comparable'. + + See `comparison` for more details. + + .. versionadded:: 21.1.0 + """ + + body = { + "__slots__": ["value"], + "__init__": _make_init(), + "_requirements": [], + "_is_comparable_to": _is_comparable_to, + } + + # Add operations. + num_order_functions = 0 + has_eq_function = False + + if eq is not None: + has_eq_function = True + body["__eq__"] = _make_operator("eq", eq) + body["__ne__"] = _make_ne() + + if lt is not None: + num_order_functions += 1 + body["__lt__"] = _make_operator("lt", lt) + + if le is not None: + num_order_functions += 1 + body["__le__"] = _make_operator("le", le) + + if gt is not None: + num_order_functions += 1 + body["__gt__"] = _make_operator("gt", gt) + + if ge is not None: + num_order_functions += 1 + body["__ge__"] = _make_operator("ge", ge) + + type_ = new_class(class_name, (object,), {}, lambda ns: ns.update(body)) + + # Add same type requirement. + if require_same_type: + type_._requirements.append(_check_same_type) + + # Add total ordering if at least one operation was defined. + if 0 < num_order_functions < 4: + if not has_eq_function: + # functools.total_ordering requires __eq__ to be defined, + # so raise early error here to keep a nice stack. + raise ValueError( + "eq must be define is order to complete ordering from " + "lt, le, gt, ge." + ) + type_ = functools.total_ordering(type_) + + return type_ + + +def _make_init(): + """ + Create __init__ method. + """ + + def __init__(self, value): + """ + Initialize object with *value*. + """ + self.value = value + + return __init__ + + +def _make_operator(name, func): + """ + Create operator method. + """ + + def method(self, other): + if not self._is_comparable_to(other): + return NotImplemented + + result = func(self.value, other.value) + if result is NotImplemented: + return NotImplemented + + return result + + method.__name__ = "__%s__" % (name,) + method.__doc__ = "Return a %s b. Computed by attrs." % ( + _operation_names[name], + ) + + return method + + +def _is_comparable_to(self, other): + """ + Check whether `other` is comparable to `self`. + """ + for func in self._requirements: + if not func(self, other): + return False + return True + + +def _check_same_type(self, other): + """ + Return True if *self* and *other* are of the same type, False otherwise. + """ + return other.value.__class__ is self.value.__class__ diff --git a/openpype/vendor/python/python_2/attr/_cmp.pyi b/openpype/vendor/python/python_2/attr/_cmp.pyi new file mode 100644 index 0000000000..e71aaff7a1 --- /dev/null +++ b/openpype/vendor/python/python_2/attr/_cmp.pyi @@ -0,0 +1,13 @@ +from typing import Type + +from . import _CompareWithType + +def cmp_using( + eq: Optional[_CompareWithType], + lt: Optional[_CompareWithType], + le: Optional[_CompareWithType], + gt: Optional[_CompareWithType], + ge: Optional[_CompareWithType], + require_same_type: bool, + class_name: str, +) -> Type: ... diff --git a/openpype/vendor/python/python_2/attr/_compat.py b/openpype/vendor/python/python_2/attr/_compat.py new file mode 100644 index 0000000000..dc0cb02b64 --- /dev/null +++ b/openpype/vendor/python/python_2/attr/_compat.py @@ -0,0 +1,261 @@ +# SPDX-License-Identifier: MIT + +from __future__ import absolute_import, division, print_function + +import platform +import sys +import threading +import types +import warnings + + +PY2 = sys.version_info[0] == 2 +PYPY = platform.python_implementation() == "PyPy" +PY36 = sys.version_info[:2] >= (3, 6) +HAS_F_STRINGS = PY36 +PY310 = sys.version_info[:2] >= (3, 10) + + +if PYPY or PY36: + ordered_dict = dict +else: + from collections import OrderedDict + + ordered_dict = OrderedDict + + +if PY2: + from collections import Mapping, Sequence + + from UserDict import IterableUserDict + + # We 'bundle' isclass instead of using inspect as importing inspect is + # fairly expensive (order of 10-15 ms for a modern machine in 2016) + def isclass(klass): + return isinstance(klass, (type, types.ClassType)) + + def new_class(name, bases, kwds, exec_body): + """ + A minimal stub of types.new_class that we need for make_class. + """ + ns = {} + exec_body(ns) + + return type(name, bases, ns) + + # TYPE is used in exceptions, repr(int) is different on Python 2 and 3. + TYPE = "type" + + def iteritems(d): + return d.iteritems() + + # Python 2 is bereft of a read-only dict proxy, so we make one! + class ReadOnlyDict(IterableUserDict): + """ + Best-effort read-only dict wrapper. + """ + + def __setitem__(self, key, val): + # We gently pretend we're a Python 3 mappingproxy. + raise TypeError( + "'mappingproxy' object does not support item assignment" + ) + + def update(self, _): + # We gently pretend we're a Python 3 mappingproxy. + raise AttributeError( + "'mappingproxy' object has no attribute 'update'" + ) + + def __delitem__(self, _): + # We gently pretend we're a Python 3 mappingproxy. + raise TypeError( + "'mappingproxy' object does not support item deletion" + ) + + def clear(self): + # We gently pretend we're a Python 3 mappingproxy. + raise AttributeError( + "'mappingproxy' object has no attribute 'clear'" + ) + + def pop(self, key, default=None): + # We gently pretend we're a Python 3 mappingproxy. + raise AttributeError( + "'mappingproxy' object has no attribute 'pop'" + ) + + def popitem(self): + # We gently pretend we're a Python 3 mappingproxy. + raise AttributeError( + "'mappingproxy' object has no attribute 'popitem'" + ) + + def setdefault(self, key, default=None): + # We gently pretend we're a Python 3 mappingproxy. + raise AttributeError( + "'mappingproxy' object has no attribute 'setdefault'" + ) + + def __repr__(self): + # Override to be identical to the Python 3 version. + return "mappingproxy(" + repr(self.data) + ")" + + def metadata_proxy(d): + res = ReadOnlyDict() + res.data.update(d) # We blocked update, so we have to do it like this. + return res + + def just_warn(*args, **kw): # pragma: no cover + """ + We only warn on Python 3 because we are not aware of any concrete + consequences of not setting the cell on Python 2. + """ + +else: # Python 3 and later. + from collections.abc import Mapping, Sequence # noqa + + def just_warn(*args, **kw): + """ + We only warn on Python 3 because we are not aware of any concrete + consequences of not setting the cell on Python 2. + """ + warnings.warn( + "Running interpreter doesn't sufficiently support code object " + "introspection. Some features like bare super() or accessing " + "__class__ will not work with slotted classes.", + RuntimeWarning, + stacklevel=2, + ) + + def isclass(klass): + return isinstance(klass, type) + + TYPE = "class" + + def iteritems(d): + return d.items() + + new_class = types.new_class + + def metadata_proxy(d): + return types.MappingProxyType(dict(d)) + + +def make_set_closure_cell(): + """Return a function of two arguments (cell, value) which sets + the value stored in the closure cell `cell` to `value`. + """ + # pypy makes this easy. (It also supports the logic below, but + # why not do the easy/fast thing?) + if PYPY: + + def set_closure_cell(cell, value): + cell.__setstate__((value,)) + + return set_closure_cell + + # Otherwise gotta do it the hard way. + + # Create a function that will set its first cellvar to `value`. + def set_first_cellvar_to(value): + x = value + return + + # This function will be eliminated as dead code, but + # not before its reference to `x` forces `x` to be + # represented as a closure cell rather than a local. + def force_x_to_be_a_cell(): # pragma: no cover + return x + + try: + # Extract the code object and make sure our assumptions about + # the closure behavior are correct. + if PY2: + co = set_first_cellvar_to.func_code + else: + co = set_first_cellvar_to.__code__ + if co.co_cellvars != ("x",) or co.co_freevars != (): + raise AssertionError # pragma: no cover + + # Convert this code object to a code object that sets the + # function's first _freevar_ (not cellvar) to the argument. + if sys.version_info >= (3, 8): + # CPython 3.8+ has an incompatible CodeType signature + # (added a posonlyargcount argument) but also added + # CodeType.replace() to do this without counting parameters. + set_first_freevar_code = co.replace( + co_cellvars=co.co_freevars, co_freevars=co.co_cellvars + ) + else: + args = [co.co_argcount] + if not PY2: + args.append(co.co_kwonlyargcount) + args.extend( + [ + co.co_nlocals, + co.co_stacksize, + co.co_flags, + co.co_code, + co.co_consts, + co.co_names, + co.co_varnames, + co.co_filename, + co.co_name, + co.co_firstlineno, + co.co_lnotab, + # These two arguments are reversed: + co.co_cellvars, + co.co_freevars, + ] + ) + set_first_freevar_code = types.CodeType(*args) + + def set_closure_cell(cell, value): + # Create a function using the set_first_freevar_code, + # whose first closure cell is `cell`. Calling it will + # change the value of that cell. + setter = types.FunctionType( + set_first_freevar_code, {}, "setter", (), (cell,) + ) + # And call it to set the cell. + setter(value) + + # Make sure it works on this interpreter: + def make_func_with_cell(): + x = None + + def func(): + return x # pragma: no cover + + return func + + if PY2: + cell = make_func_with_cell().func_closure[0] + else: + cell = make_func_with_cell().__closure__[0] + set_closure_cell(cell, 100) + if cell.cell_contents != 100: + raise AssertionError # pragma: no cover + + except Exception: + return just_warn + else: + return set_closure_cell + + +set_closure_cell = make_set_closure_cell() + +# Thread-local global to track attrs instances which are already being repr'd. +# This is needed because there is no other (thread-safe) way to pass info +# about the instances that are already being repr'd through the call stack +# in order to ensure we don't perform infinite recursion. +# +# For instance, if an instance contains a dict which contains that instance, +# we need to know that we're already repr'ing the outside instance from within +# the dict's repr() call. +# +# This lives here rather than in _make.py so that the functions in _make.py +# don't have a direct reference to the thread-local in their globals dict. +# If they have such a reference, it breaks cloudpickle. +repr_context = threading.local() diff --git a/openpype/vendor/python/python_2/attr/_config.py b/openpype/vendor/python/python_2/attr/_config.py new file mode 100644 index 0000000000..fc9be29d00 --- /dev/null +++ b/openpype/vendor/python/python_2/attr/_config.py @@ -0,0 +1,33 @@ +# SPDX-License-Identifier: MIT + +from __future__ import absolute_import, division, print_function + + +__all__ = ["set_run_validators", "get_run_validators"] + +_run_validators = True + + +def set_run_validators(run): + """ + Set whether or not validators are run. By default, they are run. + + .. deprecated:: 21.3.0 It will not be removed, but it also will not be + moved to new ``attrs`` namespace. Use `attrs.validators.set_disabled()` + instead. + """ + if not isinstance(run, bool): + raise TypeError("'run' must be bool.") + global _run_validators + _run_validators = run + + +def get_run_validators(): + """ + Return whether or not validators are run. + + .. deprecated:: 21.3.0 It will not be removed, but it also will not be + moved to new ``attrs`` namespace. Use `attrs.validators.get_disabled()` + instead. + """ + return _run_validators diff --git a/openpype/vendor/python/python_2/attr/_funcs.py b/openpype/vendor/python/python_2/attr/_funcs.py new file mode 100644 index 0000000000..4c90085a40 --- /dev/null +++ b/openpype/vendor/python/python_2/attr/_funcs.py @@ -0,0 +1,422 @@ +# SPDX-License-Identifier: MIT + +from __future__ import absolute_import, division, print_function + +import copy + +from ._compat import iteritems +from ._make import NOTHING, _obj_setattr, fields +from .exceptions import AttrsAttributeNotFoundError + + +def asdict( + inst, + recurse=True, + filter=None, + dict_factory=dict, + retain_collection_types=False, + value_serializer=None, +): + """ + Return the ``attrs`` attribute values of *inst* as a dict. + + Optionally recurse into other ``attrs``-decorated classes. + + :param inst: Instance of an ``attrs``-decorated class. + :param bool recurse: Recurse into classes that are also + ``attrs``-decorated. + :param callable filter: A callable whose return code determines whether an + attribute or element is included (``True``) or dropped (``False``). Is + called with the `attrs.Attribute` as the first argument and the + value as the second argument. + :param callable dict_factory: A callable to produce dictionaries from. For + example, to produce ordered dictionaries instead of normal Python + dictionaries, pass in ``collections.OrderedDict``. + :param bool retain_collection_types: Do not convert to ``list`` when + encountering an attribute whose type is ``tuple`` or ``set``. Only + meaningful if ``recurse`` is ``True``. + :param Optional[callable] value_serializer: A hook that is called for every + attribute or dict key/value. It receives the current instance, field + and value and must return the (updated) value. The hook is run *after* + the optional *filter* has been applied. + + :rtype: return type of *dict_factory* + + :raise attr.exceptions.NotAnAttrsClassError: If *cls* is not an ``attrs`` + class. + + .. versionadded:: 16.0.0 *dict_factory* + .. versionadded:: 16.1.0 *retain_collection_types* + .. versionadded:: 20.3.0 *value_serializer* + .. versionadded:: 21.3.0 If a dict has a collection for a key, it is + serialized as a tuple. + """ + attrs = fields(inst.__class__) + rv = dict_factory() + for a in attrs: + v = getattr(inst, a.name) + if filter is not None and not filter(a, v): + continue + + if value_serializer is not None: + v = value_serializer(inst, a, v) + + if recurse is True: + if has(v.__class__): + rv[a.name] = asdict( + v, + recurse=True, + filter=filter, + dict_factory=dict_factory, + retain_collection_types=retain_collection_types, + value_serializer=value_serializer, + ) + elif isinstance(v, (tuple, list, set, frozenset)): + cf = v.__class__ if retain_collection_types is True else list + rv[a.name] = cf( + [ + _asdict_anything( + i, + is_key=False, + filter=filter, + dict_factory=dict_factory, + retain_collection_types=retain_collection_types, + value_serializer=value_serializer, + ) + for i in v + ] + ) + elif isinstance(v, dict): + df = dict_factory + rv[a.name] = df( + ( + _asdict_anything( + kk, + is_key=True, + filter=filter, + dict_factory=df, + retain_collection_types=retain_collection_types, + value_serializer=value_serializer, + ), + _asdict_anything( + vv, + is_key=False, + filter=filter, + dict_factory=df, + retain_collection_types=retain_collection_types, + value_serializer=value_serializer, + ), + ) + for kk, vv in iteritems(v) + ) + else: + rv[a.name] = v + else: + rv[a.name] = v + return rv + + +def _asdict_anything( + val, + is_key, + filter, + dict_factory, + retain_collection_types, + value_serializer, +): + """ + ``asdict`` only works on attrs instances, this works on anything. + """ + if getattr(val.__class__, "__attrs_attrs__", None) is not None: + # Attrs class. + rv = asdict( + val, + recurse=True, + filter=filter, + dict_factory=dict_factory, + retain_collection_types=retain_collection_types, + value_serializer=value_serializer, + ) + elif isinstance(val, (tuple, list, set, frozenset)): + if retain_collection_types is True: + cf = val.__class__ + elif is_key: + cf = tuple + else: + cf = list + + rv = cf( + [ + _asdict_anything( + i, + is_key=False, + filter=filter, + dict_factory=dict_factory, + retain_collection_types=retain_collection_types, + value_serializer=value_serializer, + ) + for i in val + ] + ) + elif isinstance(val, dict): + df = dict_factory + rv = df( + ( + _asdict_anything( + kk, + is_key=True, + filter=filter, + dict_factory=df, + retain_collection_types=retain_collection_types, + value_serializer=value_serializer, + ), + _asdict_anything( + vv, + is_key=False, + filter=filter, + dict_factory=df, + retain_collection_types=retain_collection_types, + value_serializer=value_serializer, + ), + ) + for kk, vv in iteritems(val) + ) + else: + rv = val + if value_serializer is not None: + rv = value_serializer(None, None, rv) + + return rv + + +def astuple( + inst, + recurse=True, + filter=None, + tuple_factory=tuple, + retain_collection_types=False, +): + """ + Return the ``attrs`` attribute values of *inst* as a tuple. + + Optionally recurse into other ``attrs``-decorated classes. + + :param inst: Instance of an ``attrs``-decorated class. + :param bool recurse: Recurse into classes that are also + ``attrs``-decorated. + :param callable filter: A callable whose return code determines whether an + attribute or element is included (``True``) or dropped (``False``). Is + called with the `attrs.Attribute` as the first argument and the + value as the second argument. + :param callable tuple_factory: A callable to produce tuples from. For + example, to produce lists instead of tuples. + :param bool retain_collection_types: Do not convert to ``list`` + or ``dict`` when encountering an attribute which type is + ``tuple``, ``dict`` or ``set``. Only meaningful if ``recurse`` is + ``True``. + + :rtype: return type of *tuple_factory* + + :raise attr.exceptions.NotAnAttrsClassError: If *cls* is not an ``attrs`` + class. + + .. versionadded:: 16.2.0 + """ + attrs = fields(inst.__class__) + rv = [] + retain = retain_collection_types # Very long. :/ + for a in attrs: + v = getattr(inst, a.name) + if filter is not None and not filter(a, v): + continue + if recurse is True: + if has(v.__class__): + rv.append( + astuple( + v, + recurse=True, + filter=filter, + tuple_factory=tuple_factory, + retain_collection_types=retain, + ) + ) + elif isinstance(v, (tuple, list, set, frozenset)): + cf = v.__class__ if retain is True else list + rv.append( + cf( + [ + astuple( + j, + recurse=True, + filter=filter, + tuple_factory=tuple_factory, + retain_collection_types=retain, + ) + if has(j.__class__) + else j + for j in v + ] + ) + ) + elif isinstance(v, dict): + df = v.__class__ if retain is True else dict + rv.append( + df( + ( + astuple( + kk, + tuple_factory=tuple_factory, + retain_collection_types=retain, + ) + if has(kk.__class__) + else kk, + astuple( + vv, + tuple_factory=tuple_factory, + retain_collection_types=retain, + ) + if has(vv.__class__) + else vv, + ) + for kk, vv in iteritems(v) + ) + ) + else: + rv.append(v) + else: + rv.append(v) + + return rv if tuple_factory is list else tuple_factory(rv) + + +def has(cls): + """ + Check whether *cls* is a class with ``attrs`` attributes. + + :param type cls: Class to introspect. + :raise TypeError: If *cls* is not a class. + + :rtype: bool + """ + return getattr(cls, "__attrs_attrs__", None) is not None + + +def assoc(inst, **changes): + """ + Copy *inst* and apply *changes*. + + :param inst: Instance of a class with ``attrs`` attributes. + :param changes: Keyword changes in the new copy. + + :return: A copy of inst with *changes* incorporated. + + :raise attr.exceptions.AttrsAttributeNotFoundError: If *attr_name* couldn't + be found on *cls*. + :raise attr.exceptions.NotAnAttrsClassError: If *cls* is not an ``attrs`` + class. + + .. deprecated:: 17.1.0 + Use `attrs.evolve` instead if you can. + This function will not be removed du to the slightly different approach + compared to `attrs.evolve`. + """ + import warnings + + warnings.warn( + "assoc is deprecated and will be removed after 2018/01.", + DeprecationWarning, + stacklevel=2, + ) + new = copy.copy(inst) + attrs = fields(inst.__class__) + for k, v in iteritems(changes): + a = getattr(attrs, k, NOTHING) + if a is NOTHING: + raise AttrsAttributeNotFoundError( + "{k} is not an attrs attribute on {cl}.".format( + k=k, cl=new.__class__ + ) + ) + _obj_setattr(new, k, v) + return new + + +def evolve(inst, **changes): + """ + Create a new instance, based on *inst* with *changes* applied. + + :param inst: Instance of a class with ``attrs`` attributes. + :param changes: Keyword changes in the new copy. + + :return: A copy of inst with *changes* incorporated. + + :raise TypeError: If *attr_name* couldn't be found in the class + ``__init__``. + :raise attr.exceptions.NotAnAttrsClassError: If *cls* is not an ``attrs`` + class. + + .. versionadded:: 17.1.0 + """ + cls = inst.__class__ + attrs = fields(cls) + for a in attrs: + if not a.init: + continue + attr_name = a.name # To deal with private attributes. + init_name = attr_name if attr_name[0] != "_" else attr_name[1:] + if init_name not in changes: + changes[init_name] = getattr(inst, attr_name) + + return cls(**changes) + + +def resolve_types(cls, globalns=None, localns=None, attribs=None): + """ + Resolve any strings and forward annotations in type annotations. + + This is only required if you need concrete types in `Attribute`'s *type* + field. In other words, you don't need to resolve your types if you only + use them for static type checking. + + With no arguments, names will be looked up in the module in which the class + was created. If this is not what you want, e.g. if the name only exists + inside a method, you may pass *globalns* or *localns* to specify other + dictionaries in which to look up these names. See the docs of + `typing.get_type_hints` for more details. + + :param type cls: Class to resolve. + :param Optional[dict] globalns: Dictionary containing global variables. + :param Optional[dict] localns: Dictionary containing local variables. + :param Optional[list] attribs: List of attribs for the given class. + This is necessary when calling from inside a ``field_transformer`` + since *cls* is not an ``attrs`` class yet. + + :raise TypeError: If *cls* is not a class. + :raise attr.exceptions.NotAnAttrsClassError: If *cls* is not an ``attrs`` + class and you didn't pass any attribs. + :raise NameError: If types cannot be resolved because of missing variables. + + :returns: *cls* so you can use this function also as a class decorator. + Please note that you have to apply it **after** `attrs.define`. That + means the decorator has to come in the line **before** `attrs.define`. + + .. versionadded:: 20.1.0 + .. versionadded:: 21.1.0 *attribs* + + """ + # Since calling get_type_hints is expensive we cache whether we've + # done it already. + if getattr(cls, "__attrs_types_resolved__", None) != cls: + import typing + + hints = typing.get_type_hints(cls, globalns=globalns, localns=localns) + for field in fields(cls) if attribs is None else attribs: + if field.name in hints: + # Since fields have been frozen we must work around it. + _obj_setattr(field, "type", hints[field.name]) + # We store the class we resolved so that subclasses know they haven't + # been resolved. + cls.__attrs_types_resolved__ = cls + + # Return the class so you can use it as a decorator too. + return cls diff --git a/openpype/vendor/python/python_2/attr/_make.py b/openpype/vendor/python/python_2/attr/_make.py new file mode 100644 index 0000000000..d46f8a3e7a --- /dev/null +++ b/openpype/vendor/python/python_2/attr/_make.py @@ -0,0 +1,3173 @@ +# SPDX-License-Identifier: MIT + +from __future__ import absolute_import, division, print_function + +import copy +import inspect +import linecache +import sys +import warnings + +from operator import itemgetter + +# We need to import _compat itself in addition to the _compat members to avoid +# having the thread-local in the globals here. +from . import _compat, _config, setters +from ._compat import ( + HAS_F_STRINGS, + PY2, + PY310, + PYPY, + isclass, + iteritems, + metadata_proxy, + new_class, + ordered_dict, + set_closure_cell, +) +from .exceptions import ( + DefaultAlreadySetError, + FrozenInstanceError, + NotAnAttrsClassError, + PythonTooOldError, + UnannotatedAttributeError, +) + + +if not PY2: + import typing + + +# This is used at least twice, so cache it here. +_obj_setattr = object.__setattr__ +_init_converter_pat = "__attr_converter_%s" +_init_factory_pat = "__attr_factory_{}" +_tuple_property_pat = ( + " {attr_name} = _attrs_property(_attrs_itemgetter({index}))" +) +_classvar_prefixes = ( + "typing.ClassVar", + "t.ClassVar", + "ClassVar", + "typing_extensions.ClassVar", +) +# we don't use a double-underscore prefix because that triggers +# name mangling when trying to create a slot for the field +# (when slots=True) +_hash_cache_field = "_attrs_cached_hash" + +_empty_metadata_singleton = metadata_proxy({}) + +# Unique object for unequivocal getattr() defaults. +_sentinel = object() + +_ng_default_on_setattr = setters.pipe(setters.convert, setters.validate) + + +class _Nothing(object): + """ + Sentinel class to indicate the lack of a value when ``None`` is ambiguous. + + ``_Nothing`` is a singleton. There is only ever one of it. + + .. versionchanged:: 21.1.0 ``bool(NOTHING)`` is now False. + """ + + _singleton = None + + def __new__(cls): + if _Nothing._singleton is None: + _Nothing._singleton = super(_Nothing, cls).__new__(cls) + return _Nothing._singleton + + def __repr__(self): + return "NOTHING" + + def __bool__(self): + return False + + def __len__(self): + return 0 # __bool__ for Python 2 + + +NOTHING = _Nothing() +""" +Sentinel to indicate the lack of a value when ``None`` is ambiguous. +""" + + +class _CacheHashWrapper(int): + """ + An integer subclass that pickles / copies as None + + This is used for non-slots classes with ``cache_hash=True``, to avoid + serializing a potentially (even likely) invalid hash value. Since ``None`` + is the default value for uncalculated hashes, whenever this is copied, + the copy's value for the hash should automatically reset. + + See GH #613 for more details. + """ + + if PY2: + # For some reason `type(None)` isn't callable in Python 2, but we don't + # actually need a constructor for None objects, we just need any + # available function that returns None. + def __reduce__(self, _none_constructor=getattr, _args=(0, "", None)): + return _none_constructor, _args + + else: + + def __reduce__(self, _none_constructor=type(None), _args=()): + return _none_constructor, _args + + +def attrib( + default=NOTHING, + validator=None, + repr=True, + cmp=None, + hash=None, + init=True, + metadata=None, + type=None, + converter=None, + factory=None, + kw_only=False, + eq=None, + order=None, + on_setattr=None, +): + """ + Create a new attribute on a class. + + .. warning:: + + Does *not* do anything unless the class is also decorated with + `attr.s`! + + :param default: A value that is used if an ``attrs``-generated ``__init__`` + is used and no value is passed while instantiating or the attribute is + excluded using ``init=False``. + + If the value is an instance of `attrs.Factory`, its callable will be + used to construct a new value (useful for mutable data types like lists + or dicts). + + If a default is not set (or set manually to `attrs.NOTHING`), a value + *must* be supplied when instantiating; otherwise a `TypeError` + will be raised. + + The default can also be set using decorator notation as shown below. + + :type default: Any value + + :param callable factory: Syntactic sugar for + ``default=attr.Factory(factory)``. + + :param validator: `callable` that is called by ``attrs``-generated + ``__init__`` methods after the instance has been initialized. They + receive the initialized instance, the :func:`~attrs.Attribute`, and the + passed value. + + The return value is *not* inspected so the validator has to throw an + exception itself. + + If a `list` is passed, its items are treated as validators and must + all pass. + + Validators can be globally disabled and re-enabled using + `get_run_validators`. + + The validator can also be set using decorator notation as shown below. + + :type validator: `callable` or a `list` of `callable`\\ s. + + :param repr: Include this attribute in the generated ``__repr__`` + method. If ``True``, include the attribute; if ``False``, omit it. By + default, the built-in ``repr()`` function is used. To override how the + attribute value is formatted, pass a ``callable`` that takes a single + value and returns a string. Note that the resulting string is used + as-is, i.e. it will be used directly *instead* of calling ``repr()`` + (the default). + :type repr: a `bool` or a `callable` to use a custom function. + + :param eq: If ``True`` (default), include this attribute in the + generated ``__eq__`` and ``__ne__`` methods that check two instances + for equality. To override how the attribute value is compared, + pass a ``callable`` that takes a single value and returns the value + to be compared. + :type eq: a `bool` or a `callable`. + + :param order: If ``True`` (default), include this attributes in the + generated ``__lt__``, ``__le__``, ``__gt__`` and ``__ge__`` methods. + To override how the attribute value is ordered, + pass a ``callable`` that takes a single value and returns the value + to be ordered. + :type order: a `bool` or a `callable`. + + :param cmp: Setting *cmp* is equivalent to setting *eq* and *order* to the + same value. Must not be mixed with *eq* or *order*. + :type cmp: a `bool` or a `callable`. + + :param Optional[bool] hash: Include this attribute in the generated + ``__hash__`` method. If ``None`` (default), mirror *eq*'s value. This + is the correct behavior according the Python spec. Setting this value + to anything else than ``None`` is *discouraged*. + :param bool init: Include this attribute in the generated ``__init__`` + method. It is possible to set this to ``False`` and set a default + value. In that case this attributed is unconditionally initialized + with the specified default value or factory. + :param callable converter: `callable` that is called by + ``attrs``-generated ``__init__`` methods to convert attribute's value + to the desired format. It is given the passed-in value, and the + returned value will be used as the new value of the attribute. The + value is converted before being passed to the validator, if any. + :param metadata: An arbitrary mapping, to be used by third-party + components. See `extending_metadata`. + :param type: The type of the attribute. In Python 3.6 or greater, the + preferred method to specify the type is using a variable annotation + (see `PEP 526 `_). + This argument is provided for backward compatibility. + Regardless of the approach used, the type will be stored on + ``Attribute.type``. + + Please note that ``attrs`` doesn't do anything with this metadata by + itself. You can use it as part of your own code or for + `static type checking `. + :param kw_only: Make this attribute keyword-only (Python 3+) + in the generated ``__init__`` (if ``init`` is ``False``, this + parameter is ignored). + :param on_setattr: Allows to overwrite the *on_setattr* setting from + `attr.s`. If left `None`, the *on_setattr* value from `attr.s` is used. + Set to `attrs.setters.NO_OP` to run **no** `setattr` hooks for this + attribute -- regardless of the setting in `attr.s`. + :type on_setattr: `callable`, or a list of callables, or `None`, or + `attrs.setters.NO_OP` + + .. versionadded:: 15.2.0 *convert* + .. versionadded:: 16.3.0 *metadata* + .. versionchanged:: 17.1.0 *validator* can be a ``list`` now. + .. versionchanged:: 17.1.0 + *hash* is ``None`` and therefore mirrors *eq* by default. + .. versionadded:: 17.3.0 *type* + .. deprecated:: 17.4.0 *convert* + .. versionadded:: 17.4.0 *converter* as a replacement for the deprecated + *convert* to achieve consistency with other noun-based arguments. + .. versionadded:: 18.1.0 + ``factory=f`` is syntactic sugar for ``default=attr.Factory(f)``. + .. versionadded:: 18.2.0 *kw_only* + .. versionchanged:: 19.2.0 *convert* keyword argument removed. + .. versionchanged:: 19.2.0 *repr* also accepts a custom callable. + .. deprecated:: 19.2.0 *cmp* Removal on or after 2021-06-01. + .. versionadded:: 19.2.0 *eq* and *order* + .. versionadded:: 20.1.0 *on_setattr* + .. versionchanged:: 20.3.0 *kw_only* backported to Python 2 + .. versionchanged:: 21.1.0 + *eq*, *order*, and *cmp* also accept a custom callable + .. versionchanged:: 21.1.0 *cmp* undeprecated + """ + eq, eq_key, order, order_key = _determine_attrib_eq_order( + cmp, eq, order, True + ) + + if hash is not None and hash is not True and hash is not False: + raise TypeError( + "Invalid value for hash. Must be True, False, or None." + ) + + if factory is not None: + if default is not NOTHING: + raise ValueError( + "The `default` and `factory` arguments are mutually " + "exclusive." + ) + if not callable(factory): + raise ValueError("The `factory` argument must be a callable.") + default = Factory(factory) + + if metadata is None: + metadata = {} + + # Apply syntactic sugar by auto-wrapping. + if isinstance(on_setattr, (list, tuple)): + on_setattr = setters.pipe(*on_setattr) + + if validator and isinstance(validator, (list, tuple)): + validator = and_(*validator) + + if converter and isinstance(converter, (list, tuple)): + converter = pipe(*converter) + + return _CountingAttr( + default=default, + validator=validator, + repr=repr, + cmp=None, + hash=hash, + init=init, + converter=converter, + metadata=metadata, + type=type, + kw_only=kw_only, + eq=eq, + eq_key=eq_key, + order=order, + order_key=order_key, + on_setattr=on_setattr, + ) + + +def _compile_and_eval(script, globs, locs=None, filename=""): + """ + "Exec" the script with the given global (globs) and local (locs) variables. + """ + bytecode = compile(script, filename, "exec") + eval(bytecode, globs, locs) + + +def _make_method(name, script, filename, globs=None): + """ + Create the method with the script given and return the method object. + """ + locs = {} + if globs is None: + globs = {} + + # In order of debuggers like PDB being able to step through the code, + # we add a fake linecache entry. + count = 1 + base_filename = filename + while True: + linecache_tuple = ( + len(script), + None, + script.splitlines(True), + filename, + ) + old_val = linecache.cache.setdefault(filename, linecache_tuple) + if old_val == linecache_tuple: + break + else: + filename = "{}-{}>".format(base_filename[:-1], count) + count += 1 + + _compile_and_eval(script, globs, locs, filename) + + return locs[name] + + +def _make_attr_tuple_class(cls_name, attr_names): + """ + Create a tuple subclass to hold `Attribute`s for an `attrs` class. + + The subclass is a bare tuple with properties for names. + + class MyClassAttributes(tuple): + __slots__ = () + x = property(itemgetter(0)) + """ + attr_class_name = "{}Attributes".format(cls_name) + attr_class_template = [ + "class {}(tuple):".format(attr_class_name), + " __slots__ = ()", + ] + if attr_names: + for i, attr_name in enumerate(attr_names): + attr_class_template.append( + _tuple_property_pat.format(index=i, attr_name=attr_name) + ) + else: + attr_class_template.append(" pass") + globs = {"_attrs_itemgetter": itemgetter, "_attrs_property": property} + _compile_and_eval("\n".join(attr_class_template), globs) + return globs[attr_class_name] + + +# Tuple class for extracted attributes from a class definition. +# `base_attrs` is a subset of `attrs`. +_Attributes = _make_attr_tuple_class( + "_Attributes", + [ + # all attributes to build dunder methods for + "attrs", + # attributes that have been inherited + "base_attrs", + # map inherited attributes to their originating classes + "base_attrs_map", + ], +) + + +def _is_class_var(annot): + """ + Check whether *annot* is a typing.ClassVar. + + The string comparison hack is used to avoid evaluating all string + annotations which would put attrs-based classes at a performance + disadvantage compared to plain old classes. + """ + annot = str(annot) + + # Annotation can be quoted. + if annot.startswith(("'", '"')) and annot.endswith(("'", '"')): + annot = annot[1:-1] + + return annot.startswith(_classvar_prefixes) + + +def _has_own_attribute(cls, attrib_name): + """ + Check whether *cls* defines *attrib_name* (and doesn't just inherit it). + + Requires Python 3. + """ + attr = getattr(cls, attrib_name, _sentinel) + if attr is _sentinel: + return False + + for base_cls in cls.__mro__[1:]: + a = getattr(base_cls, attrib_name, None) + if attr is a: + return False + + return True + + +def _get_annotations(cls): + """ + Get annotations for *cls*. + """ + if _has_own_attribute(cls, "__annotations__"): + return cls.__annotations__ + + return {} + + +def _counter_getter(e): + """ + Key function for sorting to avoid re-creating a lambda for every class. + """ + return e[1].counter + + +def _collect_base_attrs(cls, taken_attr_names): + """ + Collect attr.ibs from base classes of *cls*, except *taken_attr_names*. + """ + base_attrs = [] + base_attr_map = {} # A dictionary of base attrs to their classes. + + # Traverse the MRO and collect attributes. + for base_cls in reversed(cls.__mro__[1:-1]): + for a in getattr(base_cls, "__attrs_attrs__", []): + if a.inherited or a.name in taken_attr_names: + continue + + a = a.evolve(inherited=True) + base_attrs.append(a) + base_attr_map[a.name] = base_cls + + # For each name, only keep the freshest definition i.e. the furthest at the + # back. base_attr_map is fine because it gets overwritten with every new + # instance. + filtered = [] + seen = set() + for a in reversed(base_attrs): + if a.name in seen: + continue + filtered.insert(0, a) + seen.add(a.name) + + return filtered, base_attr_map + + +def _collect_base_attrs_broken(cls, taken_attr_names): + """ + Collect attr.ibs from base classes of *cls*, except *taken_attr_names*. + + N.B. *taken_attr_names* will be mutated. + + Adhere to the old incorrect behavior. + + Notably it collects from the front and considers inherited attributes which + leads to the buggy behavior reported in #428. + """ + base_attrs = [] + base_attr_map = {} # A dictionary of base attrs to their classes. + + # Traverse the MRO and collect attributes. + for base_cls in cls.__mro__[1:-1]: + for a in getattr(base_cls, "__attrs_attrs__", []): + if a.name in taken_attr_names: + continue + + a = a.evolve(inherited=True) + taken_attr_names.add(a.name) + base_attrs.append(a) + base_attr_map[a.name] = base_cls + + return base_attrs, base_attr_map + + +def _transform_attrs( + cls, these, auto_attribs, kw_only, collect_by_mro, field_transformer +): + """ + Transform all `_CountingAttr`s on a class into `Attribute`s. + + If *these* is passed, use that and don't look for them on the class. + + *collect_by_mro* is True, collect them in the correct MRO order, otherwise + use the old -- incorrect -- order. See #428. + + Return an `_Attributes`. + """ + cd = cls.__dict__ + anns = _get_annotations(cls) + + if these is not None: + ca_list = [(name, ca) for name, ca in iteritems(these)] + + if not isinstance(these, ordered_dict): + ca_list.sort(key=_counter_getter) + elif auto_attribs is True: + ca_names = { + name + for name, attr in cd.items() + if isinstance(attr, _CountingAttr) + } + ca_list = [] + annot_names = set() + for attr_name, type in anns.items(): + if _is_class_var(type): + continue + annot_names.add(attr_name) + a = cd.get(attr_name, NOTHING) + + if not isinstance(a, _CountingAttr): + if a is NOTHING: + a = attrib() + else: + a = attrib(default=a) + ca_list.append((attr_name, a)) + + unannotated = ca_names - annot_names + if len(unannotated) > 0: + raise UnannotatedAttributeError( + "The following `attr.ib`s lack a type annotation: " + + ", ".join( + sorted(unannotated, key=lambda n: cd.get(n).counter) + ) + + "." + ) + else: + ca_list = sorted( + ( + (name, attr) + for name, attr in cd.items() + if isinstance(attr, _CountingAttr) + ), + key=lambda e: e[1].counter, + ) + + own_attrs = [ + Attribute.from_counting_attr( + name=attr_name, ca=ca, type=anns.get(attr_name) + ) + for attr_name, ca in ca_list + ] + + if collect_by_mro: + base_attrs, base_attr_map = _collect_base_attrs( + cls, {a.name for a in own_attrs} + ) + else: + base_attrs, base_attr_map = _collect_base_attrs_broken( + cls, {a.name for a in own_attrs} + ) + + if kw_only: + own_attrs = [a.evolve(kw_only=True) for a in own_attrs] + base_attrs = [a.evolve(kw_only=True) for a in base_attrs] + + attrs = base_attrs + own_attrs + + # Mandatory vs non-mandatory attr order only matters when they are part of + # the __init__ signature and when they aren't kw_only (which are moved to + # the end and can be mandatory or non-mandatory in any order, as they will + # be specified as keyword args anyway). Check the order of those attrs: + had_default = False + for a in (a for a in attrs if a.init is not False and a.kw_only is False): + if had_default is True and a.default is NOTHING: + raise ValueError( + "No mandatory attributes allowed after an attribute with a " + "default value or factory. Attribute in question: %r" % (a,) + ) + + if had_default is False and a.default is not NOTHING: + had_default = True + + if field_transformer is not None: + attrs = field_transformer(cls, attrs) + + # Create AttrsClass *after* applying the field_transformer since it may + # add or remove attributes! + attr_names = [a.name for a in attrs] + AttrsClass = _make_attr_tuple_class(cls.__name__, attr_names) + + return _Attributes((AttrsClass(attrs), base_attrs, base_attr_map)) + + +if PYPY: + + def _frozen_setattrs(self, name, value): + """ + Attached to frozen classes as __setattr__. + """ + if isinstance(self, BaseException) and name in ( + "__cause__", + "__context__", + ): + BaseException.__setattr__(self, name, value) + return + + raise FrozenInstanceError() + +else: + + def _frozen_setattrs(self, name, value): + """ + Attached to frozen classes as __setattr__. + """ + raise FrozenInstanceError() + + +def _frozen_delattrs(self, name): + """ + Attached to frozen classes as __delattr__. + """ + raise FrozenInstanceError() + + +class _ClassBuilder(object): + """ + Iteratively build *one* class. + """ + + __slots__ = ( + "_attr_names", + "_attrs", + "_base_attr_map", + "_base_names", + "_cache_hash", + "_cls", + "_cls_dict", + "_delete_attribs", + "_frozen", + "_has_pre_init", + "_has_post_init", + "_is_exc", + "_on_setattr", + "_slots", + "_weakref_slot", + "_wrote_own_setattr", + "_has_custom_setattr", + ) + + def __init__( + self, + cls, + these, + slots, + frozen, + weakref_slot, + getstate_setstate, + auto_attribs, + kw_only, + cache_hash, + is_exc, + collect_by_mro, + on_setattr, + has_custom_setattr, + field_transformer, + ): + attrs, base_attrs, base_map = _transform_attrs( + cls, + these, + auto_attribs, + kw_only, + collect_by_mro, + field_transformer, + ) + + self._cls = cls + self._cls_dict = dict(cls.__dict__) if slots else {} + self._attrs = attrs + self._base_names = set(a.name for a in base_attrs) + self._base_attr_map = base_map + self._attr_names = tuple(a.name for a in attrs) + self._slots = slots + self._frozen = frozen + self._weakref_slot = weakref_slot + self._cache_hash = cache_hash + self._has_pre_init = bool(getattr(cls, "__attrs_pre_init__", False)) + self._has_post_init = bool(getattr(cls, "__attrs_post_init__", False)) + self._delete_attribs = not bool(these) + self._is_exc = is_exc + self._on_setattr = on_setattr + + self._has_custom_setattr = has_custom_setattr + self._wrote_own_setattr = False + + self._cls_dict["__attrs_attrs__"] = self._attrs + + if frozen: + self._cls_dict["__setattr__"] = _frozen_setattrs + self._cls_dict["__delattr__"] = _frozen_delattrs + + self._wrote_own_setattr = True + elif on_setattr in ( + _ng_default_on_setattr, + setters.validate, + setters.convert, + ): + has_validator = has_converter = False + for a in attrs: + if a.validator is not None: + has_validator = True + if a.converter is not None: + has_converter = True + + if has_validator and has_converter: + break + if ( + ( + on_setattr == _ng_default_on_setattr + and not (has_validator or has_converter) + ) + or (on_setattr == setters.validate and not has_validator) + or (on_setattr == setters.convert and not has_converter) + ): + # If class-level on_setattr is set to convert + validate, but + # there's no field to convert or validate, pretend like there's + # no on_setattr. + self._on_setattr = None + + if getstate_setstate: + ( + self._cls_dict["__getstate__"], + self._cls_dict["__setstate__"], + ) = self._make_getstate_setstate() + + def __repr__(self): + return "<_ClassBuilder(cls={cls})>".format(cls=self._cls.__name__) + + def build_class(self): + """ + Finalize class based on the accumulated configuration. + + Builder cannot be used after calling this method. + """ + if self._slots is True: + return self._create_slots_class() + else: + return self._patch_original_class() + + def _patch_original_class(self): + """ + Apply accumulated methods and return the class. + """ + cls = self._cls + base_names = self._base_names + + # Clean class of attribute definitions (`attr.ib()`s). + if self._delete_attribs: + for name in self._attr_names: + if ( + name not in base_names + and getattr(cls, name, _sentinel) is not _sentinel + ): + try: + delattr(cls, name) + except AttributeError: + # This can happen if a base class defines a class + # variable and we want to set an attribute with the + # same name by using only a type annotation. + pass + + # Attach our dunder methods. + for name, value in self._cls_dict.items(): + setattr(cls, name, value) + + # If we've inherited an attrs __setattr__ and don't write our own, + # reset it to object's. + if not self._wrote_own_setattr and getattr( + cls, "__attrs_own_setattr__", False + ): + cls.__attrs_own_setattr__ = False + + if not self._has_custom_setattr: + cls.__setattr__ = object.__setattr__ + + return cls + + def _create_slots_class(self): + """ + Build and return a new class with a `__slots__` attribute. + """ + cd = { + k: v + for k, v in iteritems(self._cls_dict) + if k not in tuple(self._attr_names) + ("__dict__", "__weakref__") + } + + # If our class doesn't have its own implementation of __setattr__ + # (either from the user or by us), check the bases, if one of them has + # an attrs-made __setattr__, that needs to be reset. We don't walk the + # MRO because we only care about our immediate base classes. + # XXX: This can be confused by subclassing a slotted attrs class with + # XXX: a non-attrs class and subclass the resulting class with an attrs + # XXX: class. See `test_slotted_confused` for details. For now that's + # XXX: OK with us. + if not self._wrote_own_setattr: + cd["__attrs_own_setattr__"] = False + + if not self._has_custom_setattr: + for base_cls in self._cls.__bases__: + if base_cls.__dict__.get("__attrs_own_setattr__", False): + cd["__setattr__"] = object.__setattr__ + break + + # Traverse the MRO to collect existing slots + # and check for an existing __weakref__. + existing_slots = dict() + weakref_inherited = False + for base_cls in self._cls.__mro__[1:-1]: + if base_cls.__dict__.get("__weakref__", None) is not None: + weakref_inherited = True + existing_slots.update( + { + name: getattr(base_cls, name) + for name in getattr(base_cls, "__slots__", []) + } + ) + + base_names = set(self._base_names) + + names = self._attr_names + if ( + self._weakref_slot + and "__weakref__" not in getattr(self._cls, "__slots__", ()) + and "__weakref__" not in names + and not weakref_inherited + ): + names += ("__weakref__",) + + # We only add the names of attributes that aren't inherited. + # Setting __slots__ to inherited attributes wastes memory. + slot_names = [name for name in names if name not in base_names] + # There are slots for attributes from current class + # that are defined in parent classes. + # As their descriptors may be overriden by a child class, + # we collect them here and update the class dict + reused_slots = { + slot: slot_descriptor + for slot, slot_descriptor in iteritems(existing_slots) + if slot in slot_names + } + slot_names = [name for name in slot_names if name not in reused_slots] + cd.update(reused_slots) + if self._cache_hash: + slot_names.append(_hash_cache_field) + cd["__slots__"] = tuple(slot_names) + + qualname = getattr(self._cls, "__qualname__", None) + if qualname is not None: + cd["__qualname__"] = qualname + + # Create new class based on old class and our methods. + cls = type(self._cls)(self._cls.__name__, self._cls.__bases__, cd) + + # The following is a fix for + # . On Python 3, + # if a method mentions `__class__` or uses the no-arg super(), the + # compiler will bake a reference to the class in the method itself + # as `method.__closure__`. Since we replace the class with a + # clone, we rewrite these references so it keeps working. + for item in cls.__dict__.values(): + if isinstance(item, (classmethod, staticmethod)): + # Class- and staticmethods hide their functions inside. + # These might need to be rewritten as well. + closure_cells = getattr(item.__func__, "__closure__", None) + elif isinstance(item, property): + # Workaround for property `super()` shortcut (PY3-only). + # There is no universal way for other descriptors. + closure_cells = getattr(item.fget, "__closure__", None) + else: + closure_cells = getattr(item, "__closure__", None) + + if not closure_cells: # Catch None or the empty list. + continue + for cell in closure_cells: + try: + match = cell.cell_contents is self._cls + except ValueError: # ValueError: Cell is empty + pass + else: + if match: + set_closure_cell(cell, cls) + + return cls + + def add_repr(self, ns): + self._cls_dict["__repr__"] = self._add_method_dunders( + _make_repr(self._attrs, ns, self._cls) + ) + return self + + def add_str(self): + repr = self._cls_dict.get("__repr__") + if repr is None: + raise ValueError( + "__str__ can only be generated if a __repr__ exists." + ) + + def __str__(self): + return self.__repr__() + + self._cls_dict["__str__"] = self._add_method_dunders(__str__) + return self + + def _make_getstate_setstate(self): + """ + Create custom __setstate__ and __getstate__ methods. + """ + # __weakref__ is not writable. + state_attr_names = tuple( + an for an in self._attr_names if an != "__weakref__" + ) + + def slots_getstate(self): + """ + Automatically created by attrs. + """ + return tuple(getattr(self, name) for name in state_attr_names) + + hash_caching_enabled = self._cache_hash + + def slots_setstate(self, state): + """ + Automatically created by attrs. + """ + __bound_setattr = _obj_setattr.__get__(self, Attribute) + for name, value in zip(state_attr_names, state): + __bound_setattr(name, value) + + # The hash code cache is not included when the object is + # serialized, but it still needs to be initialized to None to + # indicate that the first call to __hash__ should be a cache + # miss. + if hash_caching_enabled: + __bound_setattr(_hash_cache_field, None) + + return slots_getstate, slots_setstate + + def make_unhashable(self): + self._cls_dict["__hash__"] = None + return self + + def add_hash(self): + self._cls_dict["__hash__"] = self._add_method_dunders( + _make_hash( + self._cls, + self._attrs, + frozen=self._frozen, + cache_hash=self._cache_hash, + ) + ) + + return self + + def add_init(self): + self._cls_dict["__init__"] = self._add_method_dunders( + _make_init( + self._cls, + self._attrs, + self._has_pre_init, + self._has_post_init, + self._frozen, + self._slots, + self._cache_hash, + self._base_attr_map, + self._is_exc, + self._on_setattr, + attrs_init=False, + ) + ) + + return self + + def add_match_args(self): + self._cls_dict["__match_args__"] = tuple( + field.name + for field in self._attrs + if field.init and not field.kw_only + ) + + def add_attrs_init(self): + self._cls_dict["__attrs_init__"] = self._add_method_dunders( + _make_init( + self._cls, + self._attrs, + self._has_pre_init, + self._has_post_init, + self._frozen, + self._slots, + self._cache_hash, + self._base_attr_map, + self._is_exc, + self._on_setattr, + attrs_init=True, + ) + ) + + return self + + def add_eq(self): + cd = self._cls_dict + + cd["__eq__"] = self._add_method_dunders( + _make_eq(self._cls, self._attrs) + ) + cd["__ne__"] = self._add_method_dunders(_make_ne()) + + return self + + def add_order(self): + cd = self._cls_dict + + cd["__lt__"], cd["__le__"], cd["__gt__"], cd["__ge__"] = ( + self._add_method_dunders(meth) + for meth in _make_order(self._cls, self._attrs) + ) + + return self + + def add_setattr(self): + if self._frozen: + return self + + sa_attrs = {} + for a in self._attrs: + on_setattr = a.on_setattr or self._on_setattr + if on_setattr and on_setattr is not setters.NO_OP: + sa_attrs[a.name] = a, on_setattr + + if not sa_attrs: + return self + + if self._has_custom_setattr: + # We need to write a __setattr__ but there already is one! + raise ValueError( + "Can't combine custom __setattr__ with on_setattr hooks." + ) + + # docstring comes from _add_method_dunders + def __setattr__(self, name, val): + try: + a, hook = sa_attrs[name] + except KeyError: + nval = val + else: + nval = hook(self, a, val) + + _obj_setattr(self, name, nval) + + self._cls_dict["__attrs_own_setattr__"] = True + self._cls_dict["__setattr__"] = self._add_method_dunders(__setattr__) + self._wrote_own_setattr = True + + return self + + def _add_method_dunders(self, method): + """ + Add __module__ and __qualname__ to a *method* if possible. + """ + try: + method.__module__ = self._cls.__module__ + except AttributeError: + pass + + try: + method.__qualname__ = ".".join( + (self._cls.__qualname__, method.__name__) + ) + except AttributeError: + pass + + try: + method.__doc__ = "Method generated by attrs for class %s." % ( + self._cls.__qualname__, + ) + except AttributeError: + pass + + return method + + +_CMP_DEPRECATION = ( + "The usage of `cmp` is deprecated and will be removed on or after " + "2021-06-01. Please use `eq` and `order` instead." +) + + +def _determine_attrs_eq_order(cmp, eq, order, default_eq): + """ + Validate the combination of *cmp*, *eq*, and *order*. Derive the effective + values of eq and order. If *eq* is None, set it to *default_eq*. + """ + if cmp is not None and any((eq is not None, order is not None)): + raise ValueError("Don't mix `cmp` with `eq' and `order`.") + + # cmp takes precedence due to bw-compatibility. + if cmp is not None: + return cmp, cmp + + # If left None, equality is set to the specified default and ordering + # mirrors equality. + if eq is None: + eq = default_eq + + if order is None: + order = eq + + if eq is False and order is True: + raise ValueError("`order` can only be True if `eq` is True too.") + + return eq, order + + +def _determine_attrib_eq_order(cmp, eq, order, default_eq): + """ + Validate the combination of *cmp*, *eq*, and *order*. Derive the effective + values of eq and order. If *eq* is None, set it to *default_eq*. + """ + if cmp is not None and any((eq is not None, order is not None)): + raise ValueError("Don't mix `cmp` with `eq' and `order`.") + + def decide_callable_or_boolean(value): + """ + Decide whether a key function is used. + """ + if callable(value): + value, key = True, value + else: + key = None + return value, key + + # cmp takes precedence due to bw-compatibility. + if cmp is not None: + cmp, cmp_key = decide_callable_or_boolean(cmp) + return cmp, cmp_key, cmp, cmp_key + + # If left None, equality is set to the specified default and ordering + # mirrors equality. + if eq is None: + eq, eq_key = default_eq, None + else: + eq, eq_key = decide_callable_or_boolean(eq) + + if order is None: + order, order_key = eq, eq_key + else: + order, order_key = decide_callable_or_boolean(order) + + if eq is False and order is True: + raise ValueError("`order` can only be True if `eq` is True too.") + + return eq, eq_key, order, order_key + + +def _determine_whether_to_implement( + cls, flag, auto_detect, dunders, default=True +): + """ + Check whether we should implement a set of methods for *cls*. + + *flag* is the argument passed into @attr.s like 'init', *auto_detect* the + same as passed into @attr.s and *dunders* is a tuple of attribute names + whose presence signal that the user has implemented it themselves. + + Return *default* if no reason for either for or against is found. + + auto_detect must be False on Python 2. + """ + if flag is True or flag is False: + return flag + + if flag is None and auto_detect is False: + return default + + # Logically, flag is None and auto_detect is True here. + for dunder in dunders: + if _has_own_attribute(cls, dunder): + return False + + return default + + +def attrs( + maybe_cls=None, + these=None, + repr_ns=None, + repr=None, + cmp=None, + hash=None, + init=None, + slots=False, + frozen=False, + weakref_slot=True, + str=False, + auto_attribs=False, + kw_only=False, + cache_hash=False, + auto_exc=False, + eq=None, + order=None, + auto_detect=False, + collect_by_mro=False, + getstate_setstate=None, + on_setattr=None, + field_transformer=None, + match_args=True, +): + r""" + A class decorator that adds `dunder + `_\ -methods according to the + specified attributes using `attr.ib` or the *these* argument. + + :param these: A dictionary of name to `attr.ib` mappings. This is + useful to avoid the definition of your attributes within the class body + because you can't (e.g. if you want to add ``__repr__`` methods to + Django models) or don't want to. + + If *these* is not ``None``, ``attrs`` will *not* search the class body + for attributes and will *not* remove any attributes from it. + + If *these* is an ordered dict (`dict` on Python 3.6+, + `collections.OrderedDict` otherwise), the order is deduced from + the order of the attributes inside *these*. Otherwise the order + of the definition of the attributes is used. + + :type these: `dict` of `str` to `attr.ib` + + :param str repr_ns: When using nested classes, there's no way in Python 2 + to automatically detect that. Therefore it's possible to set the + namespace explicitly for a more meaningful ``repr`` output. + :param bool auto_detect: Instead of setting the *init*, *repr*, *eq*, + *order*, and *hash* arguments explicitly, assume they are set to + ``True`` **unless any** of the involved methods for one of the + arguments is implemented in the *current* class (i.e. it is *not* + inherited from some base class). + + So for example by implementing ``__eq__`` on a class yourself, + ``attrs`` will deduce ``eq=False`` and will create *neither* + ``__eq__`` *nor* ``__ne__`` (but Python classes come with a sensible + ``__ne__`` by default, so it *should* be enough to only implement + ``__eq__`` in most cases). + + .. warning:: + + If you prevent ``attrs`` from creating the ordering methods for you + (``order=False``, e.g. by implementing ``__le__``), it becomes + *your* responsibility to make sure its ordering is sound. The best + way is to use the `functools.total_ordering` decorator. + + + Passing ``True`` or ``False`` to *init*, *repr*, *eq*, *order*, + *cmp*, or *hash* overrides whatever *auto_detect* would determine. + + *auto_detect* requires Python 3. Setting it ``True`` on Python 2 raises + an `attrs.exceptions.PythonTooOldError`. + + :param bool repr: Create a ``__repr__`` method with a human readable + representation of ``attrs`` attributes.. + :param bool str: Create a ``__str__`` method that is identical to + ``__repr__``. This is usually not necessary except for + `Exception`\ s. + :param Optional[bool] eq: If ``True`` or ``None`` (default), add ``__eq__`` + and ``__ne__`` methods that check two instances for equality. + + They compare the instances as if they were tuples of their ``attrs`` + attributes if and only if the types of both classes are *identical*! + :param Optional[bool] order: If ``True``, add ``__lt__``, ``__le__``, + ``__gt__``, and ``__ge__`` methods that behave like *eq* above and + allow instances to be ordered. If ``None`` (default) mirror value of + *eq*. + :param Optional[bool] cmp: Setting *cmp* is equivalent to setting *eq* + and *order* to the same value. Must not be mixed with *eq* or *order*. + :param Optional[bool] hash: If ``None`` (default), the ``__hash__`` method + is generated according how *eq* and *frozen* are set. + + 1. If *both* are True, ``attrs`` will generate a ``__hash__`` for you. + 2. If *eq* is True and *frozen* is False, ``__hash__`` will be set to + None, marking it unhashable (which it is). + 3. If *eq* is False, ``__hash__`` will be left untouched meaning the + ``__hash__`` method of the base class will be used (if base class is + ``object``, this means it will fall back to id-based hashing.). + + Although not recommended, you can decide for yourself and force + ``attrs`` to create one (e.g. if the class is immutable even though you + didn't freeze it programmatically) by passing ``True`` or not. Both of + these cases are rather special and should be used carefully. + + See our documentation on `hashing`, Python's documentation on + `object.__hash__`, and the `GitHub issue that led to the default \ + behavior `_ for more + details. + :param bool init: Create a ``__init__`` method that initializes the + ``attrs`` attributes. Leading underscores are stripped for the argument + name. If a ``__attrs_pre_init__`` method exists on the class, it will + be called before the class is initialized. If a ``__attrs_post_init__`` + method exists on the class, it will be called after the class is fully + initialized. + + If ``init`` is ``False``, an ``__attrs_init__`` method will be + injected instead. This allows you to define a custom ``__init__`` + method that can do pre-init work such as ``super().__init__()``, + and then call ``__attrs_init__()`` and ``__attrs_post_init__()``. + :param bool slots: Create a `slotted class ` that's more + memory-efficient. Slotted classes are generally superior to the default + dict classes, but have some gotchas you should know about, so we + encourage you to read the `glossary entry `. + :param bool frozen: Make instances immutable after initialization. If + someone attempts to modify a frozen instance, + `attr.exceptions.FrozenInstanceError` is raised. + + .. note:: + + 1. This is achieved by installing a custom ``__setattr__`` method + on your class, so you can't implement your own. + + 2. True immutability is impossible in Python. + + 3. This *does* have a minor a runtime performance `impact + ` when initializing new instances. In other words: + ``__init__`` is slightly slower with ``frozen=True``. + + 4. If a class is frozen, you cannot modify ``self`` in + ``__attrs_post_init__`` or a self-written ``__init__``. You can + circumvent that limitation by using + ``object.__setattr__(self, "attribute_name", value)``. + + 5. Subclasses of a frozen class are frozen too. + + :param bool weakref_slot: Make instances weak-referenceable. This has no + effect unless ``slots`` is also enabled. + :param bool auto_attribs: If ``True``, collect `PEP 526`_-annotated + attributes (Python 3.6 and later only) from the class body. + + In this case, you **must** annotate every field. If ``attrs`` + encounters a field that is set to an `attr.ib` but lacks a type + annotation, an `attr.exceptions.UnannotatedAttributeError` is + raised. Use ``field_name: typing.Any = attr.ib(...)`` if you don't + want to set a type. + + If you assign a value to those attributes (e.g. ``x: int = 42``), that + value becomes the default value like if it were passed using + ``attr.ib(default=42)``. Passing an instance of `attrs.Factory` also + works as expected in most cases (see warning below). + + Attributes annotated as `typing.ClassVar`, and attributes that are + neither annotated nor set to an `attr.ib` are **ignored**. + + .. warning:: + For features that use the attribute name to create decorators (e.g. + `validators `), you still *must* assign `attr.ib` to + them. Otherwise Python will either not find the name or try to use + the default value to call e.g. ``validator`` on it. + + These errors can be quite confusing and probably the most common bug + report on our bug tracker. + + .. _`PEP 526`: https://www.python.org/dev/peps/pep-0526/ + :param bool kw_only: Make all attributes keyword-only (Python 3+) + in the generated ``__init__`` (if ``init`` is ``False``, this + parameter is ignored). + :param bool cache_hash: Ensure that the object's hash code is computed + only once and stored on the object. If this is set to ``True``, + hashing must be either explicitly or implicitly enabled for this + class. If the hash code is cached, avoid any reassignments of + fields involved in hash code computation or mutations of the objects + those fields point to after object creation. If such changes occur, + the behavior of the object's hash code is undefined. + :param bool auto_exc: If the class subclasses `BaseException` + (which implicitly includes any subclass of any exception), the + following happens to behave like a well-behaved Python exceptions + class: + + - the values for *eq*, *order*, and *hash* are ignored and the + instances compare and hash by the instance's ids (N.B. ``attrs`` will + *not* remove existing implementations of ``__hash__`` or the equality + methods. It just won't add own ones.), + - all attributes that are either passed into ``__init__`` or have a + default value are additionally available as a tuple in the ``args`` + attribute, + - the value of *str* is ignored leaving ``__str__`` to base classes. + :param bool collect_by_mro: Setting this to `True` fixes the way ``attrs`` + collects attributes from base classes. The default behavior is + incorrect in certain cases of multiple inheritance. It should be on by + default but is kept off for backward-compatibility. + + See issue `#428 `_ for + more details. + + :param Optional[bool] getstate_setstate: + .. note:: + This is usually only interesting for slotted classes and you should + probably just set *auto_detect* to `True`. + + If `True`, ``__getstate__`` and + ``__setstate__`` are generated and attached to the class. This is + necessary for slotted classes to be pickleable. If left `None`, it's + `True` by default for slotted classes and ``False`` for dict classes. + + If *auto_detect* is `True`, and *getstate_setstate* is left `None`, + and **either** ``__getstate__`` or ``__setstate__`` is detected directly + on the class (i.e. not inherited), it is set to `False` (this is usually + what you want). + + :param on_setattr: A callable that is run whenever the user attempts to set + an attribute (either by assignment like ``i.x = 42`` or by using + `setattr` like ``setattr(i, "x", 42)``). It receives the same arguments + as validators: the instance, the attribute that is being modified, and + the new value. + + If no exception is raised, the attribute is set to the return value of + the callable. + + If a list of callables is passed, they're automatically wrapped in an + `attrs.setters.pipe`. + + :param Optional[callable] field_transformer: + A function that is called with the original class object and all + fields right before ``attrs`` finalizes the class. You can use + this, e.g., to automatically add converters or validators to + fields based on their types. See `transform-fields` for more details. + + :param bool match_args: + If `True` (default), set ``__match_args__`` on the class to support + `PEP 634 `_ (Structural + Pattern Matching). It is a tuple of all positional-only ``__init__`` + parameter names on Python 3.10 and later. Ignored on older Python + versions. + + .. versionadded:: 16.0.0 *slots* + .. versionadded:: 16.1.0 *frozen* + .. versionadded:: 16.3.0 *str* + .. versionadded:: 16.3.0 Support for ``__attrs_post_init__``. + .. versionchanged:: 17.1.0 + *hash* supports ``None`` as value which is also the default now. + .. versionadded:: 17.3.0 *auto_attribs* + .. versionchanged:: 18.1.0 + If *these* is passed, no attributes are deleted from the class body. + .. versionchanged:: 18.1.0 If *these* is ordered, the order is retained. + .. versionadded:: 18.2.0 *weakref_slot* + .. deprecated:: 18.2.0 + ``__lt__``, ``__le__``, ``__gt__``, and ``__ge__`` now raise a + `DeprecationWarning` if the classes compared are subclasses of + each other. ``__eq`` and ``__ne__`` never tried to compared subclasses + to each other. + .. versionchanged:: 19.2.0 + ``__lt__``, ``__le__``, ``__gt__``, and ``__ge__`` now do not consider + subclasses comparable anymore. + .. versionadded:: 18.2.0 *kw_only* + .. versionadded:: 18.2.0 *cache_hash* + .. versionadded:: 19.1.0 *auto_exc* + .. deprecated:: 19.2.0 *cmp* Removal on or after 2021-06-01. + .. versionadded:: 19.2.0 *eq* and *order* + .. versionadded:: 20.1.0 *auto_detect* + .. versionadded:: 20.1.0 *collect_by_mro* + .. versionadded:: 20.1.0 *getstate_setstate* + .. versionadded:: 20.1.0 *on_setattr* + .. versionadded:: 20.3.0 *field_transformer* + .. versionchanged:: 21.1.0 + ``init=False`` injects ``__attrs_init__`` + .. versionchanged:: 21.1.0 Support for ``__attrs_pre_init__`` + .. versionchanged:: 21.1.0 *cmp* undeprecated + .. versionadded:: 21.3.0 *match_args* + """ + if auto_detect and PY2: + raise PythonTooOldError( + "auto_detect only works on Python 3 and later." + ) + + eq_, order_ = _determine_attrs_eq_order(cmp, eq, order, None) + hash_ = hash # work around the lack of nonlocal + + if isinstance(on_setattr, (list, tuple)): + on_setattr = setters.pipe(*on_setattr) + + def wrap(cls): + + if getattr(cls, "__class__", None) is None: + raise TypeError("attrs only works with new-style classes.") + + is_frozen = frozen or _has_frozen_base_class(cls) + is_exc = auto_exc is True and issubclass(cls, BaseException) + has_own_setattr = auto_detect and _has_own_attribute( + cls, "__setattr__" + ) + + if has_own_setattr and is_frozen: + raise ValueError("Can't freeze a class with a custom __setattr__.") + + builder = _ClassBuilder( + cls, + these, + slots, + is_frozen, + weakref_slot, + _determine_whether_to_implement( + cls, + getstate_setstate, + auto_detect, + ("__getstate__", "__setstate__"), + default=slots, + ), + auto_attribs, + kw_only, + cache_hash, + is_exc, + collect_by_mro, + on_setattr, + has_own_setattr, + field_transformer, + ) + if _determine_whether_to_implement( + cls, repr, auto_detect, ("__repr__",) + ): + builder.add_repr(repr_ns) + if str is True: + builder.add_str() + + eq = _determine_whether_to_implement( + cls, eq_, auto_detect, ("__eq__", "__ne__") + ) + if not is_exc and eq is True: + builder.add_eq() + if not is_exc and _determine_whether_to_implement( + cls, order_, auto_detect, ("__lt__", "__le__", "__gt__", "__ge__") + ): + builder.add_order() + + builder.add_setattr() + + if ( + hash_ is None + and auto_detect is True + and _has_own_attribute(cls, "__hash__") + ): + hash = False + else: + hash = hash_ + if hash is not True and hash is not False and hash is not None: + # Can't use `hash in` because 1 == True for example. + raise TypeError( + "Invalid value for hash. Must be True, False, or None." + ) + elif hash is False or (hash is None and eq is False) or is_exc: + # Don't do anything. Should fall back to __object__'s __hash__ + # which is by id. + if cache_hash: + raise TypeError( + "Invalid value for cache_hash. To use hash caching," + " hashing must be either explicitly or implicitly " + "enabled." + ) + elif hash is True or ( + hash is None and eq is True and is_frozen is True + ): + # Build a __hash__ if told so, or if it's safe. + builder.add_hash() + else: + # Raise TypeError on attempts to hash. + if cache_hash: + raise TypeError( + "Invalid value for cache_hash. To use hash caching," + " hashing must be either explicitly or implicitly " + "enabled." + ) + builder.make_unhashable() + + if _determine_whether_to_implement( + cls, init, auto_detect, ("__init__",) + ): + builder.add_init() + else: + builder.add_attrs_init() + if cache_hash: + raise TypeError( + "Invalid value for cache_hash. To use hash caching," + " init must be True." + ) + + if ( + PY310 + and match_args + and not _has_own_attribute(cls, "__match_args__") + ): + builder.add_match_args() + + return builder.build_class() + + # maybe_cls's type depends on the usage of the decorator. It's a class + # if it's used as `@attrs` but ``None`` if used as `@attrs()`. + if maybe_cls is None: + return wrap + else: + return wrap(maybe_cls) + + +_attrs = attrs +""" +Internal alias so we can use it in functions that take an argument called +*attrs*. +""" + + +if PY2: + + def _has_frozen_base_class(cls): + """ + Check whether *cls* has a frozen ancestor by looking at its + __setattr__. + """ + return ( + getattr(cls.__setattr__, "__module__", None) + == _frozen_setattrs.__module__ + and cls.__setattr__.__name__ == _frozen_setattrs.__name__ + ) + +else: + + def _has_frozen_base_class(cls): + """ + Check whether *cls* has a frozen ancestor by looking at its + __setattr__. + """ + return cls.__setattr__ == _frozen_setattrs + + +def _generate_unique_filename(cls, func_name): + """ + Create a "filename" suitable for a function being generated. + """ + unique_filename = "".format( + func_name, + cls.__module__, + getattr(cls, "__qualname__", cls.__name__), + ) + return unique_filename + + +def _make_hash(cls, attrs, frozen, cache_hash): + attrs = tuple( + a for a in attrs if a.hash is True or (a.hash is None and a.eq is True) + ) + + tab = " " + + unique_filename = _generate_unique_filename(cls, "hash") + type_hash = hash(unique_filename) + + hash_def = "def __hash__(self" + hash_func = "hash((" + closing_braces = "))" + if not cache_hash: + hash_def += "):" + else: + if not PY2: + hash_def += ", *" + + hash_def += ( + ", _cache_wrapper=" + + "__import__('attr._make')._make._CacheHashWrapper):" + ) + hash_func = "_cache_wrapper(" + hash_func + closing_braces += ")" + + method_lines = [hash_def] + + def append_hash_computation_lines(prefix, indent): + """ + Generate the code for actually computing the hash code. + Below this will either be returned directly or used to compute + a value which is then cached, depending on the value of cache_hash + """ + + method_lines.extend( + [ + indent + prefix + hash_func, + indent + " %d," % (type_hash,), + ] + ) + + for a in attrs: + method_lines.append(indent + " self.%s," % a.name) + + method_lines.append(indent + " " + closing_braces) + + if cache_hash: + method_lines.append(tab + "if self.%s is None:" % _hash_cache_field) + if frozen: + append_hash_computation_lines( + "object.__setattr__(self, '%s', " % _hash_cache_field, tab * 2 + ) + method_lines.append(tab * 2 + ")") # close __setattr__ + else: + append_hash_computation_lines( + "self.%s = " % _hash_cache_field, tab * 2 + ) + method_lines.append(tab + "return self.%s" % _hash_cache_field) + else: + append_hash_computation_lines("return ", tab) + + script = "\n".join(method_lines) + return _make_method("__hash__", script, unique_filename) + + +def _add_hash(cls, attrs): + """ + Add a hash method to *cls*. + """ + cls.__hash__ = _make_hash(cls, attrs, frozen=False, cache_hash=False) + return cls + + +def _make_ne(): + """ + Create __ne__ method. + """ + + def __ne__(self, other): + """ + Check equality and either forward a NotImplemented or + return the result negated. + """ + result = self.__eq__(other) + if result is NotImplemented: + return NotImplemented + + return not result + + return __ne__ + + +def _make_eq(cls, attrs): + """ + Create __eq__ method for *cls* with *attrs*. + """ + attrs = [a for a in attrs if a.eq] + + unique_filename = _generate_unique_filename(cls, "eq") + lines = [ + "def __eq__(self, other):", + " if other.__class__ is not self.__class__:", + " return NotImplemented", + ] + + # We can't just do a big self.x = other.x and... clause due to + # irregularities like nan == nan is false but (nan,) == (nan,) is true. + globs = {} + if attrs: + lines.append(" return (") + others = [" ) == ("] + for a in attrs: + if a.eq_key: + cmp_name = "_%s_key" % (a.name,) + # Add the key function to the global namespace + # of the evaluated function. + globs[cmp_name] = a.eq_key + lines.append( + " %s(self.%s)," + % ( + cmp_name, + a.name, + ) + ) + others.append( + " %s(other.%s)," + % ( + cmp_name, + a.name, + ) + ) + else: + lines.append(" self.%s," % (a.name,)) + others.append(" other.%s," % (a.name,)) + + lines += others + [" )"] + else: + lines.append(" return True") + + script = "\n".join(lines) + + return _make_method("__eq__", script, unique_filename, globs) + + +def _make_order(cls, attrs): + """ + Create ordering methods for *cls* with *attrs*. + """ + attrs = [a for a in attrs if a.order] + + def attrs_to_tuple(obj): + """ + Save us some typing. + """ + return tuple( + key(value) if key else value + for value, key in ( + (getattr(obj, a.name), a.order_key) for a in attrs + ) + ) + + def __lt__(self, other): + """ + Automatically created by attrs. + """ + if other.__class__ is self.__class__: + return attrs_to_tuple(self) < attrs_to_tuple(other) + + return NotImplemented + + def __le__(self, other): + """ + Automatically created by attrs. + """ + if other.__class__ is self.__class__: + return attrs_to_tuple(self) <= attrs_to_tuple(other) + + return NotImplemented + + def __gt__(self, other): + """ + Automatically created by attrs. + """ + if other.__class__ is self.__class__: + return attrs_to_tuple(self) > attrs_to_tuple(other) + + return NotImplemented + + def __ge__(self, other): + """ + Automatically created by attrs. + """ + if other.__class__ is self.__class__: + return attrs_to_tuple(self) >= attrs_to_tuple(other) + + return NotImplemented + + return __lt__, __le__, __gt__, __ge__ + + +def _add_eq(cls, attrs=None): + """ + Add equality methods to *cls* with *attrs*. + """ + if attrs is None: + attrs = cls.__attrs_attrs__ + + cls.__eq__ = _make_eq(cls, attrs) + cls.__ne__ = _make_ne() + + return cls + + +if HAS_F_STRINGS: + + def _make_repr(attrs, ns, cls): + unique_filename = _generate_unique_filename(cls, "repr") + # Figure out which attributes to include, and which function to use to + # format them. The a.repr value can be either bool or a custom + # callable. + attr_names_with_reprs = tuple( + (a.name, (repr if a.repr is True else a.repr), a.init) + for a in attrs + if a.repr is not False + ) + globs = { + name + "_repr": r + for name, r, _ in attr_names_with_reprs + if r != repr + } + globs["_compat"] = _compat + globs["AttributeError"] = AttributeError + globs["NOTHING"] = NOTHING + attribute_fragments = [] + for name, r, i in attr_names_with_reprs: + accessor = ( + "self." + name + if i + else 'getattr(self, "' + name + '", NOTHING)' + ) + fragment = ( + "%s={%s!r}" % (name, accessor) + if r == repr + else "%s={%s_repr(%s)}" % (name, name, accessor) + ) + attribute_fragments.append(fragment) + repr_fragment = ", ".join(attribute_fragments) + + if ns is None: + cls_name_fragment = ( + '{self.__class__.__qualname__.rsplit(">.", 1)[-1]}' + ) + else: + cls_name_fragment = ns + ".{self.__class__.__name__}" + + lines = [ + "def __repr__(self):", + " try:", + " already_repring = _compat.repr_context.already_repring", + " except AttributeError:", + " already_repring = {id(self),}", + " _compat.repr_context.already_repring = already_repring", + " else:", + " if id(self) in already_repring:", + " return '...'", + " else:", + " already_repring.add(id(self))", + " try:", + " return f'%s(%s)'" % (cls_name_fragment, repr_fragment), + " finally:", + " already_repring.remove(id(self))", + ] + + return _make_method( + "__repr__", "\n".join(lines), unique_filename, globs=globs + ) + +else: + + def _make_repr(attrs, ns, _): + """ + Make a repr method that includes relevant *attrs*, adding *ns* to the + full name. + """ + + # Figure out which attributes to include, and which function to use to + # format them. The a.repr value can be either bool or a custom + # callable. + attr_names_with_reprs = tuple( + (a.name, repr if a.repr is True else a.repr) + for a in attrs + if a.repr is not False + ) + + def __repr__(self): + """ + Automatically created by attrs. + """ + try: + already_repring = _compat.repr_context.already_repring + except AttributeError: + already_repring = set() + _compat.repr_context.already_repring = already_repring + + if id(self) in already_repring: + return "..." + real_cls = self.__class__ + if ns is None: + qualname = getattr(real_cls, "__qualname__", None) + if qualname is not None: # pragma: no cover + # This case only happens on Python 3.5 and 3.6. We exclude + # it from coverage, because we don't want to slow down our + # test suite by running them under coverage too for this + # one line. + class_name = qualname.rsplit(">.", 1)[-1] + else: + class_name = real_cls.__name__ + else: + class_name = ns + "." + real_cls.__name__ + + # Since 'self' remains on the stack (i.e.: strongly referenced) + # for the duration of this call, it's safe to depend on id(...) + # stability, and not need to track the instance and therefore + # worry about properties like weakref- or hash-ability. + already_repring.add(id(self)) + try: + result = [class_name, "("] + first = True + for name, attr_repr in attr_names_with_reprs: + if first: + first = False + else: + result.append(", ") + result.extend( + (name, "=", attr_repr(getattr(self, name, NOTHING))) + ) + return "".join(result) + ")" + finally: + already_repring.remove(id(self)) + + return __repr__ + + +def _add_repr(cls, ns=None, attrs=None): + """ + Add a repr method to *cls*. + """ + if attrs is None: + attrs = cls.__attrs_attrs__ + + cls.__repr__ = _make_repr(attrs, ns, cls) + return cls + + +def fields(cls): + """ + Return the tuple of ``attrs`` attributes for a class. + + The tuple also allows accessing the fields by their names (see below for + examples). + + :param type cls: Class to introspect. + + :raise TypeError: If *cls* is not a class. + :raise attr.exceptions.NotAnAttrsClassError: If *cls* is not an ``attrs`` + class. + + :rtype: tuple (with name accessors) of `attrs.Attribute` + + .. versionchanged:: 16.2.0 Returned tuple allows accessing the fields + by name. + """ + if not isclass(cls): + raise TypeError("Passed object must be a class.") + attrs = getattr(cls, "__attrs_attrs__", None) + if attrs is None: + raise NotAnAttrsClassError( + "{cls!r} is not an attrs-decorated class.".format(cls=cls) + ) + return attrs + + +def fields_dict(cls): + """ + Return an ordered dictionary of ``attrs`` attributes for a class, whose + keys are the attribute names. + + :param type cls: Class to introspect. + + :raise TypeError: If *cls* is not a class. + :raise attr.exceptions.NotAnAttrsClassError: If *cls* is not an ``attrs`` + class. + + :rtype: an ordered dict where keys are attribute names and values are + `attrs.Attribute`\\ s. This will be a `dict` if it's + naturally ordered like on Python 3.6+ or an + :class:`~collections.OrderedDict` otherwise. + + .. versionadded:: 18.1.0 + """ + if not isclass(cls): + raise TypeError("Passed object must be a class.") + attrs = getattr(cls, "__attrs_attrs__", None) + if attrs is None: + raise NotAnAttrsClassError( + "{cls!r} is not an attrs-decorated class.".format(cls=cls) + ) + return ordered_dict(((a.name, a) for a in attrs)) + + +def validate(inst): + """ + Validate all attributes on *inst* that have a validator. + + Leaves all exceptions through. + + :param inst: Instance of a class with ``attrs`` attributes. + """ + if _config._run_validators is False: + return + + for a in fields(inst.__class__): + v = a.validator + if v is not None: + v(inst, a, getattr(inst, a.name)) + + +def _is_slot_cls(cls): + return "__slots__" in cls.__dict__ + + +def _is_slot_attr(a_name, base_attr_map): + """ + Check if the attribute name comes from a slot class. + """ + return a_name in base_attr_map and _is_slot_cls(base_attr_map[a_name]) + + +def _make_init( + cls, + attrs, + pre_init, + post_init, + frozen, + slots, + cache_hash, + base_attr_map, + is_exc, + cls_on_setattr, + attrs_init, +): + has_cls_on_setattr = ( + cls_on_setattr is not None and cls_on_setattr is not setters.NO_OP + ) + + if frozen and has_cls_on_setattr: + raise ValueError("Frozen classes can't use on_setattr.") + + needs_cached_setattr = cache_hash or frozen + filtered_attrs = [] + attr_dict = {} + for a in attrs: + if not a.init and a.default is NOTHING: + continue + + filtered_attrs.append(a) + attr_dict[a.name] = a + + if a.on_setattr is not None: + if frozen is True: + raise ValueError("Frozen classes can't use on_setattr.") + + needs_cached_setattr = True + elif has_cls_on_setattr and a.on_setattr is not setters.NO_OP: + needs_cached_setattr = True + + unique_filename = _generate_unique_filename(cls, "init") + + script, globs, annotations = _attrs_to_init_script( + filtered_attrs, + frozen, + slots, + pre_init, + post_init, + cache_hash, + base_attr_map, + is_exc, + needs_cached_setattr, + has_cls_on_setattr, + attrs_init, + ) + if cls.__module__ in sys.modules: + # This makes typing.get_type_hints(CLS.__init__) resolve string types. + globs.update(sys.modules[cls.__module__].__dict__) + + globs.update({"NOTHING": NOTHING, "attr_dict": attr_dict}) + + if needs_cached_setattr: + # Save the lookup overhead in __init__ if we need to circumvent + # setattr hooks. + globs["_cached_setattr"] = _obj_setattr + + init = _make_method( + "__attrs_init__" if attrs_init else "__init__", + script, + unique_filename, + globs, + ) + init.__annotations__ = annotations + + return init + + +def _setattr(attr_name, value_var, has_on_setattr): + """ + Use the cached object.setattr to set *attr_name* to *value_var*. + """ + return "_setattr('%s', %s)" % (attr_name, value_var) + + +def _setattr_with_converter(attr_name, value_var, has_on_setattr): + """ + Use the cached object.setattr to set *attr_name* to *value_var*, but run + its converter first. + """ + return "_setattr('%s', %s(%s))" % ( + attr_name, + _init_converter_pat % (attr_name,), + value_var, + ) + + +def _assign(attr_name, value, has_on_setattr): + """ + Unless *attr_name* has an on_setattr hook, use normal assignment. Otherwise + relegate to _setattr. + """ + if has_on_setattr: + return _setattr(attr_name, value, True) + + return "self.%s = %s" % (attr_name, value) + + +def _assign_with_converter(attr_name, value_var, has_on_setattr): + """ + Unless *attr_name* has an on_setattr hook, use normal assignment after + conversion. Otherwise relegate to _setattr_with_converter. + """ + if has_on_setattr: + return _setattr_with_converter(attr_name, value_var, True) + + return "self.%s = %s(%s)" % ( + attr_name, + _init_converter_pat % (attr_name,), + value_var, + ) + + +if PY2: + + def _unpack_kw_only_py2(attr_name, default=None): + """ + Unpack *attr_name* from _kw_only dict. + """ + if default is not None: + arg_default = ", %s" % default + else: + arg_default = "" + return "%s = _kw_only.pop('%s'%s)" % ( + attr_name, + attr_name, + arg_default, + ) + + def _unpack_kw_only_lines_py2(kw_only_args): + """ + Unpack all *kw_only_args* from _kw_only dict and handle errors. + + Given a list of strings "{attr_name}" and "{attr_name}={default}" + generates list of lines of code that pop attrs from _kw_only dict and + raise TypeError similar to builtin if required attr is missing or + extra key is passed. + + >>> print("\n".join(_unpack_kw_only_lines_py2(["a", "b=42"]))) + try: + a = _kw_only.pop('a') + b = _kw_only.pop('b', 42) + except KeyError as _key_error: + raise TypeError( + ... + if _kw_only: + raise TypeError( + ... + """ + lines = ["try:"] + lines.extend( + " " + _unpack_kw_only_py2(*arg.split("=")) + for arg in kw_only_args + ) + lines += """\ +except KeyError as _key_error: + raise TypeError( + '__init__() missing required keyword-only argument: %s' % _key_error + ) +if _kw_only: + raise TypeError( + '__init__() got an unexpected keyword argument %r' + % next(iter(_kw_only)) + ) +""".split( + "\n" + ) + return lines + + +def _attrs_to_init_script( + attrs, + frozen, + slots, + pre_init, + post_init, + cache_hash, + base_attr_map, + is_exc, + needs_cached_setattr, + has_cls_on_setattr, + attrs_init, +): + """ + Return a script of an initializer for *attrs* and a dict of globals. + + The globals are expected by the generated script. + + If *frozen* is True, we cannot set the attributes directly so we use + a cached ``object.__setattr__``. + """ + lines = [] + if pre_init: + lines.append("self.__attrs_pre_init__()") + + if needs_cached_setattr: + lines.append( + # Circumvent the __setattr__ descriptor to save one lookup per + # assignment. + # Note _setattr will be used again below if cache_hash is True + "_setattr = _cached_setattr.__get__(self, self.__class__)" + ) + + if frozen is True: + if slots is True: + fmt_setter = _setattr + fmt_setter_with_converter = _setattr_with_converter + else: + # Dict frozen classes assign directly to __dict__. + # But only if the attribute doesn't come from an ancestor slot + # class. + # Note _inst_dict will be used again below if cache_hash is True + lines.append("_inst_dict = self.__dict__") + + def fmt_setter(attr_name, value_var, has_on_setattr): + if _is_slot_attr(attr_name, base_attr_map): + return _setattr(attr_name, value_var, has_on_setattr) + + return "_inst_dict['%s'] = %s" % (attr_name, value_var) + + def fmt_setter_with_converter( + attr_name, value_var, has_on_setattr + ): + if has_on_setattr or _is_slot_attr(attr_name, base_attr_map): + return _setattr_with_converter( + attr_name, value_var, has_on_setattr + ) + + return "_inst_dict['%s'] = %s(%s)" % ( + attr_name, + _init_converter_pat % (attr_name,), + value_var, + ) + + else: + # Not frozen. + fmt_setter = _assign + fmt_setter_with_converter = _assign_with_converter + + args = [] + kw_only_args = [] + attrs_to_validate = [] + + # This is a dictionary of names to validator and converter callables. + # Injecting this into __init__ globals lets us avoid lookups. + names_for_globals = {} + annotations = {"return": None} + + for a in attrs: + if a.validator: + attrs_to_validate.append(a) + + attr_name = a.name + has_on_setattr = a.on_setattr is not None or ( + a.on_setattr is not setters.NO_OP and has_cls_on_setattr + ) + arg_name = a.name.lstrip("_") + + has_factory = isinstance(a.default, Factory) + if has_factory and a.default.takes_self: + maybe_self = "self" + else: + maybe_self = "" + + if a.init is False: + if has_factory: + init_factory_name = _init_factory_pat.format(a.name) + if a.converter is not None: + lines.append( + fmt_setter_with_converter( + attr_name, + init_factory_name + "(%s)" % (maybe_self,), + has_on_setattr, + ) + ) + conv_name = _init_converter_pat % (a.name,) + names_for_globals[conv_name] = a.converter + else: + lines.append( + fmt_setter( + attr_name, + init_factory_name + "(%s)" % (maybe_self,), + has_on_setattr, + ) + ) + names_for_globals[init_factory_name] = a.default.factory + else: + if a.converter is not None: + lines.append( + fmt_setter_with_converter( + attr_name, + "attr_dict['%s'].default" % (attr_name,), + has_on_setattr, + ) + ) + conv_name = _init_converter_pat % (a.name,) + names_for_globals[conv_name] = a.converter + else: + lines.append( + fmt_setter( + attr_name, + "attr_dict['%s'].default" % (attr_name,), + has_on_setattr, + ) + ) + elif a.default is not NOTHING and not has_factory: + arg = "%s=attr_dict['%s'].default" % (arg_name, attr_name) + if a.kw_only: + kw_only_args.append(arg) + else: + args.append(arg) + + if a.converter is not None: + lines.append( + fmt_setter_with_converter( + attr_name, arg_name, has_on_setattr + ) + ) + names_for_globals[ + _init_converter_pat % (a.name,) + ] = a.converter + else: + lines.append(fmt_setter(attr_name, arg_name, has_on_setattr)) + + elif has_factory: + arg = "%s=NOTHING" % (arg_name,) + if a.kw_only: + kw_only_args.append(arg) + else: + args.append(arg) + lines.append("if %s is not NOTHING:" % (arg_name,)) + + init_factory_name = _init_factory_pat.format(a.name) + if a.converter is not None: + lines.append( + " " + + fmt_setter_with_converter( + attr_name, arg_name, has_on_setattr + ) + ) + lines.append("else:") + lines.append( + " " + + fmt_setter_with_converter( + attr_name, + init_factory_name + "(" + maybe_self + ")", + has_on_setattr, + ) + ) + names_for_globals[ + _init_converter_pat % (a.name,) + ] = a.converter + else: + lines.append( + " " + fmt_setter(attr_name, arg_name, has_on_setattr) + ) + lines.append("else:") + lines.append( + " " + + fmt_setter( + attr_name, + init_factory_name + "(" + maybe_self + ")", + has_on_setattr, + ) + ) + names_for_globals[init_factory_name] = a.default.factory + else: + if a.kw_only: + kw_only_args.append(arg_name) + else: + args.append(arg_name) + + if a.converter is not None: + lines.append( + fmt_setter_with_converter( + attr_name, arg_name, has_on_setattr + ) + ) + names_for_globals[ + _init_converter_pat % (a.name,) + ] = a.converter + else: + lines.append(fmt_setter(attr_name, arg_name, has_on_setattr)) + + if a.init is True: + if a.type is not None and a.converter is None: + annotations[arg_name] = a.type + elif a.converter is not None and not PY2: + # Try to get the type from the converter. + sig = None + try: + sig = inspect.signature(a.converter) + except (ValueError, TypeError): # inspect failed + pass + if sig: + sig_params = list(sig.parameters.values()) + if ( + sig_params + and sig_params[0].annotation + is not inspect.Parameter.empty + ): + annotations[arg_name] = sig_params[0].annotation + + if attrs_to_validate: # we can skip this if there are no validators. + names_for_globals["_config"] = _config + lines.append("if _config._run_validators is True:") + for a in attrs_to_validate: + val_name = "__attr_validator_" + a.name + attr_name = "__attr_" + a.name + lines.append( + " %s(self, %s, self.%s)" % (val_name, attr_name, a.name) + ) + names_for_globals[val_name] = a.validator + names_for_globals[attr_name] = a + + if post_init: + lines.append("self.__attrs_post_init__()") + + # because this is set only after __attrs_post_init is called, a crash + # will result if post-init tries to access the hash code. This seemed + # preferable to setting this beforehand, in which case alteration to + # field values during post-init combined with post-init accessing the + # hash code would result in silent bugs. + if cache_hash: + if frozen: + if slots: + # if frozen and slots, then _setattr defined above + init_hash_cache = "_setattr('%s', %s)" + else: + # if frozen and not slots, then _inst_dict defined above + init_hash_cache = "_inst_dict['%s'] = %s" + else: + init_hash_cache = "self.%s = %s" + lines.append(init_hash_cache % (_hash_cache_field, "None")) + + # For exceptions we rely on BaseException.__init__ for proper + # initialization. + if is_exc: + vals = ",".join("self." + a.name for a in attrs if a.init) + + lines.append("BaseException.__init__(self, %s)" % (vals,)) + + args = ", ".join(args) + if kw_only_args: + if PY2: + lines = _unpack_kw_only_lines_py2(kw_only_args) + lines + + args += "%s**_kw_only" % (", " if args else "",) # leading comma + else: + args += "%s*, %s" % ( + ", " if args else "", # leading comma + ", ".join(kw_only_args), # kw_only args + ) + return ( + """\ +def {init_name}(self, {args}): + {lines} +""".format( + init_name=("__attrs_init__" if attrs_init else "__init__"), + args=args, + lines="\n ".join(lines) if lines else "pass", + ), + names_for_globals, + annotations, + ) + + +class Attribute(object): + """ + *Read-only* representation of an attribute. + + The class has *all* arguments of `attr.ib` (except for ``factory`` + which is only syntactic sugar for ``default=Factory(...)`` plus the + following: + + - ``name`` (`str`): The name of the attribute. + - ``inherited`` (`bool`): Whether or not that attribute has been inherited + from a base class. + - ``eq_key`` and ``order_key`` (`typing.Callable` or `None`): The callables + that are used for comparing and ordering objects by this attribute, + respectively. These are set by passing a callable to `attr.ib`'s ``eq``, + ``order``, or ``cmp`` arguments. See also :ref:`comparison customization + `. + + Instances of this class are frequently used for introspection purposes + like: + + - `fields` returns a tuple of them. + - Validators get them passed as the first argument. + - The :ref:`field transformer ` hook receives a list of + them. + + .. versionadded:: 20.1.0 *inherited* + .. versionadded:: 20.1.0 *on_setattr* + .. versionchanged:: 20.2.0 *inherited* is not taken into account for + equality checks and hashing anymore. + .. versionadded:: 21.1.0 *eq_key* and *order_key* + + For the full version history of the fields, see `attr.ib`. + """ + + __slots__ = ( + "name", + "default", + "validator", + "repr", + "eq", + "eq_key", + "order", + "order_key", + "hash", + "init", + "metadata", + "type", + "converter", + "kw_only", + "inherited", + "on_setattr", + ) + + def __init__( + self, + name, + default, + validator, + repr, + cmp, # XXX: unused, remove along with other cmp code. + hash, + init, + inherited, + metadata=None, + type=None, + converter=None, + kw_only=False, + eq=None, + eq_key=None, + order=None, + order_key=None, + on_setattr=None, + ): + eq, eq_key, order, order_key = _determine_attrib_eq_order( + cmp, eq_key or eq, order_key or order, True + ) + + # Cache this descriptor here to speed things up later. + bound_setattr = _obj_setattr.__get__(self, Attribute) + + # Despite the big red warning, people *do* instantiate `Attribute` + # themselves. + bound_setattr("name", name) + bound_setattr("default", default) + bound_setattr("validator", validator) + bound_setattr("repr", repr) + bound_setattr("eq", eq) + bound_setattr("eq_key", eq_key) + bound_setattr("order", order) + bound_setattr("order_key", order_key) + bound_setattr("hash", hash) + bound_setattr("init", init) + bound_setattr("converter", converter) + bound_setattr( + "metadata", + ( + metadata_proxy(metadata) + if metadata + else _empty_metadata_singleton + ), + ) + bound_setattr("type", type) + bound_setattr("kw_only", kw_only) + bound_setattr("inherited", inherited) + bound_setattr("on_setattr", on_setattr) + + def __setattr__(self, name, value): + raise FrozenInstanceError() + + @classmethod + def from_counting_attr(cls, name, ca, type=None): + # type holds the annotated value. deal with conflicts: + if type is None: + type = ca.type + elif ca.type is not None: + raise ValueError( + "Type annotation and type argument cannot both be present" + ) + inst_dict = { + k: getattr(ca, k) + for k in Attribute.__slots__ + if k + not in ( + "name", + "validator", + "default", + "type", + "inherited", + ) # exclude methods and deprecated alias + } + return cls( + name=name, + validator=ca._validator, + default=ca._default, + type=type, + cmp=None, + inherited=False, + **inst_dict + ) + + @property + def cmp(self): + """ + Simulate the presence of a cmp attribute and warn. + """ + warnings.warn(_CMP_DEPRECATION, DeprecationWarning, stacklevel=2) + + return self.eq and self.order + + # Don't use attr.evolve since fields(Attribute) doesn't work + def evolve(self, **changes): + """ + Copy *self* and apply *changes*. + + This works similarly to `attr.evolve` but that function does not work + with ``Attribute``. + + It is mainly meant to be used for `transform-fields`. + + .. versionadded:: 20.3.0 + """ + new = copy.copy(self) + + new._setattrs(changes.items()) + + return new + + # Don't use _add_pickle since fields(Attribute) doesn't work + def __getstate__(self): + """ + Play nice with pickle. + """ + return tuple( + getattr(self, name) if name != "metadata" else dict(self.metadata) + for name in self.__slots__ + ) + + def __setstate__(self, state): + """ + Play nice with pickle. + """ + self._setattrs(zip(self.__slots__, state)) + + def _setattrs(self, name_values_pairs): + bound_setattr = _obj_setattr.__get__(self, Attribute) + for name, value in name_values_pairs: + if name != "metadata": + bound_setattr(name, value) + else: + bound_setattr( + name, + metadata_proxy(value) + if value + else _empty_metadata_singleton, + ) + + +_a = [ + Attribute( + name=name, + default=NOTHING, + validator=None, + repr=True, + cmp=None, + eq=True, + order=False, + hash=(name != "metadata"), + init=True, + inherited=False, + ) + for name in Attribute.__slots__ +] + +Attribute = _add_hash( + _add_eq( + _add_repr(Attribute, attrs=_a), + attrs=[a for a in _a if a.name != "inherited"], + ), + attrs=[a for a in _a if a.hash and a.name != "inherited"], +) + + +class _CountingAttr(object): + """ + Intermediate representation of attributes that uses a counter to preserve + the order in which the attributes have been defined. + + *Internal* data structure of the attrs library. Running into is most + likely the result of a bug like a forgotten `@attr.s` decorator. + """ + + __slots__ = ( + "counter", + "_default", + "repr", + "eq", + "eq_key", + "order", + "order_key", + "hash", + "init", + "metadata", + "_validator", + "converter", + "type", + "kw_only", + "on_setattr", + ) + __attrs_attrs__ = tuple( + Attribute( + name=name, + default=NOTHING, + validator=None, + repr=True, + cmp=None, + hash=True, + init=True, + kw_only=False, + eq=True, + eq_key=None, + order=False, + order_key=None, + inherited=False, + on_setattr=None, + ) + for name in ( + "counter", + "_default", + "repr", + "eq", + "order", + "hash", + "init", + "on_setattr", + ) + ) + ( + Attribute( + name="metadata", + default=None, + validator=None, + repr=True, + cmp=None, + hash=False, + init=True, + kw_only=False, + eq=True, + eq_key=None, + order=False, + order_key=None, + inherited=False, + on_setattr=None, + ), + ) + cls_counter = 0 + + def __init__( + self, + default, + validator, + repr, + cmp, + hash, + init, + converter, + metadata, + type, + kw_only, + eq, + eq_key, + order, + order_key, + on_setattr, + ): + _CountingAttr.cls_counter += 1 + self.counter = _CountingAttr.cls_counter + self._default = default + self._validator = validator + self.converter = converter + self.repr = repr + self.eq = eq + self.eq_key = eq_key + self.order = order + self.order_key = order_key + self.hash = hash + self.init = init + self.metadata = metadata + self.type = type + self.kw_only = kw_only + self.on_setattr = on_setattr + + def validator(self, meth): + """ + Decorator that adds *meth* to the list of validators. + + Returns *meth* unchanged. + + .. versionadded:: 17.1.0 + """ + if self._validator is None: + self._validator = meth + else: + self._validator = and_(self._validator, meth) + return meth + + def default(self, meth): + """ + Decorator that allows to set the default for an attribute. + + Returns *meth* unchanged. + + :raises DefaultAlreadySetError: If default has been set before. + + .. versionadded:: 17.1.0 + """ + if self._default is not NOTHING: + raise DefaultAlreadySetError() + + self._default = Factory(meth, takes_self=True) + + return meth + + +_CountingAttr = _add_eq(_add_repr(_CountingAttr)) + + +class Factory(object): + """ + Stores a factory callable. + + If passed as the default value to `attrs.field`, the factory is used to + generate a new value. + + :param callable factory: A callable that takes either none or exactly one + mandatory positional argument depending on *takes_self*. + :param bool takes_self: Pass the partially initialized instance that is + being initialized as a positional argument. + + .. versionadded:: 17.1.0 *takes_self* + """ + + __slots__ = ("factory", "takes_self") + + def __init__(self, factory, takes_self=False): + """ + `Factory` is part of the default machinery so if we want a default + value here, we have to implement it ourselves. + """ + self.factory = factory + self.takes_self = takes_self + + def __getstate__(self): + """ + Play nice with pickle. + """ + return tuple(getattr(self, name) for name in self.__slots__) + + def __setstate__(self, state): + """ + Play nice with pickle. + """ + for name, value in zip(self.__slots__, state): + setattr(self, name, value) + + +_f = [ + Attribute( + name=name, + default=NOTHING, + validator=None, + repr=True, + cmp=None, + eq=True, + order=False, + hash=True, + init=True, + inherited=False, + ) + for name in Factory.__slots__ +] + +Factory = _add_hash(_add_eq(_add_repr(Factory, attrs=_f), attrs=_f), attrs=_f) + + +def make_class(name, attrs, bases=(object,), **attributes_arguments): + """ + A quick way to create a new class called *name* with *attrs*. + + :param str name: The name for the new class. + + :param attrs: A list of names or a dictionary of mappings of names to + attributes. + + If *attrs* is a list or an ordered dict (`dict` on Python 3.6+, + `collections.OrderedDict` otherwise), the order is deduced from + the order of the names or attributes inside *attrs*. Otherwise the + order of the definition of the attributes is used. + :type attrs: `list` or `dict` + + :param tuple bases: Classes that the new class will subclass. + + :param attributes_arguments: Passed unmodified to `attr.s`. + + :return: A new class with *attrs*. + :rtype: type + + .. versionadded:: 17.1.0 *bases* + .. versionchanged:: 18.1.0 If *attrs* is ordered, the order is retained. + """ + if isinstance(attrs, dict): + cls_dict = attrs + elif isinstance(attrs, (list, tuple)): + cls_dict = dict((a, attrib()) for a in attrs) + else: + raise TypeError("attrs argument must be a dict or a list.") + + pre_init = cls_dict.pop("__attrs_pre_init__", None) + post_init = cls_dict.pop("__attrs_post_init__", None) + user_init = cls_dict.pop("__init__", None) + + body = {} + if pre_init is not None: + body["__attrs_pre_init__"] = pre_init + if post_init is not None: + body["__attrs_post_init__"] = post_init + if user_init is not None: + body["__init__"] = user_init + + type_ = new_class(name, bases, {}, lambda ns: ns.update(body)) + + # For pickling to work, the __module__ variable needs to be set to the + # frame where the class is created. Bypass this step in environments where + # sys._getframe is not defined (Jython for example) or sys._getframe is not + # defined for arguments greater than 0 (IronPython). + try: + type_.__module__ = sys._getframe(1).f_globals.get( + "__name__", "__main__" + ) + except (AttributeError, ValueError): + pass + + # We do it here for proper warnings with meaningful stacklevel. + cmp = attributes_arguments.pop("cmp", None) + ( + attributes_arguments["eq"], + attributes_arguments["order"], + ) = _determine_attrs_eq_order( + cmp, + attributes_arguments.get("eq"), + attributes_arguments.get("order"), + True, + ) + + return _attrs(these=cls_dict, **attributes_arguments)(type_) + + +# These are required by within this module so we define them here and merely +# import into .validators / .converters. + + +@attrs(slots=True, hash=True) +class _AndValidator(object): + """ + Compose many validators to a single one. + """ + + _validators = attrib() + + def __call__(self, inst, attr, value): + for v in self._validators: + v(inst, attr, value) + + +def and_(*validators): + """ + A validator that composes multiple validators into one. + + When called on a value, it runs all wrapped validators. + + :param callables validators: Arbitrary number of validators. + + .. versionadded:: 17.1.0 + """ + vals = [] + for validator in validators: + vals.extend( + validator._validators + if isinstance(validator, _AndValidator) + else [validator] + ) + + return _AndValidator(tuple(vals)) + + +def pipe(*converters): + """ + A converter that composes multiple converters into one. + + When called on a value, it runs all wrapped converters, returning the + *last* value. + + Type annotations will be inferred from the wrapped converters', if + they have any. + + :param callables converters: Arbitrary number of converters. + + .. versionadded:: 20.1.0 + """ + + def pipe_converter(val): + for converter in converters: + val = converter(val) + + return val + + if not PY2: + if not converters: + # If the converter list is empty, pipe_converter is the identity. + A = typing.TypeVar("A") + pipe_converter.__annotations__ = {"val": A, "return": A} + else: + # Get parameter type. + sig = None + try: + sig = inspect.signature(converters[0]) + except (ValueError, TypeError): # inspect failed + pass + if sig: + params = list(sig.parameters.values()) + if ( + params + and params[0].annotation is not inspect.Parameter.empty + ): + pipe_converter.__annotations__["val"] = params[ + 0 + ].annotation + # Get return type. + sig = None + try: + sig = inspect.signature(converters[-1]) + except (ValueError, TypeError): # inspect failed + pass + if sig and sig.return_annotation is not inspect.Signature().empty: + pipe_converter.__annotations__[ + "return" + ] = sig.return_annotation + + return pipe_converter diff --git a/openpype/vendor/python/python_2/attr/_next_gen.py b/openpype/vendor/python/python_2/attr/_next_gen.py new file mode 100644 index 0000000000..068253688c --- /dev/null +++ b/openpype/vendor/python/python_2/attr/_next_gen.py @@ -0,0 +1,216 @@ +# SPDX-License-Identifier: MIT + +""" +These are Python 3.6+-only and keyword-only APIs that call `attr.s` and +`attr.ib` with different default values. +""" + + +from functools import partial + +from . import setters +from ._funcs import asdict as _asdict +from ._funcs import astuple as _astuple +from ._make import ( + NOTHING, + _frozen_setattrs, + _ng_default_on_setattr, + attrib, + attrs, +) +from .exceptions import UnannotatedAttributeError + + +def define( + maybe_cls=None, + *, + these=None, + repr=None, + hash=None, + init=None, + slots=True, + frozen=False, + weakref_slot=True, + str=False, + auto_attribs=None, + kw_only=False, + cache_hash=False, + auto_exc=True, + eq=None, + order=False, + auto_detect=True, + getstate_setstate=None, + on_setattr=None, + field_transformer=None, + match_args=True, +): + r""" + Define an ``attrs`` class. + + Differences to the classic `attr.s` that it uses underneath: + + - Automatically detect whether or not *auto_attribs* should be `True` + (c.f. *auto_attribs* parameter). + - If *frozen* is `False`, run converters and validators when setting an + attribute by default. + - *slots=True* (see :term:`slotted classes` for potentially surprising + behaviors) + - *auto_exc=True* + - *auto_detect=True* + - *order=False* + - *match_args=True* + - Some options that were only relevant on Python 2 or were kept around for + backwards-compatibility have been removed. + + Please note that these are all defaults and you can change them as you + wish. + + :param Optional[bool] auto_attribs: If set to `True` or `False`, it behaves + exactly like `attr.s`. If left `None`, `attr.s` will try to guess: + + 1. If any attributes are annotated and no unannotated `attrs.fields`\ s + are found, it assumes *auto_attribs=True*. + 2. Otherwise it assumes *auto_attribs=False* and tries to collect + `attrs.fields`\ s. + + For now, please refer to `attr.s` for the rest of the parameters. + + .. versionadded:: 20.1.0 + .. versionchanged:: 21.3.0 Converters are also run ``on_setattr``. + """ + + def do_it(cls, auto_attribs): + return attrs( + maybe_cls=cls, + these=these, + repr=repr, + hash=hash, + init=init, + slots=slots, + frozen=frozen, + weakref_slot=weakref_slot, + str=str, + auto_attribs=auto_attribs, + kw_only=kw_only, + cache_hash=cache_hash, + auto_exc=auto_exc, + eq=eq, + order=order, + auto_detect=auto_detect, + collect_by_mro=True, + getstate_setstate=getstate_setstate, + on_setattr=on_setattr, + field_transformer=field_transformer, + match_args=match_args, + ) + + def wrap(cls): + """ + Making this a wrapper ensures this code runs during class creation. + + We also ensure that frozen-ness of classes is inherited. + """ + nonlocal frozen, on_setattr + + had_on_setattr = on_setattr not in (None, setters.NO_OP) + + # By default, mutable classes convert & validate on setattr. + if frozen is False and on_setattr is None: + on_setattr = _ng_default_on_setattr + + # However, if we subclass a frozen class, we inherit the immutability + # and disable on_setattr. + for base_cls in cls.__bases__: + if base_cls.__setattr__ is _frozen_setattrs: + if had_on_setattr: + raise ValueError( + "Frozen classes can't use on_setattr " + "(frozen-ness was inherited)." + ) + + on_setattr = setters.NO_OP + break + + if auto_attribs is not None: + return do_it(cls, auto_attribs) + + try: + return do_it(cls, True) + except UnannotatedAttributeError: + return do_it(cls, False) + + # maybe_cls's type depends on the usage of the decorator. It's a class + # if it's used as `@attrs` but ``None`` if used as `@attrs()`. + if maybe_cls is None: + return wrap + else: + return wrap(maybe_cls) + + +mutable = define +frozen = partial(define, frozen=True, on_setattr=None) + + +def field( + *, + default=NOTHING, + validator=None, + repr=True, + hash=None, + init=True, + metadata=None, + converter=None, + factory=None, + kw_only=False, + eq=None, + order=None, + on_setattr=None, +): + """ + Identical to `attr.ib`, except keyword-only and with some arguments + removed. + + .. versionadded:: 20.1.0 + """ + return attrib( + default=default, + validator=validator, + repr=repr, + hash=hash, + init=init, + metadata=metadata, + converter=converter, + factory=factory, + kw_only=kw_only, + eq=eq, + order=order, + on_setattr=on_setattr, + ) + + +def asdict(inst, *, recurse=True, filter=None, value_serializer=None): + """ + Same as `attr.asdict`, except that collections types are always retained + and dict is always used as *dict_factory*. + + .. versionadded:: 21.3.0 + """ + return _asdict( + inst=inst, + recurse=recurse, + filter=filter, + value_serializer=value_serializer, + retain_collection_types=True, + ) + + +def astuple(inst, *, recurse=True, filter=None): + """ + Same as `attr.astuple`, except that collections types are always retained + and `tuple` is always used as the *tuple_factory*. + + .. versionadded:: 21.3.0 + """ + return _astuple( + inst=inst, recurse=recurse, filter=filter, retain_collection_types=True + ) diff --git a/openpype/vendor/python/python_2/attr/_version_info.py b/openpype/vendor/python/python_2/attr/_version_info.py new file mode 100644 index 0000000000..cdaeec37a1 --- /dev/null +++ b/openpype/vendor/python/python_2/attr/_version_info.py @@ -0,0 +1,87 @@ +# SPDX-License-Identifier: MIT + +from __future__ import absolute_import, division, print_function + +from functools import total_ordering + +from ._funcs import astuple +from ._make import attrib, attrs + + +@total_ordering +@attrs(eq=False, order=False, slots=True, frozen=True) +class VersionInfo(object): + """ + A version object that can be compared to tuple of length 1--4: + + >>> attr.VersionInfo(19, 1, 0, "final") <= (19, 2) + True + >>> attr.VersionInfo(19, 1, 0, "final") < (19, 1, 1) + True + >>> vi = attr.VersionInfo(19, 2, 0, "final") + >>> vi < (19, 1, 1) + False + >>> vi < (19,) + False + >>> vi == (19, 2,) + True + >>> vi == (19, 2, 1) + False + + .. versionadded:: 19.2 + """ + + year = attrib(type=int) + minor = attrib(type=int) + micro = attrib(type=int) + releaselevel = attrib(type=str) + + @classmethod + def _from_version_string(cls, s): + """ + Parse *s* and return a _VersionInfo. + """ + v = s.split(".") + if len(v) == 3: + v.append("final") + + return cls( + year=int(v[0]), minor=int(v[1]), micro=int(v[2]), releaselevel=v[3] + ) + + def _ensure_tuple(self, other): + """ + Ensure *other* is a tuple of a valid length. + + Returns a possibly transformed *other* and ourselves as a tuple of + the same length as *other*. + """ + + if self.__class__ is other.__class__: + other = astuple(other) + + if not isinstance(other, tuple): + raise NotImplementedError + + if not (1 <= len(other) <= 4): + raise NotImplementedError + + return astuple(self)[: len(other)], other + + def __eq__(self, other): + try: + us, them = self._ensure_tuple(other) + except NotImplementedError: + return NotImplemented + + return us == them + + def __lt__(self, other): + try: + us, them = self._ensure_tuple(other) + except NotImplementedError: + return NotImplemented + + # Since alphabetically "dev0" < "final" < "post1" < "post2", we don't + # have to do anything special with releaselevel for now. + return us < them diff --git a/openpype/vendor/python/python_2/attr/_version_info.pyi b/openpype/vendor/python/python_2/attr/_version_info.pyi new file mode 100644 index 0000000000..45ced08633 --- /dev/null +++ b/openpype/vendor/python/python_2/attr/_version_info.pyi @@ -0,0 +1,9 @@ +class VersionInfo: + @property + def year(self) -> int: ... + @property + def minor(self) -> int: ... + @property + def micro(self) -> int: ... + @property + def releaselevel(self) -> str: ... diff --git a/openpype/vendor/python/python_2/attr/converters.py b/openpype/vendor/python/python_2/attr/converters.py new file mode 100644 index 0000000000..1fb6c05d7b --- /dev/null +++ b/openpype/vendor/python/python_2/attr/converters.py @@ -0,0 +1,155 @@ +# SPDX-License-Identifier: MIT + +""" +Commonly useful converters. +""" + +from __future__ import absolute_import, division, print_function + +from ._compat import PY2 +from ._make import NOTHING, Factory, pipe + + +if not PY2: + import inspect + import typing + + +__all__ = [ + "default_if_none", + "optional", + "pipe", + "to_bool", +] + + +def optional(converter): + """ + A converter that allows an attribute to be optional. An optional attribute + is one which can be set to ``None``. + + Type annotations will be inferred from the wrapped converter's, if it + has any. + + :param callable converter: the converter that is used for non-``None`` + values. + + .. versionadded:: 17.1.0 + """ + + def optional_converter(val): + if val is None: + return None + return converter(val) + + if not PY2: + sig = None + try: + sig = inspect.signature(converter) + except (ValueError, TypeError): # inspect failed + pass + if sig: + params = list(sig.parameters.values()) + if params and params[0].annotation is not inspect.Parameter.empty: + optional_converter.__annotations__["val"] = typing.Optional[ + params[0].annotation + ] + if sig.return_annotation is not inspect.Signature.empty: + optional_converter.__annotations__["return"] = typing.Optional[ + sig.return_annotation + ] + + return optional_converter + + +def default_if_none(default=NOTHING, factory=None): + """ + A converter that allows to replace ``None`` values by *default* or the + result of *factory*. + + :param default: Value to be used if ``None`` is passed. Passing an instance + of `attrs.Factory` is supported, however the ``takes_self`` option + is *not*. + :param callable factory: A callable that takes no parameters whose result + is used if ``None`` is passed. + + :raises TypeError: If **neither** *default* or *factory* is passed. + :raises TypeError: If **both** *default* and *factory* are passed. + :raises ValueError: If an instance of `attrs.Factory` is passed with + ``takes_self=True``. + + .. versionadded:: 18.2.0 + """ + if default is NOTHING and factory is None: + raise TypeError("Must pass either `default` or `factory`.") + + if default is not NOTHING and factory is not None: + raise TypeError( + "Must pass either `default` or `factory` but not both." + ) + + if factory is not None: + default = Factory(factory) + + if isinstance(default, Factory): + if default.takes_self: + raise ValueError( + "`takes_self` is not supported by default_if_none." + ) + + def default_if_none_converter(val): + if val is not None: + return val + + return default.factory() + + else: + + def default_if_none_converter(val): + if val is not None: + return val + + return default + + return default_if_none_converter + + +def to_bool(val): + """ + Convert "boolean" strings (e.g., from env. vars.) to real booleans. + + Values mapping to :code:`True`: + + - :code:`True` + - :code:`"true"` / :code:`"t"` + - :code:`"yes"` / :code:`"y"` + - :code:`"on"` + - :code:`"1"` + - :code:`1` + + Values mapping to :code:`False`: + + - :code:`False` + - :code:`"false"` / :code:`"f"` + - :code:`"no"` / :code:`"n"` + - :code:`"off"` + - :code:`"0"` + - :code:`0` + + :raises ValueError: for any other value. + + .. versionadded:: 21.3.0 + """ + if isinstance(val, str): + val = val.lower() + truthy = {True, "true", "t", "yes", "y", "on", "1", 1} + falsy = {False, "false", "f", "no", "n", "off", "0", 0} + try: + if val in truthy: + return True + if val in falsy: + return False + except TypeError: + # Raised when "val" is not hashable (e.g., lists) + pass + raise ValueError("Cannot convert value to bool: {}".format(val)) diff --git a/openpype/vendor/python/python_2/attr/converters.pyi b/openpype/vendor/python/python_2/attr/converters.pyi new file mode 100644 index 0000000000..0f58088a37 --- /dev/null +++ b/openpype/vendor/python/python_2/attr/converters.pyi @@ -0,0 +1,13 @@ +from typing import Callable, Optional, TypeVar, overload + +from . import _ConverterType + +_T = TypeVar("_T") + +def pipe(*validators: _ConverterType) -> _ConverterType: ... +def optional(converter: _ConverterType) -> _ConverterType: ... +@overload +def default_if_none(default: _T) -> _ConverterType: ... +@overload +def default_if_none(*, factory: Callable[[], _T]) -> _ConverterType: ... +def to_bool(val: str) -> bool: ... diff --git a/openpype/vendor/python/python_2/attr/exceptions.py b/openpype/vendor/python/python_2/attr/exceptions.py new file mode 100644 index 0000000000..b2f1edc32a --- /dev/null +++ b/openpype/vendor/python/python_2/attr/exceptions.py @@ -0,0 +1,94 @@ +# SPDX-License-Identifier: MIT + +from __future__ import absolute_import, division, print_function + + +class FrozenError(AttributeError): + """ + A frozen/immutable instance or attribute have been attempted to be + modified. + + It mirrors the behavior of ``namedtuples`` by using the same error message + and subclassing `AttributeError`. + + .. versionadded:: 20.1.0 + """ + + msg = "can't set attribute" + args = [msg] + + +class FrozenInstanceError(FrozenError): + """ + A frozen instance has been attempted to be modified. + + .. versionadded:: 16.1.0 + """ + + +class FrozenAttributeError(FrozenError): + """ + A frozen attribute has been attempted to be modified. + + .. versionadded:: 20.1.0 + """ + + +class AttrsAttributeNotFoundError(ValueError): + """ + An ``attrs`` function couldn't find an attribute that the user asked for. + + .. versionadded:: 16.2.0 + """ + + +class NotAnAttrsClassError(ValueError): + """ + A non-``attrs`` class has been passed into an ``attrs`` function. + + .. versionadded:: 16.2.0 + """ + + +class DefaultAlreadySetError(RuntimeError): + """ + A default has been set using ``attr.ib()`` and is attempted to be reset + using the decorator. + + .. versionadded:: 17.1.0 + """ + + +class UnannotatedAttributeError(RuntimeError): + """ + A class with ``auto_attribs=True`` has an ``attr.ib()`` without a type + annotation. + + .. versionadded:: 17.3.0 + """ + + +class PythonTooOldError(RuntimeError): + """ + It was attempted to use an ``attrs`` feature that requires a newer Python + version. + + .. versionadded:: 18.2.0 + """ + + +class NotCallableError(TypeError): + """ + A ``attr.ib()`` requiring a callable has been set with a value + that is not callable. + + .. versionadded:: 19.2.0 + """ + + def __init__(self, msg, value): + super(TypeError, self).__init__(msg, value) + self.msg = msg + self.value = value + + def __str__(self): + return str(self.msg) diff --git a/openpype/vendor/python/python_2/attr/exceptions.pyi b/openpype/vendor/python/python_2/attr/exceptions.pyi new file mode 100644 index 0000000000..f2680118b4 --- /dev/null +++ b/openpype/vendor/python/python_2/attr/exceptions.pyi @@ -0,0 +1,17 @@ +from typing import Any + +class FrozenError(AttributeError): + msg: str = ... + +class FrozenInstanceError(FrozenError): ... +class FrozenAttributeError(FrozenError): ... +class AttrsAttributeNotFoundError(ValueError): ... +class NotAnAttrsClassError(ValueError): ... +class DefaultAlreadySetError(RuntimeError): ... +class UnannotatedAttributeError(RuntimeError): ... +class PythonTooOldError(RuntimeError): ... + +class NotCallableError(TypeError): + msg: str = ... + value: Any = ... + def __init__(self, msg: str, value: Any) -> None: ... diff --git a/openpype/vendor/python/python_2/attr/filters.py b/openpype/vendor/python/python_2/attr/filters.py new file mode 100644 index 0000000000..a1978a8775 --- /dev/null +++ b/openpype/vendor/python/python_2/attr/filters.py @@ -0,0 +1,54 @@ +# SPDX-License-Identifier: MIT + +""" +Commonly useful filters for `attr.asdict`. +""" + +from __future__ import absolute_import, division, print_function + +from ._compat import isclass +from ._make import Attribute + + +def _split_what(what): + """ + Returns a tuple of `frozenset`s of classes and attributes. + """ + return ( + frozenset(cls for cls in what if isclass(cls)), + frozenset(cls for cls in what if isinstance(cls, Attribute)), + ) + + +def include(*what): + """ + Include *what*. + + :param what: What to include. + :type what: `list` of `type` or `attrs.Attribute`\\ s + + :rtype: `callable` + """ + cls, attrs = _split_what(what) + + def include_(attribute, value): + return value.__class__ in cls or attribute in attrs + + return include_ + + +def exclude(*what): + """ + Exclude *what*. + + :param what: What to exclude. + :type what: `list` of classes or `attrs.Attribute`\\ s. + + :rtype: `callable` + """ + cls, attrs = _split_what(what) + + def exclude_(attribute, value): + return value.__class__ not in cls and attribute not in attrs + + return exclude_ diff --git a/openpype/vendor/python/python_2/attr/filters.pyi b/openpype/vendor/python/python_2/attr/filters.pyi new file mode 100644 index 0000000000..993866865e --- /dev/null +++ b/openpype/vendor/python/python_2/attr/filters.pyi @@ -0,0 +1,6 @@ +from typing import Any, Union + +from . import Attribute, _FilterType + +def include(*what: Union[type, Attribute[Any]]) -> _FilterType[Any]: ... +def exclude(*what: Union[type, Attribute[Any]]) -> _FilterType[Any]: ... diff --git a/openpype/vendor/python/python_2/attr/py.typed b/openpype/vendor/python/python_2/attr/py.typed new file mode 100644 index 0000000000..e69de29bb2 diff --git a/openpype/vendor/python/python_2/attr/setters.py b/openpype/vendor/python/python_2/attr/setters.py new file mode 100644 index 0000000000..b1cbb5d83e --- /dev/null +++ b/openpype/vendor/python/python_2/attr/setters.py @@ -0,0 +1,79 @@ +# SPDX-License-Identifier: MIT + +""" +Commonly used hooks for on_setattr. +""" + +from __future__ import absolute_import, division, print_function + +from . import _config +from .exceptions import FrozenAttributeError + + +def pipe(*setters): + """ + Run all *setters* and return the return value of the last one. + + .. versionadded:: 20.1.0 + """ + + def wrapped_pipe(instance, attrib, new_value): + rv = new_value + + for setter in setters: + rv = setter(instance, attrib, rv) + + return rv + + return wrapped_pipe + + +def frozen(_, __, ___): + """ + Prevent an attribute to be modified. + + .. versionadded:: 20.1.0 + """ + raise FrozenAttributeError() + + +def validate(instance, attrib, new_value): + """ + Run *attrib*'s validator on *new_value* if it has one. + + .. versionadded:: 20.1.0 + """ + if _config._run_validators is False: + return new_value + + v = attrib.validator + if not v: + return new_value + + v(instance, attrib, new_value) + + return new_value + + +def convert(instance, attrib, new_value): + """ + Run *attrib*'s converter -- if it has one -- on *new_value* and return the + result. + + .. versionadded:: 20.1.0 + """ + c = attrib.converter + if c: + return c(new_value) + + return new_value + + +NO_OP = object() +""" +Sentinel for disabling class-wide *on_setattr* hooks for certain attributes. + +Does not work in `pipe` or within lists. + +.. versionadded:: 20.1.0 +""" diff --git a/openpype/vendor/python/python_2/attr/setters.pyi b/openpype/vendor/python/python_2/attr/setters.pyi new file mode 100644 index 0000000000..3f5603c2b0 --- /dev/null +++ b/openpype/vendor/python/python_2/attr/setters.pyi @@ -0,0 +1,19 @@ +from typing import Any, NewType, NoReturn, TypeVar, cast + +from . import Attribute, _OnSetAttrType + +_T = TypeVar("_T") + +def frozen( + instance: Any, attribute: Attribute[Any], new_value: Any +) -> NoReturn: ... +def pipe(*setters: _OnSetAttrType) -> _OnSetAttrType: ... +def validate(instance: Any, attribute: Attribute[_T], new_value: _T) -> _T: ... + +# convert is allowed to return Any, because they can be chained using pipe. +def convert( + instance: Any, attribute: Attribute[Any], new_value: Any +) -> Any: ... + +_NoOpType = NewType("_NoOpType", object) +NO_OP: _NoOpType diff --git a/openpype/vendor/python/python_2/attr/validators.py b/openpype/vendor/python/python_2/attr/validators.py new file mode 100644 index 0000000000..0b0c8342f2 --- /dev/null +++ b/openpype/vendor/python/python_2/attr/validators.py @@ -0,0 +1,561 @@ +# SPDX-License-Identifier: MIT + +""" +Commonly useful validators. +""" + +from __future__ import absolute_import, division, print_function + +import operator +import re + +from contextlib import contextmanager + +from ._config import get_run_validators, set_run_validators +from ._make import _AndValidator, and_, attrib, attrs +from .exceptions import NotCallableError + + +try: + Pattern = re.Pattern +except AttributeError: # Python <3.7 lacks a Pattern type. + Pattern = type(re.compile("")) + + +__all__ = [ + "and_", + "deep_iterable", + "deep_mapping", + "disabled", + "ge", + "get_disabled", + "gt", + "in_", + "instance_of", + "is_callable", + "le", + "lt", + "matches_re", + "max_len", + "optional", + "provides", + "set_disabled", +] + + +def set_disabled(disabled): + """ + Globally disable or enable running validators. + + By default, they are run. + + :param disabled: If ``True``, disable running all validators. + :type disabled: bool + + .. warning:: + + This function is not thread-safe! + + .. versionadded:: 21.3.0 + """ + set_run_validators(not disabled) + + +def get_disabled(): + """ + Return a bool indicating whether validators are currently disabled or not. + + :return: ``True`` if validators are currently disabled. + :rtype: bool + + .. versionadded:: 21.3.0 + """ + return not get_run_validators() + + +@contextmanager +def disabled(): + """ + Context manager that disables running validators within its context. + + .. warning:: + + This context manager is not thread-safe! + + .. versionadded:: 21.3.0 + """ + set_run_validators(False) + try: + yield + finally: + set_run_validators(True) + + +@attrs(repr=False, slots=True, hash=True) +class _InstanceOfValidator(object): + type = attrib() + + def __call__(self, inst, attr, value): + """ + We use a callable class to be able to change the ``__repr__``. + """ + if not isinstance(value, self.type): + raise TypeError( + "'{name}' must be {type!r} (got {value!r} that is a " + "{actual!r}).".format( + name=attr.name, + type=self.type, + actual=value.__class__, + value=value, + ), + attr, + self.type, + value, + ) + + def __repr__(self): + return "".format( + type=self.type + ) + + +def instance_of(type): + """ + A validator that raises a `TypeError` if the initializer is called + with a wrong type for this particular attribute (checks are performed using + `isinstance` therefore it's also valid to pass a tuple of types). + + :param type: The type to check for. + :type type: type or tuple of types + + :raises TypeError: With a human readable error message, the attribute + (of type `attrs.Attribute`), the expected type, and the value it + got. + """ + return _InstanceOfValidator(type) + + +@attrs(repr=False, frozen=True, slots=True) +class _MatchesReValidator(object): + pattern = attrib() + match_func = attrib() + + def __call__(self, inst, attr, value): + """ + We use a callable class to be able to change the ``__repr__``. + """ + if not self.match_func(value): + raise ValueError( + "'{name}' must match regex {pattern!r}" + " ({value!r} doesn't)".format( + name=attr.name, pattern=self.pattern.pattern, value=value + ), + attr, + self.pattern, + value, + ) + + def __repr__(self): + return "".format( + pattern=self.pattern + ) + + +def matches_re(regex, flags=0, func=None): + r""" + A validator that raises `ValueError` if the initializer is called + with a string that doesn't match *regex*. + + :param regex: a regex string or precompiled pattern to match against + :param int flags: flags that will be passed to the underlying re function + (default 0) + :param callable func: which underlying `re` function to call (options + are `re.fullmatch`, `re.search`, `re.match`, default + is ``None`` which means either `re.fullmatch` or an emulation of + it on Python 2). For performance reasons, they won't be used directly + but on a pre-`re.compile`\ ed pattern. + + .. versionadded:: 19.2.0 + .. versionchanged:: 21.3.0 *regex* can be a pre-compiled pattern. + """ + fullmatch = getattr(re, "fullmatch", None) + valid_funcs = (fullmatch, None, re.search, re.match) + if func not in valid_funcs: + raise ValueError( + "'func' must be one of {}.".format( + ", ".join( + sorted( + e and e.__name__ or "None" for e in set(valid_funcs) + ) + ) + ) + ) + + if isinstance(regex, Pattern): + if flags: + raise TypeError( + "'flags' can only be used with a string pattern; " + "pass flags to re.compile() instead" + ) + pattern = regex + else: + pattern = re.compile(regex, flags) + + if func is re.match: + match_func = pattern.match + elif func is re.search: + match_func = pattern.search + elif fullmatch: + match_func = pattern.fullmatch + else: # Python 2 fullmatch emulation (https://bugs.python.org/issue16203) + pattern = re.compile( + r"(?:{})\Z".format(pattern.pattern), pattern.flags + ) + match_func = pattern.match + + return _MatchesReValidator(pattern, match_func) + + +@attrs(repr=False, slots=True, hash=True) +class _ProvidesValidator(object): + interface = attrib() + + def __call__(self, inst, attr, value): + """ + We use a callable class to be able to change the ``__repr__``. + """ + if not self.interface.providedBy(value): + raise TypeError( + "'{name}' must provide {interface!r} which {value!r} " + "doesn't.".format( + name=attr.name, interface=self.interface, value=value + ), + attr, + self.interface, + value, + ) + + def __repr__(self): + return "".format( + interface=self.interface + ) + + +def provides(interface): + """ + A validator that raises a `TypeError` if the initializer is called + with an object that does not provide the requested *interface* (checks are + performed using ``interface.providedBy(value)`` (see `zope.interface + `_). + + :param interface: The interface to check for. + :type interface: ``zope.interface.Interface`` + + :raises TypeError: With a human readable error message, the attribute + (of type `attrs.Attribute`), the expected interface, and the + value it got. + """ + return _ProvidesValidator(interface) + + +@attrs(repr=False, slots=True, hash=True) +class _OptionalValidator(object): + validator = attrib() + + def __call__(self, inst, attr, value): + if value is None: + return + + self.validator(inst, attr, value) + + def __repr__(self): + return "".format( + what=repr(self.validator) + ) + + +def optional(validator): + """ + A validator that makes an attribute optional. An optional attribute is one + which can be set to ``None`` in addition to satisfying the requirements of + the sub-validator. + + :param validator: A validator (or a list of validators) that is used for + non-``None`` values. + :type validator: callable or `list` of callables. + + .. versionadded:: 15.1.0 + .. versionchanged:: 17.1.0 *validator* can be a list of validators. + """ + if isinstance(validator, list): + return _OptionalValidator(_AndValidator(validator)) + return _OptionalValidator(validator) + + +@attrs(repr=False, slots=True, hash=True) +class _InValidator(object): + options = attrib() + + def __call__(self, inst, attr, value): + try: + in_options = value in self.options + except TypeError: # e.g. `1 in "abc"` + in_options = False + + if not in_options: + raise ValueError( + "'{name}' must be in {options!r} (got {value!r})".format( + name=attr.name, options=self.options, value=value + ) + ) + + def __repr__(self): + return "".format( + options=self.options + ) + + +def in_(options): + """ + A validator that raises a `ValueError` if the initializer is called + with a value that does not belong in the options provided. The check is + performed using ``value in options``. + + :param options: Allowed options. + :type options: list, tuple, `enum.Enum`, ... + + :raises ValueError: With a human readable error message, the attribute (of + type `attrs.Attribute`), the expected options, and the value it + got. + + .. versionadded:: 17.1.0 + """ + return _InValidator(options) + + +@attrs(repr=False, slots=False, hash=True) +class _IsCallableValidator(object): + def __call__(self, inst, attr, value): + """ + We use a callable class to be able to change the ``__repr__``. + """ + if not callable(value): + message = ( + "'{name}' must be callable " + "(got {value!r} that is a {actual!r})." + ) + raise NotCallableError( + msg=message.format( + name=attr.name, value=value, actual=value.__class__ + ), + value=value, + ) + + def __repr__(self): + return "" + + +def is_callable(): + """ + A validator that raises a `attr.exceptions.NotCallableError` if the + initializer is called with a value for this particular attribute + that is not callable. + + .. versionadded:: 19.1.0 + + :raises `attr.exceptions.NotCallableError`: With a human readable error + message containing the attribute (`attrs.Attribute`) name, + and the value it got. + """ + return _IsCallableValidator() + + +@attrs(repr=False, slots=True, hash=True) +class _DeepIterable(object): + member_validator = attrib(validator=is_callable()) + iterable_validator = attrib( + default=None, validator=optional(is_callable()) + ) + + def __call__(self, inst, attr, value): + """ + We use a callable class to be able to change the ``__repr__``. + """ + if self.iterable_validator is not None: + self.iterable_validator(inst, attr, value) + + for member in value: + self.member_validator(inst, attr, member) + + def __repr__(self): + iterable_identifier = ( + "" + if self.iterable_validator is None + else " {iterable!r}".format(iterable=self.iterable_validator) + ) + return ( + "" + ).format( + iterable_identifier=iterable_identifier, + member=self.member_validator, + ) + + +def deep_iterable(member_validator, iterable_validator=None): + """ + A validator that performs deep validation of an iterable. + + :param member_validator: Validator to apply to iterable members + :param iterable_validator: Validator to apply to iterable itself + (optional) + + .. versionadded:: 19.1.0 + + :raises TypeError: if any sub-validators fail + """ + return _DeepIterable(member_validator, iterable_validator) + + +@attrs(repr=False, slots=True, hash=True) +class _DeepMapping(object): + key_validator = attrib(validator=is_callable()) + value_validator = attrib(validator=is_callable()) + mapping_validator = attrib(default=None, validator=optional(is_callable())) + + def __call__(self, inst, attr, value): + """ + We use a callable class to be able to change the ``__repr__``. + """ + if self.mapping_validator is not None: + self.mapping_validator(inst, attr, value) + + for key in value: + self.key_validator(inst, attr, key) + self.value_validator(inst, attr, value[key]) + + def __repr__(self): + return ( + "" + ).format(key=self.key_validator, value=self.value_validator) + + +def deep_mapping(key_validator, value_validator, mapping_validator=None): + """ + A validator that performs deep validation of a dictionary. + + :param key_validator: Validator to apply to dictionary keys + :param value_validator: Validator to apply to dictionary values + :param mapping_validator: Validator to apply to top-level mapping + attribute (optional) + + .. versionadded:: 19.1.0 + + :raises TypeError: if any sub-validators fail + """ + return _DeepMapping(key_validator, value_validator, mapping_validator) + + +@attrs(repr=False, frozen=True, slots=True) +class _NumberValidator(object): + bound = attrib() + compare_op = attrib() + compare_func = attrib() + + def __call__(self, inst, attr, value): + """ + We use a callable class to be able to change the ``__repr__``. + """ + if not self.compare_func(value, self.bound): + raise ValueError( + "'{name}' must be {op} {bound}: {value}".format( + name=attr.name, + op=self.compare_op, + bound=self.bound, + value=value, + ) + ) + + def __repr__(self): + return "".format( + op=self.compare_op, bound=self.bound + ) + + +def lt(val): + """ + A validator that raises `ValueError` if the initializer is called + with a number larger or equal to *val*. + + :param val: Exclusive upper bound for values + + .. versionadded:: 21.3.0 + """ + return _NumberValidator(val, "<", operator.lt) + + +def le(val): + """ + A validator that raises `ValueError` if the initializer is called + with a number greater than *val*. + + :param val: Inclusive upper bound for values + + .. versionadded:: 21.3.0 + """ + return _NumberValidator(val, "<=", operator.le) + + +def ge(val): + """ + A validator that raises `ValueError` if the initializer is called + with a number smaller than *val*. + + :param val: Inclusive lower bound for values + + .. versionadded:: 21.3.0 + """ + return _NumberValidator(val, ">=", operator.ge) + + +def gt(val): + """ + A validator that raises `ValueError` if the initializer is called + with a number smaller or equal to *val*. + + :param val: Exclusive lower bound for values + + .. versionadded:: 21.3.0 + """ + return _NumberValidator(val, ">", operator.gt) + + +@attrs(repr=False, frozen=True, slots=True) +class _MaxLengthValidator(object): + max_length = attrib() + + def __call__(self, inst, attr, value): + """ + We use a callable class to be able to change the ``__repr__``. + """ + if len(value) > self.max_length: + raise ValueError( + "Length of '{name}' must be <= {max}: {len}".format( + name=attr.name, max=self.max_length, len=len(value) + ) + ) + + def __repr__(self): + return "".format(max=self.max_length) + + +def max_len(length): + """ + A validator that raises `ValueError` if the initializer is called + with a string or iterable that is longer than *length*. + + :param int length: Maximum length of the string or iterable + + .. versionadded:: 21.3.0 + """ + return _MaxLengthValidator(length) diff --git a/openpype/vendor/python/python_2/attr/validators.pyi b/openpype/vendor/python/python_2/attr/validators.pyi new file mode 100644 index 0000000000..5e00b85433 --- /dev/null +++ b/openpype/vendor/python/python_2/attr/validators.pyi @@ -0,0 +1,78 @@ +from typing import ( + Any, + AnyStr, + Callable, + Container, + ContextManager, + Iterable, + List, + Mapping, + Match, + Optional, + Pattern, + Tuple, + Type, + TypeVar, + Union, + overload, +) + +from . import _ValidatorType + +_T = TypeVar("_T") +_T1 = TypeVar("_T1") +_T2 = TypeVar("_T2") +_T3 = TypeVar("_T3") +_I = TypeVar("_I", bound=Iterable) +_K = TypeVar("_K") +_V = TypeVar("_V") +_M = TypeVar("_M", bound=Mapping) + +def set_disabled(run: bool) -> None: ... +def get_disabled() -> bool: ... +def disabled() -> ContextManager[None]: ... + +# To be more precise on instance_of use some overloads. +# If there are more than 3 items in the tuple then we fall back to Any +@overload +def instance_of(type: Type[_T]) -> _ValidatorType[_T]: ... +@overload +def instance_of(type: Tuple[Type[_T]]) -> _ValidatorType[_T]: ... +@overload +def instance_of( + type: Tuple[Type[_T1], Type[_T2]] +) -> _ValidatorType[Union[_T1, _T2]]: ... +@overload +def instance_of( + type: Tuple[Type[_T1], Type[_T2], Type[_T3]] +) -> _ValidatorType[Union[_T1, _T2, _T3]]: ... +@overload +def instance_of(type: Tuple[type, ...]) -> _ValidatorType[Any]: ... +def provides(interface: Any) -> _ValidatorType[Any]: ... +def optional( + validator: Union[_ValidatorType[_T], List[_ValidatorType[_T]]] +) -> _ValidatorType[Optional[_T]]: ... +def in_(options: Container[_T]) -> _ValidatorType[_T]: ... +def and_(*validators: _ValidatorType[_T]) -> _ValidatorType[_T]: ... +def matches_re( + regex: Union[Pattern[AnyStr], AnyStr], + flags: int = ..., + func: Optional[ + Callable[[AnyStr, AnyStr, int], Optional[Match[AnyStr]]] + ] = ..., +) -> _ValidatorType[AnyStr]: ... +def deep_iterable( + member_validator: _ValidatorType[_T], + iterable_validator: Optional[_ValidatorType[_I]] = ..., +) -> _ValidatorType[_I]: ... +def deep_mapping( + key_validator: _ValidatorType[_K], + value_validator: _ValidatorType[_V], + mapping_validator: Optional[_ValidatorType[_M]] = ..., +) -> _ValidatorType[_M]: ... +def is_callable() -> _ValidatorType[_T]: ... +def lt(val: _T) -> _ValidatorType[_T]: ... +def le(val: _T) -> _ValidatorType[_T]: ... +def ge(val: _T) -> _ValidatorType[_T]: ... +def gt(val: _T) -> _ValidatorType[_T]: ... +def max_len(length: int) -> _ValidatorType[_T]: ... diff --git a/openpype/vendor/python/python_2/attrs/__init__.py b/openpype/vendor/python/python_2/attrs/__init__.py new file mode 100644 index 0000000000..a704b8b56b --- /dev/null +++ b/openpype/vendor/python/python_2/attrs/__init__.py @@ -0,0 +1,70 @@ +# SPDX-License-Identifier: MIT + +from attr import ( + NOTHING, + Attribute, + Factory, + __author__, + __copyright__, + __description__, + __doc__, + __email__, + __license__, + __title__, + __url__, + __version__, + __version_info__, + assoc, + cmp_using, + define, + evolve, + field, + fields, + fields_dict, + frozen, + has, + make_class, + mutable, + resolve_types, + validate, +) +from attr._next_gen import asdict, astuple + +from . import converters, exceptions, filters, setters, validators + + +__all__ = [ + "__author__", + "__copyright__", + "__description__", + "__doc__", + "__email__", + "__license__", + "__title__", + "__url__", + "__version__", + "__version_info__", + "asdict", + "assoc", + "astuple", + "Attribute", + "cmp_using", + "converters", + "define", + "evolve", + "exceptions", + "Factory", + "field", + "fields_dict", + "fields", + "filters", + "frozen", + "has", + "make_class", + "mutable", + "NOTHING", + "resolve_types", + "setters", + "validate", + "validators", +] diff --git a/openpype/vendor/python/python_2/attrs/__init__.pyi b/openpype/vendor/python/python_2/attrs/__init__.pyi new file mode 100644 index 0000000000..7426fa5ddb --- /dev/null +++ b/openpype/vendor/python/python_2/attrs/__init__.pyi @@ -0,0 +1,63 @@ +from typing import ( + Any, + Callable, + Dict, + Mapping, + Optional, + Sequence, + Tuple, + Type, +) + +# Because we need to type our own stuff, we have to make everything from +# attr explicitly public too. +from attr import __author__ as __author__ +from attr import __copyright__ as __copyright__ +from attr import __description__ as __description__ +from attr import __email__ as __email__ +from attr import __license__ as __license__ +from attr import __title__ as __title__ +from attr import __url__ as __url__ +from attr import __version__ as __version__ +from attr import __version_info__ as __version_info__ +from attr import _FilterType +from attr import assoc as assoc +from attr import Attribute as Attribute +from attr import define as define +from attr import evolve as evolve +from attr import Factory as Factory +from attr import exceptions as exceptions +from attr import field as field +from attr import fields as fields +from attr import fields_dict as fields_dict +from attr import frozen as frozen +from attr import has as has +from attr import make_class as make_class +from attr import mutable as mutable +from attr import NOTHING as NOTHING +from attr import resolve_types as resolve_types +from attr import setters as setters +from attr import validate as validate +from attr import validators as validators + +# TODO: see definition of attr.asdict/astuple +def asdict( + inst: Any, + recurse: bool = ..., + filter: Optional[_FilterType[Any]] = ..., + dict_factory: Type[Mapping[Any, Any]] = ..., + retain_collection_types: bool = ..., + value_serializer: Optional[ + Callable[[type, Attribute[Any], Any], Any] + ] = ..., + tuple_keys: bool = ..., +) -> Dict[str, Any]: ... + +# TODO: add support for returning NamedTuple from the mypy plugin +def astuple( + inst: Any, + recurse: bool = ..., + filter: Optional[_FilterType[Any]] = ..., + tuple_factory: Type[Sequence[Any]] = ..., + retain_collection_types: bool = ..., +) -> Tuple[Any, ...]: ... diff --git a/openpype/vendor/python/python_2/attrs/converters.py b/openpype/vendor/python/python_2/attrs/converters.py new file mode 100644 index 0000000000..edfa8d3c16 --- /dev/null +++ b/openpype/vendor/python/python_2/attrs/converters.py @@ -0,0 +1,3 @@ +# SPDX-License-Identifier: MIT + +from attr.converters import * # noqa diff --git a/openpype/vendor/python/python_2/attrs/exceptions.py b/openpype/vendor/python/python_2/attrs/exceptions.py new file mode 100644 index 0000000000..bd9efed202 --- /dev/null +++ b/openpype/vendor/python/python_2/attrs/exceptions.py @@ -0,0 +1,3 @@ +# SPDX-License-Identifier: MIT + +from attr.exceptions import * # noqa diff --git a/openpype/vendor/python/python_2/attrs/filters.py b/openpype/vendor/python/python_2/attrs/filters.py new file mode 100644 index 0000000000..52959005b0 --- /dev/null +++ b/openpype/vendor/python/python_2/attrs/filters.py @@ -0,0 +1,3 @@ +# SPDX-License-Identifier: MIT + +from attr.filters import * # noqa diff --git a/openpype/vendor/python/python_2/attrs/py.typed b/openpype/vendor/python/python_2/attrs/py.typed new file mode 100644 index 0000000000..e69de29bb2 diff --git a/openpype/vendor/python/python_2/attrs/setters.py b/openpype/vendor/python/python_2/attrs/setters.py new file mode 100644 index 0000000000..9b50770804 --- /dev/null +++ b/openpype/vendor/python/python_2/attrs/setters.py @@ -0,0 +1,3 @@ +# SPDX-License-Identifier: MIT + +from attr.setters import * # noqa diff --git a/openpype/vendor/python/python_2/attrs/validators.py b/openpype/vendor/python/python_2/attrs/validators.py new file mode 100644 index 0000000000..ab2c9b3024 --- /dev/null +++ b/openpype/vendor/python/python_2/attrs/validators.py @@ -0,0 +1,3 @@ +# SPDX-License-Identifier: MIT + +from attr.validators import * # noqa diff --git a/openpype/version.py b/openpype/version.py index 6ff5dfb7b5..174aca1e6c 100644 --- a/openpype/version.py +++ b/openpype/version.py @@ -1,3 +1,3 @@ # -*- coding: utf-8 -*- """Package declaring Pype version.""" -__version__ = "3.13.1-nightly.2" +__version__ = "3.14.1-nightly.1" diff --git a/poetry.lock b/poetry.lock index 919a352505..21b6bda880 100644 --- a/poetry.lock +++ b/poetry.lock @@ -48,7 +48,7 @@ aiohttp = ">=3,<4" [[package]] name = "aiohttp-middlewares" -version = "2.0.0" +version = "2.1.0" description = "Collection of useful middlewares for aiohttp applications." category = "main" optional = false @@ -114,7 +114,7 @@ python-dateutil = ">=2.7.0" [[package]] name = "astroid" -version = "2.11.5" +version = "2.11.7" description = "An abstract syntax tree for Python with inference support." category = "dev" optional = false @@ -147,7 +147,7 @@ python-versions = ">=3.5" [[package]] name = "atomicwrites" -version = "1.4.0" +version = "1.4.1" description = "Atomic file writes." category = "dev" optional = false @@ -155,17 +155,17 @@ python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*" [[package]] name = "attrs" -version = "21.4.0" +version = "22.1.0" description = "Classes Without Boilerplate" category = "main" optional = false -python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*" +python-versions = ">=3.5" [package.extras] -dev = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "six", "mypy", "pytest-mypy-plugins", "zope.interface", "furo", "sphinx", "sphinx-notfound-page", "pre-commit", "cloudpickle"] +dev = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "mypy (>=0.900,!=0.940)", "pytest-mypy-plugins", "zope.interface", "furo", "sphinx", "sphinx-notfound-page", "pre-commit", "cloudpickle"] docs = ["furo", "sphinx", "zope.interface", "sphinx-notfound-page"] -tests = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "six", "mypy", "pytest-mypy-plugins", "zope.interface", "cloudpickle"] -tests_no_zope = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "six", "mypy", "pytest-mypy-plugins", "cloudpickle"] +tests = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "mypy (>=0.900,!=0.940)", "pytest-mypy-plugins", "zope.interface", "cloudpickle"] +tests_no_zope = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "mypy (>=0.900,!=0.940)", "pytest-mypy-plugins", "cloudpickle"] [[package]] name = "autopep8" @@ -181,11 +181,11 @@ toml = "*" [[package]] name = "babel" -version = "2.9.1" +version = "2.10.3" description = "Internationalization utilities" category = "dev" optional = false -python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*" +python-versions = ">=3.6" [package.dependencies] pytz = ">=2015.7" @@ -236,7 +236,7 @@ python-versions = ">=3.6" [[package]] name = "cffi" -version = "1.15.0" +version = "1.15.1" description = "Foreign Function Interface for Python calling C code." category = "main" optional = false @@ -279,7 +279,7 @@ test = ["pytest-runner (>=2.7,<3)", "pytest (>=2.3.5,<5)", "pytest-cov (>=2,<3)" [[package]] name = "colorama" -version = "0.4.4" +version = "0.4.5" description = "Cross-platform colored terminal text." category = "dev" optional = false @@ -306,7 +306,7 @@ python-versions = "*" [[package]] name = "coverage" -version = "6.4.1" +version = "6.4.3" description = "Code coverage measurement for Python" category = "dev" optional = false @@ -320,7 +320,7 @@ toml = ["tomli"] [[package]] name = "cryptography" -version = "37.0.2" +version = "37.0.4" description = "cryptography is a package which provides cryptographic recipes and primitives to Python developers." category = "main" optional = false @@ -408,7 +408,7 @@ python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*" [[package]] name = "dropbox" -version = "11.31.0" +version = "11.33.0" description = "Official Dropbox API Client" category = "main" optional = false @@ -433,7 +433,7 @@ prefixed = ">=0.3.2" [[package]] name = "evdev" -version = "1.5.0" +version = "1.6.0" description = "Bindings to the Linux input handling subsystem" category = "main" optional = false @@ -455,7 +455,7 @@ pyflakes = ">=2.3.0,<2.4.0" [[package]] name = "frozenlist" -version = "1.3.0" +version = "1.3.1" description = "A list-like structure which implements collections.abc.MutableSequence" category = "main" optional = false @@ -490,7 +490,7 @@ python-versions = ">=2.6, !=3.0.*, !=3.1.*, !=3.2.*" [[package]] name = "gazu" -version = "0.8.28" +version = "0.8.30" description = "Gazu is a client for Zou, the API to store the data of your CG production." category = "main" optional = false @@ -530,7 +530,7 @@ typing-extensions = {version = ">=3.7.4.3", markers = "python_version < \"3.8\"" [[package]] name = "google-api-core" -version = "2.8.1" +version = "2.8.2" description = "Google API client core library" category = "main" optional = false @@ -539,13 +539,11 @@ python-versions = ">=3.6" [package.dependencies] google-auth = ">=1.25.0,<3.0dev" googleapis-common-protos = ">=1.56.2,<2.0dev" -protobuf = ">=3.15.0,<4.0.0dev" +protobuf = ">=3.15.0,<5.0.0dev" requests = ">=2.18.0,<3.0.0dev" [package.extras] grpc = ["grpcio (>=1.33.2,<2.0dev)", "grpcio-status (>=1.33.2,<2.0dev)"] -grpcgcp = ["grpcio-gcp (>=0.2.2,<1.0dev)"] -grpcio-gcp = ["grpcio-gcp (>=0.2.2,<1.0dev)"] [[package]] name = "google-api-python-client" @@ -565,7 +563,7 @@ uritemplate = ">=3.0.0,<4dev" [[package]] name = "google-auth" -version = "2.7.0" +version = "2.10.0" description = "Google Authentication Library" category = "main" optional = false @@ -598,14 +596,14 @@ six = "*" [[package]] name = "googleapis-common-protos" -version = "1.56.2" +version = "1.56.4" description = "Common protobufs used in Google APIs" category = "main" optional = false -python-versions = ">=3.6" +python-versions = ">=3.7" [package.dependencies] -protobuf = ">=3.15.0,<4.0.0dev" +protobuf = ">=3.15.0,<5.0.0dev" [package.extras] grpc = ["grpcio (>=1.0.0,<2.0.0dev)"] @@ -631,7 +629,7 @@ python-versions = ">=3.5" [[package]] name = "imagesize" -version = "1.3.0" +version = "1.4.1" description = "Getting image size from png/jpeg/jpeg2000/gif file" category = "dev" optional = false @@ -639,7 +637,7 @@ python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*" [[package]] name = "importlib-metadata" -version = "4.11.4" +version = "4.12.0" description = "Read metadata from Python packages" category = "main" optional = false @@ -652,7 +650,7 @@ zipp = ">=0.5" [package.extras] docs = ["sphinx", "jaraco.packaging (>=9)", "rst.linker (>=1.9)"] perf = ["ipython"] -testing = ["pytest (>=6)", "pytest-checkdocs (>=2.4)", "pytest-flake8", "pytest-cov", "pytest-enabler (>=1.0.1)", "packaging", "pyfakefs", "flufl.flake8", "pytest-perf (>=0.9.2)", "pytest-black (>=0.3.7)", "pytest-mypy (>=0.9.1)", "importlib-resources (>=1.3)"] +testing = ["pytest (>=6)", "pytest-checkdocs (>=2.4)", "pytest-flake8", "pytest-cov", "pytest-enabler (>=1.3)", "packaging", "pyfakefs", "flufl.flake8", "pytest-perf (>=0.9.2)", "pytest-black (>=0.3.7)", "pytest-mypy (>=0.9.1)", "importlib-resources (>=1.3)"] [[package]] name = "iniconfig" @@ -692,15 +690,15 @@ testing = ["colorama", "docopt", "pytest (>=3.1.0)"] [[package]] name = "jeepney" -version = "0.7.1" +version = "0.8.0" description = "Low-level, pure Python DBus protocol wrapper." category = "main" optional = false -python-versions = ">=3.6" +python-versions = ">=3.7" [package.extras] -test = ["pytest", "pytest-trio", "pytest-asyncio", "testpath", "trio", "async-timeout"] -trio = ["trio", "async-generator"] +trio = ["async-generator", "trio"] +test = ["async-timeout", "trio", "testpath", "pytest-asyncio (>=0.17)", "pytest-trio", "pytest"] [[package]] name = "jinja2" @@ -875,14 +873,14 @@ six = "*" [[package]] name = "pillow" -version = "9.1.1" +version = "9.2.0" description = "Python Imaging Library (Fork)" category = "main" optional = false python-versions = ">=3.7" [package.extras] -docs = ["olefile", "sphinx (>=2.4)", "sphinx-copybutton", "sphinx-issues (>=3.0.1)", "sphinx-removed-in", "sphinx-rtd-theme (>=1.0)", "sphinxext-opengraph"] +docs = ["furo", "olefile", "sphinx (>=2.4)", "sphinx-copybutton", "sphinx-issues (>=3.0.1)", "sphinx-removed-in", "sphinxext-opengraph"] tests = ["check-manifest", "coverage", "defusedxml", "markdown2", "olefile", "packaging", "pyroma", "pytest", "pytest-cov", "pytest-timeout"] [[package]] @@ -930,11 +928,11 @@ python-versions = "*" [[package]] name = "protobuf" -version = "3.19.4" -description = "Protocol Buffers" +version = "4.21.5" +description = "" category = "main" optional = false -python-versions = ">=3.5" +python-versions = ">=3.7" [[package]] name = "py" @@ -1271,7 +1269,7 @@ python-versions = "*" [[package]] name = "pytz" -version = "2022.1" +version = "2022.2" description = "World timezone definitions, modern and historical" category = "dev" optional = false @@ -1354,7 +1352,7 @@ use_chardet_on_py3 = ["chardet (>=3.0.2,<5)"] [[package]] name = "rsa" -version = "4.8" +version = "4.9" description = "Pure-Python RSA implementation" category = "main" optional = false @@ -1408,7 +1406,7 @@ python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*" [[package]] name = "slack-sdk" -version = "3.17.0" +version = "3.18.1" description = "The Slack API Platform SDK for Python" category = "main" optional = false @@ -1487,9 +1485,9 @@ docutils = "*" sphinx = "*" [package.extras] +test = ["pytest-cov", "pytest (>=3.0.0)"] +lint = ["pylint", "flake8", "black"] dev = ["pre-commit"] -lint = ["black", "flake8", "pylint"] -test = ["pytest (>=3.0.0)", "pytest-cov"] [[package]] name = "sphinx-rtd-theme" @@ -1638,11 +1636,11 @@ python-versions = ">=3.6" [[package]] name = "typing-extensions" -version = "4.0.1" -description = "Backported and Experimental Type Hints for Python 3.6+" +version = "4.3.0" +description = "Backported and Experimental Type Hints for Python 3.7+" category = "main" optional = false -python-versions = ">=3.6" +python-versions = ">=3.7" [[package]] name = "uritemplate" @@ -1654,11 +1652,11 @@ python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*" [[package]] name = "urllib3" -version = "1.26.9" +version = "1.26.11" description = "HTTP library with thread-safe connection pooling, file post, and more." category = "main" optional = false -python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, <4" +python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, <4" [package.extras] brotli = ["brotlicffi (>=0.8.0)", "brotli (>=1.0.9)", "brotlipy (>=0.6.0)"] @@ -1711,11 +1709,11 @@ ujson = ["ujson"] [[package]] name = "yarl" -version = "1.7.2" +version = "1.8.1" description = "Yet another URL library" category = "main" optional = false -python-versions = ">=3.6" +python-versions = ">=3.7" [package.dependencies] idna = ">=2.0" @@ -1724,20 +1722,20 @@ typing-extensions = {version = ">=3.7.4", markers = "python_version < \"3.8\""} [[package]] name = "zipp" -version = "3.8.0" +version = "3.8.1" description = "Backport of pathlib-compatible object wrapper for zip files" category = "main" optional = false python-versions = ">=3.7" [package.extras] -docs = ["sphinx", "jaraco.packaging (>=9)", "rst.linker (>=1.9)"] -testing = ["pytest (>=6)", "pytest-checkdocs (>=2.4)", "pytest-flake8", "pytest-cov", "pytest-enabler (>=1.0.1)", "jaraco.itertools", "func-timeout", "pytest-black (>=0.3.7)", "pytest-mypy (>=0.9.1)"] +docs = ["sphinx", "jaraco.packaging (>=9)", "rst.linker (>=1.9)", "jaraco.tidelift (>=1.4)"] +testing = ["pytest (>=6)", "pytest-checkdocs (>=2.4)", "pytest-flake8", "pytest-cov", "pytest-enabler (>=1.3)", "jaraco.itertools", "func-timeout", "pytest-black (>=0.3.7)", "pytest-mypy (>=0.9.1)"] [metadata] lock-version = "1.1" python-versions = "3.7.*" -content-hash = "bd8e0a03668c380c6e76c8cd6c71020692f4ea9f32de7a4f09433564faa9dad0" +content-hash = "de7422afb6aed02f75e1696afdda9ad6c7bf32da76b5022ee3e8f71a1ac4bae2" [metadata.files] acre = [] @@ -1819,10 +1817,7 @@ aiohttp-json-rpc = [ {file = "aiohttp-json-rpc-0.13.3.tar.gz", hash = "sha256:6237a104478c22c6ef96c7227a01d6832597b414e4b79a52d85593356a169e99"}, {file = "aiohttp_json_rpc-0.13.3-py3-none-any.whl", hash = "sha256:4fbd197aced61bd2df7ae3237ead7d3e08833c2ccf48b8581e1828c95ebee680"}, ] -aiohttp-middlewares = [ - {file = "aiohttp-middlewares-2.0.0.tar.gz", hash = "sha256:e08ba04dc0e8fe379aa5e9444a68485c275677ee1e18c55cbb855de0c3629502"}, - {file = "aiohttp_middlewares-2.0.0-py3-none-any.whl", hash = "sha256:29cf1513176b4013844711975ff520e26a8a5d8f9fefbbddb5e91224a86b043e"}, -] +aiohttp-middlewares = [] aiosignal = [ {file = "aiosignal-1.2.0-py3-none-any.whl", hash = "sha256:26e62109036cd181df6e6ad646f91f0dcfd05fe16d0cb924138ff2ab75d64e3a"}, {file = "aiosignal-1.2.0.tar.gz", hash = "sha256:78ed67db6c7b7ced4f98e495e572106d5c432a93e1ddd1bf475e1dc05f5b7df2"}, @@ -1840,10 +1835,7 @@ arrow = [ {file = "arrow-0.17.0-py2.py3-none-any.whl", hash = "sha256:e098abbd9af3665aea81bdd6c869e93af4feb078e98468dd351c383af187aac5"}, {file = "arrow-0.17.0.tar.gz", hash = "sha256:ff08d10cda1d36c68657d6ad20d74fbea493d980f8b2d45344e00d6ed2bf6ed4"}, ] -astroid = [ - {file = "astroid-2.11.5-py3-none-any.whl", hash = "sha256:14ffbb4f6aa2cf474a0834014005487f7ecd8924996083ab411e7fa0b508ce0b"}, - {file = "astroid-2.11.5.tar.gz", hash = "sha256:f4e4ec5294c4b07ac38bab9ca5ddd3914d4bf46f9006eb5c0ae755755061044e"}, -] +astroid = [] async-timeout = [ {file = "async-timeout-4.0.2.tar.gz", hash = "sha256:2163e1640ddb52b7a8c80d0a67a08587e5d245cc9c553a74a847056bc2976b15"}, {file = "async_timeout-4.0.2-py3-none-any.whl", hash = "sha256:8ca1e4fcf50d07413d66d1a5e416e42cfdf5851c981d679a09851a6853383b3c"}, @@ -1852,99 +1844,21 @@ asynctest = [ {file = "asynctest-0.13.0-py3-none-any.whl", hash = "sha256:5da6118a7e6d6b54d83a8f7197769d046922a44d2a99c21382f0a6e4fadae676"}, {file = "asynctest-0.13.0.tar.gz", hash = "sha256:c27862842d15d83e6a34eb0b2866c323880eb3a75e4485b079ea11748fd77fac"}, ] -atomicwrites = [ - {file = "atomicwrites-1.4.0-py2.py3-none-any.whl", hash = "sha256:6d1784dea7c0c8d4a5172b6c620f40b6e4cbfdf96d783691f2e1302a7b88e197"}, - {file = "atomicwrites-1.4.0.tar.gz", hash = "sha256:ae70396ad1a434f9c7046fd2dd196fc04b12f9e91ffb859164193be8b6168a7a"}, -] -attrs = [ - {file = "attrs-21.4.0-py2.py3-none-any.whl", hash = "sha256:2d27e3784d7a565d36ab851fe94887c5eccd6a463168875832a1be79c82828b4"}, - {file = "attrs-21.4.0.tar.gz", hash = "sha256:626ba8234211db98e869df76230a137c4c40a12d72445c45d5f5b716f076e2fd"}, -] +atomicwrites = [] +attrs = [] autopep8 = [ {file = "autopep8-1.5.7-py2.py3-none-any.whl", hash = "sha256:aa213493c30dcdac99537249ee65b24af0b2c29f2e83cd8b3f68760441ed0db9"}, {file = "autopep8-1.5.7.tar.gz", hash = "sha256:276ced7e9e3cb22e5d7c14748384a5cf5d9002257c0ed50c0e075b68011bb6d0"}, ] -babel = [ - {file = "Babel-2.9.1-py2.py3-none-any.whl", hash = "sha256:ab49e12b91d937cd11f0b67cb259a57ab4ad2b59ac7a3b41d6c06c0ac5b0def9"}, - {file = "Babel-2.9.1.tar.gz", hash = "sha256:bc0c176f9f6a994582230df350aa6e05ba2ebe4b3ac317eab29d9be5d2768da0"}, -] -bcrypt = [ - {file = "bcrypt-3.2.2-cp36-abi3-macosx_10_10_universal2.whl", hash = "sha256:7180d98a96f00b1050e93f5b0f556e658605dd9f524d0b0e68ae7944673f525e"}, - {file = "bcrypt-3.2.2-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:61bae49580dce88095d669226d5076d0b9d927754cedbdf76c6c9f5099ad6f26"}, - {file = "bcrypt-3.2.2-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:88273d806ab3a50d06bc6a2fc7c87d737dd669b76ad955f449c43095389bc8fb"}, - {file = "bcrypt-3.2.2-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:6d2cb9d969bfca5bc08e45864137276e4c3d3d7de2b162171def3d188bf9d34a"}, - {file = "bcrypt-3.2.2-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2b02d6bfc6336d1094276f3f588aa1225a598e27f8e3388f4db9948cb707b521"}, - {file = "bcrypt-3.2.2-cp36-abi3-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:a2c46100e315c3a5b90fdc53e429c006c5f962529bc27e1dfd656292c20ccc40"}, - {file = "bcrypt-3.2.2-cp36-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:7d9ba2e41e330d2af4af6b1b6ec9e6128e91343d0b4afb9282e54e5508f31baa"}, - {file = "bcrypt-3.2.2-cp36-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:cd43303d6b8a165c29ec6756afd169faba9396a9472cdff753fe9f19b96ce2fa"}, - {file = "bcrypt-3.2.2-cp36-abi3-win32.whl", hash = "sha256:4e029cef560967fb0cf4a802bcf4d562d3d6b4b1bf81de5ec1abbe0f1adb027e"}, - {file = "bcrypt-3.2.2-cp36-abi3-win_amd64.whl", hash = "sha256:7ff2069240c6bbe49109fe84ca80508773a904f5a8cb960e02a977f7f519b129"}, - {file = "bcrypt-3.2.2.tar.gz", hash = "sha256:433c410c2177057705da2a9f2cd01dd157493b2a7ac14c8593a16b3dab6b6bfb"}, -] +babel = [] +bcrypt = [] blessed = [ {file = "blessed-1.19.1-py2.py3-none-any.whl", hash = "sha256:63b8554ae2e0e7f43749b6715c734cc8f3883010a809bf16790102563e6cf25b"}, {file = "blessed-1.19.1.tar.gz", hash = "sha256:9a0d099695bf621d4680dd6c73f6ad547f6a3442fbdbe80c4b1daa1edbc492fc"}, ] -cachetools = [ - {file = "cachetools-5.2.0-py3-none-any.whl", hash = "sha256:f9f17d2aec496a9aa6b76f53e3b614c965223c061982d434d160f930c698a9db"}, - {file = "cachetools-5.2.0.tar.gz", hash = "sha256:6a94c6402995a99c3970cc7e4884bb60b4a8639938157eeed436098bf9831757"}, -] -certifi = [ - {file = "certifi-2022.6.15-py3-none-any.whl", hash = "sha256:fe86415d55e84719d75f8b69414f6438ac3547d2078ab91b67e779ef69378412"}, - {file = "certifi-2022.6.15.tar.gz", hash = "sha256:84c85a9078b11105f04f3036a9482ae10e4621616db313fe045dd24743a0820d"}, -] -cffi = [ - {file = "cffi-1.15.0-cp27-cp27m-macosx_10_9_x86_64.whl", hash = "sha256:c2502a1a03b6312837279c8c1bd3ebedf6c12c4228ddbad40912d671ccc8a962"}, - {file = "cffi-1.15.0-cp27-cp27m-manylinux1_i686.whl", hash = "sha256:23cfe892bd5dd8941608f93348c0737e369e51c100d03718f108bf1add7bd6d0"}, - {file = "cffi-1.15.0-cp27-cp27m-manylinux1_x86_64.whl", hash = "sha256:41d45de54cd277a7878919867c0f08b0cf817605e4eb94093e7516505d3c8d14"}, - {file = "cffi-1.15.0-cp27-cp27m-win32.whl", hash = "sha256:4a306fa632e8f0928956a41fa8e1d6243c71e7eb59ffbd165fc0b41e316b2474"}, - {file = "cffi-1.15.0-cp27-cp27m-win_amd64.whl", hash = "sha256:e7022a66d9b55e93e1a845d8c9eba2a1bebd4966cd8bfc25d9cd07d515b33fa6"}, - {file = "cffi-1.15.0-cp27-cp27mu-manylinux1_i686.whl", hash = "sha256:14cd121ea63ecdae71efa69c15c5543a4b5fbcd0bbe2aad864baca0063cecf27"}, - {file = "cffi-1.15.0-cp27-cp27mu-manylinux1_x86_64.whl", hash = "sha256:d4d692a89c5cf08a8557fdeb329b82e7bf609aadfaed6c0d79f5a449a3c7c023"}, - {file = "cffi-1.15.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:0104fb5ae2391d46a4cb082abdd5c69ea4eab79d8d44eaaf79f1b1fd806ee4c2"}, - {file = "cffi-1.15.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:91ec59c33514b7c7559a6acda53bbfe1b283949c34fe7440bcf917f96ac0723e"}, - {file = "cffi-1.15.0-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:f5c7150ad32ba43a07c4479f40241756145a1f03b43480e058cfd862bf5041c7"}, - {file = "cffi-1.15.0-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:00c878c90cb53ccfaae6b8bc18ad05d2036553e6d9d1d9dbcf323bbe83854ca3"}, - {file = "cffi-1.15.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:abb9a20a72ac4e0fdb50dae135ba5e77880518e742077ced47eb1499e29a443c"}, - {file = "cffi-1.15.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a5263e363c27b653a90078143adb3d076c1a748ec9ecc78ea2fb916f9b861962"}, - {file = "cffi-1.15.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f54a64f8b0c8ff0b64d18aa76675262e1700f3995182267998c31ae974fbc382"}, - {file = "cffi-1.15.0-cp310-cp310-win32.whl", hash = "sha256:c21c9e3896c23007803a875460fb786118f0cdd4434359577ea25eb556e34c55"}, - {file = "cffi-1.15.0-cp310-cp310-win_amd64.whl", hash = "sha256:5e069f72d497312b24fcc02073d70cb989045d1c91cbd53979366077959933e0"}, - {file = "cffi-1.15.0-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:64d4ec9f448dfe041705426000cc13e34e6e5bb13736e9fd62e34a0b0c41566e"}, - {file = "cffi-1.15.0-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2756c88cbb94231c7a147402476be2c4df2f6078099a6f4a480d239a8817ae39"}, - {file = "cffi-1.15.0-cp36-cp36m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3b96a311ac60a3f6be21d2572e46ce67f09abcf4d09344c49274eb9e0bf345fc"}, - {file = "cffi-1.15.0-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:75e4024375654472cc27e91cbe9eaa08567f7fbdf822638be2814ce059f58032"}, - {file = "cffi-1.15.0-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:59888172256cac5629e60e72e86598027aca6bf01fa2465bdb676d37636573e8"}, - {file = "cffi-1.15.0-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:27c219baf94952ae9d50ec19651a687b826792055353d07648a5695413e0c605"}, - {file = "cffi-1.15.0-cp36-cp36m-win32.whl", hash = "sha256:4958391dbd6249d7ad855b9ca88fae690783a6be9e86df65865058ed81fc860e"}, - {file = "cffi-1.15.0-cp36-cp36m-win_amd64.whl", hash = "sha256:f6f824dc3bce0edab5f427efcfb1d63ee75b6fcb7282900ccaf925be84efb0fc"}, - {file = "cffi-1.15.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:06c48159c1abed75c2e721b1715c379fa3200c7784271b3c46df01383b593636"}, - {file = "cffi-1.15.0-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:c2051981a968d7de9dd2d7b87bcb9c939c74a34626a6e2f8181455dd49ed69e4"}, - {file = "cffi-1.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:fd8a250edc26254fe5b33be00402e6d287f562b6a5b2152dec302fa15bb3e997"}, - {file = "cffi-1.15.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:91d77d2a782be4274da750752bb1650a97bfd8f291022b379bb8e01c66b4e96b"}, - {file = "cffi-1.15.0-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:45db3a33139e9c8f7c09234b5784a5e33d31fd6907800b316decad50af323ff2"}, - {file = "cffi-1.15.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:263cc3d821c4ab2213cbe8cd8b355a7f72a8324577dc865ef98487c1aeee2bc7"}, - {file = "cffi-1.15.0-cp37-cp37m-win32.whl", hash = "sha256:17771976e82e9f94976180f76468546834d22a7cc404b17c22df2a2c81db0c66"}, - {file = "cffi-1.15.0-cp37-cp37m-win_amd64.whl", hash = "sha256:3415c89f9204ee60cd09b235810be700e993e343a408693e80ce7f6a40108029"}, - {file = "cffi-1.15.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:4238e6dab5d6a8ba812de994bbb0a79bddbdf80994e4ce802b6f6f3142fcc880"}, - {file = "cffi-1.15.0-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:0808014eb713677ec1292301ea4c81ad277b6cdf2fdd90fd540af98c0b101d20"}, - {file = "cffi-1.15.0-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:57e9ac9ccc3101fac9d6014fba037473e4358ef4e89f8e181f8951a2c0162024"}, - {file = "cffi-1.15.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8b6c2ea03845c9f501ed1313e78de148cd3f6cad741a75d43a29b43da27f2e1e"}, - {file = "cffi-1.15.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:10dffb601ccfb65262a27233ac273d552ddc4d8ae1bf93b21c94b8511bffe728"}, - {file = "cffi-1.15.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:786902fb9ba7433aae840e0ed609f45c7bcd4e225ebb9c753aa39725bb3e6ad6"}, - {file = "cffi-1.15.0-cp38-cp38-win32.whl", hash = "sha256:da5db4e883f1ce37f55c667e5c0de439df76ac4cb55964655906306918e7363c"}, - {file = "cffi-1.15.0-cp38-cp38-win_amd64.whl", hash = "sha256:181dee03b1170ff1969489acf1c26533710231c58f95534e3edac87fff06c443"}, - {file = "cffi-1.15.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:45e8636704eacc432a206ac7345a5d3d2c62d95a507ec70d62f23cd91770482a"}, - {file = "cffi-1.15.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:31fb708d9d7c3f49a60f04cf5b119aeefe5644daba1cd2a0fe389b674fd1de37"}, - {file = "cffi-1.15.0-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:6dc2737a3674b3e344847c8686cf29e500584ccad76204efea14f451d4cc669a"}, - {file = "cffi-1.15.0-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:74fdfdbfdc48d3f47148976f49fab3251e550a8720bebc99bf1483f5bfb5db3e"}, - {file = "cffi-1.15.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ffaa5c925128e29efbde7301d8ecaf35c8c60ffbcd6a1ffd3a552177c8e5e796"}, - {file = "cffi-1.15.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3f7d084648d77af029acb79a0ff49a0ad7e9d09057a9bf46596dac9514dc07df"}, - {file = "cffi-1.15.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ef1f279350da2c586a69d32fc8733092fd32cc8ac95139a00377841f59a3f8d8"}, - {file = "cffi-1.15.0-cp39-cp39-win32.whl", hash = "sha256:2a23af14f408d53d5e6cd4e3d9a24ff9e05906ad574822a10563efcef137979a"}, - {file = "cffi-1.15.0-cp39-cp39-win_amd64.whl", hash = "sha256:3773c4d81e6e818df2efbc7dd77325ca0dcb688116050fb2b3011218eda36139"}, - {file = "cffi-1.15.0.tar.gz", hash = "sha256:920f0d66a896c2d99f0adbb391f990a84091179542c205fa53ce5787aff87954"}, -] +cachetools = [] +certifi = [] +cffi = [] charset-normalizer = [ {file = "charset-normalizer-2.0.12.tar.gz", hash = "sha256:2857e29ff0d34db842cd7ca3230549d1a697f96ee6d3fb071cfa6c7393832597"}, {file = "charset_normalizer-2.0.12-py3-none-any.whl", hash = "sha256:6881edbebdb17b39b4eaaa821b438bf6eddffb4468cf344f09f89def34a8b1df"}, @@ -1957,10 +1871,7 @@ clique = [ {file = "clique-1.6.1-py2.py3-none-any.whl", hash = "sha256:8619774fa035661928dd8c93cd805acf2d42533ccea1b536c09815ed426c9858"}, {file = "clique-1.6.1.tar.gz", hash = "sha256:90165c1cf162d4dd1baef83ceaa1afc886b453e379094fa5b60ea470d1733e66"}, ] -colorama = [ - {file = "colorama-0.4.4-py2.py3-none-any.whl", hash = "sha256:9f47eda37229f68eee03b24b9748937c7dc3868f906e8ba69fbcbdd3bc5dc3e2"}, - {file = "colorama-0.4.4.tar.gz", hash = "sha256:5941b2b48a20143d2267e95b1c2a7603ce057ee39fd88e7329b0c292aa16869b"}, -] +colorama = [] commonmark = [ {file = "commonmark-0.9.1-py2.py3-none-any.whl", hash = "sha256:da2f38c92590f83de410ba1a3cbceafbc74fee9def35f9251ba9a971d6d66fd9"}, {file = "commonmark-0.9.1.tar.gz", hash = "sha256:452f9dc859be7f06631ddcb328b6919c67984aca654e5fefb3914d54691aed60"}, @@ -1969,73 +1880,8 @@ coolname = [ {file = "coolname-1.1.0-py2.py3-none-any.whl", hash = "sha256:e6a83a0ac88640f4f3d2070438dbe112fe80cfebc119c93bd402976ec84c0978"}, {file = "coolname-1.1.0.tar.gz", hash = "sha256:410fe6ea9999bf96f2856ef0c726d5f38782bbefb7bb1aca0e91e0dc98ed09e3"}, ] -coverage = [ - {file = "coverage-6.4.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:f1d5aa2703e1dab4ae6cf416eb0095304f49d004c39e9db1d86f57924f43006b"}, - {file = "coverage-6.4.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:4ce1b258493cbf8aec43e9b50d89982346b98e9ffdfaae8ae5793bc112fb0068"}, - {file = "coverage-6.4.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:83c4e737f60c6936460c5be330d296dd5b48b3963f48634c53b3f7deb0f34ec4"}, - {file = "coverage-6.4.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:84e65ef149028516c6d64461b95a8dbcfce95cfd5b9eb634320596173332ea84"}, - {file = "coverage-6.4.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f69718750eaae75efe506406c490d6fc5a6161d047206cc63ce25527e8a3adad"}, - {file = "coverage-6.4.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:e57816f8ffe46b1df8f12e1b348f06d164fd5219beba7d9433ba79608ef011cc"}, - {file = "coverage-6.4.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:01c5615d13f3dd3aa8543afc069e5319cfa0c7d712f6e04b920431e5c564a749"}, - {file = "coverage-6.4.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:75ab269400706fab15981fd4bd5080c56bd5cc07c3bccb86aab5e1d5a88dc8f4"}, - {file = "coverage-6.4.1-cp310-cp310-win32.whl", hash = "sha256:a7f3049243783df2e6cc6deafc49ea123522b59f464831476d3d1448e30d72df"}, - {file = "coverage-6.4.1-cp310-cp310-win_amd64.whl", hash = "sha256:ee2ddcac99b2d2aec413e36d7a429ae9ebcadf912946b13ffa88e7d4c9b712d6"}, - {file = "coverage-6.4.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:fb73e0011b8793c053bfa85e53129ba5f0250fdc0392c1591fd35d915ec75c46"}, - {file = "coverage-6.4.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:106c16dfe494de3193ec55cac9640dd039b66e196e4641fa8ac396181578b982"}, - {file = "coverage-6.4.1-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:87f4f3df85aa39da00fd3ec4b5abeb7407e82b68c7c5ad181308b0e2526da5d4"}, - {file = "coverage-6.4.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:961e2fb0680b4f5ad63234e0bf55dfb90d302740ae9c7ed0120677a94a1590cb"}, - {file = "coverage-6.4.1-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:cec3a0f75c8f1031825e19cd86ee787e87cf03e4fd2865c79c057092e69e3a3b"}, - {file = "coverage-6.4.1-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:129cd05ba6f0d08a766d942a9ed4b29283aff7b2cccf5b7ce279d50796860bb3"}, - {file = "coverage-6.4.1-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:bf5601c33213d3cb19d17a796f8a14a9eaa5e87629a53979a5981e3e3ae166f6"}, - {file = "coverage-6.4.1-cp37-cp37m-win32.whl", hash = "sha256:269eaa2c20a13a5bf17558d4dc91a8d078c4fa1872f25303dddcbba3a813085e"}, - {file = "coverage-6.4.1-cp37-cp37m-win_amd64.whl", hash = "sha256:f02cbbf8119db68455b9d763f2f8737bb7db7e43720afa07d8eb1604e5c5ae28"}, - {file = "coverage-6.4.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:ffa9297c3a453fba4717d06df579af42ab9a28022444cae7fa605af4df612d54"}, - {file = "coverage-6.4.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:145f296d00441ca703a659e8f3eb48ae39fb083baba2d7ce4482fb2723e050d9"}, - {file = "coverage-6.4.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d67d44996140af8b84284e5e7d398e589574b376fb4de8ccd28d82ad8e3bea13"}, - {file = "coverage-6.4.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2bd9a6fc18aab8d2e18f89b7ff91c0f34ff4d5e0ba0b33e989b3cd4194c81fd9"}, - {file = "coverage-6.4.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3384f2a3652cef289e38100f2d037956194a837221edd520a7ee5b42d00cc605"}, - {file = "coverage-6.4.1-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:9b3e07152b4563722be523e8cd0b209e0d1a373022cfbde395ebb6575bf6790d"}, - {file = "coverage-6.4.1-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:1480ff858b4113db2718848d7b2d1b75bc79895a9c22e76a221b9d8d62496428"}, - {file = "coverage-6.4.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:865d69ae811a392f4d06bde506d531f6a28a00af36f5c8649684a9e5e4a85c83"}, - {file = "coverage-6.4.1-cp38-cp38-win32.whl", hash = "sha256:664a47ce62fe4bef9e2d2c430306e1428ecea207ffd68649e3b942fa8ea83b0b"}, - {file = "coverage-6.4.1-cp38-cp38-win_amd64.whl", hash = "sha256:26dff09fb0d82693ba9e6231248641d60ba606150d02ed45110f9ec26404ed1c"}, - {file = "coverage-6.4.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:d9c80df769f5ec05ad21ea34be7458d1dc51ff1fb4b2219e77fe24edf462d6df"}, - {file = "coverage-6.4.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:39ee53946bf009788108b4dd2894bf1349b4e0ca18c2016ffa7d26ce46b8f10d"}, - {file = "coverage-6.4.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f5b66caa62922531059bc5ac04f836860412f7f88d38a476eda0a6f11d4724f4"}, - {file = "coverage-6.4.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fd180ed867e289964404051a958f7cccabdeed423f91a899829264bb7974d3d3"}, - {file = "coverage-6.4.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:84631e81dd053e8a0d4967cedab6db94345f1c36107c71698f746cb2636c63e3"}, - {file = "coverage-6.4.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:8c08da0bd238f2970230c2a0d28ff0e99961598cb2e810245d7fc5afcf1254e8"}, - {file = "coverage-6.4.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:d42c549a8f41dc103a8004b9f0c433e2086add8a719da00e246e17cbe4056f72"}, - {file = "coverage-6.4.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:309ce4a522ed5fca432af4ebe0f32b21d6d7ccbb0f5fcc99290e71feba67c264"}, - {file = "coverage-6.4.1-cp39-cp39-win32.whl", hash = "sha256:fdb6f7bd51c2d1714cea40718f6149ad9be6a2ee7d93b19e9f00934c0f2a74d9"}, - {file = "coverage-6.4.1-cp39-cp39-win_amd64.whl", hash = "sha256:342d4aefd1c3e7f620a13f4fe563154d808b69cccef415415aece4c786665397"}, - {file = "coverage-6.4.1-pp36.pp37.pp38-none-any.whl", hash = "sha256:4803e7ccf93230accb928f3a68f00ffa80a88213af98ed338a57ad021ef06815"}, - {file = "coverage-6.4.1.tar.gz", hash = "sha256:4321f075095a096e70aff1d002030ee612b65a205a0a0f5b815280d5dc58100c"}, -] -cryptography = [ - {file = "cryptography-37.0.2-cp36-abi3-macosx_10_10_universal2.whl", hash = "sha256:ef15c2df7656763b4ff20a9bc4381d8352e6640cfeb95c2972c38ef508e75181"}, - {file = "cryptography-37.0.2-cp36-abi3-macosx_10_10_x86_64.whl", hash = "sha256:3c81599befb4d4f3d7648ed3217e00d21a9341a9a688ecdd615ff72ffbed7336"}, - {file = "cryptography-37.0.2-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:2bd1096476aaac820426239ab534b636c77d71af66c547b9ddcd76eb9c79e004"}, - {file = "cryptography-37.0.2-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:31fe38d14d2e5f787e0aecef831457da6cec68e0bb09a35835b0b44ae8b988fe"}, - {file = "cryptography-37.0.2-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:093cb351031656d3ee2f4fa1be579a8c69c754cf874206be1d4cf3b542042804"}, - {file = "cryptography-37.0.2-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:59b281eab51e1b6b6afa525af2bd93c16d49358404f814fe2c2410058623928c"}, - {file = "cryptography-37.0.2-cp36-abi3-manylinux_2_24_x86_64.whl", hash = "sha256:0cc20f655157d4cfc7bada909dc5cc228211b075ba8407c46467f63597c78178"}, - {file = "cryptography-37.0.2-cp36-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:f8ec91983e638a9bcd75b39f1396e5c0dc2330cbd9ce4accefe68717e6779e0a"}, - {file = "cryptography-37.0.2-cp36-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:46f4c544f6557a2fefa7ac8ac7d1b17bf9b647bd20b16decc8fbcab7117fbc15"}, - {file = "cryptography-37.0.2-cp36-abi3-win32.whl", hash = "sha256:731c8abd27693323b348518ed0e0705713a36d79fdbd969ad968fbef0979a7e0"}, - {file = "cryptography-37.0.2-cp36-abi3-win_amd64.whl", hash = "sha256:471e0d70201c069f74c837983189949aa0d24bb2d751b57e26e3761f2f782b8d"}, - {file = "cryptography-37.0.2-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a68254dd88021f24a68b613d8c51d5c5e74d735878b9e32cc0adf19d1f10aaf9"}, - {file = "cryptography-37.0.2-pp37-pypy37_pp73-manylinux_2_24_x86_64.whl", hash = "sha256:a7d5137e556cc0ea418dca6186deabe9129cee318618eb1ffecbd35bee55ddc1"}, - {file = "cryptography-37.0.2-pp38-pypy38_pp73-macosx_10_10_x86_64.whl", hash = "sha256:aeaba7b5e756ea52c8861c133c596afe93dd716cbcacae23b80bc238202dc023"}, - {file = "cryptography-37.0.2-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:95e590dd70642eb2079d280420a888190aa040ad20f19ec8c6e097e38aa29e06"}, - {file = "cryptography-37.0.2-pp38-pypy38_pp73-manylinux_2_24_x86_64.whl", hash = "sha256:1b9362d34363f2c71b7853f6251219298124aa4cc2075ae2932e64c91a3e2717"}, - {file = "cryptography-37.0.2-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:e53258e69874a306fcecb88b7534d61820db8a98655662a3dd2ec7f1afd9132f"}, - {file = "cryptography-37.0.2-pp39-pypy39_pp73-macosx_10_10_x86_64.whl", hash = "sha256:1f3bfbd611db5cb58ca82f3deb35e83af34bb8cf06043fa61500157d50a70982"}, - {file = "cryptography-37.0.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:419c57d7b63f5ec38b1199a9521d77d7d1754eb97827bbb773162073ccd8c8d4"}, - {file = "cryptography-37.0.2-pp39-pypy39_pp73-manylinux_2_24_x86_64.whl", hash = "sha256:dc26bb134452081859aa21d4990474ddb7e863aa39e60d1592800a8865a702de"}, - {file = "cryptography-37.0.2-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:3b8398b3d0efc420e777c40c16764d6870bcef2eb383df9c6dbb9ffe12c64452"}, - {file = "cryptography-37.0.2.tar.gz", hash = "sha256:f224ad253cc9cea7568f49077007d2263efa57396a2f2f78114066fd54b5c68e"}, -] +coverage = [] +cryptography = [] cx-freeze = [ {file = "cx_Freeze-6.9-cp310-cp310-win32.whl", hash = "sha256:776d4fb68a4831691acbd3c374362b9b48ce2e568514a73c3d4cb14d5dcf1470"}, {file = "cx_Freeze-6.9-cp310-cp310-win_amd64.whl", hash = "sha256:243f36d35a034a409cd6247d8cb5d1fbfd7374e3e668e813d0811f64d6bd5ed3"}, @@ -2064,14 +1910,8 @@ cx-logging = [ {file = "cx_Logging-3.0-cp39-cp39-win_amd64.whl", hash = "sha256:302e9c4f65a936c288a4fa59a90e7e142d9ef994aa29676731acafdcccdbb3f5"}, {file = "cx_Logging-3.0.tar.gz", hash = "sha256:ba8a7465facf7b98d8f494030fb481a2e8aeee29dc191e10383bb54ed42bdb34"}, ] -deprecated = [ - {file = "Deprecated-1.2.13-py2.py3-none-any.whl", hash = "sha256:64756e3e14c8c5eea9795d93c524551432a0be75629f8f29e67ab8caf076c76d"}, - {file = "Deprecated-1.2.13.tar.gz", hash = "sha256:43ac5335da90c31c24ba028af536a91d41d53f9e6901ddb021bcc572ce44e38d"}, -] -dill = [ - {file = "dill-0.3.5.1-py2.py3-none-any.whl", hash = "sha256:33501d03270bbe410c72639b350e941882a8b0fd55357580fbc873fba0c59302"}, - {file = "dill-0.3.5.1.tar.gz", hash = "sha256:d75e41f3eff1eee599d738e76ba8f4ad98ea229db8b085318aa2b3333a208c86"}, -] +deprecated = [] +dill = [] dnspython = [ {file = "dnspython-2.2.1-py3-none-any.whl", hash = "sha256:a851e51367fb93e9e1361732c1d60dab63eff98712e503ea7d92e6eccb109b4f"}, {file = "dnspython-2.2.1.tar.gz", hash = "sha256:0f7569a4a6ff151958b64304071d370daa3243d15941a7beedf0c9fe5105603e"}, @@ -2080,90 +1920,22 @@ docutils = [ {file = "docutils-0.17.1-py2.py3-none-any.whl", hash = "sha256:cf316c8370a737a022b72b56874f6602acf974a37a9fba42ec2876387549fc61"}, {file = "docutils-0.17.1.tar.gz", hash = "sha256:686577d2e4c32380bb50cbb22f575ed742d58168cee37e99117a854bcd88f125"}, ] -dropbox = [ - {file = "dropbox-11.31.0-py2-none-any.whl", hash = "sha256:393a99dfe30d42fd73c265b9b7d24bb21c9a961739cd097c3541e709eb2a209c"}, - {file = "dropbox-11.31.0-py3-none-any.whl", hash = "sha256:5f924102fd6464def81573320c6aa4ea9cd3368e1b1c13d838403dd4c9ffc919"}, - {file = "dropbox-11.31.0.tar.gz", hash = "sha256:f483d65b702775b9abf7b9328f702c68c6397fc01770477c6ddbfb1d858a5bcf"}, -] +dropbox = [] enlighten = [ {file = "enlighten-1.10.2-py2.py3-none-any.whl", hash = "sha256:b237fe562b320bf9f1d4bb76d0c98e0daf914372a76ab87c35cd02f57aa9d8c1"}, {file = "enlighten-1.10.2.tar.gz", hash = "sha256:7a5b83cd0f4d095e59d80c648ebb5f7ffca0cd8bcf7ae6639828ee1ad000632a"}, ] -evdev = [ - {file = "evdev-1.5.0.tar.gz", hash = "sha256:5b33b174f7c84576e7dd6071e438bf5ad227da95efd4356a39fe4c8355412fe6"}, -] +evdev = [] flake8 = [ {file = "flake8-3.9.2-py2.py3-none-any.whl", hash = "sha256:bf8fd333346d844f616e8d47905ef3a3384edae6b4e9beb0c5101e25e3110907"}, {file = "flake8-3.9.2.tar.gz", hash = "sha256:07528381786f2a6237b061f6e96610a4167b226cb926e2aa2b6b1d78057c576b"}, ] -frozenlist = [ - {file = "frozenlist-1.3.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d2257aaba9660f78c7b1d8fea963b68f3feffb1a9d5d05a18401ca9eb3e8d0a3"}, - {file = "frozenlist-1.3.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:4a44ebbf601d7bac77976d429e9bdb5a4614f9f4027777f9e54fd765196e9d3b"}, - {file = "frozenlist-1.3.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:45334234ec30fc4ea677f43171b18a27505bfb2dba9aca4398a62692c0ea8868"}, - {file = "frozenlist-1.3.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:47be22dc27ed933d55ee55845d34a3e4e9f6fee93039e7f8ebadb0c2f60d403f"}, - {file = "frozenlist-1.3.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:03a7dd1bfce30216a3f51a84e6dd0e4a573d23ca50f0346634916ff105ba6e6b"}, - {file = "frozenlist-1.3.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:691ddf6dc50480ce49f68441f1d16a4c3325887453837036e0fb94736eae1e58"}, - {file = "frozenlist-1.3.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:bde99812f237f79eaf3f04ebffd74f6718bbd216101b35ac7955c2d47c17da02"}, - {file = "frozenlist-1.3.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6a202458d1298ced3768f5a7d44301e7c86defac162ace0ab7434c2e961166e8"}, - {file = "frozenlist-1.3.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:b9e3e9e365991f8cc5f5edc1fd65b58b41d0514a6a7ad95ef5c7f34eb49b3d3e"}, - {file = "frozenlist-1.3.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:04cb491c4b1c051734d41ea2552fde292f5f3a9c911363f74f39c23659c4af78"}, - {file = "frozenlist-1.3.0-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:436496321dad302b8b27ca955364a439ed1f0999311c393dccb243e451ff66aa"}, - {file = "frozenlist-1.3.0-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:754728d65f1acc61e0f4df784456106e35afb7bf39cfe37227ab00436fb38676"}, - {file = "frozenlist-1.3.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:6eb275c6385dd72594758cbe96c07cdb9bd6becf84235f4a594bdf21e3596c9d"}, - {file = "frozenlist-1.3.0-cp310-cp310-win32.whl", hash = "sha256:e30b2f9683812eb30cf3f0a8e9f79f8d590a7999f731cf39f9105a7c4a39489d"}, - {file = "frozenlist-1.3.0-cp310-cp310-win_amd64.whl", hash = "sha256:f7353ba3367473d1d616ee727945f439e027f0bb16ac1a750219a8344d1d5d3c"}, - {file = "frozenlist-1.3.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:88aafd445a233dbbf8a65a62bc3249a0acd0d81ab18f6feb461cc5a938610d24"}, - {file = "frozenlist-1.3.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4406cfabef8f07b3b3af0f50f70938ec06d9f0fc26cbdeaab431cbc3ca3caeaa"}, - {file = "frozenlist-1.3.0-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8cf829bd2e2956066dd4de43fd8ec881d87842a06708c035b37ef632930505a2"}, - {file = "frozenlist-1.3.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:603b9091bd70fae7be28bdb8aa5c9990f4241aa33abb673390a7f7329296695f"}, - {file = "frozenlist-1.3.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:25af28b560e0c76fa41f550eacb389905633e7ac02d6eb3c09017fa1c8cdfde1"}, - {file = "frozenlist-1.3.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:94c7a8a9fc9383b52c410a2ec952521906d355d18fccc927fca52ab575ee8b93"}, - {file = "frozenlist-1.3.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:65bc6e2fece04e2145ab6e3c47428d1bbc05aede61ae365b2c1bddd94906e478"}, - {file = "frozenlist-1.3.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:3f7c935c7b58b0d78c0beea0c7358e165f95f1fd8a7e98baa40d22a05b4a8141"}, - {file = "frozenlist-1.3.0-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd89acd1b8bb4f31b47072615d72e7f53a948d302b7c1d1455e42622de180eae"}, - {file = "frozenlist-1.3.0-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:6983a31698490825171be44ffbafeaa930ddf590d3f051e397143a5045513b01"}, - {file = "frozenlist-1.3.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:adac9700675cf99e3615eb6a0eb5e9f5a4143c7d42c05cea2e7f71c27a3d0846"}, - {file = "frozenlist-1.3.0-cp37-cp37m-win32.whl", hash = "sha256:0c36e78b9509e97042ef869c0e1e6ef6429e55817c12d78245eb915e1cca7468"}, - {file = "frozenlist-1.3.0-cp37-cp37m-win_amd64.whl", hash = "sha256:57f4d3f03a18facacb2a6bcd21bccd011e3b75d463dc49f838fd699d074fabd1"}, - {file = "frozenlist-1.3.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:8c905a5186d77111f02144fab5b849ab524f1e876a1e75205cd1386a9be4b00a"}, - {file = "frozenlist-1.3.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:b5009062d78a8c6890d50b4e53b0ddda31841b3935c1937e2ed8c1bda1c7fb9d"}, - {file = "frozenlist-1.3.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:2fdc3cd845e5a1f71a0c3518528bfdbfe2efaf9886d6f49eacc5ee4fd9a10953"}, - {file = "frozenlist-1.3.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:92e650bd09b5dda929523b9f8e7f99b24deac61240ecc1a32aeba487afcd970f"}, - {file = "frozenlist-1.3.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:40dff8962b8eba91fd3848d857203f0bd704b5f1fa2b3fc9af64901a190bba08"}, - {file = "frozenlist-1.3.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:768efd082074bb203c934e83a61654ed4931ef02412c2fbdecea0cff7ecd0274"}, - {file = "frozenlist-1.3.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:006d3595e7d4108a12025ddf415ae0f6c9e736e726a5db0183326fd191b14c5e"}, - {file = "frozenlist-1.3.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:871d42623ae15eb0b0e9df65baeee6976b2e161d0ba93155411d58ff27483ad8"}, - {file = "frozenlist-1.3.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:aff388be97ef2677ae185e72dc500d19ecaf31b698986800d3fc4f399a5e30a5"}, - {file = "frozenlist-1.3.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:9f892d6a94ec5c7b785e548e42722e6f3a52f5f32a8461e82ac3e67a3bd073f1"}, - {file = "frozenlist-1.3.0-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:e982878792c971cbd60ee510c4ee5bf089a8246226dea1f2138aa0bb67aff148"}, - {file = "frozenlist-1.3.0-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:c6c321dd013e8fc20735b92cb4892c115f5cdb82c817b1e5b07f6b95d952b2f0"}, - {file = "frozenlist-1.3.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:30530930410855c451bea83f7b272fb1c495ed9d5cc72895ac29e91279401db3"}, - {file = "frozenlist-1.3.0-cp38-cp38-win32.whl", hash = "sha256:40ec383bc194accba825fbb7d0ef3dda5736ceab2375462f1d8672d9f6b68d07"}, - {file = "frozenlist-1.3.0-cp38-cp38-win_amd64.whl", hash = "sha256:f20baa05eaa2bcd5404c445ec51aed1c268d62600362dc6cfe04fae34a424bd9"}, - {file = "frozenlist-1.3.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:0437fe763fb5d4adad1756050cbf855bbb2bf0d9385c7bb13d7a10b0dd550486"}, - {file = "frozenlist-1.3.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:b684c68077b84522b5c7eafc1dc735bfa5b341fb011d5552ebe0968e22ed641c"}, - {file = "frozenlist-1.3.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:93641a51f89473837333b2f8100f3f89795295b858cd4c7d4a1f18e299dc0a4f"}, - {file = "frozenlist-1.3.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d6d32ff213aef0fd0bcf803bffe15cfa2d4fde237d1d4838e62aec242a8362fa"}, - {file = "frozenlist-1.3.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:31977f84828b5bb856ca1eb07bf7e3a34f33a5cddce981d880240ba06639b94d"}, - {file = "frozenlist-1.3.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3c62964192a1c0c30b49f403495911298810bada64e4f03249ca35a33ca0417a"}, - {file = "frozenlist-1.3.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4eda49bea3602812518765810af732229b4291d2695ed24a0a20e098c45a707b"}, - {file = "frozenlist-1.3.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:acb267b09a509c1df5a4ca04140da96016f40d2ed183cdc356d237286c971b51"}, - {file = "frozenlist-1.3.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:e1e26ac0a253a2907d654a37e390904426d5ae5483150ce3adedb35c8c06614a"}, - {file = "frozenlist-1.3.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:f96293d6f982c58ebebb428c50163d010c2f05de0cde99fd681bfdc18d4b2dc2"}, - {file = "frozenlist-1.3.0-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:e84cb61b0ac40a0c3e0e8b79c575161c5300d1d89e13c0e02f76193982f066ed"}, - {file = "frozenlist-1.3.0-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:ff9310f05b9d9c5c4dd472983dc956901ee6cb2c3ec1ab116ecdde25f3ce4951"}, - {file = "frozenlist-1.3.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:d26b650b71fdc88065b7a21f8ace70175bcf3b5bdba5ea22df4bfd893e795a3b"}, - {file = "frozenlist-1.3.0-cp39-cp39-win32.whl", hash = "sha256:01a73627448b1f2145bddb6e6c2259988bb8aee0fb361776ff8604b99616cd08"}, - {file = "frozenlist-1.3.0-cp39-cp39-win_amd64.whl", hash = "sha256:772965f773757a6026dea111a15e6e2678fbd6216180f82a48a40b27de1ee2ab"}, - {file = "frozenlist-1.3.0.tar.gz", hash = "sha256:ce6f2ba0edb7b0c1d8976565298ad2deba6f8064d2bebb6ffce2ca896eb35b0b"}, -] +frozenlist = [] ftrack-python-api = [] future = [ {file = "future-0.18.2.tar.gz", hash = "sha256:b1bead90b70cf6ec3f0710ae53a525360fa360d306a86583adc6bf83a4db537d"}, ] -gazu = [ - {file = "gazu-0.8.28-py2.py3-none-any.whl", hash = "sha256:ec4f7c2688a2b37ee8a77737e4e30565ad362428c3ade9046136a998c043e51c"}, -] +gazu = [] gitdb = [ {file = "gitdb-4.0.9-py3-none-any.whl", hash = "sha256:8033ad4e853066ba6ca92050b9df2f89301b8fc8bf7e9324d412a63f8bf1a8fd"}, {file = "gitdb-4.0.9.tar.gz", hash = "sha256:bac2fd45c0a1c9cf619e63a90d62bdc63892ef92387424b855792a6cabe789aa"}, @@ -2172,42 +1944,24 @@ gitpython = [ {file = "GitPython-3.1.27-py3-none-any.whl", hash = "sha256:5b68b000463593e05ff2b261acff0ff0972df8ab1b70d3cdbd41b546c8b8fc3d"}, {file = "GitPython-3.1.27.tar.gz", hash = "sha256:1c885ce809e8ba2d88a29befeb385fcea06338d3640712b59ca623c220bb5704"}, ] -google-api-core = [ - {file = "google-api-core-2.8.1.tar.gz", hash = "sha256:958024c6aa3460b08f35741231076a4dd9a4c819a6a39d44da9627febe8b28f0"}, - {file = "google_api_core-2.8.1-py3-none-any.whl", hash = "sha256:ce1daa49644b50398093d2a9ad886501aa845e2602af70c3001b9f402a9d7359"}, -] +google-api-core = [] google-api-python-client = [ {file = "google-api-python-client-1.12.11.tar.gz", hash = "sha256:1b4bd42a46321e13c0542a9e4d96fa05d73626f07b39f83a73a947d70ca706a9"}, {file = "google_api_python_client-1.12.11-py2.py3-none-any.whl", hash = "sha256:7e0a1a265c8d3088ee1987778c72683fcb376e32bada8d7767162bd9c503fd9b"}, ] -google-auth = [ - {file = "google-auth-2.7.0.tar.gz", hash = "sha256:8a954960f852d5f19e6af14dd8e75c20159609e85d8db37e4013cc8c3824a7e1"}, - {file = "google_auth-2.7.0-py2.py3-none-any.whl", hash = "sha256:df549a1433108801b11bdcc0e312eaf0d5f0500db42f0523e4d65c78722e8475"}, -] +google-auth = [] google-auth-httplib2 = [ {file = "google-auth-httplib2-0.1.0.tar.gz", hash = "sha256:a07c39fd632becacd3f07718dfd6021bf396978f03ad3ce4321d060015cc30ac"}, {file = "google_auth_httplib2-0.1.0-py2.py3-none-any.whl", hash = "sha256:31e49c36c6b5643b57e82617cb3e021e3e1d2df9da63af67252c02fa9c1f4a10"}, ] -googleapis-common-protos = [ - {file = "googleapis-common-protos-1.56.2.tar.gz", hash = "sha256:b09b56f5463070c2153753ef123f07d2e49235e89148e9b2459ec8ed2f68d7d3"}, - {file = "googleapis_common_protos-1.56.2-py2.py3-none-any.whl", hash = "sha256:023eaea9d8c1cceccd9587c6af6c20f33eeeb05d4148670f2b0322dc1511700c"}, -] +googleapis-common-protos = [] httplib2 = [ {file = "httplib2-0.20.4-py3-none-any.whl", hash = "sha256:8b6a905cb1c79eefd03f8669fd993c36dc341f7c558f056cb5a33b5c2f458543"}, {file = "httplib2-0.20.4.tar.gz", hash = "sha256:58a98e45b4b1a48273073f905d2961666ecf0fbac4250ea5b47aef259eb5c585"}, ] -idna = [ - {file = "idna-3.3-py3-none-any.whl", hash = "sha256:84d9dd047ffa80596e0f246e2eab0b391788b0503584e8945f2368256d2735ff"}, - {file = "idna-3.3.tar.gz", hash = "sha256:9d643ff0a55b762d5cdb124b8eaa99c66322e2157b69160bc32796e824360e6d"}, -] -imagesize = [ - {file = "imagesize-1.3.0-py2.py3-none-any.whl", hash = "sha256:1db2f82529e53c3e929e8926a1fa9235aa82d0bd0c580359c67ec31b2fddaa8c"}, - {file = "imagesize-1.3.0.tar.gz", hash = "sha256:cd1750d452385ca327479d45b64d9c7729ecf0b3969a58148298c77092261f9d"}, -] -importlib-metadata = [ - {file = "importlib_metadata-4.11.4-py3-none-any.whl", hash = "sha256:c58c8eb8a762858f49e18436ff552e83914778e50e9d2f1660535ffb364552ec"}, - {file = "importlib_metadata-4.11.4.tar.gz", hash = "sha256:5d26852efe48c0a32b0509ffbc583fda1a2266545a78d104a6f4aff3db17d700"}, -] +idna = [] +imagesize = [] +importlib-metadata = [] iniconfig = [ {file = "iniconfig-1.1.1-py2.py3-none-any.whl", hash = "sha256:011e24c64b7f47f6ebd835bb12a743f2fbe9a26d4cecaa7f53bc4f35ee9da8b3"}, {file = "iniconfig-1.1.1.tar.gz", hash = "sha256:bc3af051d7d14b2ee5ef9969666def0cd1a000e121eaea580d4a313df4b37f32"}, @@ -2220,18 +1974,12 @@ jedi = [ {file = "jedi-0.13.3-py2.py3-none-any.whl", hash = "sha256:2c6bcd9545c7d6440951b12b44d373479bf18123a401a52025cf98563fbd826c"}, {file = "jedi-0.13.3.tar.gz", hash = "sha256:2bb0603e3506f708e792c7f4ad8fc2a7a9d9c2d292a358fbbd58da531695595b"}, ] -jeepney = [ - {file = "jeepney-0.7.1-py3-none-any.whl", hash = "sha256:1b5a0ea5c0e7b166b2f5895b91a08c14de8915afda4407fb5022a195224958ac"}, - {file = "jeepney-0.7.1.tar.gz", hash = "sha256:fa9e232dfa0c498bd0b8a3a73b8d8a31978304dcef0515adc859d4e096f96f4f"}, -] +jeepney = [] jinja2 = [ {file = "Jinja2-2.11.3-py2.py3-none-any.whl", hash = "sha256:03e47ad063331dd6a3f04a43eddca8a966a26ba0c5b7207a9a9e4e08f1b29419"}, {file = "Jinja2-2.11.3.tar.gz", hash = "sha256:a6d58433de0ae800347cab1fa3043cebbabe8baa9d29e668f1c768cb87a333c6"}, ] -jinxed = [ - {file = "jinxed-1.2.0-py2.py3-none-any.whl", hash = "sha256:cfc2b2e4e3b4326954d546ba6d6b9a7a796ddcb0aef8d03161d005177eb0d48b"}, - {file = "jinxed-1.2.0.tar.gz", hash = "sha256:032acda92d5c57cd216033cbbd53de731e6ed50deb63eb4781336ca55f72cda5"}, -] +jinxed = [] jsonschema = [ {file = "jsonschema-2.6.0-py2.py3-none-any.whl", hash = "sha256:000e68abd33c972a5248544925a0cae7d1125f9bf6c58280d37546b946769a08"}, {file = "jsonschema-2.6.0.tar.gz", hash = "sha256:6ff5f3180870836cae40f06fa10419f557208175f13ad7bc26caa77beb1f6e02"}, @@ -2283,28 +2031,12 @@ log4mongo = [ {file = "log4mongo-1.7.0.tar.gz", hash = "sha256:dc374617206162a0b14167fbb5feac01dbef587539a235dadba6200362984a68"}, ] markupsafe = [ - {file = "MarkupSafe-2.0.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d8446c54dc28c01e5a2dbac5a25f071f6653e6e40f3a8818e8b45d790fe6ef53"}, - {file = "MarkupSafe-2.0.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:36bc903cbb393720fad60fc28c10de6acf10dc6cc883f3e24ee4012371399a38"}, - {file = "MarkupSafe-2.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2d7d807855b419fc2ed3e631034685db6079889a1f01d5d9dac950f764da3dad"}, - {file = "MarkupSafe-2.0.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:add36cb2dbb8b736611303cd3bfcee00afd96471b09cda130da3581cbdc56a6d"}, - {file = "MarkupSafe-2.0.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:168cd0a3642de83558a5153c8bd34f175a9a6e7f6dc6384b9655d2697312a646"}, - {file = "MarkupSafe-2.0.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:4dc8f9fb58f7364b63fd9f85013b780ef83c11857ae79f2feda41e270468dd9b"}, - {file = "MarkupSafe-2.0.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:20dca64a3ef2d6e4d5d615a3fd418ad3bde77a47ec8a23d984a12b5b4c74491a"}, - {file = "MarkupSafe-2.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:cdfba22ea2f0029c9261a4bd07e830a8da012291fbe44dc794e488b6c9bb353a"}, - {file = "MarkupSafe-2.0.1-cp310-cp310-win32.whl", hash = "sha256:99df47edb6bda1249d3e80fdabb1dab8c08ef3975f69aed437cb69d0a5de1e28"}, - {file = "MarkupSafe-2.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:e0f138900af21926a02425cf736db95be9f4af72ba1bb21453432a07f6082134"}, {file = "MarkupSafe-2.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:f9081981fe268bd86831e5c75f7de206ef275defcb82bc70740ae6dc507aee51"}, {file = "MarkupSafe-2.0.1-cp36-cp36m-manylinux1_i686.whl", hash = "sha256:0955295dd5eec6cb6cc2fe1698f4c6d84af2e92de33fbcac4111913cd100a6ff"}, {file = "MarkupSafe-2.0.1-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:0446679737af14f45767963a1a9ef7620189912317d095f2d9ffa183a4d25d2b"}, {file = "MarkupSafe-2.0.1-cp36-cp36m-manylinux2010_i686.whl", hash = "sha256:f826e31d18b516f653fe296d967d700fddad5901ae07c622bb3705955e1faa94"}, {file = "MarkupSafe-2.0.1-cp36-cp36m-manylinux2010_x86_64.whl", hash = "sha256:fa130dd50c57d53368c9d59395cb5526eda596d3ffe36666cd81a44d56e48872"}, {file = "MarkupSafe-2.0.1-cp36-cp36m-manylinux2014_aarch64.whl", hash = "sha256:905fec760bd2fa1388bb5b489ee8ee5f7291d692638ea5f67982d968366bef9f"}, - {file = "MarkupSafe-2.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bf5d821ffabf0ef3533c39c518f3357b171a1651c1ff6827325e4489b0e46c3c"}, - {file = "MarkupSafe-2.0.1-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:0d4b31cc67ab36e3392bbf3862cfbadac3db12bdd8b02a2731f509ed5b829724"}, - {file = "MarkupSafe-2.0.1-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:baa1a4e8f868845af802979fcdbf0bb11f94f1cb7ced4c4b8a351bb60d108145"}, - {file = "MarkupSafe-2.0.1-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:deb993cacb280823246a026e3b2d81c493c53de6acfd5e6bfe31ab3402bb37dd"}, - {file = "MarkupSafe-2.0.1-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:63f3268ba69ace99cab4e3e3b5840b03340efed0948ab8f78d2fd87ee5442a4f"}, - {file = "MarkupSafe-2.0.1-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:8d206346619592c6200148b01a2142798c989edcb9c896f9ac9722a99d4e77e6"}, {file = "MarkupSafe-2.0.1-cp36-cp36m-win32.whl", hash = "sha256:6c4ca60fa24e85fe25b912b01e62cb969d69a23a5d5867682dd3e80b5b02581d"}, {file = "MarkupSafe-2.0.1-cp36-cp36m-win_amd64.whl", hash = "sha256:b2f4bf27480f5e5e8ce285a8c8fd176c0b03e93dcc6646477d4630e83440c6a9"}, {file = "MarkupSafe-2.0.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:0717a7390a68be14b8c793ba258e075c6f4ca819f15edfc2a3a027c823718567"}, @@ -2313,27 +2045,14 @@ markupsafe = [ {file = "MarkupSafe-2.0.1-cp37-cp37m-manylinux2010_i686.whl", hash = "sha256:d7f9850398e85aba693bb640262d3611788b1f29a79f0c93c565694658f4071f"}, {file = "MarkupSafe-2.0.1-cp37-cp37m-manylinux2010_x86_64.whl", hash = "sha256:6a7fae0dd14cf60ad5ff42baa2e95727c3d81ded453457771d02b7d2b3f9c0c2"}, {file = "MarkupSafe-2.0.1-cp37-cp37m-manylinux2014_aarch64.whl", hash = "sha256:b7f2d075102dc8c794cbde1947378051c4e5180d52d276987b8d28a3bd58c17d"}, - {file = "MarkupSafe-2.0.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e9936f0b261d4df76ad22f8fee3ae83b60d7c3e871292cd42f40b81b70afae85"}, - {file = "MarkupSafe-2.0.1-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:2a7d351cbd8cfeb19ca00de495e224dea7e7d919659c2841bbb7f420ad03e2d6"}, - {file = "MarkupSafe-2.0.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:60bf42e36abfaf9aff1f50f52644b336d4f0a3fd6d8a60ca0d054ac9f713a864"}, - {file = "MarkupSafe-2.0.1-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:d6c7ebd4e944c85e2c3421e612a7057a2f48d478d79e61800d81468a8d842207"}, - {file = "MarkupSafe-2.0.1-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:f0567c4dc99f264f49fe27da5f735f414c4e7e7dd850cfd8e69f0862d7c74ea9"}, - {file = "MarkupSafe-2.0.1-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:89c687013cb1cd489a0f0ac24febe8c7a666e6e221b783e53ac50ebf68e45d86"}, {file = "MarkupSafe-2.0.1-cp37-cp37m-win32.whl", hash = "sha256:a30e67a65b53ea0a5e62fe23682cfe22712e01f453b95233b25502f7c61cb415"}, {file = "MarkupSafe-2.0.1-cp37-cp37m-win_amd64.whl", hash = "sha256:611d1ad9a4288cf3e3c16014564df047fe08410e628f89805e475368bd304914"}, - {file = "MarkupSafe-2.0.1-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:5bb28c636d87e840583ee3adeb78172efc47c8b26127267f54a9c0ec251d41a9"}, {file = "MarkupSafe-2.0.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:be98f628055368795d818ebf93da628541e10b75b41c559fdf36d104c5787066"}, {file = "MarkupSafe-2.0.1-cp38-cp38-manylinux1_i686.whl", hash = "sha256:1d609f577dc6e1aa17d746f8bd3c31aa4d258f4070d61b2aa5c4166c1539de35"}, {file = "MarkupSafe-2.0.1-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:7d91275b0245b1da4d4cfa07e0faedd5b0812efc15b702576d103293e252af1b"}, {file = "MarkupSafe-2.0.1-cp38-cp38-manylinux2010_i686.whl", hash = "sha256:01a9b8ea66f1658938f65b93a85ebe8bc016e6769611be228d797c9d998dd298"}, {file = "MarkupSafe-2.0.1-cp38-cp38-manylinux2010_x86_64.whl", hash = "sha256:47ab1e7b91c098ab893b828deafa1203de86d0bc6ab587b160f78fe6c4011f75"}, {file = "MarkupSafe-2.0.1-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:97383d78eb34da7e1fa37dd273c20ad4320929af65d156e35a5e2d89566d9dfb"}, - {file = "MarkupSafe-2.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6fcf051089389abe060c9cd7caa212c707e58153afa2c649f00346ce6d260f1b"}, - {file = "MarkupSafe-2.0.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:5855f8438a7d1d458206a2466bf82b0f104a3724bf96a1c781ab731e4201731a"}, - {file = "MarkupSafe-2.0.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:3dd007d54ee88b46be476e293f48c85048603f5f516008bee124ddd891398ed6"}, - {file = "MarkupSafe-2.0.1-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:aca6377c0cb8a8253e493c6b451565ac77e98c2951c45f913e0b52facdcff83f"}, - {file = "MarkupSafe-2.0.1-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:04635854b943835a6ea959e948d19dcd311762c5c0c6e1f0e16ee57022669194"}, - {file = "MarkupSafe-2.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:6300b8454aa6930a24b9618fbb54b5a68135092bc666f7b06901f897fa5c2fee"}, {file = "MarkupSafe-2.0.1-cp38-cp38-win32.whl", hash = "sha256:023cb26ec21ece8dc3907c0e8320058b2e0cb3c55cf9564da612bc325bed5e64"}, {file = "MarkupSafe-2.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:984d76483eb32f1bcb536dc27e4ad56bba4baa70be32fa87152832cdd9db0833"}, {file = "MarkupSafe-2.0.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:2ef54abee730b502252bcdf31b10dacb0a416229b72c18b19e24a4509f273d26"}, @@ -2343,12 +2062,6 @@ markupsafe = [ {file = "MarkupSafe-2.0.1-cp39-cp39-manylinux2010_i686.whl", hash = "sha256:4efca8f86c54b22348a5467704e3fec767b2db12fc39c6d963168ab1d3fc9135"}, {file = "MarkupSafe-2.0.1-cp39-cp39-manylinux2010_x86_64.whl", hash = "sha256:ab3ef638ace319fa26553db0624c4699e31a28bb2a835c5faca8f8acf6a5a902"}, {file = "MarkupSafe-2.0.1-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:f8ba0e8349a38d3001fae7eadded3f6606f0da5d748ee53cc1dab1d6527b9509"}, - {file = "MarkupSafe-2.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c47adbc92fc1bb2b3274c4b3a43ae0e4573d9fbff4f54cd484555edbf030baf1"}, - {file = "MarkupSafe-2.0.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:37205cac2a79194e3750b0af2a5720d95f786a55ce7df90c3af697bfa100eaac"}, - {file = "MarkupSafe-2.0.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:1f2ade76b9903f39aa442b4aadd2177decb66525062db244b35d71d0ee8599b6"}, - {file = "MarkupSafe-2.0.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:4296f2b1ce8c86a6aea78613c34bb1a672ea0e3de9c6ba08a960efe0b0a09047"}, - {file = "MarkupSafe-2.0.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f02365d4e99430a12647f09b6cc8bab61a6564363f313126f775eb4f6ef798e"}, - {file = "MarkupSafe-2.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:5b6d930f030f8ed98e3e6c98ffa0652bdb82601e7a016ec2ab5d7ff23baa78d1"}, {file = "MarkupSafe-2.0.1-cp39-cp39-win32.whl", hash = "sha256:10f82115e21dc0dfec9ab5c0223652f7197feb168c940f3ef61563fc2d6beb74"}, {file = "MarkupSafe-2.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:693ce3f9e70a6cf7d2fb9e6c9d8b204b6b39897a2c4a1aa65728d5ac97dcc1d8"}, {file = "MarkupSafe-2.0.1.tar.gz", hash = "sha256:594c67807fb16238b30c44bdf74f36c02cdf22d1c8cda91ef8a0ed8dabf5620a"}, @@ -2423,10 +2136,7 @@ packaging = [ {file = "packaging-21.3-py3-none-any.whl", hash = "sha256:ef103e05f519cdc783ae24ea4e2e0f508a9c99b2d4969652eed6a2e1ea5bd522"}, {file = "packaging-21.3.tar.gz", hash = "sha256:dd47c42927d89ab911e606518907cc2d3a1f38bbd026385970643f9c5b8ecfeb"}, ] -paramiko = [ - {file = "paramiko-2.11.0-py2.py3-none-any.whl", hash = "sha256:655f25dc8baf763277b933dfcea101d636581df8d6b9774d1fb653426b72c270"}, - {file = "paramiko-2.11.0.tar.gz", hash = "sha256:003e6bee7c034c21fbb051bf83dc0a9ee4106204dd3c53054c71452cc4ec3938"}, -] +paramiko = [] parso = [ {file = "parso-0.8.3-py2.py3-none-any.whl", hash = "sha256:c001d4636cd3aecdaf33cbb40aebb59b094be2a74c556778ef5576c175e19e75"}, {file = "parso-0.8.3.tar.gz", hash = "sha256:8c07be290bb59f03588915921e29e8a50002acaf2cdc5fa0e0114f91709fafa0"}, @@ -2435,50 +2145,8 @@ pathlib2 = [ {file = "pathlib2-2.3.7.post1-py2.py3-none-any.whl", hash = "sha256:5266a0fd000452f1b3467d782f079a4343c63aaa119221fbdc4e39577489ca5b"}, {file = "pathlib2-2.3.7.post1.tar.gz", hash = "sha256:9fe0edad898b83c0c3e199c842b27ed216645d2e177757b2dd67384d4113c641"}, ] -pillow = [ - {file = "Pillow-9.1.1-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:42dfefbef90eb67c10c45a73a9bc1599d4dac920f7dfcbf4ec6b80cb620757fe"}, - {file = "Pillow-9.1.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:ffde4c6fabb52891d81606411cbfaf77756e3b561b566efd270b3ed3791fde4e"}, - {file = "Pillow-9.1.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9c857532c719fb30fafabd2371ce9b7031812ff3889d75273827633bca0c4602"}, - {file = "Pillow-9.1.1-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:59789a7d06c742e9d13b883d5e3569188c16acb02eeed2510fd3bfdbc1bd1530"}, - {file = "Pillow-9.1.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4d45dbe4b21a9679c3e8b3f7f4f42a45a7d3ddff8a4a16109dff0e1da30a35b2"}, - {file = "Pillow-9.1.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:e9ed59d1b6ee837f4515b9584f3d26cf0388b742a11ecdae0d9237a94505d03a"}, - {file = "Pillow-9.1.1-cp310-cp310-win32.whl", hash = "sha256:b3fe2ff1e1715d4475d7e2c3e8dabd7c025f4410f79513b4ff2de3d51ce0fa9c"}, - {file = "Pillow-9.1.1-cp310-cp310-win_amd64.whl", hash = "sha256:5b650dbbc0969a4e226d98a0b440c2f07a850896aed9266b6fedc0f7e7834108"}, - {file = "Pillow-9.1.1-cp37-cp37m-macosx_10_10_x86_64.whl", hash = "sha256:0b4d5ad2cd3a1f0d1df882d926b37dbb2ab6c823ae21d041b46910c8f8cd844b"}, - {file = "Pillow-9.1.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9370d6744d379f2de5d7fa95cdbd3a4d92f0b0ef29609b4b1687f16bc197063d"}, - {file = "Pillow-9.1.1-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b761727ed7d593e49671d1827044b942dd2f4caae6e51bab144d4accf8244a84"}, - {file = "Pillow-9.1.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8a66fe50386162df2da701b3722781cbe90ce043e7d53c1fd6bd801bca6b48d4"}, - {file = "Pillow-9.1.1-cp37-cp37m-win32.whl", hash = "sha256:2b291cab8a888658d72b575a03e340509b6b050b62db1f5539dd5cd18fd50578"}, - {file = "Pillow-9.1.1-cp37-cp37m-win_amd64.whl", hash = "sha256:1d4331aeb12f6b3791911a6da82de72257a99ad99726ed6b63f481c0184b6fb9"}, - {file = "Pillow-9.1.1-cp38-cp38-macosx_10_10_x86_64.whl", hash = "sha256:8844217cdf66eabe39567118f229e275f0727e9195635a15e0e4b9227458daaf"}, - {file = "Pillow-9.1.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:b6617221ff08fbd3b7a811950b5c3f9367f6e941b86259843eab77c8e3d2b56b"}, - {file = "Pillow-9.1.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:20d514c989fa28e73a5adbddd7a171afa5824710d0ab06d4e1234195d2a2e546"}, - {file = "Pillow-9.1.1-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:088df396b047477dd1bbc7de6e22f58400dae2f21310d9e2ec2933b2ef7dfa4f"}, - {file = "Pillow-9.1.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:53c27bd452e0f1bc4bfed07ceb235663a1df7c74df08e37fd6b03eb89454946a"}, - {file = "Pillow-9.1.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:3f6c1716c473ebd1649663bf3b42702d0d53e27af8b64642be0dd3598c761fb1"}, - {file = "Pillow-9.1.1-cp38-cp38-win32.whl", hash = "sha256:c67db410508b9de9c4694c57ed754b65a460e4812126e87f5052ecf23a011a54"}, - {file = "Pillow-9.1.1-cp38-cp38-win_amd64.whl", hash = "sha256:f054b020c4d7e9786ae0404278ea318768eb123403b18453e28e47cdb7a0a4bf"}, - {file = "Pillow-9.1.1-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:c17770a62a71718a74b7548098a74cd6880be16bcfff5f937f900ead90ca8e92"}, - {file = "Pillow-9.1.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:f3f6a6034140e9e17e9abc175fc7a266a6e63652028e157750bd98e804a8ed9a"}, - {file = "Pillow-9.1.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f372d0f08eff1475ef426344efe42493f71f377ec52237bf153c5713de987251"}, - {file = "Pillow-9.1.1-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:09e67ef6e430f90caa093528bd758b0616f8165e57ed8d8ce014ae32df6a831d"}, - {file = "Pillow-9.1.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:66daa16952d5bf0c9d5389c5e9df562922a59bd16d77e2a276e575d32e38afd1"}, - {file = "Pillow-9.1.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:d78ca526a559fb84faaaf84da2dd4addef5edb109db8b81677c0bb1aad342601"}, - {file = "Pillow-9.1.1-cp39-cp39-win32.whl", hash = "sha256:55e74faf8359ddda43fee01bffbc5bd99d96ea508d8a08c527099e84eb708f45"}, - {file = "Pillow-9.1.1-cp39-cp39-win_amd64.whl", hash = "sha256:7c150dbbb4a94ea4825d1e5f2c5501af7141ea95825fadd7829f9b11c97aaf6c"}, - {file = "Pillow-9.1.1-pp37-pypy37_pp73-macosx_10_10_x86_64.whl", hash = "sha256:769a7f131a2f43752455cc72f9f7a093c3ff3856bf976c5fb53a59d0ccc704f6"}, - {file = "Pillow-9.1.1-pp37-pypy37_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:488f3383cf5159907d48d32957ac6f9ea85ccdcc296c14eca1a4e396ecc32098"}, - {file = "Pillow-9.1.1-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0b525a356680022b0af53385944026d3486fc8c013638cf9900eb87c866afb4c"}, - {file = "Pillow-9.1.1-pp38-pypy38_pp73-macosx_10_10_x86_64.whl", hash = "sha256:6e760cf01259a1c0a50f3c845f9cad1af30577fd8b670339b1659c6d0e7a41dd"}, - {file = "Pillow-9.1.1-pp38-pypy38_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a4165205a13b16a29e1ac57efeee6be2dfd5b5408122d59ef2145bc3239fa340"}, - {file = "Pillow-9.1.1-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:937a54e5694684f74dcbf6e24cc453bfc5b33940216ddd8f4cd8f0f79167f765"}, - {file = "Pillow-9.1.1-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:baf3be0b9446a4083cc0c5bb9f9c964034be5374b5bc09757be89f5d2fa247b8"}, - {file = "Pillow-9.1.1.tar.gz", hash = "sha256:7502539939b53d7565f3d11d87c78e7ec900d3c72945d4ee0e2f250d598309a0"}, -] -platformdirs = [ - {file = "platformdirs-2.5.2-py3-none-any.whl", hash = "sha256:027d8e83a2d7de06bbac4e5ef7e023c02b863d7ea5d079477e722bb41ab25788"}, - {file = "platformdirs-2.5.2.tar.gz", hash = "sha256:58c8abb07dcb441e6ee4b11d8df0ac856038f944ab98b7be6b27b2a3c7feef19"}, -] +pillow = [] +platformdirs = [] pluggy = [ {file = "pluggy-1.0.0-py2.py3-none-any.whl", hash = "sha256:74134bbf457f031a36d68416e1509f34bd5ccc019f0bcc952c7b909d06b37bd3"}, {file = "pluggy-1.0.0.tar.gz", hash = "sha256:4224373bacce55f955a878bf9cfa763c1e360858e330072059e10bad68531159"}, @@ -2491,34 +2159,7 @@ prefixed = [ {file = "prefixed-0.3.2-py2.py3-none-any.whl", hash = "sha256:5e107306462d63f2f03c529dbf11b0026fdfec621a9a008ca639d71de22995c3"}, {file = "prefixed-0.3.2.tar.gz", hash = "sha256:ca48277ba5fa8346dd4b760847da930c7b84416387c39e93affef086add2c029"}, ] -protobuf = [ - {file = "protobuf-3.19.4-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:f51d5a9f137f7a2cec2d326a74b6e3fc79d635d69ffe1b036d39fc7d75430d37"}, - {file = "protobuf-3.19.4-cp310-cp310-manylinux2014_aarch64.whl", hash = "sha256:09297b7972da685ce269ec52af761743714996b4381c085205914c41fcab59fb"}, - {file = "protobuf-3.19.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:072fbc78d705d3edc7ccac58a62c4c8e0cec856987da7df8aca86e647be4e35c"}, - {file = "protobuf-3.19.4-cp310-cp310-win32.whl", hash = "sha256:7bb03bc2873a2842e5ebb4801f5c7ff1bfbdf426f85d0172f7644fcda0671ae0"}, - {file = "protobuf-3.19.4-cp310-cp310-win_amd64.whl", hash = "sha256:f358aa33e03b7a84e0d91270a4d4d8f5df6921abe99a377828839e8ed0c04e07"}, - {file = "protobuf-3.19.4-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:1c91ef4110fdd2c590effb5dca8fdbdcb3bf563eece99287019c4204f53d81a4"}, - {file = "protobuf-3.19.4-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c438268eebb8cf039552897d78f402d734a404f1360592fef55297285f7f953f"}, - {file = "protobuf-3.19.4-cp36-cp36m-win32.whl", hash = "sha256:835a9c949dc193953c319603b2961c5c8f4327957fe23d914ca80d982665e8ee"}, - {file = "protobuf-3.19.4-cp36-cp36m-win_amd64.whl", hash = "sha256:4276cdec4447bd5015453e41bdc0c0c1234eda08420b7c9a18b8d647add51e4b"}, - {file = "protobuf-3.19.4-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:6cbc312be5e71869d9d5ea25147cdf652a6781cf4d906497ca7690b7b9b5df13"}, - {file = "protobuf-3.19.4-cp37-cp37m-manylinux2014_aarch64.whl", hash = "sha256:54a1473077f3b616779ce31f477351a45b4fef8c9fd7892d6d87e287a38df368"}, - {file = "protobuf-3.19.4-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:435bb78b37fc386f9275a7035fe4fb1364484e38980d0dd91bc834a02c5ec909"}, - {file = "protobuf-3.19.4-cp37-cp37m-win32.whl", hash = "sha256:16f519de1313f1b7139ad70772e7db515b1420d208cb16c6d7858ea989fc64a9"}, - {file = "protobuf-3.19.4-cp37-cp37m-win_amd64.whl", hash = "sha256:cdc076c03381f5c1d9bb1abdcc5503d9ca8b53cf0a9d31a9f6754ec9e6c8af0f"}, - {file = "protobuf-3.19.4-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:69da7d39e39942bd52848438462674c463e23963a1fdaa84d88df7fbd7e749b2"}, - {file = "protobuf-3.19.4-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:48ed3877fa43e22bcacc852ca76d4775741f9709dd9575881a373bd3e85e54b2"}, - {file = "protobuf-3.19.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bd95d1dfb9c4f4563e6093a9aa19d9c186bf98fa54da5252531cc0d3a07977e7"}, - {file = "protobuf-3.19.4-cp38-cp38-win32.whl", hash = "sha256:b38057450a0c566cbd04890a40edf916db890f2818e8682221611d78dc32ae26"}, - {file = "protobuf-3.19.4-cp38-cp38-win_amd64.whl", hash = "sha256:7ca7da9c339ca8890d66958f5462beabd611eca6c958691a8fe6eccbd1eb0c6e"}, - {file = "protobuf-3.19.4-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:36cecbabbda242915529b8ff364f2263cd4de7c46bbe361418b5ed859677ba58"}, - {file = "protobuf-3.19.4-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:c1068287025f8ea025103e37d62ffd63fec8e9e636246b89c341aeda8a67c934"}, - {file = "protobuf-3.19.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:96bd766831596d6014ca88d86dc8fe0fb2e428c0b02432fd9db3943202bf8c5e"}, - {file = "protobuf-3.19.4-cp39-cp39-win32.whl", hash = "sha256:84123274d982b9e248a143dadd1b9815049f4477dc783bf84efe6250eb4b836a"}, - {file = "protobuf-3.19.4-cp39-cp39-win_amd64.whl", hash = "sha256:3112b58aac3bac9c8be2b60a9daf6b558ca3f7681c130dcdd788ade7c9ffbdca"}, - {file = "protobuf-3.19.4-py2.py3-none-any.whl", hash = "sha256:8961c3a78ebfcd000920c9060a262f082f29838682b1f7201889300c1fbe0616"}, - {file = "protobuf-3.19.4.tar.gz", hash = "sha256:9df0c10adf3e83015ced42a9a7bd64e13d06c4cf45c340d2c63020ea04499d0a"}, -] +protobuf = [] py = [ {file = "py-1.11.0-py2.py3-none-any.whl", hash = "sha256:607c53218732647dff4acdfcd50cb62615cedf612e72d1724fb1a0cc6405b378"}, {file = "py-1.11.0.tar.gz", hash = "sha256:51c75c4126074b472f746a24399ad32f6053d1b34b68d2fa41e558e6f4a98719"}, @@ -2577,14 +2218,8 @@ pyflakes = [ {file = "pyflakes-2.3.1-py2.py3-none-any.whl", hash = "sha256:7893783d01b8a89811dd72d7dfd4d84ff098e5eed95cfa8905b22bbffe52efc3"}, {file = "pyflakes-2.3.1.tar.gz", hash = "sha256:f5bc8ecabc05bb9d291eb5203d6810b49040f6ff446a756326104746cc00c1db"}, ] -pygments = [ - {file = "Pygments-2.12.0-py3-none-any.whl", hash = "sha256:dc9c10fb40944260f6ed4c688ece0cd2048414940f1cea51b8b226318411c519"}, - {file = "Pygments-2.12.0.tar.gz", hash = "sha256:5eb116118f9612ff1ee89ac96437bb6b49e8f04d8a13b514ba26f620208e26eb"}, -] -pylint = [ - {file = "pylint-2.13.9-py3-none-any.whl", hash = "sha256:705c620d388035bdd9ff8b44c5bcdd235bfb49d276d488dd2c8ff1736aa42526"}, - {file = "pylint-2.13.9.tar.gz", hash = "sha256:095567c96e19e6f57b5b907e67d265ff535e588fe26b12b5ebe1fc5645b2c731"}, -] +pygments = [] +pylint = [] pymongo = [ {file = "pymongo-3.12.3-cp27-cp27m-macosx_10_14_intel.whl", hash = "sha256:c164eda0be9048f83c24b9b2656900041e069ddf72de81c17d874d0c32f6079f"}, {file = "pymongo-3.12.3-cp27-cp27m-manylinux1_i686.whl", hash = "sha256:a055d29f1302892a9389a382bed10a3f77708bcf3e49bfb76f7712fa5f391cc6"}, @@ -2711,42 +2346,10 @@ pynput = [ {file = "pynput-1.7.6-py3.9.egg", hash = "sha256:264429fbe676e98e9050ad26a7017453bdd08768adb25cafb918347cf9f1eb4a"}, {file = "pynput-1.7.6.tar.gz", hash = "sha256:3a5726546da54116b687785d38b1db56997ce1d28e53e8d22fc656d8b92e533c"}, ] -pyobjc-core = [ - {file = "pyobjc-core-8.5.tar.gz", hash = "sha256:704c275439856c0d1287469f0d589a7d808d48b754a93d9ce5415d4eaf06d576"}, - {file = "pyobjc_core-8.5-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:0c234143b48334443f5adcf26e668945a6d47bc1fa6223e80918c6c735a029d9"}, - {file = "pyobjc_core-8.5-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:1486ee533f0d76f666804ce89723ada4db56bfde55e56151ba512d3f849857f8"}, - {file = "pyobjc_core-8.5-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:412de06dfa728301c04b3e46fd7453320a8ae8b862e85236e547cd797a73b490"}, - {file = "pyobjc_core-8.5-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:0b3e09cccb1be574a82cc9f929ae27fc4283eccc75496cb5d51534caa6bb83a3"}, - {file = "pyobjc_core-8.5-cp38-cp38-macosx_11_0_universal2.whl", hash = "sha256:eeafe21f879666ab7f57efcc6b007c9f5f8733d367b7e380c925203ed83f000d"}, - {file = "pyobjc_core-8.5-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c0071686976d7ea8c14690950e504a13cb22b4ebb2bc7b5ec47c1c1c0f6eff41"}, -] -pyobjc-framework-applicationservices = [ - {file = "pyobjc-framework-ApplicationServices-8.5.tar.gz", hash = "sha256:fa3015ef8e3add90af3447d7fdcc7f8dd083cc2a1d58f99a569480a2df10d2b1"}, - {file = "pyobjc_framework_ApplicationServices-8.5-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:436b16ebe448a829a8312e10208eec81a2adcae1fff674dbcc3262e1bd76e0ca"}, - {file = "pyobjc_framework_ApplicationServices-8.5-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:408958d14aa7fcf46f2163754c211078bc63be1368934d86188202914dce077d"}, - {file = "pyobjc_framework_ApplicationServices-8.5-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:1d6cd4ce192859a22e208da4d7177a1c3ceb1ef2f64c339fd881102b1210cadd"}, - {file = "pyobjc_framework_ApplicationServices-8.5-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:0251d092adb1d2d116fd9f147ceef0e53b158a46c21245131c40b9d7b786d0db"}, - {file = "pyobjc_framework_ApplicationServices-8.5-cp38-cp38-macosx_11_0_universal2.whl", hash = "sha256:9742e69fe6d4545d0e02b0ad0a7a2432bc9944569ee07d6e90ffa5ef614df9f7"}, - {file = "pyobjc_framework_ApplicationServices-8.5-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:16f5677c14ea903c6aaca1dd121521825c39e816cae696d6ae32c0b287252ab2"}, -] -pyobjc-framework-cocoa = [ - {file = "pyobjc-framework-Cocoa-8.5.tar.gz", hash = "sha256:569bd3a020f64b536fb2d1c085b37553e50558c9f907e08b73ffc16ae68e1861"}, - {file = "pyobjc_framework_Cocoa-8.5-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:7a7c160416696bf6035dfcdf0e603aaa52858d6afcddfcc5ab41733619ac2529"}, - {file = "pyobjc_framework_Cocoa-8.5-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:6ceba444282030be8596b812260e8d28b671254a51052ad778d32da6e17db847"}, - {file = "pyobjc_framework_Cocoa-8.5-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:f46b2b161b8dd40c7b9e00bc69636c3e6480b2704a69aee22ee0154befbe163a"}, - {file = "pyobjc_framework_Cocoa-8.5-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:b31d425aee8698cbf62b187338f5ca59427fa4dca2153a73866f7cb410713119"}, - {file = "pyobjc_framework_Cocoa-8.5-cp38-cp38-macosx_11_0_universal2.whl", hash = "sha256:898359ac1f76eedec8aa156847682378a8950824421c40edb89391286e607dc4"}, - {file = "pyobjc_framework_Cocoa-8.5-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:baa2947f76b119a3360973d74d57d6dada87ac527bab9a88f31596af392f123c"}, -] -pyobjc-framework-quartz = [ - {file = "pyobjc-framework-Quartz-8.5.tar.gz", hash = "sha256:d2bc5467a792ddc04814f12a1e9c2fcaf699a1c3ad3d4264cfdce6b9c7b10624"}, - {file = "pyobjc_framework_Quartz-8.5-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:e9f0fb663f7872c9de94169031ac42b91ad01bd4cad49a9f1a0164be8f028426"}, - {file = "pyobjc_framework_Quartz-8.5-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:567eec91287cfe9a1b6433717192c585935de8f3daa28d82ce72fdd6c7ac00f6"}, - {file = "pyobjc_framework_Quartz-8.5-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:9f910ab41a712ffc7a8c3e3716a2d6f39ea4419004b26a2fd2d2f740ff5c262c"}, - {file = "pyobjc_framework_Quartz-8.5-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:29d07066781628278bf0e5278abcfc96ef6724c66c5629a0b4c214d319a82e55"}, - {file = "pyobjc_framework_Quartz-8.5-cp38-cp38-macosx_11_0_universal2.whl", hash = "sha256:72abcde1a3d72be11f2c881c9b9872044c8f2de86d2047b67fe771713638b107"}, - {file = "pyobjc_framework_Quartz-8.5-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8809b9a2df2f461697bdb45b6d1b5a4f881f88f09450e3990858e64e3e26c530"}, -] +pyobjc-core = [] +pyobjc-framework-applicationservices = [] +pyobjc-framework-cocoa = [] +pyobjc-framework-quartz = [] pyparsing = [ {file = "pyparsing-2.4.7-py2.py3-none-any.whl", hash = "sha256:ef9d7589ef3c200abe66653d3f1ab1033c3c419ae9b9bdb1240a85b024efc88b"}, {file = "pyparsing-2.4.7.tar.gz", hash = "sha256:c203ec8783bf771a155b207279b9bccb8dea02d8f0c9e5f8ead507bc3246ecc1"}, @@ -2770,14 +2373,8 @@ python-dateutil = [ {file = "python-dateutil-2.8.2.tar.gz", hash = "sha256:0123cacc1627ae19ddf3c27a5de5bd67ee4586fbdd6440d9748f8abb483d3e86"}, {file = "python_dateutil-2.8.2-py2.py3-none-any.whl", hash = "sha256:961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9"}, ] -python-engineio = [ - {file = "python-engineio-3.14.2.tar.gz", hash = "sha256:eab4553f2804c1ce97054c8b22cf0d5a9ab23128075248b97e1a5b2f29553085"}, - {file = "python_engineio-3.14.2-py2.py3-none-any.whl", hash = "sha256:5a9e6086d192463b04a1428ff1f85b6ba631bbb19d453b144ffc04f530542b84"}, -] -python-socketio = [ - {file = "python-socketio-4.6.1.tar.gz", hash = "sha256:cd1f5aa492c1eb2be77838e837a495f117e17f686029ebc03d62c09e33f4fa10"}, - {file = "python_socketio-4.6.1-py2.py3-none-any.whl", hash = "sha256:5a21da53fdbdc6bb6c8071f40e13d100e0b279ad997681c2492478e06f370523"}, -] +python-engineio = [] +python-socketio = [] python-xlib = [ {file = "python-xlib-0.31.tar.gz", hash = "sha256:74d83a081f532bc07f6d7afcd6416ec38403d68f68b9b9dc9e1f28fbf2d799e9"}, {file = "python_xlib-0.31-py2.py3-none-any.whl", hash = "sha256:1ec6ce0de73d9e6592ead666779a5732b384e5b8fb1f1886bd0a81cafa477759"}, @@ -2785,10 +2382,7 @@ python-xlib = [ python3-xlib = [ {file = "python3-xlib-0.15.tar.gz", hash = "sha256:dc4245f3ae4aa5949c1d112ee4723901ade37a96721ba9645f2bfa56e5b383f8"}, ] -pytz = [ - {file = "pytz-2022.1-py2.py3-none-any.whl", hash = "sha256:e68985985296d9a66a881eb3193b0906246245294a881e7c8afe623866ac6a5c"}, - {file = "pytz-2022.1.tar.gz", hash = "sha256:1e760e2fe6a8163bc0b3d9a19c4f84342afa0a2affebfaa84b01b978a02ecaa7"}, -] +pytz = [] pywin32 = [ {file = "pywin32-301-cp35-cp35m-win32.whl", hash = "sha256:93367c96e3a76dfe5003d8291ae16454ca7d84bb24d721e0b74a07610b7be4a7"}, {file = "pywin32-301-cp35-cp35m-win_amd64.whl", hash = "sha256:9635df6998a70282bd36e7ac2a5cef9ead1627b0a63b17c731312c7a0daebb72"}, @@ -2805,10 +2399,7 @@ pywin32-ctypes = [ {file = "pywin32-ctypes-0.2.0.tar.gz", hash = "sha256:24ffc3b341d457d48e8922352130cf2644024a4ff09762a2261fd34c36ee5942"}, {file = "pywin32_ctypes-0.2.0-py2.py3-none-any.whl", hash = "sha256:9dc2d991b3479cc2df15930958b674a48a227d5361d413827a4cfd0b5876fc98"}, ] -"qt.py" = [ - {file = "Qt.py-1.3.7-py2.py3-none-any.whl", hash = "sha256:150099d1c6f64c9621a2c9d79d45102ec781c30ee30ee69fc082c6e9be7324fe"}, - {file = "Qt.py-1.3.7.tar.gz", hash = "sha256:803c7bdf4d6230f9a466be19d55934a173eabb61406d21cb91e80c2a3f773b1f"}, -] +"qt.py" = [] qtawesome = [ {file = "QtAwesome-0.7.3-py2.py3-none-any.whl", hash = "sha256:ddf4530b4af71cec13b24b88a4cdb56ec85b1e44c43c42d0698804c7137b09b0"}, {file = "QtAwesome-0.7.3.tar.gz", hash = "sha256:b98b9038d19190e83ab26d91c4d8fc3a36591ee2bc7f5016d4438b8240d097bd"}, @@ -2821,18 +2412,9 @@ recommonmark = [ {file = "recommonmark-0.7.1-py2.py3-none-any.whl", hash = "sha256:1b1db69af0231efce3fa21b94ff627ea33dee7079a01dd0a7f8482c3da148b3f"}, {file = "recommonmark-0.7.1.tar.gz", hash = "sha256:bdb4db649f2222dcd8d2d844f0006b958d627f732415d399791ee436a3686d67"}, ] -requests = [ - {file = "requests-2.27.1-py2.py3-none-any.whl", hash = "sha256:f22fa1e554c9ddfd16e6e41ac79759e17be9e492b3587efa038054674760e72d"}, - {file = "requests-2.27.1.tar.gz", hash = "sha256:68d7c56fd5a8999887728ef304a6d12edc7be74f1cfa47714fc8b414525c9a61"}, -] -rsa = [ - {file = "rsa-4.8-py3-none-any.whl", hash = "sha256:95c5d300c4e879ee69708c428ba566c59478fd653cc3a22243eeb8ed846950bb"}, - {file = "rsa-4.8.tar.gz", hash = "sha256:5c6bd9dc7a543b7fe4304a631f8a8a3b674e2bbfc49c2ae96200cdbe55df6b17"}, -] -secretstorage = [ - {file = "SecretStorage-3.3.2-py3-none-any.whl", hash = "sha256:755dc845b6ad76dcbcbc07ea3da75ae54bb1ea529eb72d15f83d26499a5df319"}, - {file = "SecretStorage-3.3.2.tar.gz", hash = "sha256:0a8eb9645b320881c222e827c26f4cfcf55363e8b374a021981ef886657a912f"}, -] +requests = [] +rsa = [] +secretstorage = [] semver = [ {file = "semver-2.13.0-py2.py3-none-any.whl", hash = "sha256:ced8b23dceb22134307c1b8abfa523da14198793d9787ac838e70e29e77458d4"}, {file = "semver-2.13.0.tar.gz", hash = "sha256:fa0fe2722ee1c3f57eac478820c3a5ae2f624af8264cbdf9000c980ff7f75e3f"}, @@ -2842,10 +2424,7 @@ six = [ {file = "six-1.16.0-py2.py3-none-any.whl", hash = "sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254"}, {file = "six-1.16.0.tar.gz", hash = "sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926"}, ] -slack-sdk = [ - {file = "slack_sdk-3.17.0-py2.py3-none-any.whl", hash = "sha256:0816efc43d1d2db8286e8dbcbb2e86fd0f71c206c01c521c2cb054ecb40f9ced"}, - {file = "slack_sdk-3.17.0.tar.gz", hash = "sha256:860cd0e50c454b955f14321c8c5486a47cc1e0e84116acdb009107f836752feb"}, -] +slack-sdk = [] smmap = [ {file = "smmap-5.0.0-py3-none-any.whl", hash = "sha256:2aba19d6a040e78d8b09de5c57e96207b09ed71d8e55ce0959eeee6c8e190d94"}, {file = "smmap-5.0.0.tar.gz", hash = "sha256:c840e62059cd3be204b0c9c9f74be2c09d5648eddd4580d9314c3ecde0b30936"}, @@ -2854,18 +2433,9 @@ snowballstemmer = [ {file = "snowballstemmer-2.2.0-py2.py3-none-any.whl", hash = "sha256:c8e1716e83cc398ae16824e5572ae04e0d9fc2c6b985fb0f900f5f0c96ecba1a"}, {file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"}, ] -speedcopy = [ - {file = "speedcopy-2.1.4-py3-none-any.whl", hash = "sha256:e09eb1de67ae0e0b51d5b99a28882009d565a37a3cb3c6bae121e3a5d3cccb17"}, - {file = "speedcopy-2.1.4.tar.gz", hash = "sha256:eff007a97e49ec1934df4fa8074f4bd1cf4a3b14c5499d914988785cff0c199a"}, -] -sphinx = [ - {file = "Sphinx-5.0.1-py3-none-any.whl", hash = "sha256:36aa2a3c2f6d5230be94585bc5d74badd5f9ed8f3388b8eedc1726fe45b1ad30"}, - {file = "Sphinx-5.0.1.tar.gz", hash = "sha256:f4da1187785a5bc7312cc271b0e867a93946c319d106363e102936a3d9857306"}, -] -sphinx-qt-documentation = [ - {file = "sphinx_qt_documentation-0.4-py3-none-any.whl", hash = "sha256:fa131093f75cd1bd48699cd132e18e4d46ba9eaadc070e6026867cea75ecdb7b"}, - {file = "sphinx_qt_documentation-0.4.tar.gz", hash = "sha256:f43ba17baa93e353fb94045027fb67f9d935ed158ce8662de93f08b88eec6774"}, -] +speedcopy = [] +sphinx = [] +sphinx-qt-documentation = [] sphinx-rtd-theme = [ {file = "sphinx_rtd_theme-1.0.0-py2.py3-none-any.whl", hash = "sha256:4d35a56f4508cfee4c4fb604373ede6feae2a306731d533f409ef5c3496fdbd8"}, {file = "sphinx_rtd_theme-1.0.0.tar.gz", hash = "sha256:eec6d497e4c2195fa0e8b2016b337532b8a699a68bcb22a512870e16925c6a5c"}, @@ -2914,44 +2484,13 @@ tomli = [ {file = "tomli-2.0.1-py3-none-any.whl", hash = "sha256:939de3e7a6161af0c887ef91b7d41a53e7c5a1ca976325f429cb46ea9bc30ecc"}, {file = "tomli-2.0.1.tar.gz", hash = "sha256:de526c12914f0c550d15924c62d72abc48d6fe7364aa87328337a31007fe8a4f"}, ] -typed-ast = [ - {file = "typed_ast-1.5.4-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:669dd0c4167f6f2cd9f57041e03c3c2ebf9063d0757dc89f79ba1daa2bfca9d4"}, - {file = "typed_ast-1.5.4-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:211260621ab1cd7324e0798d6be953d00b74e0428382991adfddb352252f1d62"}, - {file = "typed_ast-1.5.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:267e3f78697a6c00c689c03db4876dd1efdfea2f251a5ad6555e82a26847b4ac"}, - {file = "typed_ast-1.5.4-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:c542eeda69212fa10a7ada75e668876fdec5f856cd3d06829e6aa64ad17c8dfe"}, - {file = "typed_ast-1.5.4-cp310-cp310-win_amd64.whl", hash = "sha256:a9916d2bb8865f973824fb47436fa45e1ebf2efd920f2b9f99342cb7fab93f72"}, - {file = "typed_ast-1.5.4-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:79b1e0869db7c830ba6a981d58711c88b6677506e648496b1f64ac7d15633aec"}, - {file = "typed_ast-1.5.4-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a94d55d142c9265f4ea46fab70977a1944ecae359ae867397757d836ea5a3f47"}, - {file = "typed_ast-1.5.4-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:183afdf0ec5b1b211724dfef3d2cad2d767cbefac291f24d69b00546c1837fb6"}, - {file = "typed_ast-1.5.4-cp36-cp36m-win_amd64.whl", hash = "sha256:639c5f0b21776605dd6c9dbe592d5228f021404dafd377e2b7ac046b0349b1a1"}, - {file = "typed_ast-1.5.4-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:cf4afcfac006ece570e32d6fa90ab74a17245b83dfd6655a6f68568098345ff6"}, - {file = "typed_ast-1.5.4-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ed855bbe3eb3715fca349c80174cfcfd699c2f9de574d40527b8429acae23a66"}, - {file = "typed_ast-1.5.4-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:6778e1b2f81dfc7bc58e4b259363b83d2e509a65198e85d5700dfae4c6c8ff1c"}, - {file = "typed_ast-1.5.4-cp37-cp37m-win_amd64.whl", hash = "sha256:0261195c2062caf107831e92a76764c81227dae162c4f75192c0d489faf751a2"}, - {file = "typed_ast-1.5.4-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2efae9db7a8c05ad5547d522e7dbe62c83d838d3906a3716d1478b6c1d61388d"}, - {file = "typed_ast-1.5.4-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:7d5d014b7daa8b0bf2eaef684295acae12b036d79f54178b92a2b6a56f92278f"}, - {file = "typed_ast-1.5.4-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:370788a63915e82fd6f212865a596a0fefcbb7d408bbbb13dea723d971ed8bdc"}, - {file = "typed_ast-1.5.4-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:4e964b4ff86550a7a7d56345c7864b18f403f5bd7380edf44a3c1fb4ee7ac6c6"}, - {file = "typed_ast-1.5.4-cp38-cp38-win_amd64.whl", hash = "sha256:683407d92dc953c8a7347119596f0b0e6c55eb98ebebd9b23437501b28dcbb8e"}, - {file = "typed_ast-1.5.4-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:4879da6c9b73443f97e731b617184a596ac1235fe91f98d279a7af36c796da35"}, - {file = "typed_ast-1.5.4-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:3e123d878ba170397916557d31c8f589951e353cc95fb7f24f6bb69adc1a8a97"}, - {file = "typed_ast-1.5.4-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ebd9d7f80ccf7a82ac5f88c521115cc55d84e35bf8b446fcd7836eb6b98929a3"}, - {file = "typed_ast-1.5.4-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:98f80dee3c03455e92796b58b98ff6ca0b2a6f652120c263efdba4d6c5e58f72"}, - {file = "typed_ast-1.5.4-cp39-cp39-win_amd64.whl", hash = "sha256:0fdbcf2fef0ca421a3f5912555804296f0b0960f0418c440f5d6d3abb549f3e1"}, - {file = "typed_ast-1.5.4.tar.gz", hash = "sha256:39e21ceb7388e4bb37f4c679d72707ed46c2fbf2a5609b8b8ebc4b067d977df2"}, -] -typing-extensions = [ - {file = "typing_extensions-4.0.1-py3-none-any.whl", hash = "sha256:7f001e5ac290a0c0401508864c7ec868be4e701886d5b573a9528ed3973d9d3b"}, - {file = "typing_extensions-4.0.1.tar.gz", hash = "sha256:4ca091dea149f945ec56afb48dae714f21e8692ef22a395223bcd328961b6a0e"}, -] +typed-ast = [] +typing-extensions = [] uritemplate = [ {file = "uritemplate-3.0.1-py2.py3-none-any.whl", hash = "sha256:07620c3f3f8eed1f12600845892b0e036a2420acf513c53f7de0abd911a5894f"}, {file = "uritemplate-3.0.1.tar.gz", hash = "sha256:5af8ad10cec94f215e3f48112de2022e1d5a37ed427fbd88652fa908f2ab7cae"}, ] -urllib3 = [ - {file = "urllib3-1.26.9-py2.py3-none-any.whl", hash = "sha256:44ece4d53fb1706f667c9bd1c648f5469a2ec925fcf3a776667042d645472c14"}, - {file = "urllib3-1.26.9.tar.gz", hash = "sha256:aabaf16477806a5e1dd19aa41f8c2b7950dd3c746362d7e3223dbe6de6ac448e"}, -] +urllib3 = [] wcwidth = [ {file = "wcwidth-0.2.5-py2.py3-none-any.whl", hash = "sha256:beb4802a9cebb9144e99086eff703a642a13d6a0052920003a230f3294bbe784"}, {file = "wcwidth-0.2.5.tar.gz", hash = "sha256:c4d647b99872929fdb7bdcaa4fbe7f01413ed3d98077df798530e5b04f116c83"}, @@ -2960,151 +2499,10 @@ websocket-client = [ {file = "websocket-client-0.59.0.tar.gz", hash = "sha256:d376bd60eace9d437ab6d7ee16f4ab4e821c9dae591e1b783c58ebd8aaf80c5c"}, {file = "websocket_client-0.59.0-py2.py3-none-any.whl", hash = "sha256:2e50d26ca593f70aba7b13a489435ef88b8fc3b5c5643c1ce8808ff9b40f0b32"}, ] -wrapt = [ - {file = "wrapt-1.14.1-cp27-cp27m-macosx_10_9_x86_64.whl", hash = "sha256:1b376b3f4896e7930f1f772ac4b064ac12598d1c38d04907e696cc4d794b43d3"}, - {file = "wrapt-1.14.1-cp27-cp27m-manylinux1_i686.whl", hash = "sha256:903500616422a40a98a5a3c4ff4ed9d0066f3b4c951fa286018ecdf0750194ef"}, - {file = "wrapt-1.14.1-cp27-cp27m-manylinux1_x86_64.whl", hash = "sha256:5a9a0d155deafd9448baff28c08e150d9b24ff010e899311ddd63c45c2445e28"}, - {file = "wrapt-1.14.1-cp27-cp27m-manylinux2010_i686.whl", hash = "sha256:ddaea91abf8b0d13443f6dac52e89051a5063c7d014710dcb4d4abb2ff811a59"}, - {file = "wrapt-1.14.1-cp27-cp27m-manylinux2010_x86_64.whl", hash = "sha256:36f582d0c6bc99d5f39cd3ac2a9062e57f3cf606ade29a0a0d6b323462f4dd87"}, - {file = "wrapt-1.14.1-cp27-cp27mu-manylinux1_i686.whl", hash = "sha256:7ef58fb89674095bfc57c4069e95d7a31cfdc0939e2a579882ac7d55aadfd2a1"}, - {file = "wrapt-1.14.1-cp27-cp27mu-manylinux1_x86_64.whl", hash = "sha256:e2f83e18fe2f4c9e7db597e988f72712c0c3676d337d8b101f6758107c42425b"}, - {file = "wrapt-1.14.1-cp27-cp27mu-manylinux2010_i686.whl", hash = "sha256:ee2b1b1769f6707a8a445162ea16dddf74285c3964f605877a20e38545c3c462"}, - {file = "wrapt-1.14.1-cp27-cp27mu-manylinux2010_x86_64.whl", hash = "sha256:833b58d5d0b7e5b9832869f039203389ac7cbf01765639c7309fd50ef619e0b1"}, - {file = "wrapt-1.14.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:80bb5c256f1415f747011dc3604b59bc1f91c6e7150bd7db03b19170ee06b320"}, - {file = "wrapt-1.14.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:07f7a7d0f388028b2df1d916e94bbb40624c59b48ecc6cbc232546706fac74c2"}, - {file = "wrapt-1.14.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:02b41b633c6261feff8ddd8d11c711df6842aba629fdd3da10249a53211a72c4"}, - {file = "wrapt-1.14.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2fe803deacd09a233e4762a1adcea5db5d31e6be577a43352936179d14d90069"}, - {file = "wrapt-1.14.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:257fd78c513e0fb5cdbe058c27a0624c9884e735bbd131935fd49e9fe719d310"}, - {file = "wrapt-1.14.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:4fcc4649dc762cddacd193e6b55bc02edca674067f5f98166d7713b193932b7f"}, - {file = "wrapt-1.14.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:11871514607b15cfeb87c547a49bca19fde402f32e2b1c24a632506c0a756656"}, - {file = "wrapt-1.14.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:8ad85f7f4e20964db4daadcab70b47ab05c7c1cf2a7c1e51087bfaa83831854c"}, - {file = "wrapt-1.14.1-cp310-cp310-win32.whl", hash = "sha256:a9a52172be0b5aae932bef82a79ec0a0ce87288c7d132946d645eba03f0ad8a8"}, - {file = "wrapt-1.14.1-cp310-cp310-win_amd64.whl", hash = "sha256:6d323e1554b3d22cfc03cd3243b5bb815a51f5249fdcbb86fda4bf62bab9e164"}, - {file = "wrapt-1.14.1-cp35-cp35m-manylinux1_i686.whl", hash = "sha256:43ca3bbbe97af00f49efb06e352eae40434ca9d915906f77def219b88e85d907"}, - {file = "wrapt-1.14.1-cp35-cp35m-manylinux1_x86_64.whl", hash = "sha256:6b1a564e6cb69922c7fe3a678b9f9a3c54e72b469875aa8018f18b4d1dd1adf3"}, - {file = "wrapt-1.14.1-cp35-cp35m-manylinux2010_i686.whl", hash = "sha256:00b6d4ea20a906c0ca56d84f93065b398ab74b927a7a3dbd470f6fc503f95dc3"}, - {file = "wrapt-1.14.1-cp35-cp35m-manylinux2010_x86_64.whl", hash = "sha256:a85d2b46be66a71bedde836d9e41859879cc54a2a04fad1191eb50c2066f6e9d"}, - {file = "wrapt-1.14.1-cp35-cp35m-win32.whl", hash = "sha256:dbcda74c67263139358f4d188ae5faae95c30929281bc6866d00573783c422b7"}, - {file = "wrapt-1.14.1-cp35-cp35m-win_amd64.whl", hash = "sha256:b21bb4c09ffabfa0e85e3a6b623e19b80e7acd709b9f91452b8297ace2a8ab00"}, - {file = "wrapt-1.14.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:9e0fd32e0148dd5dea6af5fee42beb949098564cc23211a88d799e434255a1f4"}, - {file = "wrapt-1.14.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9736af4641846491aedb3c3f56b9bc5568d92b0692303b5a305301a95dfd38b1"}, - {file = "wrapt-1.14.1-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5b02d65b9ccf0ef6c34cba6cf5bf2aab1bb2f49c6090bafeecc9cd81ad4ea1c1"}, - {file = "wrapt-1.14.1-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:21ac0156c4b089b330b7666db40feee30a5d52634cc4560e1905d6529a3897ff"}, - {file = "wrapt-1.14.1-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:9f3e6f9e05148ff90002b884fbc2a86bd303ae847e472f44ecc06c2cd2fcdb2d"}, - {file = "wrapt-1.14.1-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:6e743de5e9c3d1b7185870f480587b75b1cb604832e380d64f9504a0535912d1"}, - {file = "wrapt-1.14.1-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:d79d7d5dc8a32b7093e81e97dad755127ff77bcc899e845f41bf71747af0c569"}, - {file = "wrapt-1.14.1-cp36-cp36m-win32.whl", hash = "sha256:81b19725065dcb43df02b37e03278c011a09e49757287dca60c5aecdd5a0b8ed"}, - {file = "wrapt-1.14.1-cp36-cp36m-win_amd64.whl", hash = "sha256:b014c23646a467558be7da3d6b9fa409b2c567d2110599b7cf9a0c5992b3b471"}, - {file = "wrapt-1.14.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:88bd7b6bd70a5b6803c1abf6bca012f7ed963e58c68d76ee20b9d751c74a3248"}, - {file = "wrapt-1.14.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b5901a312f4d14c59918c221323068fad0540e34324925c8475263841dbdfe68"}, - {file = "wrapt-1.14.1-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d77c85fedff92cf788face9bfa3ebaa364448ebb1d765302e9af11bf449ca36d"}, - {file = "wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8d649d616e5c6a678b26d15ece345354f7c2286acd6db868e65fcc5ff7c24a77"}, - {file = "wrapt-1.14.1-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:7d2872609603cb35ca513d7404a94d6d608fc13211563571117046c9d2bcc3d7"}, - {file = "wrapt-1.14.1-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:ee6acae74a2b91865910eef5e7de37dc6895ad96fa23603d1d27ea69df545015"}, - {file = "wrapt-1.14.1-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:2b39d38039a1fdad98c87279b48bc5dce2c0ca0d73483b12cb72aa9609278e8a"}, - {file = "wrapt-1.14.1-cp37-cp37m-win32.whl", hash = "sha256:60db23fa423575eeb65ea430cee741acb7c26a1365d103f7b0f6ec412b893853"}, - {file = "wrapt-1.14.1-cp37-cp37m-win_amd64.whl", hash = "sha256:709fe01086a55cf79d20f741f39325018f4df051ef39fe921b1ebe780a66184c"}, - {file = "wrapt-1.14.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:8c0ce1e99116d5ab21355d8ebe53d9460366704ea38ae4d9f6933188f327b456"}, - {file = "wrapt-1.14.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:e3fb1677c720409d5f671e39bac6c9e0e422584e5f518bfd50aa4cbbea02433f"}, - {file = "wrapt-1.14.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:642c2e7a804fcf18c222e1060df25fc210b9c58db7c91416fb055897fc27e8cc"}, - {file = "wrapt-1.14.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7b7c050ae976e286906dd3f26009e117eb000fb2cf3533398c5ad9ccc86867b1"}, - {file = "wrapt-1.14.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ef3f72c9666bba2bab70d2a8b79f2c6d2c1a42a7f7e2b0ec83bb2f9e383950af"}, - {file = "wrapt-1.14.1-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:01c205616a89d09827986bc4e859bcabd64f5a0662a7fe95e0d359424e0e071b"}, - {file = "wrapt-1.14.1-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:5a0f54ce2c092aaf439813735584b9537cad479575a09892b8352fea5e988dc0"}, - {file = "wrapt-1.14.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:2cf71233a0ed05ccdabe209c606fe0bac7379fdcf687f39b944420d2a09fdb57"}, - {file = "wrapt-1.14.1-cp38-cp38-win32.whl", hash = "sha256:aa31fdcc33fef9eb2552cbcbfee7773d5a6792c137b359e82879c101e98584c5"}, - {file = "wrapt-1.14.1-cp38-cp38-win_amd64.whl", hash = "sha256:d1967f46ea8f2db647c786e78d8cc7e4313dbd1b0aca360592d8027b8508e24d"}, - {file = "wrapt-1.14.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:3232822c7d98d23895ccc443bbdf57c7412c5a65996c30442ebe6ed3df335383"}, - {file = "wrapt-1.14.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:988635d122aaf2bdcef9e795435662bcd65b02f4f4c1ae37fbee7401c440b3a7"}, - {file = "wrapt-1.14.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9cca3c2cdadb362116235fdbd411735de4328c61425b0aa9f872fd76d02c4e86"}, - {file = "wrapt-1.14.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d52a25136894c63de15a35bc0bdc5adb4b0e173b9c0d07a2be9d3ca64a332735"}, - {file = "wrapt-1.14.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:40e7bc81c9e2b2734ea4bc1aceb8a8f0ceaac7c5299bc5d69e37c44d9081d43b"}, - {file = "wrapt-1.14.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:b9b7a708dd92306328117d8c4b62e2194d00c365f18eff11a9b53c6f923b01e3"}, - {file = "wrapt-1.14.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:6a9a25751acb379b466ff6be78a315e2b439d4c94c1e99cb7266d40a537995d3"}, - {file = "wrapt-1.14.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:34aa51c45f28ba7f12accd624225e2b1e5a3a45206aa191f6f9aac931d9d56fe"}, - {file = "wrapt-1.14.1-cp39-cp39-win32.whl", hash = "sha256:dee0ce50c6a2dd9056c20db781e9c1cfd33e77d2d569f5d1d9321c641bb903d5"}, - {file = "wrapt-1.14.1-cp39-cp39-win_amd64.whl", hash = "sha256:dee60e1de1898bde3b238f18340eec6148986da0455d8ba7848d50470a7a32fb"}, - {file = "wrapt-1.14.1.tar.gz", hash = "sha256:380a85cf89e0e69b7cfbe2ea9f765f004ff419f34194018a6827ac0e3edfed4d"}, -] +wrapt = [] wsrpc-aiohttp = [ {file = "wsrpc-aiohttp-3.2.0.tar.gz", hash = "sha256:f467abc51bcdc760fc5aeb7041abdeef46eeca3928dc43dd6e7fa7a533563818"}, {file = "wsrpc_aiohttp-3.2.0-py3-none-any.whl", hash = "sha256:fa9b0bf5cb056898cb5c9f64cbc5eacb8a5dd18ab1b7f0cd4a2208b4a7fde282"}, ] -yarl = [ - {file = "yarl-1.7.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:f2a8508f7350512434e41065684076f640ecce176d262a7d54f0da41d99c5a95"}, - {file = "yarl-1.7.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:da6df107b9ccfe52d3a48165e48d72db0eca3e3029b5b8cb4fe6ee3cb870ba8b"}, - {file = "yarl-1.7.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:a1d0894f238763717bdcfea74558c94e3bc34aeacd3351d769460c1a586a8b05"}, - {file = "yarl-1.7.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dfe4b95b7e00c6635a72e2d00b478e8a28bfb122dc76349a06e20792eb53a523"}, - {file = "yarl-1.7.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c145ab54702334c42237a6c6c4cc08703b6aa9b94e2f227ceb3d477d20c36c63"}, - {file = "yarl-1.7.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1ca56f002eaf7998b5fcf73b2421790da9d2586331805f38acd9997743114e98"}, - {file = "yarl-1.7.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:1d3d5ad8ea96bd6d643d80c7b8d5977b4e2fb1bab6c9da7322616fd26203d125"}, - {file = "yarl-1.7.2-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:167ab7f64e409e9bdd99333fe8c67b5574a1f0495dcfd905bc7454e766729b9e"}, - {file = "yarl-1.7.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:95a1873b6c0dd1c437fb3bb4a4aaa699a48c218ac7ca1e74b0bee0ab16c7d60d"}, - {file = "yarl-1.7.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:6152224d0a1eb254f97df3997d79dadd8bb2c1a02ef283dbb34b97d4f8492d23"}, - {file = "yarl-1.7.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:5bb7d54b8f61ba6eee541fba4b83d22b8a046b4ef4d8eb7f15a7e35db2e1e245"}, - {file = "yarl-1.7.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:9c1f083e7e71b2dd01f7cd7434a5f88c15213194df38bc29b388ccdf1492b739"}, - {file = "yarl-1.7.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:f44477ae29025d8ea87ec308539f95963ffdc31a82f42ca9deecf2d505242e72"}, - {file = "yarl-1.7.2-cp310-cp310-win32.whl", hash = "sha256:cff3ba513db55cc6a35076f32c4cdc27032bd075c9faef31fec749e64b45d26c"}, - {file = "yarl-1.7.2-cp310-cp310-win_amd64.whl", hash = "sha256:c9c6d927e098c2d360695f2e9d38870b2e92e0919be07dbe339aefa32a090265"}, - {file = "yarl-1.7.2-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:9b4c77d92d56a4c5027572752aa35082e40c561eec776048330d2907aead891d"}, - {file = "yarl-1.7.2-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c01a89a44bb672c38f42b49cdb0ad667b116d731b3f4c896f72302ff77d71656"}, - {file = "yarl-1.7.2-cp36-cp36m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c19324a1c5399b602f3b6e7db9478e5b1adf5cf58901996fc973fe4fccd73eed"}, - {file = "yarl-1.7.2-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3abddf0b8e41445426d29f955b24aeecc83fa1072be1be4e0d194134a7d9baee"}, - {file = "yarl-1.7.2-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:6a1a9fe17621af43e9b9fcea8bd088ba682c8192d744b386ee3c47b56eaabb2c"}, - {file = "yarl-1.7.2-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:8b0915ee85150963a9504c10de4e4729ae700af11df0dc5550e6587ed7891e92"}, - {file = "yarl-1.7.2-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:29e0656d5497733dcddc21797da5a2ab990c0cb9719f1f969e58a4abac66234d"}, - {file = "yarl-1.7.2-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:bf19725fec28452474d9887a128e98dd67eee7b7d52e932e6949c532d820dc3b"}, - {file = "yarl-1.7.2-cp36-cp36m-musllinux_1_1_ppc64le.whl", hash = "sha256:d6f3d62e16c10e88d2168ba2d065aa374e3c538998ed04996cd373ff2036d64c"}, - {file = "yarl-1.7.2-cp36-cp36m-musllinux_1_1_s390x.whl", hash = "sha256:ac10bbac36cd89eac19f4e51c032ba6b412b3892b685076f4acd2de18ca990aa"}, - {file = "yarl-1.7.2-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:aa32aaa97d8b2ed4e54dc65d241a0da1c627454950f7d7b1f95b13985afd6c5d"}, - {file = "yarl-1.7.2-cp36-cp36m-win32.whl", hash = "sha256:87f6e082bce21464857ba58b569370e7b547d239ca22248be68ea5d6b51464a1"}, - {file = "yarl-1.7.2-cp36-cp36m-win_amd64.whl", hash = "sha256:ac35ccde589ab6a1870a484ed136d49a26bcd06b6a1c6397b1967ca13ceb3913"}, - {file = "yarl-1.7.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:a467a431a0817a292121c13cbe637348b546e6ef47ca14a790aa2fa8cc93df63"}, - {file = "yarl-1.7.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6ab0c3274d0a846840bf6c27d2c60ba771a12e4d7586bf550eefc2df0b56b3b4"}, - {file = "yarl-1.7.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d260d4dc495c05d6600264a197d9d6f7fc9347f21d2594926202fd08cf89a8ba"}, - {file = "yarl-1.7.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:fc4dd8b01a8112809e6b636b00f487846956402834a7fd59d46d4f4267181c41"}, - {file = "yarl-1.7.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:c1164a2eac148d85bbdd23e07dfcc930f2e633220f3eb3c3e2a25f6148c2819e"}, - {file = "yarl-1.7.2-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:67e94028817defe5e705079b10a8438b8cb56e7115fa01640e9c0bb3edf67332"}, - {file = "yarl-1.7.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:89ccbf58e6a0ab89d487c92a490cb5660d06c3a47ca08872859672f9c511fc52"}, - {file = "yarl-1.7.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:8cce6f9fa3df25f55521fbb5c7e4a736683148bcc0c75b21863789e5185f9185"}, - {file = "yarl-1.7.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:211fcd65c58bf250fb994b53bc45a442ddc9f441f6fec53e65de8cba48ded986"}, - {file = "yarl-1.7.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c10ea1e80a697cf7d80d1ed414b5cb8f1eec07d618f54637067ae3c0334133c4"}, - {file = "yarl-1.7.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:52690eb521d690ab041c3919666bea13ab9fbff80d615ec16fa81a297131276b"}, - {file = "yarl-1.7.2-cp37-cp37m-win32.whl", hash = "sha256:695ba021a9e04418507fa930d5f0704edbce47076bdcfeeaba1c83683e5649d1"}, - {file = "yarl-1.7.2-cp37-cp37m-win_amd64.whl", hash = "sha256:c17965ff3706beedafd458c452bf15bac693ecd146a60a06a214614dc097a271"}, - {file = "yarl-1.7.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:fce78593346c014d0d986b7ebc80d782b7f5e19843ca798ed62f8e3ba8728576"}, - {file = "yarl-1.7.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:c2a1ac41a6aa980db03d098a5531f13985edcb451bcd9d00670b03129922cd0d"}, - {file = "yarl-1.7.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:39d5493c5ecd75c8093fa7700a2fb5c94fe28c839c8e40144b7ab7ccba6938c8"}, - {file = "yarl-1.7.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1eb6480ef366d75b54c68164094a6a560c247370a68c02dddb11f20c4c6d3c9d"}, - {file = "yarl-1.7.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5ba63585a89c9885f18331a55d25fe81dc2d82b71311ff8bd378fc8004202ff6"}, - {file = "yarl-1.7.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e39378894ee6ae9f555ae2de332d513a5763276a9265f8e7cbaeb1b1ee74623a"}, - {file = "yarl-1.7.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:c0910c6b6c31359d2f6184828888c983d54d09d581a4a23547a35f1d0b9484b1"}, - {file = "yarl-1.7.2-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:6feca8b6bfb9eef6ee057628e71e1734caf520a907b6ec0d62839e8293e945c0"}, - {file = "yarl-1.7.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:8300401dc88cad23f5b4e4c1226f44a5aa696436a4026e456fe0e5d2f7f486e6"}, - {file = "yarl-1.7.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:788713c2896f426a4e166b11f4ec538b5736294ebf7d5f654ae445fd44270832"}, - {file = "yarl-1.7.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:fd547ec596d90c8676e369dd8a581a21227fe9b4ad37d0dc7feb4ccf544c2d59"}, - {file = "yarl-1.7.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:737e401cd0c493f7e3dd4db72aca11cfe069531c9761b8ea474926936b3c57c8"}, - {file = "yarl-1.7.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:baf81561f2972fb895e7844882898bda1eef4b07b5b385bcd308d2098f1a767b"}, - {file = "yarl-1.7.2-cp38-cp38-win32.whl", hash = "sha256:ede3b46cdb719c794427dcce9d8beb4abe8b9aa1e97526cc20de9bd6583ad1ef"}, - {file = "yarl-1.7.2-cp38-cp38-win_amd64.whl", hash = "sha256:cc8b7a7254c0fc3187d43d6cb54b5032d2365efd1df0cd1749c0c4df5f0ad45f"}, - {file = "yarl-1.7.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:580c1f15500e137a8c37053e4cbf6058944d4c114701fa59944607505c2fe3a0"}, - {file = "yarl-1.7.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:3ec1d9a0d7780416e657f1e405ba35ec1ba453a4f1511eb8b9fbab81cb8b3ce1"}, - {file = "yarl-1.7.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:3bf8cfe8856708ede6a73907bf0501f2dc4e104085e070a41f5d88e7faf237f3"}, - {file = "yarl-1.7.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1be4bbb3d27a4e9aa5f3df2ab61e3701ce8fcbd3e9846dbce7c033a7e8136746"}, - {file = "yarl-1.7.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:534b047277a9a19d858cde163aba93f3e1677d5acd92f7d10ace419d478540de"}, - {file = "yarl-1.7.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c6ddcd80d79c96eb19c354d9dca95291589c5954099836b7c8d29278a7ec0bda"}, - {file = "yarl-1.7.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:9bfcd43c65fbb339dc7086b5315750efa42a34eefad0256ba114cd8ad3896f4b"}, - {file = "yarl-1.7.2-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:f64394bd7ceef1237cc604b5a89bf748c95982a84bcd3c4bbeb40f685c810794"}, - {file = "yarl-1.7.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:044daf3012e43d4b3538562da94a88fb12a6490652dbc29fb19adfa02cf72eac"}, - {file = "yarl-1.7.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:368bcf400247318382cc150aaa632582d0780b28ee6053cd80268c7e72796dec"}, - {file = "yarl-1.7.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:bab827163113177aee910adb1f48ff7af31ee0289f434f7e22d10baf624a6dfe"}, - {file = "yarl-1.7.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:0cba38120db72123db7c58322fa69e3c0efa933040ffb586c3a87c063ec7cae8"}, - {file = "yarl-1.7.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:59218fef177296451b23214c91ea3aba7858b4ae3306dde120224cfe0f7a6ee8"}, - {file = "yarl-1.7.2-cp39-cp39-win32.whl", hash = "sha256:1edc172dcca3f11b38a9d5c7505c83c1913c0addc99cd28e993efeaafdfaa18d"}, - {file = "yarl-1.7.2-cp39-cp39-win_amd64.whl", hash = "sha256:797c2c412b04403d2da075fb93c123df35239cd7b4cc4e0cd9e5839b73f52c58"}, - {file = "yarl-1.7.2.tar.gz", hash = "sha256:45399b46d60c253327a460e99856752009fcee5f5d3c80b2f7c0cae1c38d56dd"}, -] -zipp = [ - {file = "zipp-3.8.0-py3-none-any.whl", hash = "sha256:c4f6e5bbf48e74f7a38e7cc5b0480ff42b0ae5178957d564d18932525d5cf099"}, - {file = "zipp-3.8.0.tar.gz", hash = "sha256:56bf8aadb83c24db6c4b577e13de374ccfb67da2078beba1d037c17980bf43ad"}, -] +yarl = [] +zipp = [] diff --git a/pyproject.toml b/pyproject.toml index 9cbdc295ff..e01cc71201 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -1,6 +1,6 @@ [tool.poetry] name = "OpenPype" -version = "3.13.1-nightly.2" # OpenPype +version = "3.14.1-nightly.1" # OpenPype description = "Open VFX and Animation pipeline with support." authors = ["OpenPype Team "] license = "MIT License" @@ -71,7 +71,6 @@ pysftp = "^0.2.9" dropbox = "^11.20.0" aiohttp-middlewares = "^2.0.0" - [tool.poetry.dev-dependencies] flake8 = "^3.7" autopep8 = "^1.4" @@ -80,13 +79,14 @@ cx_freeze = "~6.9" GitPython = "^3.1.17" jedi = "^0.13" Jinja2 = "^2.11" +markupsafe = "2.0.1" pycodestyle = "^2.5.0" pydocstyle = "^3.0.0" pylint = "^2.4.4" pytest = "^6.1" pytest-cov = "*" pytest-print = "*" -Sphinx = "*" +Sphinx = "5.0.1" sphinx-rtd-theme = "*" sphinxcontrib-websupport = "*" sphinx-qt-documentation = "*" @@ -142,6 +142,10 @@ hash = "3894dec7e4e521463891a869586850e8605f5fd604858b674c87323bf33e273d" url = "https://distribute.openpype.io/thirdparty/oiio-2.2.0-darwin.tgz" hash = "sha256:..." +[openpype.thirdparty.ocioconfig] +url = "https://distribute.openpype.io/thirdparty/OpenColorIO-Configs-1.0.2.zip" +hash = "4ac17c1f7de83465e6f51dd352d7117e07e765b66d00443257916c828e35b6ce" + [tool.pyright] include = [ "igniter", diff --git a/start.py b/start.py index 5cdffafb6e..d1198a85e4 100644 --- a/start.py +++ b/start.py @@ -629,6 +629,9 @@ def _determine_mongodb() -> str: def _initialize_environment(openpype_version: OpenPypeVersion) -> None: version_path = openpype_version.path + if not version_path: + _print(f"!!! Version {openpype_version} doesn't have path set.") + raise ValueError("No path set in specified OpenPype version.") os.environ["OPENPYPE_VERSION"] = str(openpype_version) # set OPENPYPE_REPOS_ROOT to point to currently used OpenPype version. os.environ["OPENPYPE_REPOS_ROOT"] = os.path.normpath( @@ -686,7 +689,7 @@ def _find_frozen_openpype(use_version: str = None, # Collect OpenPype versions installed_version = OpenPypeVersion.get_installed_version() # Expected version that should be used by studio settings - # - this option is used only if version is not explictly set and if + # - this option is used only if version is not explicitly set and if # studio has set explicit version in settings studio_version = OpenPypeVersion.get_expected_studio_version(use_staging) @@ -696,8 +699,7 @@ def _find_frozen_openpype(use_version: str = None, # Version says to use latest version _print(">>> Finding latest version defined by use version") openpype_version = bootstrap.find_latest_openpype_version( - use_staging, compatible_with=installed_version - ) + use_staging) else: _print(f">>> Finding specified version \"{use_version}\"") openpype_version = bootstrap.find_openpype_version( @@ -709,18 +711,11 @@ def _find_frozen_openpype(use_version: str = None, f"Requested version \"{use_version}\" was not found." ) - if not openpype_version.is_compatible(installed_version): - raise OpenPypeVersionIncompatible(( - f"Requested version \"{use_version}\" is not compatible " - f"with installed version \"{installed_version}\"" - )) - elif studio_version is not None: # Studio has defined a version to use _print(f">>> Finding studio version \"{studio_version}\"") openpype_version = bootstrap.find_openpype_version( - studio_version, use_staging, compatible_with=installed_version - ) + studio_version, use_staging) if openpype_version is None: raise OpenPypeVersionNotFound(( "Requested OpenPype version " @@ -731,11 +726,11 @@ def _find_frozen_openpype(use_version: str = None, else: # Default behavior to use latest version _print(( - ">>> Finding latest version compatible " + ">>> Finding latest version " f"with [ {installed_version} ]")) openpype_version = bootstrap.find_latest_openpype_version( - use_staging, compatible_with=installed_version - ) + use_staging) + if openpype_version is None: if use_staging: reason = "Didn't find any staging versions." @@ -753,6 +748,23 @@ def _find_frozen_openpype(use_version: str = None, _initialize_environment(openpype_version) return version_path + in_headless_mode = os.getenv("OPENPYPE_HEADLESS_MODE") == "1" + if not installed_version.is_compatible(openpype_version): + message = "Version {} is not compatible with installed version {}." + # Show UI to user + if not in_headless_mode: + igniter.show_message_dialog( + "Incompatible OpenPype installation", + message.format( + "{}".format(openpype_version), + "{}".format(installed_version) + ) + ) + # Raise incompatible error + raise OpenPypeVersionIncompatible( + message.format(openpype_version, installed_version) + ) + # test if latest detected is installed (in user data dir) is_inside = False try: @@ -765,7 +777,7 @@ def _find_frozen_openpype(use_version: str = None, if not is_inside: # install latest version to user data dir - if os.getenv("OPENPYPE_HEADLESS_MODE") == "1": + if in_headless_mode: version_path = bootstrap.install_version( openpype_version, force=True ) @@ -944,7 +956,12 @@ def _boot_print_versions(use_staging, local_version, openpype_root): openpype_versions = bootstrap.find_openpype( include_zips=True, staging=use_staging, - compatible_with=compatible_with) + ) + openpype_versions = [ + version for version in openpype_versions + if version.is_compatible( + OpenPypeVersion.get_installed_version()) + ] list_versions(openpype_versions, local_version) diff --git a/tools/fetch_thirdparty_libs.py b/tools/fetch_thirdparty_libs.py index b616beab27..421cc32dbd 100644 --- a/tools/fetch_thirdparty_libs.py +++ b/tools/fetch_thirdparty_libs.py @@ -109,13 +109,20 @@ except AttributeError: for k, v in thirdparty.items(): _print(f"processing {k}") - destination_path = openpype_root / "vendor" / "bin" / k / platform_name - url = v.get(platform_name).get("url") + destination_path = openpype_root / "vendor" / "bin" / k + if not v.get(platform_name): _print(("missing definition for current " - f"platform [ {platform_name} ]"), 1) - sys.exit(1) + f"platform [ {platform_name} ]"), 2) + _print("trying to get universal url for all platforms") + url = v.get("url") + if not url: + _print("cannot get url", 1) + sys.exit(1) + else: + url = v.get(platform_name).get("url") + destination_path = destination_path / platform_name parsed_url = urlparse(url) @@ -147,7 +154,13 @@ for k, v in thirdparty.items(): # get file with checksum _print("Calculating sha256 ...", 2) calc_checksum = sha256_sum(temp_file) - if v.get(platform_name).get("hash") != calc_checksum: + + if v.get(platform_name): + item_hash = v.get(platform_name).get("hash") + else: + item_hash = v.get("hash") + + if item_hash != calc_checksum: _print("Downloaded files checksum invalid.") sys.exit(1) diff --git a/website/docs/admin_hosts_maya.md b/website/docs/admin_hosts_maya.md index 93bf32798f..0e77f29fc2 100644 --- a/website/docs/admin_hosts_maya.md +++ b/website/docs/admin_hosts_maya.md @@ -120,3 +120,54 @@ raw json. You can configure path mapping using Maya `dirmap` command. This will add bi-directional mapping between list of paths specified in **Settings**. You can find it in **Settings -> Project Settings -> Maya -> Maya Directory Mapping** ![Dirmap settings](assets/maya-admin_dirmap_settings.png) + +## Templated Build Workfile + +Building a workfile using a template designed by users. Helping to assert homogeneous subsets hierarchy and imports. Template stored as file easy to define, change and customize for production needs. + + **1. Make a template** + +Make your template. Add families and everything needed for your tasks. Here is an example template for the modeling task using a placeholder to import a gauge. + +![maya outliner](assets/maya-workfile-outliner.png) + +If needed, you can add placeholders when the template needs to load some assets. **OpenPype > Template Builder > Create Placeholder** + +![create placeholder](assets/maya-create_placeholder.png) + +- **Configure placeholders** + +Fill in the necessary fields (the optional fields are regex filters) + +![new place holder](assets/maya-placeholder_new.png) + + + - Builder type: Wether the the placeholder should load current asset representations or linked assets representations + + - Representation: Representation that will be loaded (ex: ma, abc, png, etc...) + + - Family: Family of the representation to load (main, look, image, etc ...) + + - Loader: Placeholder loader name that will be used to load corresponding representations + + - Order: Priority for current placeholder loader (priority is lowest first, highet last) + +- **Save your template** + + + **2. Configure Template** + +- **Go to Studio settings > Project > Your DCC > Templated Build Settings** +- Add a profile for your task and enter path to your template + +![setting build template](assets/settings/template_build_workfile.png) + +**3. Build your workfile** + +- Open maya + +- Build your workfile + +![maya build template](assets/maya-build_workfile_from_template.png) + + diff --git a/website/docs/admin_openpype_commands.md b/website/docs/admin_openpype_commands.md index 53fc12410f..85f661d51e 100644 --- a/website/docs/admin_openpype_commands.md +++ b/website/docs/admin_openpype_commands.md @@ -40,7 +40,6 @@ For more information [see here](admin_use.md#run-openpype). | module | Run command line arguments for modules. | | | repack-version | Tool to re-create version zip. | [📑](#repack-version-arguments) | | tray | Launch OpenPype Tray. | [📑](#tray-arguments) -| eventserver | This should be ideally used by system service (such as systemd or upstart on linux and window service). | [📑](#eventserver-arguments) | | launch | Launch application in Pype environment. | [📑](#launch-arguments) | | publish | Pype takes JSON from provided path and use it to publish data in it. | [📑](#publish-arguments) | | extractenvironments | Extract environment variables for entered context to a json file. | [📑](#extractenvironments-arguments) | @@ -48,7 +47,6 @@ For more information [see here](admin_use.md#run-openpype). | interactive | Start python like interactive console session. | | | projectmanager | Launch Project Manager UI | [📑](#projectmanager-arguments) | | settings | Open Settings UI | [📑](#settings-arguments) | -| standalonepublisher | Open Standalone Publisher UI | [📑](#standalonepublisher-arguments) | --- ### `tray` arguments {#tray-arguments} @@ -57,25 +55,7 @@ For more information [see here](admin_use.md#run-openpype). openpype_console tray ``` --- -### `launch` arguments {#eventserver-arguments} -You have to set either proper environment variables to provide URL and credentials or use -option to specify them. -| Argument | Description | -| --- | --- | -| `--ftrack-url` | URL to ftrack server (can be set with `FTRACK_SERVER`) | -| `--ftrack-user` |user name to log in to ftrack (can be set with `FTRACK_API_USER`) | -| `--ftrack-api-key` | ftrack api key (can be set with `FTRACK_API_KEY`) | -| `--legacy` | run event server without mongo storing | -| `--clockify-api-key` | Clockify API key (can be set with `CLOCKIFY_API_KEY`) | -| `--clockify-workspace` | Clockify workspace (can be set with `CLOCKIFY_WORKSPACE`) | - -To run ftrack event server: -```shell -openpype_console eventserver --ftrack-url= --ftrack-user= --ftrack-api-key= -``` - ---- ### `launch` arguments {#launch-arguments} | Argument | Description | @@ -159,12 +139,6 @@ openpypeconsole settings ``` --- -### `standalonepublisher` arguments {#standalonepublisher-arguments} -`standalonepublisher` has no command-line arguments. -```shell -openpype_console standalonepublisher -``` - ### `repack-version` arguments {#repack-version-arguments} Takes path to unzipped and possibly modified OpenPype version. Files will be zipped, checksums recalculated and version will be determined by folder name diff --git a/website/docs/admin_releases.md b/website/docs/admin_releases.md new file mode 100644 index 0000000000..bba5a22110 --- /dev/null +++ b/website/docs/admin_releases.md @@ -0,0 +1,9 @@ +--- +id: admin_releases +title: Releases +sidebar_label: Releases +--- + +Information about releases can be found on GitHub [Releases page](https://github.com/pypeclub/OpenPype/releases). + +You can find features and bugfixes in the codebase or full changelog for advanced users. diff --git a/website/docs/artist_concepts.md b/website/docs/artist_concepts.md index 9005cffe87..7582540811 100644 --- a/website/docs/artist_concepts.md +++ b/website/docs/artist_concepts.md @@ -10,6 +10,8 @@ sidebar_label: Key Concepts In our pipeline all the main entities the project is made from are internally considered *'Assets'*. Episode, sequence, shot, character, prop, etc. All of these behave identically in the pipeline. Asset names need to be absolutely unique within the project because they are their key identifier. +OpenPype has a limitation regarding duplicated names. Name of assets must be unique across whole project. + ### Subset Usually, an asset needs to be created in multiple *'flavours'*. A character might have multiple different looks, model needs to be published in different resolutions, a standard animation rig might not be usable in a crowd system and so on. 'Subsets' are here to accommodate all this variety that might be needed within a single asset. A model might have subset: *'main'*, *'proxy'*, *'sculpt'*, while data of *'look'* family could have subsets *'main'*, *'dirty'*, *'damaged'*. Subsets have some recommendations for their names, but ultimately it's up to the artist to use them for separation of publishes when needed. @@ -24,6 +26,11 @@ A numbered iteration of a given subset. Each version contains at least one [repr Each published variant can come out of the software in multiple representations. All of them hold exactly the same data, but in different formats. A model, for example, might be saved as `.OBJ`, Alembic, Maya geometry or as all of them, to be ready for pickup in any other applications supporting these formats. + +#### Naming convention + +At this moment names of assets, tasks, subsets or representations can contain only letters, numbers and underscore. + ### Family Each published [subset][3b89d8e0] can have exactly one family assigned to it. Family determines the type of data that the subset holds. Family doesn't dictate the file type, but can enforce certain technical specifications. For example OpenPype default configuration expects `model` family to only contain geometry without any shaders or joints when it is published. diff --git a/website/docs/artist_hosts_unreal.md b/website/docs/artist_hosts_unreal.md index 1ff09893e3..45a0c8bb6f 100644 --- a/website/docs/artist_hosts_unreal.md +++ b/website/docs/artist_hosts_unreal.md @@ -8,6 +8,20 @@ sidebar_label: Unreal OpenPype supports Unreal in similar ways as in other DCCs Yet there are few specific you need to be aware of. +### Creating the Unreal project + +Selecting a task and opening it with Unreal will generate the Unreal project, if it hasn't been created before. +By default, OpenPype includes the plugin that will be built together with the project. + +Alternatively, the Environment variable `"OPENPYPE_UNREAL_PLUGIN"` can be set to the path of a compiled version of the plugin. +The version of the compiled plugin must match the version of Unreal with which the project is being created. + +:::note +Unreal version 5.0 onwards requires the following Environment variable: + +`"UE_PYTHONPATH": "{PYTHONPATH}"` +::: + ### Project naming Unreal doesn't support project names starting with non-alphabetic character. So names like `123_myProject` are @@ -15,9 +29,9 @@ invalid. If OpenPype detects such name it automatically prepends letter **P** to ## OpenPype global tools -OpenPype global tools can be found in *Window* main menu: +OpenPype global tools can be found in Unreal's toolbar and in the *Tools* main menu: -![Unreal OpenPype Menu](assets/unreal-avalon_tools.jpg) +![Unreal OpenPype Menu](assets/unreal_openpype_tools.png) - [Create](artist_tools.md#creator) - [Load](artist_tools.md#loader) @@ -31,10 +45,118 @@ OpenPype global tools can be found in *Window* main menu: To import Static Mesh model, just choose **OpenPype → Load ...** and select your mesh. Static meshes are transferred as FBX files as specified in [Unreal Engine 4 Static Mesh Pipeline](https://docs.unrealengine.com/en-US/Engine/Content/Importing/FBX/StaticMeshes/index.html). This action will create new folder with subset name (`unrealStaticMeshMain_CON` for example) and put all data into it. Inside, you can find: -![Unreal Container Content](assets/unreal-container.jpg) +![Unreal Container Content](assets/unreal_container.jpg) -In this case there is **lambert1**, material pulled from Maya when this static mesh was published, **unrealStaticMeshCube** is the geometry itself, **unrealStaticMeshCube_CON** is a *AssetContainer* type and is there to mark this directory as Avalon Container (to track changes) and to hold OpenPype metadata. +In this case there is **lambert1**, material pulled from Maya when this static mesh was published, **antennaA_modelMain** is the geometry itself, **modelMain_v002_CON** is a *AssetContainer* type and is there to mark this directory as Avalon Container (to track changes) and to hold OpenPype metadata. ### Publishing -Publishing of Static Mesh works in similar ways. Select your mesh in *Content Browser* and **OpenPype → Create ...**. This will create folder named by subset you've chosen - for example **unrealStaticMeshDefault_INS**. It this folder is that mesh and *Avalon Publish Instance* asset marking this folder as publishable instance and holding important metadata on it. If you want to publish this instance, go **OpenPype → Publish ...** \ No newline at end of file +Publishing of Static Mesh works in similar ways. Select your mesh in *Content Browser* and **OpenPype → Create ...**. This will create folder named by subset you've chosen - for example **unrealStaticMeshDefault_INS**. It this folder is that mesh and *Avalon Publish Instance* asset marking this folder as publishable instance and holding important metadata on it. If you want to publish this instance, go **OpenPype → Publish ...** + +## Layout + +There are two different layout options in Unreal, depending on the type of project you are working on. +One only imports the layout, and saves it in a level. +The other uses [Master Sequences](https://docs.unrealengine.com/4.27/en-US/AnimatingObjects/Sequencer/Overview/TracksShot/) to track the whole level sequence hierarchy. +You can choose in the Project Settings if you want to generate the level sequences. + +![Unreal OP Settings Level Sequence](assets/unreal_setting_level_sequence.png) + +### Loading + +To load a layout, click on the OpenPype icon in Unreal’s main taskbar, and select **Load**. + +![Unreal OP Tools Load](assets/unreal_openpype_tools_load.png) + +Select the task on the left, then right click on the layout asset and select **Load Layout**. + +![Unreal Layout Load](assets/unreal_load_layout.png) + +If you need to load multiple layouts, you can select more than one task on the left, and you can load them together. + +![Unreal Layout Load Batch](assets/unreal_load_layout_batch.png) + +### Navigating the project + +The layout will be imported in the directory `/Content/OpenPype`. The layout will be split into two subfolders: +- *Assets*, which will contain all the rigs and models contained in the layout; +- *Asset name* (in the following example, *episode 2*), a folder named as the **asset** of the current **task**. + +![Unreal Layout Loading Result](assets/unreal_layout_loading_result.png) + +If you chose to generate the level sequences, in the second folder you will find the master level for the task (usually an episode), the level sequence and the folders for all the scenes in the episodes. +Otherwise you will find the level generated for the loaded layout. + +#### Layout without level sequences + +In the layout folder, you will find the level with the imported layout and an object of *AssetContainer* type. The latter is there to mark this directory as Avalon Container (to track changes) and to hold OpenPype metadata. + +![Unreal Layout Loading No Sequence](assets/unreal_layout_loading_no_sequence.png) + +The layout level will and should contain only the data included in the layout. To add lighting, or other elements, like an environment, you have to create a master level, and add the layout level as a [streaming level](https://docs.unrealengine.com/5.0/en-US/level-streaming-in-unreal-engine/). + +Create the master level and open it. Then, open the *Levels* window (from the menu **Windows → Levels**). Click on **Levels → Add Existing** and select the layout level and the other levels you with to include in the scene. The following example shows a master level in which have been added a light level and the layout level. + +![Unreal Add Level](assets/unreal_add_level.png) +![Unreal Level List](assets/unreal_level_list_no_sequences.png) + +#### Layout with level sequences + +In the episode folder, you will find the master level for the episode, the master level sequence and the folders for all the scenes in the episodes. + +After opening the master level, open the *Levels* window (from the menu **Windows → Levels**), and you will see the list of the levels of each shot of the episode for which a layout has been loaded. + +![Unreal Level List](assets/unreal_level_list.png) + +If it has not been added already, you will need to add the environment to the level. Click on **Levels → Add Existing** and select the level with the environment (check with the studio where it is located). + +![Unreal Add Level](assets/unreal_add_level.png) + +After adding the environment level to the master level, you will need to set it as always loaded by right clicking it, and selecting **Change Streaming Method** and selecting **Always Loaded**. + +![Unreal Level Streaming Method](assets/unreal_level_streaming_method.png) + +### Update layouts + +To manage loaded layouts, click on the OpenPype icon in Unreal’s main taskbar, and select **Manage**. + +![Unreal OP Tools Manage](assets/unreal_openpype_tools_manage.png) + +You will get a list of all the assets that have been loaded in the project. +The version number will be in red if it isn’t the latest version. Right click on the element, and select Update if you need to update the layout. + +:::note +**DO NOT** update rigs or models imported with a layout. Update only the layout. +::: + +## Rendering + +:::note +The rendering requires a layout loaded with the option to create the level sequences **on**. +::: + +To render and publish an episode, a scene or a shot, you will need to create a publish instance. The publish instance for the rendering is based on one level sequence. That means that if you want to render the whole episode, you will need to create it for the level sequence of the episode, but if you want to render just one shot, you will need to create it for that shot. + +Navigate to the folder that contains the level sequence that you need to render. Select the level sequence, and then click on the OpenPype icon in Unreal’s main taskbar, and select **Create**. + +![Unreal OP Tools Create](assets/unreal_openpype_tools_create.png) + +In the Instance Creator, select **Unreal - Render**, give it a name, and click **Create**. + +![Unreal OP Instance Creator](assets/unreal_create_render.png) + +The render instance will be created in `/Content/OpenPype/PublishInstances`. + +Select the instance you need to render, and then click on the OpenPype icon in Unreal’s main taskbar, and select **Render**. You can render more than one instance at a time, if needed. Just select all the instances that you need to render before selecting the **Render** button from the OpenPype menu. + +![Unreal OP Tools Render](assets/unreal_openpype_tools_render.png) + +Once the render is finished, click on the OpenPype icon in Unreal’s main taskbar, and select **Publish**. + +![Unreal OP Tools Publish](assets/unreal_openpype_tools_publish.png) + +On the left, you will see the render instances. They will be automatically reorganised to have an instance for each shot. So, for example, if you have created the render instance for the whole episode, here you will have an instance for each shot in the episode. + +![Unreal Publish Render](assets/unreal_publish_render.png) + +Click on the play button in the bottom right, and it will start the publishing process. diff --git a/website/docs/assets/maya-build_workfile_from_template.png b/website/docs/assets/maya-build_workfile_from_template.png new file mode 100644 index 0000000000..7ef87861fe Binary files /dev/null and b/website/docs/assets/maya-build_workfile_from_template.png differ diff --git a/website/docs/assets/maya-create_placeholder.png b/website/docs/assets/maya-create_placeholder.png new file mode 100644 index 0000000000..3f49fe2e2b Binary files /dev/null and b/website/docs/assets/maya-create_placeholder.png differ diff --git a/website/docs/assets/maya-placeholder_new.png b/website/docs/assets/maya-placeholder_new.png new file mode 100644 index 0000000000..106a5275cd Binary files /dev/null and b/website/docs/assets/maya-placeholder_new.png differ diff --git a/website/docs/assets/maya-workfile-outliner.png b/website/docs/assets/maya-workfile-outliner.png new file mode 100644 index 0000000000..fbd1bbd03b Binary files /dev/null and b/website/docs/assets/maya-workfile-outliner.png differ diff --git a/website/docs/assets/settings/template_build_workfile.png b/website/docs/assets/settings/template_build_workfile.png new file mode 100644 index 0000000000..1bea5b01f5 Binary files /dev/null and b/website/docs/assets/settings/template_build_workfile.png differ diff --git a/website/docs/assets/settings_dev.png b/website/docs/assets/settings_dev.png new file mode 100644 index 0000000000..4d0359461e Binary files /dev/null and b/website/docs/assets/settings_dev.png differ diff --git a/website/docs/assets/unreal-avalon_tools.jpg b/website/docs/assets/unreal-avalon_tools.jpg deleted file mode 100644 index 531fbe516a..0000000000 Binary files a/website/docs/assets/unreal-avalon_tools.jpg and /dev/null differ diff --git a/website/docs/assets/unreal-container.jpg b/website/docs/assets/unreal-container.jpg deleted file mode 100644 index f0c0a61e95..0000000000 Binary files a/website/docs/assets/unreal-container.jpg and /dev/null differ diff --git a/website/docs/assets/unreal_add_level.png b/website/docs/assets/unreal_add_level.png new file mode 100644 index 0000000000..caeef03d10 Binary files /dev/null and b/website/docs/assets/unreal_add_level.png differ diff --git a/website/docs/assets/unreal_container.jpg b/website/docs/assets/unreal_container.jpg new file mode 100644 index 0000000000..0fda640b00 Binary files /dev/null and b/website/docs/assets/unreal_container.jpg differ diff --git a/website/docs/assets/unreal_create_render.png b/website/docs/assets/unreal_create_render.png new file mode 100644 index 0000000000..2e3ef20b35 Binary files /dev/null and b/website/docs/assets/unreal_create_render.png differ diff --git a/website/docs/assets/unreal_layout_loading_no_sequence.png b/website/docs/assets/unreal_layout_loading_no_sequence.png new file mode 100644 index 0000000000..ed05d77f53 Binary files /dev/null and b/website/docs/assets/unreal_layout_loading_no_sequence.png differ diff --git a/website/docs/assets/unreal_layout_loading_result.png b/website/docs/assets/unreal_layout_loading_result.png new file mode 100644 index 0000000000..55b329110b Binary files /dev/null and b/website/docs/assets/unreal_layout_loading_result.png differ diff --git a/website/docs/assets/unreal_level_list.png b/website/docs/assets/unreal_level_list.png new file mode 100644 index 0000000000..2fc0c1bfc7 Binary files /dev/null and b/website/docs/assets/unreal_level_list.png differ diff --git a/website/docs/assets/unreal_level_list_no_sequences.png b/website/docs/assets/unreal_level_list_no_sequences.png new file mode 100644 index 0000000000..7ed912b68b Binary files /dev/null and b/website/docs/assets/unreal_level_list_no_sequences.png differ diff --git a/website/docs/assets/unreal_level_streaming_method.png b/website/docs/assets/unreal_level_streaming_method.png new file mode 100644 index 0000000000..8f817abd2e Binary files /dev/null and b/website/docs/assets/unreal_level_streaming_method.png differ diff --git a/website/docs/assets/unreal_level_streaming_method_no_sequences.png b/website/docs/assets/unreal_level_streaming_method_no_sequences.png new file mode 100644 index 0000000000..77a2754ded Binary files /dev/null and b/website/docs/assets/unreal_level_streaming_method_no_sequences.png differ diff --git a/website/docs/assets/unreal_load_layout.png b/website/docs/assets/unreal_load_layout.png new file mode 100644 index 0000000000..ffad60ae9b Binary files /dev/null and b/website/docs/assets/unreal_load_layout.png differ diff --git a/website/docs/assets/unreal_load_layout_batch.png b/website/docs/assets/unreal_load_layout_batch.png new file mode 100644 index 0000000000..dd2f2f3e8f Binary files /dev/null and b/website/docs/assets/unreal_load_layout_batch.png differ diff --git a/website/docs/assets/unreal_openpype_tools.png b/website/docs/assets/unreal_openpype_tools.png new file mode 100644 index 0000000000..bf7d850ab2 Binary files /dev/null and b/website/docs/assets/unreal_openpype_tools.png differ diff --git a/website/docs/assets/unreal_openpype_tools_create.png b/website/docs/assets/unreal_openpype_tools_create.png new file mode 100644 index 0000000000..9cfb95f2a1 Binary files /dev/null and b/website/docs/assets/unreal_openpype_tools_create.png differ diff --git a/website/docs/assets/unreal_openpype_tools_load.png b/website/docs/assets/unreal_openpype_tools_load.png new file mode 100644 index 0000000000..4909feac3b Binary files /dev/null and b/website/docs/assets/unreal_openpype_tools_load.png differ diff --git a/website/docs/assets/unreal_openpype_tools_manage.png b/website/docs/assets/unreal_openpype_tools_manage.png new file mode 100644 index 0000000000..af7b182842 Binary files /dev/null and b/website/docs/assets/unreal_openpype_tools_manage.png differ diff --git a/website/docs/assets/unreal_openpype_tools_publish.png b/website/docs/assets/unreal_openpype_tools_publish.png new file mode 100644 index 0000000000..ab4c10c4ca Binary files /dev/null and b/website/docs/assets/unreal_openpype_tools_publish.png differ diff --git a/website/docs/assets/unreal_openpype_tools_render.png b/website/docs/assets/unreal_openpype_tools_render.png new file mode 100644 index 0000000000..377dc2951e Binary files /dev/null and b/website/docs/assets/unreal_openpype_tools_render.png differ diff --git a/website/docs/assets/unreal_publish_render.png b/website/docs/assets/unreal_publish_render.png new file mode 100644 index 0000000000..674b0ac30e Binary files /dev/null and b/website/docs/assets/unreal_publish_render.png differ diff --git a/website/docs/assets/unreal_setting_level_sequence.png b/website/docs/assets/unreal_setting_level_sequence.png new file mode 100644 index 0000000000..5a8adc6257 Binary files /dev/null and b/website/docs/assets/unreal_setting_level_sequence.png differ diff --git a/website/docs/changelog.md b/website/docs/changelog.md deleted file mode 100644 index 448592b930..0000000000 --- a/website/docs/changelog.md +++ /dev/null @@ -1,1138 +0,0 @@ ---- -id: changelog -title: Changelog -sidebar_label: Changelog ---- - -## [2.18.0](https://github.com/pypeclub/openpype/tree/2.18.0) -_**release date:** (2021-05-18)_ - -[Full Changelog](https://github.com/pypeclub/openpype/compare/2.17.3...2.18.0) - -**Enhancements:** - -- Use SubsetLoader and multiple contexts for delete_old_versions [\#1484](ttps://github.com/pypeclub/OpenPype/pull/1484)) -- TVPaint: Increment workfile version on successful publish. [\#1489](https://github.com/pypeclub/OpenPype/pull/1489) -- Maya: Use of multiple deadline servers [\#1483](https://github.com/pypeclub/OpenPype/pull/1483) - -**Fixed bugs:** - -- Use instance frame start instead of timeline. [\#1486](https://github.com/pypeclub/OpenPype/pull/1486) -- Maya: Redshift - set proper start frame on proxy [\#1480](https://github.com/pypeclub/OpenPype/pull/1480) -- Maya: wrong collection of playblasted frames [\#1517](https://github.com/pypeclub/OpenPype/pull/1517) -- Existing subsets hints in creator [\#1502](https://github.com/pypeclub/OpenPype/pull/1502) - - -### [2.17.3](https://github.com/pypeclub/openpype/tree/2.17.3) -_**release date:** (2021-05-06)_ - -[Full Changelog](https://github.com/pypeclub/openpype/compare/CI/3.0.0-rc.3...2.17.3) - -**Fixed bugs:** - -- Nuke: workfile version synced to db version always [\#1479](https://github.com/pypeclub/OpenPype/pull/1479) - -### [2.17.2](https://github.com/pypeclub/openpype/tree/2.17.2) -_**release date:** (2021-05-04)_ - -[Full Changelog](https://github.com/pypeclub/openpype/compare/CI/3.0.0-rc.1...2.17.2) - -**Enhancements:** - -- Forward/Backward compatible apps and tools with OpenPype 3 [\#1463](https://github.com/pypeclub/OpenPype/pull/1463) - -### [2.17.1](https://github.com/pypeclub/openpype/tree/2.17.1) -_**release date:** (2021-04-30)_ - -[Full Changelog](https://github.com/pypeclub/openpype/compare/2.17.0...2.17.1) - -**Enhancements:** - -- Faster settings UI loading [\#1442](https://github.com/pypeclub/OpenPype/pull/1442) -- Nuke: deadline submission with gpu [\#1414](https://github.com/pypeclub/OpenPype/pull/1414) -- TVPaint frame range definition [\#1424](https://github.com/pypeclub/OpenPype/pull/1424) -- PS - group all published instances [\#1415](https://github.com/pypeclub/OpenPype/pull/1415) -- Add task name to context pop up. [\#1383](https://github.com/pypeclub/OpenPype/pull/1383) -- Enhance review letterbox feature. [\#1371](https://github.com/pypeclub/OpenPype/pull/1371) -- AE add duration validation [\#1363](https://github.com/pypeclub/OpenPype/pull/1363) - -**Fixed bugs:** - -- Houdini menu filename [\#1417](https://github.com/pypeclub/OpenPype/pull/1417) -- Nuke: fixing undo for loaded mov and sequence [\#1433](https://github.com/pypeclub/OpenPype/pull/1433) -- AE - validation for duration was 1 frame shorter [\#1426](https://github.com/pypeclub/OpenPype/pull/1426) - -**Merged pull requests:** - -- Maya: Vray - problem getting all file nodes for look publishing [\#1399](https://github.com/pypeclub/OpenPype/pull/1399) -- Maya: Support for Redshift proxies [\#1360](https://github.com/pypeclub/OpenPype/pull/1360) - -## [2.17.0](https://github.com/pypeclub/openpype/tree/2.17.0) -_**release date:** (2021-04-20)_ - -[Full Changelog](https://github.com/pypeclub/openpype/compare/CI/3.0.0-beta.2...2.17.0) - -**Enhancements:** - -- Forward compatible ftrack group [\#1243](https://github.com/pypeclub/OpenPype/pull/1243) -- Maya: Make tx option configurable with presets [\#1328](https://github.com/pypeclub/OpenPype/pull/1328) -- TVPaint asset name validation [\#1302](https://github.com/pypeclub/OpenPype/pull/1302) -- TV Paint: Set initial project settings. [\#1299](https://github.com/pypeclub/OpenPype/pull/1299) -- TV Paint: Validate mark in and out. [\#1298](https://github.com/pypeclub/OpenPype/pull/1298) -- Validate project settings [\#1297](https://github.com/pypeclub/OpenPype/pull/1297) -- After Effects: added SubsetManager [\#1234](https://github.com/pypeclub/OpenPype/pull/1234) -- Show error message in pyblish UI [\#1206](https://github.com/pypeclub/OpenPype/pull/1206) - -**Fixed bugs:** - -- Hiero: fixing source frame from correct object [\#1362](https://github.com/pypeclub/OpenPype/pull/1362) -- Nuke: fix colourspace, prerenders and nuke panes opening [\#1308](https://github.com/pypeclub/OpenPype/pull/1308) -- AE remove orphaned instance from workfile - fix self.stub [\#1282](https://github.com/pypeclub/OpenPype/pull/1282) -- Nuke: deadline submission with search replaced env values from preset [\#1194](https://github.com/pypeclub/OpenPype/pull/1194) -- Ftrack custom attributes in bulks [\#1312](https://github.com/pypeclub/OpenPype/pull/1312) -- Ftrack optional pypclub role [\#1303](https://github.com/pypeclub/OpenPype/pull/1303) -- After Effects: remove orphaned instances [\#1275](https://github.com/pypeclub/OpenPype/pull/1275) -- Avalon schema names [\#1242](https://github.com/pypeclub/OpenPype/pull/1242) -- Handle duplication of Task name [\#1226](https://github.com/pypeclub/OpenPype/pull/1226) -- Modified path of plugin loads for Harmony and TVPaint [\#1217](https://github.com/pypeclub/OpenPype/pull/1217) -- Regex checks in profiles filtering [\#1214](https://github.com/pypeclub/OpenPype/pull/1214) -- Update custom ftrack session attributes [\#1202](https://github.com/pypeclub/OpenPype/pull/1202) -- Nuke: write node colorspace ignore `default\(\)` label [\#1199](https://github.com/pypeclub/OpenPype/pull/1199) - -## [2.16.0](https://github.com/pypeclub/pype/tree/2.16.0) - - _**release date:** 2021-03-22_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.15.3...2.16.0) - -**Enhancements:** - -- Nuke: deadline submit limit group filter [\#1167](https://github.com/pypeclub/pype/pull/1167) -- Maya: support for Deadline Group and Limit Groups - backport 2.x [\#1156](https://github.com/pypeclub/pype/pull/1156) -- Maya: fixes for Redshift support [\#1152](https://github.com/pypeclub/pype/pull/1152) -- Nuke: adding preset for a Read node name to all img and mov Loaders [\#1146](https://github.com/pypeclub/pype/pull/1146) -- nuke deadline submit with environ var from presets overrides [\#1142](https://github.com/pypeclub/pype/pull/1142) -- Change timers after task change [\#1138](https://github.com/pypeclub/pype/pull/1138) -- Nuke: shortcuts for Pype menu [\#1127](https://github.com/pypeclub/pype/pull/1127) -- Nuke: workfile template [\#1124](https://github.com/pypeclub/pype/pull/1124) -- Sites local settings by site name [\#1117](https://github.com/pypeclub/pype/pull/1117) -- Reset loader's asset selection on context change [\#1106](https://github.com/pypeclub/pype/pull/1106) -- Bulk mov render publishing [\#1101](https://github.com/pypeclub/pype/pull/1101) -- Photoshop: mark publishable instances [\#1093](https://github.com/pypeclub/pype/pull/1093) -- Added ability to define BG color for extract review [\#1088](https://github.com/pypeclub/pype/pull/1088) -- TVPaint extractor enhancement [\#1080](https://github.com/pypeclub/pype/pull/1080) -- Photoshop: added support for .psb in workfiles [\#1078](https://github.com/pypeclub/pype/pull/1078) -- Optionally add task to subset name [\#1072](https://github.com/pypeclub/pype/pull/1072) -- Only extend clip range when collecting. [\#1008](https://github.com/pypeclub/pype/pull/1008) -- Collect audio for farm reviews. [\#1073](https://github.com/pypeclub/pype/pull/1073) - - -**Fixed bugs:** - -- Fix path spaces in jpeg extractor [\#1174](https://github.com/pypeclub/pype/pull/1174) -- Maya: Bugfix: superclass for CreateCameraRig [\#1166](https://github.com/pypeclub/pype/pull/1166) -- Maya: Submit to Deadline - fix typo in condition [\#1163](https://github.com/pypeclub/pype/pull/1163) -- Avoid dot in repre extension [\#1125](https://github.com/pypeclub/pype/pull/1125) -- Fix versions variable usage in standalone publisher [\#1090](https://github.com/pypeclub/pype/pull/1090) -- Collect instance data fix subset query [\#1082](https://github.com/pypeclub/pype/pull/1082) -- Fix getting the camera name. [\#1067](https://github.com/pypeclub/pype/pull/1067) -- Nuke: Ensure "NUKE\_TEMP\_DIR" is not part of the Deadline job environment. [\#1064](https://github.com/pypeclub/pype/pull/1064) - -### [2.15.3](https://github.com/pypeclub/pype/tree/2.15.3) - - _**release date:** 2021-02-26_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.15.2...2.15.3) - -**Enhancements:** - -- Maya: speedup renderable camera collection [\#1053](https://github.com/pypeclub/pype/pull/1053) -- Harmony - add regex search to filter allowed task names for collectin… [\#1047](https://github.com/pypeclub/pype/pull/1047) - -**Fixed bugs:** - -- Ftrack integrate hierarchy fix [\#1085](https://github.com/pypeclub/pype/pull/1085) -- Explicit subset filter in anatomy instance data [\#1059](https://github.com/pypeclub/pype/pull/1059) -- TVPaint frame offset [\#1057](https://github.com/pypeclub/pype/pull/1057) -- Auto fix unicode strings [\#1046](https://github.com/pypeclub/pype/pull/1046) - -### [2.15.2](https://github.com/pypeclub/pype/tree/2.15.2) - - _**release date:** 2021-02-19_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.15.1...2.15.2) - -**Enhancements:** - -- Maya: Vray scene publishing [\#1013](https://github.com/pypeclub/pype/pull/1013) - -**Fixed bugs:** - -- Fix entity move under project [\#1040](https://github.com/pypeclub/pype/pull/1040) -- smaller nuke fixes from production [\#1036](https://github.com/pypeclub/pype/pull/1036) -- TVPaint thumbnail extract fix [\#1031](https://github.com/pypeclub/pype/pull/1031) - -### [2.15.1](https://github.com/pypeclub/pype/tree/2.15.1) - - _**release date:** 2021-02-12_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.15.0...2.15.1) - -**Enhancements:** - -- Delete version as loader action [\#1011](https://github.com/pypeclub/pype/pull/1011) -- Delete old versions [\#445](https://github.com/pypeclub/pype/pull/445) - -**Fixed bugs:** - -- PS - remove obsolete functions from pywin32 [\#1006](https://github.com/pypeclub/pype/pull/1006) -- Clone description of review session objects. [\#922](https://github.com/pypeclub/pype/pull/922) - -## [2.15.0](https://github.com/pypeclub/pype/tree/2.15.0) - - _**release date:** 2021-02-09_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.14.6...2.15.0) - -**Enhancements:** - -- Resolve - loading and updating clips [\#932](https://github.com/pypeclub/pype/pull/932) -- Release/2.15.0 [\#926](https://github.com/pypeclub/pype/pull/926) -- Photoshop: add option for template.psd and prelaunch hook [\#894](https://github.com/pypeclub/pype/pull/894) -- Nuke: deadline presets [\#993](https://github.com/pypeclub/pype/pull/993) -- Maya: Alembic only set attributes that exists. [\#986](https://github.com/pypeclub/pype/pull/986) -- Harmony: render local and handle fixes [\#981](https://github.com/pypeclub/pype/pull/981) -- PSD Bulk export of ANIM group [\#965](https://github.com/pypeclub/pype/pull/965) -- AE - added prelaunch hook for opening last or workfile from template [\#944](https://github.com/pypeclub/pype/pull/944) -- PS - safer handling of loading of workfile [\#941](https://github.com/pypeclub/pype/pull/941) -- Maya: Handling Arnold referenced AOVs [\#938](https://github.com/pypeclub/pype/pull/938) -- TVPaint: switch layer IDs for layer names during identification [\#903](https://github.com/pypeclub/pype/pull/903) -- TVPaint audio/sound loader [\#893](https://github.com/pypeclub/pype/pull/893) -- Clone review session with children. [\#891](https://github.com/pypeclub/pype/pull/891) -- Simple compositing data packager for freelancers [\#884](https://github.com/pypeclub/pype/pull/884) -- Harmony deadline submission [\#881](https://github.com/pypeclub/pype/pull/881) -- Maya: Optionally hide image planes from reviews. [\#840](https://github.com/pypeclub/pype/pull/840) -- Maya: handle referenced AOVs for Vray [\#824](https://github.com/pypeclub/pype/pull/824) -- DWAA/DWAB support on windows [\#795](https://github.com/pypeclub/pype/pull/795) -- Unreal: animation, layout and setdress updates [\#695](https://github.com/pypeclub/pype/pull/695) - -**Fixed bugs:** - -- Maya: Looks - disable hardlinks [\#995](https://github.com/pypeclub/pype/pull/995) -- Fix Ftrack custom attribute update [\#982](https://github.com/pypeclub/pype/pull/982) -- Prores ks in burnin script [\#960](https://github.com/pypeclub/pype/pull/960) -- terminal.py crash on import [\#839](https://github.com/pypeclub/pype/pull/839) -- Extract review handle bizarre pixel aspect ratio [\#990](https://github.com/pypeclub/pype/pull/990) -- Nuke: add nuke related env var to sumbission [\#988](https://github.com/pypeclub/pype/pull/988) -- Nuke: missing preset's variable [\#984](https://github.com/pypeclub/pype/pull/984) -- Get creator by name fix [\#979](https://github.com/pypeclub/pype/pull/979) -- Fix update of project's tasks on Ftrack sync [\#972](https://github.com/pypeclub/pype/pull/972) -- nuke: wrong frame offset in mov loader [\#971](https://github.com/pypeclub/pype/pull/971) -- Create project structure action fix multiroot [\#967](https://github.com/pypeclub/pype/pull/967) -- PS: remove pywin installation from hook [\#964](https://github.com/pypeclub/pype/pull/964) -- Prores ks in burnin script [\#959](https://github.com/pypeclub/pype/pull/959) -- Subset family is now stored in subset document [\#956](https://github.com/pypeclub/pype/pull/956) -- DJV new version arguments [\#954](https://github.com/pypeclub/pype/pull/954) -- TV Paint: Fix single frame Sequence [\#953](https://github.com/pypeclub/pype/pull/953) -- nuke: missing `file` knob update [\#933](https://github.com/pypeclub/pype/pull/933) -- Photoshop: Create from single layer was failing [\#920](https://github.com/pypeclub/pype/pull/920) -- Nuke: baking mov with correct colorspace inherited from write [\#909](https://github.com/pypeclub/pype/pull/909) -- Launcher fix actions discover [\#896](https://github.com/pypeclub/pype/pull/896) -- Get the correct file path for the updated mov. [\#889](https://github.com/pypeclub/pype/pull/889) -- Maya: Deadline submitter - shared data access violation [\#831](https://github.com/pypeclub/pype/pull/831) -- Maya: Take into account vray master AOV switch [\#822](https://github.com/pypeclub/pype/pull/822) - -**Merged pull requests:** - -- Refactor blender to 3.0 format [\#934](https://github.com/pypeclub/pype/pull/934) - -### [2.14.6](https://github.com/pypeclub/pype/tree/2.14.6) - - _**release date:** 2021-01-15_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.14.5...2.14.6) - -**Fixed bugs:** - -- Nuke: improving of hashing path [\#885](https://github.com/pypeclub/pype/pull/885) - -**Merged pull requests:** - -- Hiero: cut videos with correct secons [\#892](https://github.com/pypeclub/pype/pull/892) -- Faster sync to avalon preparation [\#869](https://github.com/pypeclub/pype/pull/869) - -### [2.14.5](https://github.com/pypeclub/pype/tree/2.14.5) - - _**release date:** 2021-01-06_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.14.4...2.14.5) - -**Merged pull requests:** - -- Pype logger refactor [\#866](https://github.com/pypeclub/pype/pull/866) - -### [2.14.4](https://github.com/pypeclub/pype/tree/2.14.4) - - _**release date:** 2020-12-18_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.14.3...2.14.4) - -**Merged pull requests:** - -- Fix - AE - added explicit cast to int [\#837](https://github.com/pypeclub/pype/pull/837) - -### [2.14.3](https://github.com/pypeclub/pype/tree/2.14.3) - - _**release date:** 2020-12-16_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.14.2...2.14.3) - -**Fixed bugs:** - -- TVPaint repair invalid metadata [\#809](https://github.com/pypeclub/pype/pull/809) -- Feature/push hier value to nonhier action [\#807](https://github.com/pypeclub/pype/pull/807) -- Harmony: fix palette and image sequence loader [\#806](https://github.com/pypeclub/pype/pull/806) - -**Merged pull requests:** - -- respecting space in path [\#823](https://github.com/pypeclub/pype/pull/823) - -### [2.14.2](https://github.com/pypeclub/pype/tree/2.14.2) - - _**release date:** 2020-12-04_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.14.1...2.14.2) - -**Enhancements:** - -- Collapsible wrapper in settings [\#767](https://github.com/pypeclub/pype/pull/767) - -**Fixed bugs:** - -- Harmony: template extraction and palettes thumbnails on mac [\#768](https://github.com/pypeclub/pype/pull/768) -- TVPaint store context to workfile metadata \(764\) [\#766](https://github.com/pypeclub/pype/pull/766) -- Extract review audio cut fix [\#763](https://github.com/pypeclub/pype/pull/763) - -**Merged pull requests:** - -- AE: fix publish after background load [\#781](https://github.com/pypeclub/pype/pull/781) -- TVPaint store members key [\#769](https://github.com/pypeclub/pype/pull/769) - -### [2.14.1](https://github.com/pypeclub/pype/tree/2.14.1) - - _**release date:** 2020-11-27_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.14.0...2.14.1) - -**Enhancements:** - -- Settings required keys in modifiable dict [\#770](https://github.com/pypeclub/pype/pull/770) -- Extract review may not add audio to output [\#761](https://github.com/pypeclub/pype/pull/761) - -**Fixed bugs:** - -- After Effects: frame range, file format and render source scene fixes [\#760](https://github.com/pypeclub/pype/pull/760) -- Hiero: trimming review with clip event number [\#754](https://github.com/pypeclub/pype/pull/754) -- TVPaint: fix updating of loaded subsets [\#752](https://github.com/pypeclub/pype/pull/752) -- Maya: Vray handling of default aov [\#748](https://github.com/pypeclub/pype/pull/748) -- Maya: multiple renderable cameras in layer didn't work [\#744](https://github.com/pypeclub/pype/pull/744) -- Ftrack integrate custom attributes fix [\#742](https://github.com/pypeclub/pype/pull/742) - - - -## [2.14.0](https://github.com/pypeclub/pype/tree/2.14.0) - - _**release date:** 2020-11-24_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.13.7...2.14.0) - -**Enhancements:** - -- Ftrack: Event for syncing shot or asset status with tasks.[\#736](https://github.com/pypeclub/pype/pull/736) -- Maya: add camera rig publishing option [\#721](https://github.com/pypeclub/pype/pull/721) -- Maya: Ask user to select non-default camera from scene or create a new. [\#678](https://github.com/pypeclub/pype/pull/678) -- Maya: Camera name can be added to burnins. [\#674](https://github.com/pypeclub/pype/pull/674) -- Sort instances by label in pyblish gui [\#719](https://github.com/pypeclub/pype/pull/719) -- Synchronize ftrack hierarchical and shot attributes [\#716](https://github.com/pypeclub/pype/pull/716) -- Standalone Publisher: Publish editorial from separate image sequences [\#699](https://github.com/pypeclub/pype/pull/699) -- Render publish plugins abstraction [\#687](https://github.com/pypeclub/pype/pull/687) -- TV Paint: image loader with options [\#675](https://github.com/pypeclub/pype/pull/675) -- **TV Paint (Beta):** initial implementation of creators and local rendering [\#693](https://github.com/pypeclub/pype/pull/693) -- **After Effects (Beta):** base integration with loaders [\#667](https://github.com/pypeclub/pype/pull/667) -- Harmony: Javascript refactoring and overall stability improvements [\#666](https://github.com/pypeclub/pype/pull/666) - -**Fixed bugs:** - -- TVPaint: extract review fix [\#740](https://github.com/pypeclub/pype/pull/740) -- After Effects: Review were not being sent to ftrack [\#738](https://github.com/pypeclub/pype/pull/738) -- Maya: vray proxy was not loading [\#722](https://github.com/pypeclub/pype/pull/722) -- Maya: Vray expected file fixes [\#682](https://github.com/pypeclub/pype/pull/682) - -**Deprecated:** - -- Removed artist view from pyblish gui [\#717](https://github.com/pypeclub/pype/pull/717) -- Maya: disable legacy override check for cameras [\#715](https://github.com/pypeclub/pype/pull/715) - - - - -### [2.13.7](https://github.com/pypeclub/pype/tree/2.13.7) - - _**release date:** 2020-11-19_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.13.6...2.13.7) - -**Merged pull requests:** - -- fix\(SP\): getting fps from context instead of nonexistent entity [\#729](https://github.com/pypeclub/pype/pull/729) - - - - -### [2.13.6](https://github.com/pypeclub/pype/tree/2.13.6) - - _**release date:** 2020-11-15_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.13.5...2.13.6) - -**Fixed bugs:** - -- Maya workfile version wasn't syncing with renders properly [\#711](https://github.com/pypeclub/pype/pull/711) -- Maya: Fix for publishing multiple cameras with review from the same scene [\#710](https://github.com/pypeclub/pype/pull/710) - - - - -### [2.13.5](https://github.com/pypeclub/pype/tree/2.13.5) - - _**release date:** 2020-11-12_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.13.4...2.13.5) - - -**Fixed bugs:** - -- Wrong thumbnail file was picked when publishing sequence in standalone publisher [\#703](https://github.com/pypeclub/pype/pull/703) -- Fix: Burnin data pass and FFmpeg tool check [\#701](https://github.com/pypeclub/pype/pull/701) - - - - -### [2.13.4](https://github.com/pypeclub/pype/tree/2.13.4) - - _**release date:** 2020-11-09_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.13.3...2.13.4) - - -**Fixed bugs:** - -- Photoshop unhiding hidden layers [\#688](https://github.com/pypeclub/pype/issues/688) -- Nuke: Favorite directories "shot dir" "project dir" - not working \#684 [\#685](https://github.com/pypeclub/pype/pull/685) - - - - - -### [2.13.3](https://github.com/pypeclub/pype/tree/2.13.3) - - _**release date:** _2020-11-03_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.13.2...2.13.3) - -**Fixed bugs:** - -- Fix ffmpeg executable path with spaces [\#680](https://github.com/pypeclub/pype/pull/680) -- Hotfix: Added default version number [\#679](https://github.com/pypeclub/pype/pull/679) - - - - -### [2.13.2](https://github.com/pypeclub/pype/tree/2.13.2) - - _**release date:** 2020-10-28_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.13.1...2.13.2) - -**Fixed bugs:** - -- Nuke: wrong conditions when fixing legacy write nodes [\#665](https://github.com/pypeclub/pype/pull/665) - - - - -### [2.13.1](https://github.com/pypeclub/pype/tree/2.13.1) - - _**release date:** 2020-10-23_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.13.0...2.13.1) - -**Fixed bugs:** - -- Photoshop: Layer name is not propagating to metadata [\#654](https://github.com/pypeclub/pype/issues/654) -- Photoshop: Loader in fails with "can't set attribute" [\#650](https://github.com/pypeclub/pype/issues/650) -- Hiero: Review video file adding one frame to the end [\#659](https://github.com/pypeclub/pype/issues/659) - - - -## [2.13.0](https://github.com/pypeclub/pype/tree/2.13.0) - - _**release date:** 2020-10-16_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.12.5...2.13.0) - -**Enhancements:** - -- Deadline Output Folder [\#636](https://github.com/pypeclub/pype/issues/636) -- Nuke Camera Loader [\#565](https://github.com/pypeclub/pype/issues/565) -- Deadline publish job shows publishing output folder [\#649](https://github.com/pypeclub/pype/pull/649) -- Get latest version in lib [\#642](https://github.com/pypeclub/pype/pull/642) -- Improved publishing of multiple representation from SP [\#638](https://github.com/pypeclub/pype/pull/638) -- TvPaint: launch shot work file from within Ftrack [\#631](https://github.com/pypeclub/pype/pull/631) -- Add mp4 support for RV action. [\#628](https://github.com/pypeclub/pype/pull/628) -- Maya: allow renders to have version synced with workfile [\#618](https://github.com/pypeclub/pype/pull/618) -- Renaming nukestudio host folder to hiero [\#617](https://github.com/pypeclub/pype/pull/617) -- Harmony: More efficient publishing [\#615](https://github.com/pypeclub/pype/pull/615) -- Ftrack server action improvement [\#608](https://github.com/pypeclub/pype/pull/608) -- Deadline user defaults to pype username if present [\#607](https://github.com/pypeclub/pype/pull/607) -- Standalone publisher now has icon [\#606](https://github.com/pypeclub/pype/pull/606) -- Nuke render write targeting knob improvement [\#603](https://github.com/pypeclub/pype/pull/603) -- Animated pyblish gui [\#602](https://github.com/pypeclub/pype/pull/602) -- Maya: Deadline - make use of asset dependencies optional [\#591](https://github.com/pypeclub/pype/pull/591) -- Nuke: Publishing, loading and updating alembic cameras [\#575](https://github.com/pypeclub/pype/pull/575) -- Maya: add look assigner to pype menu even if scriptsmenu is not available [\#573](https://github.com/pypeclub/pype/pull/573) -- Store task types in the database [\#572](https://github.com/pypeclub/pype/pull/572) -- Maya: Tiled EXRs to scanline EXRs render option [\#512](https://github.com/pypeclub/pype/pull/512) -- Fusion: basic integration refresh [\#452](https://github.com/pypeclub/pype/pull/452) - -**Fixed bugs:** - -- Burnin script did not propagate ffmpeg output [\#640](https://github.com/pypeclub/pype/issues/640) -- Pyblish-pype spacer in terminal wasn't transparent [\#646](https://github.com/pypeclub/pype/pull/646) -- Lib subprocess without logger [\#645](https://github.com/pypeclub/pype/pull/645) -- Nuke: prevent crash if we only have single frame in sequence [\#644](https://github.com/pypeclub/pype/pull/644) -- Burnin script logs better output [\#641](https://github.com/pypeclub/pype/pull/641) -- Missing audio on farm submission. [\#639](https://github.com/pypeclub/pype/pull/639) -- review from imagesequence error [\#633](https://github.com/pypeclub/pype/pull/633) -- Hiero: wrong order of fps clip instance data collecting [\#627](https://github.com/pypeclub/pype/pull/627) -- Add source for review instances. [\#625](https://github.com/pypeclub/pype/pull/625) -- Task processing in event sync [\#623](https://github.com/pypeclub/pype/pull/623) -- sync to avalon doesn t remove renamed task [\#619](https://github.com/pypeclub/pype/pull/619) -- Intent publish setting wasn't working with default value [\#562](https://github.com/pypeclub/pype/pull/562) -- Maya: Updating a look where the shader name changed, leaves the geo without a shader [\#514](https://github.com/pypeclub/pype/pull/514) - - -### [2.12.5](https://github.com/pypeclub/pype/tree/2.12.5) - -_**release date:** 2020-10-14_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.12.4...2.12.5) - -**Fixed Bugs:** - -- Harmony: Disable application launch logic [\#637](https://github.com/pypeclub/pype/pull/637) - -### [2.12.4](https://github.com/pypeclub/pype/tree/2.12.4) - -_**release date:** 2020-10-08_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.12.3...2.12.4) - -**Fixed bugs:** - -- Sync to avalon doesn't remove renamed task [\#605](https://github.com/pypeclub/pype/issues/605) - - -**Merged pull requests:** - -- NukeStudio: small fixes [\#622](https://github.com/pypeclub/pype/pull/622) -- NukeStudio: broken order of plugins [\#620](https://github.com/pypeclub/pype/pull/620) - -### [2.12.3](https://github.com/pypeclub/pype/tree/2.12.3) - -_**release date:** 2020-10-06_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.12.2...2.12.3) - -**Fixed bugs:** - -- Harmony: empty scene contamination [\#583](https://github.com/pypeclub/pype/issues/583) -- Edit publishing in SP doesn't respect shot selection for publishing [\#542](https://github.com/pypeclub/pype/issues/542) -- Pathlib breaks compatibility with python2 hosts [\#281](https://github.com/pypeclub/pype/issues/281) -- Maya: fix maya scene type preset exception [\#569](https://github.com/pypeclub/pype/pull/569) -- Standalone publisher editorial plugins interfering [\#580](https://github.com/pypeclub/pype/pull/580) - -### [2.12.2](https://github.com/pypeclub/pype/tree/2.12.2) - -_**release date:** 2020-09-25_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.12.1...2.12.2) - -**Fixed bugs:** - -- Harmony: Saving heavy scenes will crash [\#507](https://github.com/pypeclub/pype/issues/507) -- Extract review a representation name with `\*\_burnin` [\#388](https://github.com/pypeclub/pype/issues/388) -- Hierarchy data was not considering active instances [\#551](https://github.com/pypeclub/pype/pull/551) - -### [2.12.1](https://github.com/pypeclub/pype/tree/2.12.1) - -_**release date:** 2020-09-15_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.12.0...2.12.1) - -**Fixed bugs:** - -- dependency security alert ! [\#484](https://github.com/pypeclub/pype/issues/484) -- Maya: RenderSetup is missing update [\#106](https://github.com/pypeclub/pype/issues/106) -- \ extract effects creates new instance [\#78](https://github.com/pypeclub/pype/issues/78) - - - - -## [2.12.0](https://github.com/pypeclub/pype/tree/2.12.0) ## - -_**release date:** 09 Sept 2020_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.11.8...2.12.0) - -**Enhancements:** - -- Pype now uses less mongo connections [\#509](https://github.com/pypeclub/pype/pull/509) -- Nuke: adding image loader [\#499](https://github.com/pypeclub/pype/pull/499) -- Completely new application launcher [\#443](https://github.com/pypeclub/pype/pull/443) -- Maya: Optional skip review on renders. [\#441](https://github.com/pypeclub/pype/pull/441) -- Ftrack: Option to push status from task to latest version [\#440](https://github.com/pypeclub/pype/pull/440) -- Maya: Properly containerize image plane loads. [\#434](https://github.com/pypeclub/pype/pull/434) -- Option to keep the review files. [\#426](https://github.com/pypeclub/pype/pull/426) -- Maya: Isolate models during preview publishing [\#425](https://github.com/pypeclub/pype/pull/425) -- Ftrack attribute group is backwards compatible [\#418](https://github.com/pypeclub/pype/pull/418) -- Maya: Publishing of tile renderings on Deadline [\#398](https://github.com/pypeclub/pype/pull/398) -- Slightly better logging gui [\#383](https://github.com/pypeclub/pype/pull/383) -- Standalonepublisher: editorial family features expansion [\#411](https://github.com/pypeclub/pype/pull/411) - -**Fixed bugs:** - -- Maya: Fix tile order for Draft Tile Assembler [\#511](https://github.com/pypeclub/pype/pull/511) -- Remove extra dash [\#501](https://github.com/pypeclub/pype/pull/501) -- Fix: strip dot from repre names in single frame renders [\#498](https://github.com/pypeclub/pype/pull/498) -- Better handling of destination during integrating [\#485](https://github.com/pypeclub/pype/pull/485) -- Fix: allow thumbnail creation for single frame renders [\#460](https://github.com/pypeclub/pype/pull/460) -- added missing argument to launch\_application in ftrack app handler [\#453](https://github.com/pypeclub/pype/pull/453) -- Burnins: Copy bit rate of input video to match quality. [\#448](https://github.com/pypeclub/pype/pull/448) -- Standalone publisher is now independent from tray [\#442](https://github.com/pypeclub/pype/pull/442) -- Bugfix/empty enumerator attributes [\#436](https://github.com/pypeclub/pype/pull/436) -- Fixed wrong order of "other" category collapssing in publisher [\#435](https://github.com/pypeclub/pype/pull/435) -- Multiple reviews where being overwritten to one. [\#424](https://github.com/pypeclub/pype/pull/424) -- Cleanup plugin fail on instances without staging dir [\#420](https://github.com/pypeclub/pype/pull/420) -- deprecated -intra parameter in ffmpeg to new `-g` [\#417](https://github.com/pypeclub/pype/pull/417) -- Delivery action can now work with entered path [\#397](https://github.com/pypeclub/pype/pull/397) - - - - - -### [2.11.8](https://github.com/pypeclub/pype/tree/2.11.8) ## - -_**release date:** 27 Aug 2020_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.11.7...2.11.8) - -**Fixed bugs:** - -- pyblish pype - other group is collapsed before plugins are done [\#431](https://github.com/pypeclub/pype/issues/431) -- Alpha white edges in harmony on PNGs [\#412](https://github.com/pypeclub/pype/issues/412) -- harmony image loader picks wrong representations [\#404](https://github.com/pypeclub/pype/issues/404) -- Clockify crash when response contain symbol not allowed by UTF-8 [\#81](https://github.com/pypeclub/pype/issues/81) - - - - -### [2.11.7](https://github.com/pypeclub/pype/tree/2.11.7) ## - -_**release date:** 21 Aug 2020_ - - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.11.6...2.11.7) - -**Fixed bugs:** - -- Clean Up Baked Movie [\#369](https://github.com/pypeclub/pype/issues/369) -- celaction last workfile wasn't picked up correctly [\#459](https://github.com/pypeclub/pype/pull/459) - - - -### [2.11.5](https://github.com/pypeclub/pype/tree/2.11.5) ## - -_**release date:** 13 Aug 2020_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.11.4...2.11.5) - -**Enhancements:** - -- Standalone publisher now only groups sequence if the extension is known [\#439](https://github.com/pypeclub/pype/pull/439) - -**Fixed bugs:** - -- Logs have been disable for editorial by default to speed up publishing [\#433](https://github.com/pypeclub/pype/pull/433) -- Various fixes for celaction [\#430](https://github.com/pypeclub/pype/pull/430) -- Harmony: invalid variable scope in validate scene settings [\#428](https://github.com/pypeclub/pype/pull/428) -- Harmomny: new representation name for audio was not accepted [\#427](https://github.com/pypeclub/pype/pull/427) - - - - -### [2.11.3](https://github.com/pypeclub/pype/tree/2.11.3) ## - -_**release date:** 4 Aug 2020_ - -[Full Changelog](https://github.com/pypeclub/pype/compare/2.11.2...2.11.3) - -**Fixed bugs:** - -- Harmony: publishing performance issues [\#408](https://github.com/pypeclub/pype/pull/408) - - - - -## 2.11.0 ## - -_**release date:** 27 July 2020_ - -**new:** -- _(blender)_ namespace support [\#341](https://github.com/pypeclub/pype/pull/341) -- _(blender)_ start end frames [\#330](https://github.com/pypeclub/pype/pull/330) -- _(blender)_ camera asset [\#322](https://github.com/pypeclub/pype/pull/322) -- _(pype)_ toggle instances per family in pyblish GUI [\#320](https://github.com/pypeclub/pype/pull/320) -- _(pype)_ current release version is now shown in the tray menu [#379](https://github.com/pypeclub/pype/pull/379) - - -**improved:** -- _(resolve)_ tagging for publish [\#239](https://github.com/pypeclub/pype/issues/239) -- _(pype)_ Support publishing a subset of shots with standalone editorial [\#336](https://github.com/pypeclub/pype/pull/336) -- _(harmony)_ Basic support for palettes [\#324](https://github.com/pypeclub/pype/pull/324) -- _(photoshop)_ Flag outdated containers on startup and publish. [\#309](https://github.com/pypeclub/pype/pull/309) -- _(harmony)_ Flag Outdated containers [\#302](https://github.com/pypeclub/pype/pull/302) -- _(photoshop)_ Publish review [\#298](https://github.com/pypeclub/pype/pull/298) -- _(pype)_ Optional Last workfile launch [\#365](https://github.com/pypeclub/pype/pull/365) - - -**fixed:** -- _(premiere)_ workflow fixes [\#346](https://github.com/pypeclub/pype/pull/346) -- _(pype)_ pype-setup does not work with space in path [\#327](https://github.com/pypeclub/pype/issues/327) -- _(ftrack)_ Ftrack delete action cause circular error [\#206](https://github.com/pypeclub/pype/issues/206) -- _(nuke)_ Priority was forced to 50 [\#345](https://github.com/pypeclub/pype/pull/345) -- _(nuke)_ Fix ValidateNukeWriteKnobs [\#340](https://github.com/pypeclub/pype/pull/340) -- _(maya)_ If camera attributes are connected, we can ignore them. [\#339](https://github.com/pypeclub/pype/pull/339) -- _(pype)_ stop appending of tools environment to existing env [\#337](https://github.com/pypeclub/pype/pull/337) -- _(ftrack)_ Ftrack timeout needs to look at AVALON\_TIMEOUT [\#325](https://github.com/pypeclub/pype/pull/325) -- _(harmony)_ Only zip files are supported. [\#310](https://github.com/pypeclub/pype/pull/310) -- _(pype)_ hotfix/Fix event server mongo uri [\#305](https://github.com/pypeclub/pype/pull/305) -- _(photoshop)_ Subset was not named or validated correctly. [\#304](https://github.com/pypeclub/pype/pull/304) - - - - - -## 2.10.0 ## - -_**release date:** 17 June 2020_ - -**new:** -- _(harmony)_ **Toon Boom Harmony** has been greatly extended to support rigging, scene build, animation and rendering workflows. [#270](https://github.com/pypeclub/pype/issues/270) [#271](https://github.com/pypeclub/pype/issues/271) [#190](https://github.com/pypeclub/pype/issues/190) [#191](https://github.com/pypeclub/pype/issues/191) [#172](https://github.com/pypeclub/pype/issues/172) [#168](https://github.com/pypeclub/pype/issues/168) -- _(pype)_ Added support for rudimentary **edl publishing** into individual shots. [#265](https://github.com/pypeclub/pype/issues/265) -- _(celaction)_ Simple **Celaction** integration has been added with support for workfiles and rendering. [#255](https://github.com/pypeclub/pype/issues/255) -- _(maya)_ Support for multiple job types when submitting to the farm. We can now render Maya or Standalone render jobs for Vray and Arnold (limited support for arnold) [#204](https://github.com/pypeclub/pype/issues/204) -- _(photoshop)_ Added initial support for Photoshop [#232](https://github.com/pypeclub/pype/issues/232) - -**improved:** -- _(blender)_ Updated support for rigs and added support Layout family [#233](https://github.com/pypeclub/pype/issues/233) [#226](https://github.com/pypeclub/pype/issues/226) -- _(premiere)_ It is now possible to choose different storage root for workfiles of different task types. [#255](https://github.com/pypeclub/pype/issues/255) -- _(maya)_ Support for unmerged AOVs in Redshift multipart EXRs [#197](https://github.com/pypeclub/pype/issues/197) -- _(pype)_ Pype repository has been refactored in preparation for 3.0 release [#169](https://github.com/pypeclub/pype/issues/169) -- _(deadline)_ All file dependencies are now passed to deadline from maya to prevent premature start of rendering if caches or textures haven't been coppied over yet. [#195](https://github.com/pypeclub/pype/issues/195) -- _(nuke)_ Script validation can now be made optional. [#194](https://github.com/pypeclub/pype/issues/194) -- _(pype)_ Publishing can now be stopped at any time. [#194](https://github.com/pypeclub/pype/issues/194) - -**fix:** -- _(pype)_ Pyblish-lite has been integrated into pype repository, plus various publishing GUI fixes. [#274](https://github.com/pypeclub/pype/issues/274) [#275](https://github.com/pypeclub/pype/issues/275) [#268](https://github.com/pypeclub/pype/issues/268) [#227](https://github.com/pypeclub/pype/issues/227) [#238](https://github.com/pypeclub/pype/issues/238) -- _(maya)_ Alembic extractor was getting wrong frame range type in certain scenarios [#254](https://github.com/pypeclub/pype/issues/254) -- _(maya)_ Attaching a render to subset in maya was not passing validation in certain scenarios [#256](https://github.com/pypeclub/pype/issues/256) -- _(ftrack)_ Various small fixes to ftrack sync [#263](https://github.com/pypeclub/pype/issues/263) [#259](https://github.com/pypeclub/pype/issues/259) -- _(maya)_ Look extraction is now able to skp invalid connections in shaders [#207](https://github.com/pypeclub/pype/issues/207) - - - - - -## 2.9.0 ## - -_**release date:** 25 May 2020_ - -**new:** -- _(pype)_ Support for **Multiroot projects**. You can now store project data on multiple physical or virtual storages and target individual publishes to these locations. For instance render can be stored on a faster storage than the rest of the project. [#145](https://github.com/pypeclub/pype/issues/145), [#38](https://github.com/pypeclub/pype/issues/38) -- _(harmony)_ Basic implementation of **Toon Boom Harmony** has been added. [#142](https://github.com/pypeclub/pype/issues/142) -- _(pype)_ OSX support is in public beta now. There are issues to be expected, but the main implementation should be functional. [#141](https://github.com/pypeclub/pype/issues/141) - - -**improved:** - -- _(pype)_ **Review extractor** has been completely rebuilt. It now supports granular filtering so you can create **multiple outputs** for different tasks, families or hosts. [#103](https://github.com/pypeclub/pype/issues/103), [#166](https://github.com/pypeclub/pype/issues/166), [#165](https://github.com/pypeclub/pype/issues/165) -- _(pype)_ **Burnin** generation had been extended to **support same multi-output filtering** as review extractor [#103](https://github.com/pypeclub/pype/issues/103) -- _(pype)_ Publishing file templates can now be specified in config for each individual family [#114](https://github.com/pypeclub/pype/issues/114) -- _(pype)_ Studio specific plugins can now be appended to pype standard publishing plugins. [#112](https://github.com/pypeclub/pype/issues/112) -- _(nukestudio)_ Reviewable clips no longer need to be previously cut, exported and re-imported to timeline. **Pype can now dynamically cut reviewable quicktimes** from continuous offline footage during publishing. [#23](https://github.com/pypeclub/pype/issues/23) -- _(deadline)_ Deadline can now correctly differentiate between staging and production pype. [#154](https://github.com/pypeclub/pype/issues/154) -- _(deadline)_ `PYPE_PYTHON_EXE` env variable can now be used to direct publishing to explicit python installation. [#120](https://github.com/pypeclub/pype/issues/120) -- _(nuke)_ Nuke now check for new version of loaded data on file open. [#140](https://github.com/pypeclub/pype/issues/140) -- _(nuke)_ frame range and limit checkboxes are now exposed on write node. [#119](https://github.com/pypeclub/pype/issues/119) - - - -**fix:** - -- _(nukestudio)_ Project Location was using backslashes which was breaking nukestudio native exporting in certains configurations [#82](https://github.com/pypeclub/pype/issues/82) -- _(nukestudio)_ Duplicity in hierarchy tags was prone to throwing publishing error [#130](https://github.com/pypeclub/pype/issues/130), [#144](https://github.com/pypeclub/pype/issues/144) -- _(ftrack)_ multiple stability improvements [#157](https://github.com/pypeclub/pype/issues/157), [#159](https://github.com/pypeclub/pype/issues/159), [#128](https://github.com/pypeclub/pype/issues/128), [#118](https://github.com/pypeclub/pype/issues/118), [#127](https://github.com/pypeclub/pype/issues/127) -- _(deadline)_ multipart EXRs were stopping review publishing on the farm. They are still not supported for automatic review generation, but the publish will go through correctly without the quicktime. [#155](https://github.com/pypeclub/pype/issues/155) -- _(deadline)_ If deadline is non-responsive it will no longer freeze host when publishing [#149](https://github.com/pypeclub/pype/issues/149) -- _(deadline)_ Sometimes deadline was trying to launch render before all the source data was coppied over. [#137](https://github.com/pypeclub/pype/issues/137) _(harmony)_ Basic implementation of **Toon Boom Harmony** has been added. [#142](https://github.com/pypeclub/pype/issues/142) -- _(nuke)_ Filepath knob wasn't updated properly. [#131](https://github.com/pypeclub/pype/issues/131) -- _(maya)_ When extracting animation, the "Write Color Set" options on the instance were not respected. [#108](https://github.com/pypeclub/pype/issues/108) -- _(maya)_ Attribute overrides for AOV only worked for the legacy render layers. Now it works for new render setup as well [#132](https://github.com/pypeclub/pype/issues/132) -- _(maya)_ Stability and usability improvements in yeti workflow [#104](https://github.com/pypeclub/pype/issues/104) - - - - - -## 2.8.0 ## - -_**release date:** 20 April 2020_ - -**new:** - -- _(pype)_ Option to generate slates from json templates. [PYPE-628] [#26](https://github.com/pypeclub/pype/issues/26) -- _(pype)_ It is now possible to automate loading of published subsets into any scene. Documentation will follow :). [PYPE-611] [#24](https://github.com/pypeclub/pype/issues/24) - -**fix:** - -- _(maya)_ Some Redshift render tokens could break publishing. [PYPE-778] [#33](https://github.com/pypeclub/pype/issues/33) -- _(maya)_ Publish was not preserving maya file extension. [#39](https://github.com/pypeclub/pype/issues/39) -- _(maya)_ Rig output validator was failing on nodes without shapes. [#40](https://github.com/pypeclub/pype/issues/40) -- _(maya)_ Yeti caches can now be properly versioned up in the scene inventory. [#40](https://github.com/pypeclub/pype/issues/40) -- _(nuke)_ Build first workfiles was not accepting jpeg sequences. [#34](https://github.com/pypeclub/pype/issues/34) -- _(deadline)_ Trying to generate ffmpeg review from multipart EXRs no longer crashes publishing. [PYPE-781] -- _(deadline)_ Render publishing is more stable in multiplatform environments. [PYPE-775] - - - - - -## 2.7.0 ## - -_**release date:** 30 March 2020_ - -**new:** - -- _(maya)_ Artist can now choose to load multiple references of the same subset at once [PYPE-646, PYPS-81] -- _(nuke)_ Option to use named OCIO colorspaces for review colour baking. [PYPS-82] -- _(pype)_ Pype can now work with `master` versions for publishing and loading. These are non-versioned publishes that are overwritten with the latest version during publish. These are now supported in all the GUIs, but their publishing is deactivated by default. [PYPE-653] -- _(blender)_ Added support for basic blender workflow. We currently support `rig`, `model` and `animation` families. [PYPE-768] -- _(pype)_ Source timecode can now be used in burn-ins. [PYPE-777] -- _(pype)_ Review outputs profiles can now specify delivery resolution different than project setting [PYPE-759] -- _(nuke)_ Bookmark to current context is now added automatically to all nuke browser windows. [PYPE-712] - -**change:** - -- _(maya)_ It is now possible to publish camera without. baking. Keep in mind that unbaked cameras can't be guaranteed to work in other hosts. [PYPE-595] -- _(maya)_ All the renders from maya are now grouped in the loader by their Layer name. [PYPE-482] -- _(nuke/hiero)_ Any publishes from nuke and hiero can now be versioned independently of the workfile. [PYPE-728] - - -**fix:** - -- _(nuke)_ Mixed slashes caused issues in ocio config path. -- _(pype)_ Intent field in pyblish GUI was passing label instead of value to ftrack. [PYPE-733] -- _(nuke)_ Publishing of pre-renders was inconsistent. [PYPE-766] -- _(maya)_ Handles and frame ranges were inconsistent in various places during publishing. -- _(nuke)_ Nuke was crashing if it ran into certain missing knobs. For example DPX output missing `autocrop` [PYPE-774] -- _(deadline)_ Project overrides were not working properly with farm render publishing. -- _(hiero)_ Problems with single frame plates publishing. -- _(maya)_ Redshift RenderPass token were breaking render publishing. [PYPE-778] -- _(nuke)_ Build first workfile was not accepting jpeg sequences. -- _(maya)_ Multipart (Multilayer) EXRs were breaking review publishing due to FFMPEG incompatiblity [PYPE-781] - - - - -## 2.6.0 ## - -_**release date:** 9 March 2020_ - -**change:** -- _(maya)_ render publishing has been simplified and made more robust. Render setup layers are now automatically added to publishing subsets and `render globals` family has been replaced with simple `render` [PYPE-570] -- _(avalon)_ change context and workfiles apps, have been merged into one, that allows both actions to be performed at the same time. [PYPE-747] -- _(pype)_ thumbnails are now automatically propagate to asset from the last published subset in the loader -- _(ftrack)_ publishing comment and intent are now being published to ftrack note as well as describtion. [PYPE-727] -- _(pype)_ when overriding existing version new old representations are now overriden, instead of the new ones just being appended. (to allow this behaviour, the version validator need to be disabled. [PYPE-690]) -- _(pype)_ burnin preset has been significantly simplified. It now doesn't require passing function to each field, but only need the actual text template. to use this, all the current burnin PRESETS MUST BE UPDATED for all the projects. -- _(ftrack)_ credentials are now stored on a per server basis, so it's possible to switch between ftrack servers without having to log in and out. [PYPE-723] - - -**new:** -- _(pype)_ production and development deployments now have different colour of the tray icon. Orange for Dev and Green for production [PYPE-718] -- _(maya)_ renders can now be attached to a publishable subset rather than creating their own subset. For example it is possible to create a reviewable `look` or `model` render and have it correctly attached as a representation of the subsets [PYPE-451] -- _(maya)_ after saving current scene into a new context (as a new shot for instance), all the scene publishing subsets data gets re-generated automatically to match the new context [PYPE-532] -- _(pype)_ we now support project specific publish, load and create plugins [PYPE-740] -- _(ftrack)_ new action that allow archiving/deleting old published versions. User can keep how many of the latest version to keep when the action is ran. [PYPE-748, PYPE-715] -- _(ftrack)_ it is now possible to monitor and restart ftrack event server using ftrack action. [PYPE-658] -- _(pype)_ validator that prevent accidental overwrites of previously published versions. [PYPE-680] -- _(avalon)_ avalon core updated to version 5.6.0 -- _(maya)_ added validator to make sure that relative paths are used when publishing arnold standins. -- _(nukestudio)_ it is now possible to extract and publish audio family from clip in nuke studio [PYPE-682] - -**fix**: -- _(maya)_ maya set framerange button was ignoring handles [PYPE-719] -- _(ftrack)_ sync to avalon was sometime crashing when ran on empty project -- _(nukestudio)_ publishing same shots after they've been previously archived/deleted would result in a crash. [PYPE-737] -- _(nuke)_ slate workflow was breaking in certain scenarios. [PYPE-730] -- _(pype)_ rendering publish workflow has been significantly improved to prevent error resulting from implicit render collection. [PYPE-665, PYPE-746] -- _(pype)_ launching application on a non-synced project resulted in obscure [PYPE-528] -- _(pype)_ missing keys in burnins no longer result in an error. [PYPE-706] -- _(ftrack)_ create folder structure action was sometimes failing for project managers due to wrong permissions. -- _(Nukestudio)_ using `source` in the start frame tag could result in wrong frame range calculation -- _(ftrack)_ sync to avalon action and event have been improved by catching more edge cases and provessing them properly. - - - - -## 2.5.0 ## - -_**release date:** 11 Feb 2020_ - -**change:** -- _(pype)_ added many logs for easier debugging -- _(pype)_ review presets can now be separated between 2d and 3d renders [PYPE-693] -- _(pype)_ anatomy module has been greatly improved to allow for more dynamic pulblishing and faster debugging [PYPE-685] -- _(pype)_ avalon schemas have been moved from `pype-config` to `pype` repository, for simplification. [PYPE-670] -- _(ftrack)_ updated to latest ftrack API -- _(ftrack)_ publishing comments now appear in ftrack also as a note on version with customisable category [PYPE-645] -- _(ftrack)_ delete asset/subset action had been improved. It is now able to remove multiple entities and descendants of the selected entities [PYPE-361, PYPS-72] -- _(workfiles)_ added date field to workfiles app [PYPE-603] -- _(maya)_ old deprecated loader have been removed in favour of a single unified reference loader (old scenes will upgrade automatically to the new loader upon opening) [PYPE-633, PYPE-697] -- _(avalon)_ core updated to 5.5.15 [PYPE-671] -- _(nuke)_ library loader is now available in nuke [PYPE-698] - - -**new:** -- _(pype)_ added pype render wrapper to allow rendering on mixed platform farms. [PYPE-634] -- _(pype)_ added `pype launch` command. It let's admin run applications with dynamically built environment based on the given context. [PYPE-634] -- _(pype)_ added support for extracting review sequences with burnins [PYPE-657] -- _(publish)_ users can now set intent next to a comment when publishing. This will then be reflected on an attribute in ftrack. [PYPE-632] -- _(burnin)_ timecode can now be added to burnin -- _(burnin)_ datetime keys can now be added to burnin and anatomy [PYPE-651] -- _(burnin)_ anatomy templates can now be used in burnins. [PYPE=626] -- _(nuke)_ new validator for render resolution -- _(nuke)_ support for attach slate to nuke renders [PYPE-630] -- _(nuke)_ png sequences were added to loaders -- _(maya)_ added maya 2020 compatibility [PYPE-677] -- _(maya)_ ability to publish and load .ASS standin sequences [PYPS-54] -- _(pype)_ thumbnails can now be published and are visible in the loader. `AVALON_THUMBNAIL_ROOT` environment variable needs to be set for this to work [PYPE-573, PYPE-132] -- _(blender)_ base implementation of blender was added with publishing and loading of .blend files [PYPE-612] -- _(ftrack)_ new action for preparing deliveries [PYPE-639] - - -**fix**: -- _(burnin)_ more robust way of finding ffmpeg for burnins. -- _(pype)_ improved UNC paths remapping when sending to farm. -- _(pype)_ float frames sometimes made their way to representation context in database, breaking loaders [PYPE-668] -- _(pype)_ `pype install --force` was failing sometimes [PYPE-600] -- _(pype)_ padding in published files got calculated wrongly sometimes. It is now instead being always read from project anatomy. [PYPE-667] -- _(publish)_ comment publishing was failing in certain situations -- _(ftrack)_ multiple edge case scenario fixes in auto sync and sync-to-avalon action -- _(ftrack)_ sync to avalon now works on empty projects -- _(ftrack)_ thumbnail update event was failing when deleting entities [PYPE-561] -- _(nuke)_ loader applies proper colorspaces from Presets -- _(nuke)_ publishing handles didn't always work correctly [PYPE-686] -- _(maya)_ assembly publishing and loading wasn't working correctly - - - - - - -## 2.4.0 ## - -_**release date:** 9 Dec 2019_ - -**change:** -- _(ftrack)_ version to status ftrack event can now be configured from Presets - - based on preset `presets/ftracc/ftrack_config.json["status_version_to_task"]` -- _(ftrack)_ sync to avalon event has been completely re-written. It now supports most of the project management situations on ftrack including moving, renaming and deleting entities, updating attributes and working with tasks. -- _(ftrack)_ sync to avalon action has been also re-writen. It is now much faster (up to 100 times depending on a project structure), has much better logging and reporting on encountered problems, and is able to handle much more complex situations. -- _(ftrack)_ sync to avalon trigger by checking `auto-sync` toggle on ftrack [PYPE-504] -- _(pype)_ various new features in the REST api -- _(pype)_ new visual identity used across pype -- _(pype)_ started moving all requirements to pip installation rather than vendorising them in pype repository. Due to a few yet unreleased packages, this means that pype can temporarily be only installed in the offline mode. - -**new:** -- _(nuke)_ support for publishing gizmos and loading them as viewer processes -- _(nuke)_ support for publishing nuke nodes from backdrops and loading them back -- _(pype)_ burnins can now work with start and end frames as keys - - use keys `{frame_start}`, `{frame_end}` and `{current_frame}` in burnin preset to use them. [PYPS-44,PYPS-73, PYPE-602] -- _(pype)_ option to filter logs by user and level in loggin GUI -- _(pype)_ image family added to standalone publisher [PYPE-574] -- _(pype)_ matchmove family added to standalone publisher [PYPE-574] -- _(nuke)_ validator for comparing arbitrary knobs with values from presets -- _(maya)_ option to force maya to copy textures in the new look publish rather than hardlinking them -- _(pype)_ comments from pyblish GUI are now being added to ftrack version -- _(maya)_ validator for checking outdated containers in the scene -- _(maya)_ option to publish and load arnold standin sequence [PYPE-579, PYPS-54] - -**fix**: -- _(pype)_ burnins were not respecting codec of the input video -- _(nuke)_ lot's of various nuke and nuke studio fixes across the board [PYPS-45] -- _(pype)_ workfiles app is not launching with the start of the app by default [PYPE-569] -- _(ftrack)_ ftrack integration during publishing was failing under certain situations [PYPS-66] -- _(pype)_ minor fixes in REST api -- _(ftrack)_ status change event was crashing when the target status was missing [PYPS-68] -- _(ftrack)_ actions will try to reconnect if they fail for some reason -- _(maya)_ problems with fps mapping when using float FPS values -- _(deadline)_ overall improvements to deadline publishing -- _(setup)_ environment variables are now remapped on the fly based on the platform pype is running on. This fixes many issues in mixed platform environments. - - - - -## 2.3.6 # - -_**release date:** 27 Nov 2019_ - -**hotfix**: -- _(ftrack)_ was hiding important debug logo -- _(nuke)_ crashes during workfile publishing -- _(ftrack)_ event server crashes because of signal problems -- _(muster)_ problems with muster render submissions -- _(ftrack)_ thumbnail update event syntax errors - - - - -## 2.3.0 ## - -_release date: 6 Oct 2019_ - -**new**: -- _(maya)_ support for yeti rigs and yeti caches -- _(maya)_ validator for comparing arbitrary attributes against ftrack -- _(pype)_ burnins can now show current date and time -- _(muster)_ pools can now be set in render globals in maya -- _(pype)_ Rest API has been implemented in beta stage -- _(nuke)_ LUT loader has been added -- _(pype)_ rudimentary user module has been added as preparation for user management -- _(pype)_ a simple logging GUI has been added to pype tray -- _(nuke)_ nuke can now bake input process into mov -- _(maya)_ imported models now have selection handle displayed by defaulting -- _(avalon)_ it's is now possible to load multiple assets at once using loader -- _(maya)_ added ability to automatically connect yeti rig to a mesh upon loading - -**changed**: -- _(ftrack)_ event server now runs two parallel processes and is able to keep queue of events to process. -- _(nuke)_ task name is now added to all rendered subsets -- _(pype)_ adding more families to standalone publisher -- _(pype)_ standalone publisher now uses pyblish-lite -- _(pype)_ standalone publisher can now create review quicktimes -- _(ftrack)_ queries to ftrack were sped up -- _(ftrack)_ multiple ftrack action have been deprecated -- _(avalon)_ avalon upstream has been updated to 5.5.0 -- _(nukestudio)_ published transforms can now be animated -- - -**fix**: -- _(maya)_ fps popup button didn't work in some cases -- _(maya)_ geometry instances and references in maya were losing shader assignments -- _(muster)_ muster rendering templates were not working correctly -- _(maya)_ arnold tx texture conversion wasn't respecting colorspace set by the artist -- _(pype)_ problems with avalon db sync -- _(maya)_ ftrack was rounding FPS making it inconsistent -- _(pype)_ wrong icon names in Creator -- _(maya)_ scene inventory wasn't showing anything if representation was removed from database after it's been loaded to the scene -- _(nukestudio)_ multiple bugs squashed -- _(loader)_ loader was taking long time to show all the loading action when first launcher in maya - -## 2.2.0 ## -_**release date:** 8 Sept 2019_ - -**new**: -- _(pype)_ add customisable workflow for creating quicktimes from renders or playblasts -- _(nuke)_ option to choose deadline chunk size on write nodes -- _(nukestudio)_ added option to publish soft effects (subTrackItems) from NukeStudio as subsets including LUT files. these can then be loaded in nuke or NukeStudio -- _(nuke)_ option to build nuke script from previously published latest versions of plate and render subsets. -- _(nuke)_ nuke writes now have deadline tab. -- _(ftrack)_ Prepare Project action can now be used for creating the base folder structure on disk and in ftrack, setting up all the initial project attributes and it automatically prepares `pype_project_config` folder for the given project. -- _(clockify)_ Added support for time tracking in clockify. This currently in addition to ftrack time logs, but does not completely replace them. -- _(pype)_ any attributes in Creator and Loader plugins can now be customised using pype preset system - -**changed**: -- nukestudio now uses workio API for workfiles -- _(maya)_ "FIX FPS" prompt in maya now appears in the middle of the screen -- _(muster)_ can now be configured with custom templates -- _(pype)_ global publishing plugins can now be configured using presets as well as host specific ones - - -**fix**: -- wrong version retrieval from path in certain scenarios -- nuke reset resolution wasn't working in certain scenarios - -## 2.1.0 ## -_release date: 6 Aug 2019_ - -A large cleanup release. Most of the change are under the hood. - -**new**: -- _(pype)_ add customisable workflow for creating quicktimes from renders or playblasts -- _(pype)_ Added configurable option to add burnins to any generated quicktimes -- _(ftrack)_ Action that identifies what machines pype is running on. -- _(system)_ unify subprocess calls -- _(maya)_ add audio to review quicktimes -- _(nuke)_ add crop before write node to prevent overscan problems in ffmpeg -- **Nuke Studio** publishing and workfiles support -- **Muster** render manager support -- _(nuke)_ Framerange, FPS and Resolution are set automatically at startup -- _(maya)_ Ability to load published sequences as image planes -- _(system)_ Ftrack event that sets asset folder permissions based on task assignees in ftrack. -- _(maya)_ Pyblish plugin that allow validation of maya attributes -- _(system)_ added better startup logging to tray debug, including basic connection information -- _(avalon)_ option to group published subsets to groups in the loader -- _(avalon)_ loader family filters are working now - -**changed**: -- change multiple key attributes to unify their behaviour across the pipeline - - `frameRate` to `fps` - - `startFrame` to `frameStart` - - `endFrame` to `frameEnd` - - `fstart` to `frameStart` - - `fend` to `frameEnd` - - `handle_start` to `handleStart` - - `handle_end` to `handleEnd` - - `resolution_width` to `resolutionWidth` - - `resolution_height` to `resolutionHeight` - - `pixel_aspect` to `pixelAspect` - -- _(nuke)_ write nodes are now created inside group with only some attributes editable by the artist -- rendered frames are now deleted from temporary location after their publishing is finished. -- _(ftrack)_ RV action can now be launched from any entity -- after publishing only refresh button is now available in pyblish UI -- added context instance pyblish-lite so that artist knows if context plugin fails -- _(avalon)_ allow opening selected files using enter key -- _(avalon)_ core updated to v5.2.9 with our forked changes on top - -**fix**: -- faster hierarchy retrieval from db -- _(nuke)_ A lot of stability enhancements -- _(nuke studio)_ A lot of stability enhancements -- _(nuke)_ now only renders a single write node on farm -- _(ftrack)_ pype would crash when launcher project level task -- work directory was sometimes not being created correctly -- major pype.lib cleanup. Removing of unused functions, merging those that were doing the same and general house cleaning. -- _(avalon)_ subsets in maya 2019 weren't behaving correctly in the outliner diff --git a/website/docs/dev_settings.md b/website/docs/dev_settings.md new file mode 100644 index 0000000000..94590345e8 --- /dev/null +++ b/website/docs/dev_settings.md @@ -0,0 +1,899 @@ +--- +id: dev_settings +title: Settings +sidebar_label: Settings +--- + +Settings give the ability to change how OpenPype behaves in certain situations. Settings are split into 3 categories **system settings**, **project anatomy** and **project settings**. Project anatomy and project settings are grouped into a single category but there is a technical difference (explained later). Only difference in system and project settings is that system settings can't be technically handled on a project level or their values must be available no matter in which project the values are received. Settings have headless entities or settings UI. + +There is one more category **local settings** but they don't have ability to be changed or defined easily. Local settings can change how settings work per machine, can affect both system and project settings but they're hardcoded for predefined values at this moment. + +## Settings schemas +System and project settings are defined by settings schemas. Schema defines the structure of output value, what value types output will contain, how settings are stored and how its UI input will look. + +## Settings values +Output of settings is a json serializable value. There are 3 possible types of value **default values**, **studio overrides** and **project overrides**. Default values must be always available for all settings schemas, their values are stored to code. Default values are what everyone who just installed OpenPype will use as default values. It is good practice to set example values but they should be actually relevant. + +Setting overrides is what makes settings a powerful tool. Overrides contain only a part of settings with additional metadata that describe which parts of settings values should be replaced from overrides values. Using overrides gives the ability to save only specific values and use default values for rest. It is super useful in project settings which have up to 2 levels of overrides. In project settings are used **default values** as base on which are applied **studio overrides** and then **project overrides**. In practice it is possible to save only studio overrides which affect all projects. Changes in studio overrides are then propagated to all projects without project overrides. But values can be locked on project level so studio overrides are not used. + +## Settings storage +As was mentioned default values are stored into repository files. Overrides are stored in the Mongo database. The value in mongo contain only overrides with metadata so their content on it's own is useless and must be used with combination of default values. System settings and project settings are stored into special collection. Single document represents one set of overrides with OpenPype version for which is stored. Settings are versioned and are loaded in specific order - current OpenPype version overrides or first lower available. If there are any overrides with the same or lower version then the first higher version is used. If there are any overrides then no overrides are applied. + +Project anatomy is stored into a project document thus is not versioned and its values are always overridden. Any changes in anatomy schema may have a drastic effect on production and OpenPype updates. + +## Settings schema items +As was mentioned schema items define output type of values, how they are stored and how they look in UI. +- schemas are (by default) defined by json files +- OpenPype core system settings schemas are stored in `~/openpype/settings/entities/schemas/system_schema/` and project settings in `~/openpype/settings/entities/schemas/projects_schema/` + - both contain `schema_main.json` which are entry points +- OpenPype modules/addons can define their settings schemas using `BaseModuleSettingsDef` in that case some functionality may be slightly modified +- single schema item is represented by dictionary (object) in json which has `"type"` key. + - **type** is only common key which is required for all schema items +- each item may have "input modifiers" (other keys in dictionary) and they may be required or optional based on the type +- there are special keys across all items + - `"is_file"` - this key is used when defaults values are stored in the file. Its value matches the filename where values are stored + - key is validated, must be unique in hierarchy otherwise it won't be possible to store default values + - make sense to fill it only if it's value if `true` + + - `"is_group"` - define that all values under a key in settings hierarchy will be overridden if any value is modified + - this key is not allowed for all inputs as they may not have technical ability to handle it + - key is validated, must be unique in hierarchy and is automatically filled on last possible item if is not defined in schemas + - make sense to fill it only if it's value if `true` +- all entities can have set `"tooltip"` key with description which will be shown in UI on hover + +### Inner schema +Settings schemas are big json files which would become unmanageable if they were in a single file. To be able to split them into multiple files to help organize them special types `schema` and `template` were added. Both types are related to a different file by filename. If a json file contains a dictionary it is considered as `schema` if it contains a list it is considered as a `template`. + +#### schema +Schema item is replaced by content of entered schema name. It is recommended that the schema file is used only once in settings hierarchy. Templates are meant for reusing. +- schema must have `"name"` key which is name of schema that should be used + +```javascript +{ + "type": "schema", + "name": "my_schema_name" +} +``` + +#### template +Templates are almost the same as schema items but can contain one or more items which can be formatted with additional data or some keys can be skipped if needed. Templates are meant for reusing the same schemas with ability to modify content. + +- legacy name is `schema_template` (still usable) +- template must have `"name"` key which is name of template file that should be used +- to fill formatting keys use `"template_data"` +- all items in template, except `__default_values__`, will replace `template` item in original schema +- template may contain other templates + +```javascript +// Example template json file content +[ + { + // Define default values for formatting values + // - gives ability to set the value but have default value + "__default_values__": { + "multipath_executables": true + } + }, { + "type": "raw-json", + "label": "{host_label} Environments", + "key": "{host_name}_environments" + }, { + "type": "path", + "key": "{host_name}_executables", + "label": "{host_label} - Full paths to executables", + "multiplatform": "{multipath_executables}", + "multipath": true + } +] +``` +```javascript +// Example usage of the template in schema +{ + "type": "dict", + "key": "template_examples", + "label": "Schema template examples", + "children": [ + { + "type": "template", + "name": "example_template", + "template_data": [ + { + "host_label": "Maya 2019", + "host_name": "maya_2019", + "multipath_executables": false + }, + { + "host_label": "Maya 2020", + "host_name": "maya_2020" + }, + { + "host_label": "Maya 2021", + "host_name": "maya_2021" + } + ] + } + ] +} +``` +```javascript +// The same schema defined without templates +{ + "type": "dict", + "key": "template_examples", + "label": "Schema template examples", + "children": [ + { + "type": "raw-json", + "label": "Maya 2019 Environments", + "key": "maya_2019_environments" + }, { + "type": "path", + "key": "maya_2019_executables", + "label": "Maya 2019 - Full paths to executables", + "multiplatform": false, + "multipath": true + }, { + "type": "raw-json", + "label": "Maya 2020 Environments", + "key": "maya_2020_environments" + }, { + "type": "path", + "key": "maya_2020_executables", + "label": "Maya 2020 - Full paths to executables", + "multiplatform": true, + "multipath": true + }, { + "type": "raw-json", + "label": "Maya 2021 Environments", + "key": "maya_2021_environments" + }, { + "type": "path", + "key": "maya_2021_executables", + "label": "Maya 2021 - Full paths to executables", + "multiplatform": true, + "multipath": true + } + ] +} +``` + +Template data can be used only to fill templates in values but not in keys. It is also possible to define default values for unfilled fields to do so one of the items in the list must be a dictionary with key "__default_values__"` and value as dictionary with default key: values (as in example above). +```javascript +{ + ... + // Allowed + "key": "{to_fill}" + ... + // Not allowed + "{to_fill}": "value" + ... +} +``` + +Because formatting values can be only string it is possible to use formatting values which are replaced with different types. +```javascript +// Template data +{ + "template_data": { + "executable_multiplatform": { + "type": "schema", + "name": "my_multiplatform_schema" + } + } +} +// Template content +{ + ... + // Allowed - value is replaced with dictionary + "multiplatform": "{executable_multiplatform}" + ... + // Not allowed - there is no way how it could be replaced + "multiplatform": "{executable_multiplatform}_enhanced_string" + ... +} +``` + +#### dynamic_schema +Dynamic schema item marks a place in settings schema where schemas defined by `BaseModuleSettingsDef` can be placed. +- example: +```javascript +{ + "type": "dynamic_schema", + "name": "project_settings/global" +} +``` +- `BaseModuleSettingsDef` with implemented `get_settings_schemas` can return a dictionary where key defines a dynamic schema name and value schemas that will be put there +- dynamic schemas work almost the same way as templates + - one item can be replaced by multiple items (or by 0 items) +- goal is to dynamically load settings of OpenPype modules without having their schemas or default values in core repository + - values of these schemas are saved using the `BaseModuleSettingsDef` methods +- we recommend to use `JsonFilesSettingsDef` which has full implementation of storing default values to json files + - requires only to implement method `get_settings_root_path` which should return path to root directory where settings schema can be found and default values will be saved + +### Basic Dictionary inputs +These inputs wraps another inputs into {key: value} relation + +#### dict +- this is dictionary type wrapping more inputs with keys defined in schema +- may be used as dynamic children (e.g. in [list](#list) or [dict-modifiable](#dict-modifiable)) + - in that case the only key modifier is `children` which is a list of its keys + - USAGE: e.g. List of dictionaries where each dictionary has the same structure. +- if is not used as dynamic children then must have defined `"key"` under which are it's values stored +- may be with or without `"label"` (only for GUI) + - `"label"` must be set to be able to mark item as group with `"is_group"` key set to True +- item with label can visually wrap its children + - this option is enabled by default to turn off set `"use_label_wrap"` to `False` + - label wrap is by default collapsible + - that can be set with key `"collapsible"` to `True`/`False` + - with key `"collapsed"` as `True`/`False` can be set that is collapsed when GUI is opened (Default: `False`) + - it is possible to add lighter background with `"highlight_content"` (Default: `False`) + - lighter background has limits of maximum applies after 3-4 nested highlighted items there is not much difference in the color + - output is dictionary `{the "key": children values}` +```javascript +// Example +{ + "key": "applications", + "type": "dict", + "label": "Applications", + "collapsible": true, + "highlight_content": true, + "is_group": true, + "is_file": true, + "children": [ + ...ITEMS... + ] +} + +// Without label +{ + "type": "dict", + "key": "global", + "children": [ + ...ITEMS... + ] +} + +// When used as widget +{ + "type": "list", + "key": "profiles", + "label": "Profiles", + "object_type": { + "type": "dict", + "children": [ + { + "key": "families", + "label": "Families", + "type": "list", + "object_type": "text" + }, { + "key": "hosts", + "label": "Hosts", + "type": "list", + "object_type": "text" + } + ... + ] + } +} +``` + +#### dict-roots +- entity can be used only in Project settings +- keys of dictionary are based on current project roots +- they are not updated "live" it is required to save root changes and then + modify values on this entity + # TODO do live updates +```javascript +{ + "type": "dict-roots", + "key": "roots", + "label": "Roots", + "object_type": { + "type": "path", + "multiplatform": true, + "multipath": false + } +} +``` + +#### dict-conditional +- is similar to `dict` but has always available one enum entity + - the enum entity has single selection and it's value define other children entities +- each value of enumerator have defined children that will be used + - there is no way how to have shared entities across multiple enum items +- value from enumerator is also stored next to other values + - to define the key under which will be enum value stored use `enum_key` + - `enum_key` must match key regex and any enum item can't have children with same key + - `enum_label` is label of the entity for UI purposes +- enum items are define with `enum_children` + - it's a list where each item represents single item for the enum + - all items in `enum_children` must have at least `key` key which represents value stored under `enum_key` + - enum items can define `label` for UI purposes + - most important part is that item can define `children` key where are definitions of it's children (`children` value works the same way as in `dict`) +- to set default value for `enum_key` set it with `enum_default` +- entity must have defined `"label"` if is not used as widget +- is set as group if any parent is not group (can't have children as group) +- may be with or without `"label"` (only for GUI) + - `"label"` must be set to be able to mark item as group with `"is_group"` key set to True +- item with label can visually wrap its children + - this option is enabled by default to turn off set `"use_label_wrap"` to `False` + - label wrap is by default collapsible + - that can be set with key `"collapsible"` to `True`/`False` + - with key `"collapsed"` as `True`/`False` can be set that is collapsed when GUI is opened (Default: `False`) + - it is possible to add lighter background with `"highlight_content"` (Default: `False`) + - lighter background has limits of maximum applies after 3-4 nested highlighted items there is not much difference in the color +- for UI purposes was added `enum_is_horizontal` which will make combobox appear next to children inputs instead of on top of them (Default: `False`) + - this has extended ability of `enum_on_right` which will move combobox to right side next to children widgets (Default: `False`) +- output is dictionary `{the "key": children values}` +- using this type as template item for list type can be used to create infinite hierarchies + +```javascript +// Example +{ + "type": "dict-conditional", + "key": "my_key", + "label": "My Key", + "enum_key": "type", + "enum_label": "label", + "enum_children": [ + // Each item must be a dictionary with 'key' + { + "key": "action", + "label": "Action", + "children": [ + { + "type": "text", + "key": "key", + "label": "Key" + }, + { + "type": "text", + "key": "label", + "label": "Label" + }, + { + "type": "text", + "key": "command", + "label": "Comand" + } + ] + }, + { + "key": "menu", + "label": "Menu", + "children": [ + { + "key": "children", + "label": "Children", + "type": "list", + "object_type": "text" + } + ] + }, + { + // Separator does not have children as "separator" value is enough + "key": "separator", + "label": "Separator" + } + ] +} +``` + +How output of the schema could look like on save: +```javascript +{ + "type": "separator" +} + +{ + "type": "action", + "key": "action_1", + "label": "Action 1", + "command": "run command -arg" +} + +{ + "type": "menu", + "children": [ + "child_1", + "child_2" + ] +} +``` + +### Inputs for setting any kind of value (`Pure` inputs) +- all inputs must have defined `"key"` if are not used as dynamic item + - they can also have defined `"label"` + +#### boolean +- simple checkbox, nothing more to set +```javascript +{ + "type": "boolean", + "key": "my_boolean_key", + "label": "Do you want to use Pype?" +} +``` + +#### number +- number input, can be used for both integer and float + - key `"decimal"` defines how many decimal places will be used, 0 is for integer input (Default: `0`) + - key `"minimum"` as minimum allowed number to enter (Default: `-99999`) + - key `"maxium"` as maximum allowed number to enter (Default: `99999`) +- key `"steps"` will change single step value of UI inputs (using arrows and wheel scroll) +- for UI it is possible to show slider to enable this option set `show_slider` to `true` +```javascript +{ + "type": "number", + "key": "fps", + "label": "Frame rate (FPS)" + "decimal": 2, + "minimum": 1, + "maximum": 300000 +} +``` + +```javascript +{ + "type": "number", + "key": "ratio", + "label": "Ratio" + "decimal": 3, + "minimum": 0, + "maximum": 1, + "show_slider": true +} +``` + +#### text +- simple text input + - key `"multiline"` allows to enter multiple lines of text (Default: `False`) + - key `"placeholder"` allows to show text inside input when is empty (Default: `None`) + +```javascript +{ + "type": "text", + "key": "deadline_pool", + "label": "Deadline pool" +} +``` + +#### path-input +- Do not use this input in schema please (use `path` instead) +- this input is implemented to add additional features to text input +- this is meant to be used in proxy input `path` + +#### raw-json +- a little bit enhanced text input for raw json +- can store dictionary (`{}`) or list (`[]`) but not both + - by default stores dictionary to change it to list set `is_list` to `True` +- has validations of json format +- output can be stored as string + - this is to allow any keys in dictionary + - set key `store_as_string` to `true` + - code using that setting must expected that value is string and use json module to convert it to python types + +```javascript +{ + "type": "raw-json", + "key": "profiles", + "label": "Extract Review profiles", + "is_list": true +} +``` + +#### enum +- enumeration of values that are predefined in schema +- multiselection can be allowed with setting key `"multiselection"` to `True` (Default: `False`) +- values are defined under value of key `"enum_items"` as list + - each item in list is simple dictionary where value is label and key is value which will be stored + - should be possible to enter single dictionary if order of items doesn't matter +- it is possible to set default selected value/s with `default` attribute + - it is recommended to use this option only in single selection mode + - at the end this option is used only when defying default settings value or in dynamic items + +```javascript +{ + "key": "tags", + "label": "Tags", + "type": "enum", + "multiselection": true, + "enum_items": [ + {"burnin": "Add burnins"}, + {"ftrackreview": "Add to Ftrack"}, + {"delete": "Delete output"}, + {"slate-frame": "Add slate frame"}, + {"no-handles": "Skip handle frames"} + ] +} +``` + +#### anatomy-templates-enum +- enumeration of all available anatomy template keys +- have only single selection mode +- it is possible to define default value `default` + - `"work"` is used if default value is not specified +- enum values are not updated on the fly it is required to save templates and + reset settings to recache values +```javascript +{ + "key": "host", + "label": "Host name", + "type": "anatomy-templates-enum", + "default": "publish" +} +``` + +#### hosts-enum +- enumeration of available hosts +- multiselection can be allowed with setting key `"multiselection"` to `True` (Default: `False`) +- it is possible to add empty value (represented with empty string) with setting `"use_empty_value"` to `True` (Default: `False`) +- it is possible to set `"custom_labels"` for host names where key `""` is empty value (Default: `{}`) +- to filter host names it is required to define `"hosts_filter"` which is list of host names that will be available + - do not pass empty string if `use_empty_value` is enabled + - ignoring host names would be more dangerous in some cases +```javascript +{ + "key": "host", + "label": "Host name", + "type": "hosts-enum", + "multiselection": false, + "use_empty_value": true, + "custom_labels": { + "": "N/A", + "nuke": "Nuke" + }, + "hosts_filter": [ + "nuke" + ] +} +``` + +#### apps-enum +- enumeration of available application and their variants from system settings + - applications without host name are excluded +- can be used only in project settings +- has only `multiselection` +- used only in project anatomy +```javascript +{ + "type": "apps-enum", + "key": "applications", + "label": "Applications" +} +``` + +#### tools-enum +- enumeration of available tools and their variants from system settings +- can be used only in project settings +- has only `multiselection` +- used only in project anatomy +```javascript +{ + "type": "tools-enum", + "key": "tools_env", + "label": "Tools" +} +``` + +#### task-types-enum +- enumeration of task types from current project +- enum values are not updated on the fly and modifications of task types on project require save and reset to be propagated to this enum +- has set `multiselection` to `True` but can be changed to `False` in schema + +#### deadline_url-enum +- deadline module specific enumerator using deadline system settings to fill it's values +- TODO: move this type to deadline module + +### Inputs for setting value using Pure inputs +- these inputs also have required `"key"` +- attribute `"label"` is required in few conditions + - when item is marked `as_group` or when `use_label_wrap` +- they use Pure inputs "as widgets" + +#### list +- output is list +- items can be added and removed +- items in list must be the same type +- to wrap item in collapsible widget with label on top set `use_label_wrap` to `True` + - when this is used `collapsible` and `collapsed` can be set (same as `dict` item does) +- type of items is defined with key `"object_type"` +- there are 2 possible ways how to set the type: + 1.) dictionary with item modifiers (`number` input has `minimum`, `maximum` and `decimals`) in that case item type must be set as value of `"type"` (example below) + 2.) item type name as string without modifiers (e.g. [text](#text)) + 3.) enhancement of 1.) there is also support of `template` type but be carefull about endless loop of templates + - goal of using `template` is to easily change same item definitions in multiple lists + +1.) with item modifiers +```javascript +{ + "type": "list", + "key": "exclude_ports", + "label": "Exclude ports", + "object_type": { + "type": "number", # number item type + "minimum": 1, # minimum modifier + "maximum": 65535 # maximum modifier + } +} +``` + +2.) without modifiers +```javascript +{ + "type": "list", + "key": "exclude_ports", + "label": "Exclude ports", + "object_type": "text" +} +``` + +3.) with template definition +```javascript +// Schema of list item where template is used +{ + "type": "list", + "key": "menu_items", + "label": "Menu Items", + "object_type": { + "type": "template", + "name": "template_object_example" + } +} + +// WARNING: +// In this example the template use itself inside which will work in `list` +// but may cause an issue in other entity types (e.g. `dict`). + +'template_object_example.json' : +[ + { + "type": "dict-conditional", + "use_label_wrap": true, + "collapsible": true, + "key": "menu_items", + "label": "Menu items", + "enum_key": "type", + "enum_label": "Type", + "enum_children": [ + { + "key": "action", + "label": "Action", + "children": [ + { + "type": "text", + "key": "key", + "label": "Key" + } + ] + }, { + "key": "menu", + "label": "Menu", + "children": [ + { + "key": "children", + "label": "Children", + "type": "list", + "object_type": { + "type": "template", + "name": "template_object_example" + } + } + ] + } + ] + } +] +``` + +#### dict-modifiable +- one of dictionary inputs, this is only used as value input +- items in this input can be removed and added same way as in `list` input +- value items in dictionary must be the same type +- required keys may be defined under `"required_keys"` + - required keys must be defined as a list (e.g. `["key_1"]`) and are moved to the top + - these keys can't be removed or edited (it is possible to edit label if item is collapsible) +- type of items is defined with key `"object_type"` + - there are 2 possible ways how to set the object type (Examples below): + 1. just a type name as string without modifiers (e.g. `"text"`) + 2. full types with modifiers as dictionary(`number` input has `minimum`, `maximum` and `decimals`) in that case item type must be set as value of `"type"` +- this input can be collapsible + - `"use_label_wrap"` must be set to `True` (Default behavior) + - that can be set with key `"collapsible"` as `True`/`False` (Default: `True`) + - with key `"collapsed"` as `True`/`False` can be set that is collapsed when GUI is opened (Default: `False`) + +1. **Object type** without modifiers +```javascript +{ + "type": "dict-modifiable", + "object_type": "text", + "is_group": true, + "key": "templates_mapping", + "label": "Muster - Templates mapping", + "is_file": true +} +``` + +2. **Object type** with item modifiers +```javascript +{ + "type": "dict-modifiable", + "object_type": { + "type": "number", + "minimum": 0, + "maximum": 300 + }, + "is_group": true, + "key": "templates_mapping", + "label": "Muster - Templates mapping", + "is_file": true +} +``` + +#### path +- input for paths, use `path-input` internally +- has 2 input modifiers `"multiplatform"` and `"multipath"` + - `"multiplatform"` - adds `"windows"`, `"linux"` and `"darwin"` path inputs (result is dictionary) + - `"multipath"` - it is possible to enter multiple paths + - if both are enabled result is dictionary with lists + +```javascript +{ + "type": "path", + "key": "ffmpeg_path", + "label": "FFmpeg path", + "multiplatform": true, + "multipath": true +} +``` + +#### list-strict +- input for strict number of items in list +- each child item can be different type with different possible modifiers +- it is possible to display them in horizontal or vertical layout + - key `"horizontal"` as `True`/`False` (Default: `True`) +- each child may have defined `"label"` which is shown next to input + - label does not reflect modifications or overrides (TODO) +- children item are defined under key `"object_types"` which is list of dictionaries + - key `"children"` is not used because is used for hierarchy validations in schema +- USAGE: For colors, transformations, etc. Custom number and different modifiers + give ability to define if color is HUE or RGB, 0-255, 0-1, 0-100 etc. + +```javascript +{ + "type": "list-strict", + "key": "color", + "label": "Color", + "object_types": [ + { + "label": "Red", + "type": "number", + "minimum": 0, + "maximum": 255, + "decimal": 0 + }, { + "label": "Green", + "type": "number", + "minimum": 0, + "maximum": 255, + "decimal": 0 + }, { + "label": "Blue", + "type": "number", + "minimum": 0, + "maximum": 255, + "decimal": 0 + }, { + "label": "Alpha", + "type": "number", + "minimum": 0, + "maximum": 1, + "decimal": 6 + } + ] +} +``` + +#### color +- pre implemented entity to store and load color values +- entity store and expect list of 4 integers in range 0-255 + - integers represents rgba [Red, Green, Blue, Alpha] +- has modifier `"use_alpha"` which can be `True`/`False` + - alpha is always `255` if set to `True` and alpha slider is not visible in UI + +```javascript +{ + "type": "color", + "key": "bg_color", + "label": "Background Color" +} +``` + +### Anatomy +Anatomy represents data stored on project document. Item cares about **Project Anatomy**. + +#### anatomy +- entity is just enhanced [dict](#dict) item +- anatomy has always all keys overridden with overrides + +### Noninteractive items +Items used only for UI purposes. + +#### label +- add label with note or explanations +- it is possible to use html tags inside the label +- set `work_wrap` to `true`/`false` if you want to enable word wrapping in UI (default: `false`) + +```javascript +{ + "type": "label", + "label": "RED LABEL: Normal label" +} +``` + +#### separator +- legacy name is `splitter` (still usable) +- visual separator of items (more divider than separator) + +```javascript +{ + "type": "separator" +} +``` + +### Proxy wrappers +- should wrap multiple inputs only visually +- these do not have `"key"` key and do not allow to have `"is_file"` or `"is_group"` modifiers enabled +- can't be used as a widget (first item in e.g. `list`, `dict-modifiable`, etc.) + +#### form +- wraps inputs into form look layout +- should be used only for Pure inputs + +```javascript +{ + "type": "dict-form", + "children": [ + { + "type": "text", + "key": "deadline_department", + "label": "Deadline apartment" + }, { + "type": "number", + "key": "deadline_priority", + "label": "Deadline priority" + }, { + ... + } + ] +} +``` + + +#### collapsible-wrap +- wraps inputs into collapsible widget + - looks like `dict` but does not hold `"key"` +- should be used only for Pure inputs + +```javascript +{ + "type": "collapsible-wrap", + "label": "Collapsible example" + "children": [ + { + "type": "text", + "key": "_example_input_collapsible", + "label": "Example input in collapsible wrapper" + }, { + ... + } + ] +} +``` + + +## How to add new settings +Always start with modifying or adding a new schema and don't worry about values. When you think schema is ready to use launch OpenPype settings in development mode using `poetry run python ./start.py settings --dev` or prepared script in `~/openpype/tools/run_settings(.sh|.ps1)`. Settings opened in development mode have the checkbox `Modify defaults` available in the bottom left corner. When checked default values are modified and saved on `Save`. This is a recommended approach on how default settings should be created instead of direct modification of files. + +![Modify default settings](assets/settings_dev.png) diff --git a/website/docs/module_ftrack.md b/website/docs/module_ftrack.md index 667782754f..6d5529b512 100644 --- a/website/docs/module_ftrack.md +++ b/website/docs/module_ftrack.md @@ -13,7 +13,7 @@ Ftrack is currently the main project management option for OpenPype. This docume ## Prepare Ftrack for OpenPype ### Server URL -If you want to connect Ftrack to OpenPype you might need to make few changes in Ftrack settings. These changes would take a long time to do manually, so we prepared a few Ftrack actions to help you out. First, you'll need to launch OpenPype settings, enable [Ftrack module](admin_settings_system.md#Ftrack), and enter the address to your Ftrack server. +If you want to connect Ftrack to OpenPype you might need to make few changes in Ftrack settings. These changes would take a long time to do manually, so we prepared a few Ftrack actions to help you out. First, you'll need to launch OpenPype settings, enable [Ftrack module](admin_settings_system.md#Ftrack), and enter the address to your Ftrack server. ### Login Once your server is configured, restart OpenPype and you should be prompted to enter your [Ftrack credentials](artist_ftrack.md#How-to-use-Ftrack-in-OpenPype) to be able to run our Ftrack actions. If you are already logged in to Ftrack in your browser, it is enough to press `Ftrack login` and it will connect automatically. @@ -26,7 +26,7 @@ You can only use our Ftrack Actions and publish to Ftrack if each artist is logg ### Custom Attributes After successfully connecting OpenPype with you Ftrack, you can right click on any project in Ftrack and you should see a bunch of actions available. The most important one is called `OpenPype Admin` and contains multiple options inside. -To prepare Ftrack for working with OpenPype you'll need to run [OpenPype Admin - Create/Update Custom Attributes](manager_ftrack_actions.md#create-update-avalon-attributes), which creates and sets the Custom Attributes necessary for OpenPype to function. +To prepare Ftrack for working with OpenPype you'll need to run [OpenPype Admin - Create/Update Custom Attributes](manager_ftrack_actions.md#create-update-avalon-attributes), which creates and sets the Custom Attributes necessary for OpenPype to function. @@ -34,7 +34,7 @@ To prepare Ftrack for working with OpenPype you'll need to run [OpenPype Admin - Ftrack Event Server is the key to automation of many tasks like _status change_, _thumbnail update_, _automatic synchronization to Avalon database_ and many more. Event server should run at all times to perform the required processing as it is not possible to catch some of them retrospectively with enough certainty. ### Running event server -There are specific launch arguments for event server. With `openpype_console eventserver` you can launch event server but without prior preparation it will terminate immediately. The reason is that event server requires 3 pieces of information: _Ftrack server url_, _paths to events_ and _credentials (Username and API key)_. Ftrack server URL and Event path are set from OpenPype's environments by default, but the credentials must be done separatelly for security reasons. +There are specific launch arguments for event server. With `openpype_console module ftrack eventserver` you can launch event server but without prior preparation it will terminate immediately. The reason is that event server requires 3 pieces of information: _Ftrack server url_, _paths to events_ and _credentials (Username and API key)_. Ftrack server URL and Event path are set from OpenPype's environments by default, but the credentials must be done separatelly for security reasons. @@ -53,7 +53,7 @@ There are specific launch arguments for event server. With `openpype_console eve - **`--ftrack-api-key "00000aaa-11bb-22cc-33dd-444444eeeee"`** : User's API key - `--ftrack-url "https://yourdomain.ftrackapp.com/"` : Ftrack server URL _(it is not needed to enter if you have set `FTRACK_SERVER` in OpenPype' environments)_ -So if you want to use OpenPype's environments then you can launch event server for first time with these arguments `openpype_console.exe eventserver --ftrack-user "my.username" --ftrack-api-key "00000aaa-11bb-22cc-33dd-444444eeeee" --store-credentials`. Since that time, if everything was entered correctly, you can launch event server with `openpype_console.exe eventserver`. +So if you want to use OpenPype's environments then you can launch event server for first time with these arguments `openpype_console.exe module ftrack eventserver --ftrack-user "my.username" --ftrack-api-key "00000aaa-11bb-22cc-33dd-444444eeeee" --store-credentials`. Since that time, if everything was entered correctly, you can launch event server with `openpype_console.exe module ftrack eventserver`. @@ -72,7 +72,7 @@ We do not recommend setting your Ftrack user and api key environments in a persi ### Where to run event server -We recommend you to run event server on stable server machine with ability to connect to Avalon database and Ftrack web server. Best practice we recommend is to run event server as service. It can be Windows or Linux. +We recommend you to run event server on stable server machine with ability to connect to OpenPype database and Ftrack web server. Best practice we recommend is to run event server as service. It can be Windows or Linux. :::important Event server should **not** run more than once! It may cause major issues. @@ -99,11 +99,10 @@ Event server should **not** run more than once! It may cause major issues. - add content to the file: ```sh #!/usr/bin/env bash -export OPENPYPE_DEBUG=1 export OPENPYPE_MONGO= pushd /mnt/path/to/openpype -./openpype_console eventserver --ftrack-user --ftrack-api-key +./openpype_console module ftrack eventserver --ftrack-user --ftrack-api-key --debug ``` - change file permission: `sudo chmod 0755 /opt/openpype/run_event_server.sh` @@ -140,14 +139,13 @@ WantedBy=multi-user.target - create service file: `openpype-ftrack-eventserver.bat` -- add content to the service file: +- add content to the service file: ```sh @echo off -set OPENPYPE_DEBUG=1 set OPENPYPE_MONGO= pushd \\path\to\openpype -openpype_console.exe eventserver --ftrack-user --ftrack-api-key +openpype_console.exe module ftrack eventserver --ftrack-user --ftrack-api-key --debug ``` - download and install `nssm.cc` - create Windows service according to nssm.cc manual @@ -174,7 +172,7 @@ This event updates entities on their changes Ftrack. When new entity is created Deleting an entity by Ftrack's default is not processed for security reasons _(to delete entity use [Delete Asset/Subset action](manager_ftrack_actions.md#delete-asset-subset))_. ::: -### Synchronize Hierarchical and Entity Attributes +### Synchronize Hierarchical and Entity Attributes Auto-synchronization of hierarchical attributes from Ftrack entities. @@ -190,7 +188,7 @@ Change status of next task from `Not started` to `Ready` when previous task is a Multiple detailed rules for next task update can be configured in the settings. -### Delete Avalon ID from new entity +### Delete Avalon ID from new entity Is used to remove value from `Avalon/Mongo Id` Custom Attribute when entity is created. @@ -215,7 +213,7 @@ This event handler allows setting of different status to a first created Asset V This is useful for example if first version publish doesn't contain any actual reviewable work, but is only used for roundtrip conform check, in which case this version could receive status `pending conform` instead of standard `pending review` ### Update status on next task -Change status on next task by task types order when task status state changed to "Done". All tasks with the same Task mapping of next task status changes From → To. Some status can be ignored. +Change status on next task by task types order when task status state changed to "Done". All tasks with the same Task mapping of next task status changes From → To. Some status can be ignored. ## Publish plugins @@ -238,7 +236,7 @@ Add Ftrack Family: enabled #### Advanced adding if additional families present -In special cases adding 'ftrack' based on main family ('Families' set higher) is not enough. +In special cases adding 'ftrack' based on main family ('Families' set higher) is not enough. (For example upload to Ftrack for 'plate' main family should only happen if 'review' is contained in instance 'families', not added in other cases. ) -![Collect Ftrack Family](assets/ftrack/ftrack-collect-advanced.png) \ No newline at end of file +![Collect Ftrack Family](assets/ftrack/ftrack-collect-advanced.png) diff --git a/website/docs/system_introduction.md b/website/docs/system_introduction.md index 71c5d64aa8..05627b5359 100644 --- a/website/docs/system_introduction.md +++ b/website/docs/system_introduction.md @@ -17,7 +17,7 @@ various usage scenarios. You can find detailed breakdown of technical requirements [here](dev_requirements), but in general OpenPype should be able to operate in most studios fairly quickly. The main obstacles are usually related to workflows and habits, that -might now be fully compatible with what OpenPype is expecting or enforcing. +might not be fully compatible with what OpenPype is expecting or enforcing. It is recommended to go through artists [key concepts](artist_concepts) to get idea about basics. Keep in mind that if you run into any workflows that are not supported, it's usually just because we haven't hit that particular case and it can most likely be added upon request. @@ -48,24 +48,3 @@ to the table - Some DCCs do not support using Environment variables in file paths. This will make it very hard to maintain full multiplatform compatibility as well variable storage roots. - Relying on VPN connection and using it to work directly of network storage will be painfully slow. - - -## Repositories - -### [OpenPype](https://github.com/pypeclub/pype) - -This is where vast majority of the code that works with your data lives. It acts -as Avalon-Config, if we're speaking in avalon terms. - -Avalon gives us the ability to work with a certain host, say Maya, in a standardized manner, but OpenPype defines **how** we work with all the data, allows most of the behavior to be configured on a very granular level and provides a comprehensive build and installation tools for it. - -Thanks to that, we are able to maintain one codebase for vast majority of the features across all our clients deployments while keeping the option to tailor the pipeline to each individual studio. - -### [Avalon-core](https://github.com/pypeclub/avalon-core) - -Avalon-core is the heart of OpenPype. It provides the base functionality including key GUIs (albeit expanded and modified by us), database connection, standards for data structures, working with entities and some universal tools. - -Avalon is being actively developed and maintained by a community of studios and TDs from around the world, with Pype Club team being an active contributor as well. - -Due to the extensive work we've done on OpenPype and the need to react quickly to production needs, we -maintain our own fork of avalon-core, which is kept up to date with upstream changes as much as possible. diff --git a/website/docs/upgrade_notes.md b/website/docs/upgrade_notes.md deleted file mode 100644 index 8231cf997d..0000000000 --- a/website/docs/upgrade_notes.md +++ /dev/null @@ -1,165 +0,0 @@ ---- -id: update_notes -title: Update Notes -sidebar_label: Update Notes ---- - - - -## **Updating to 2.13.0** ## - -### MongoDB - -**Must** - -Due to changes in how tasks are stored in the database (we added task types and possibility of more arbitrary data.), we must take a few precautions when updating. -1. Make sure that ftrack event server with sync to avalon is NOT running during the update. -2. Any project that is to be worked on with 2.13 must be synced from ftrack to avalon with the updated sync to avalon action, or using and updated event server sync to avalon event. - -If 2.12 event servers runs when trying to update the project sync with 2.13, it will override any changes. - -### Nuke Studio / hiero - -Make sure to re-generate pype tags and replace any `task` tags on your shots with the new ones. This will allow you to make multiple tasks of the same type, but with different task name at the same time. - -### Nuke - -Due to a minor update to nuke write node, artists will be prompted to update their write nodes before being able to publish any old shots. There is a "repair" action for this in the publisher, so it doesn't have to be done manually. - - - - -## **Updating to 2.12.0** ## - -### Apps and tools - -**Must** - -run Create/Update Custom attributes action (to update custom attributes group) -check if studio has set custom intent values and move values to ~/config/presets/global/intent.json - -**Optional** - -Set true/false on application and tools by studio usage (eliminate app list in Ftrack and time for registering Ftrack ations) - - - - -## **Updating to 2.11.0** ## - -### Maya in deadline - -We added or own maya deadline plugin to make render management easier. It operates the same as standard mayaBatch in deadline, but allow us to separate Pype sumitted jobs from standard submitter. You'll need to follow this guide to update this [install pype deadline](https://pype.club/docs/admin_hosts#pype-dealine-supplement-code) - - - - -## **Updating to 2.9.0** ## - -### Review and Burnin PRESETS - -This release introduces a major update to working with review and burnin presets. They can now be much more granular and can target extremely specific usecases. The change is backwards compatible with previous format of review and burnin presets, however we highly recommend updating all the presets to the new format. Documentation on what this looks like can be found on pype main [documentation page](https://pype.club/docs/admin_presets_plugins#publishjson). - -### Multiroot and storages - -With the support of multiroot projects, we removed the old `storage.json` from configuration and replaced it with simpler `config/anatomy/roots.json`. This is a required change, but only needs to be done once per studio during the update to 2.9.0. [Read More](https://pype.club/docs/next/admin_config#roots) - - - - -## **Updating to 2.7.0** ## - -### Master Versions -To activate `master` version workflow you need to activate `integrateMasterVersion` plugin in the `config/presets/plugins/global/publish.json` - -``` -"IntegrateMasterVersion": {"enabled": true}, -``` - -### Ftrack - -Make sure that `intent` attributes in ftrack is set correctly. It should follow this setup unless you have your custom values -``` -{ - "label": "Intent", - "key": "intent", - "type": "enumerator", - "entity_type": "assetversion", - "group": "avalon", - "config": { - "multiselect": false, - "data": [ - {"test": "Test"}, - {"wip": "WIP"}, - {"final": "Final"} - ] - } -``` - - - - -## **Updating to 2.6.0** ## - -### Dev vs Prod - -If you want to differentiate between dev and prod deployments of pype, you need to add `config.ini` file to `pype-setup/pypeapp` folder with content. - -``` -[Default] -dev=true -``` - -### Ftrack - -You will have to log in to ftrack in pype after the update. You should be automatically prompted with the ftrack login window when you launch 2.6 release for the first time. - -Event server has to be restarted after the update to enable the ability to control it via action. - -### Presets - -There is a major change in the way how burnin presets are being stored. We simplified the preset format, however that means the currently running production configs need to be tweaked to match the new format. - -:::note Example of converting burnin preset from 2.5 to 2.6 - -2.5 burnin preset - -``` -"burnins":{ - "TOP_LEFT": { - "function": "text", - "text": "{dd}/{mm}/{yyyy}" - }, - "TOP_CENTERED": { - "function": "text", - "text": "" - }, - "TOP_RIGHT": { - "function": "text", - "text": "v{version:0>3}" - }, - "BOTTOM_LEFT": { - "function": "text", - "text": "{frame_start}-{current_frame}-{frame_end}" - }, - "BOTTOM_CENTERED": { - "function": "text", - "text": "{asset}" - }, - "BOTTOM_RIGHT": { - "function": "frame_numbers", - "text": "{username}" - } -``` - -2.6 burnin preset -``` -"burnins":{ - "TOP_LEFT": "{dd}/{mm}/{yyyy}", - "TOP_CENTER": "", - "TOP_RIGHT": "v{version:0>3}" - "BOTTOM_LEFT": "{frame_start}-{current_frame}-{frame_end}", - "BOTTOM_CENTERED": "{asset}", - "BOTTOM_RIGHT": "{username}" -} -``` diff --git a/website/sidebars.js b/website/sidebars.js index 9d60a5811c..920a3134f6 100644 --- a/website/sidebars.js +++ b/website/sidebars.js @@ -109,11 +109,7 @@ module.exports = { "admin_hosts_tvpaint" ], }, - { - type: "category", - label: "Releases", - items: ["changelog", "update_notes"], - }, + "admin_releases", { type: "category", collapsed: false, @@ -152,6 +148,7 @@ module.exports = { "dev_build", "dev_testing", "dev_contribute", + "dev_settings", { type: "category", label: "Hosts integrations",