diff --git a/CHANGELOG.md b/CHANGELOG.md index d2ae3c9eae..7f92fdc9f5 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,23 +1,80 @@ # Changelog +## [3.5.0-nightly.6](https://github.com/pypeclub/OpenPype/tree/HEAD) + +[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.4.1...HEAD) + +**Deprecated:** + +- Maya: Change mayaAscii family to mayaScene [\#2106](https://github.com/pypeclub/OpenPype/pull/2106) + +**πŸ†• New features** + +- Added running configurable disk mapping command before start of OP [\#2091](https://github.com/pypeclub/OpenPype/pull/2091) +- SFTP provider [\#2073](https://github.com/pypeclub/OpenPype/pull/2073) +- Maya: Validate setdress top group [\#2068](https://github.com/pypeclub/OpenPype/pull/2068) + +**πŸš€ Enhancements** + +- Settings UI: Project model refreshing and sorting [\#2104](https://github.com/pypeclub/OpenPype/pull/2104) +- Create Read From Rendered - Disable Relative paths by default [\#2093](https://github.com/pypeclub/OpenPype/pull/2093) +- Added choosing different dirmap mapping if workfile synched locally [\#2088](https://github.com/pypeclub/OpenPype/pull/2088) +- General: Remove IdleManager module [\#2084](https://github.com/pypeclub/OpenPype/pull/2084) +- Tray UI: Message box about missing settings defaults [\#2080](https://github.com/pypeclub/OpenPype/pull/2080) +- Tray UI: Show menu where first click happened [\#2079](https://github.com/pypeclub/OpenPype/pull/2079) +- Global: add global validators to settings [\#2078](https://github.com/pypeclub/OpenPype/pull/2078) +- Use CRF for burnin when available [\#2070](https://github.com/pypeclub/OpenPype/pull/2070) +- Project manager: Filter first item after selection of project [\#2069](https://github.com/pypeclub/OpenPype/pull/2069) +- Nuke: Adding `still` image family workflow [\#2064](https://github.com/pypeclub/OpenPype/pull/2064) +- Maya: validate authorized loaded plugins [\#2062](https://github.com/pypeclub/OpenPype/pull/2062) +- Tools: add support for pyenv on windows [\#2051](https://github.com/pypeclub/OpenPype/pull/2051) + +**πŸ› Bug fixes** + +- General: Disk mapping group [\#2120](https://github.com/pypeclub/OpenPype/pull/2120) +- Hiero: publishing effect first time makes wrong resources path [\#2115](https://github.com/pypeclub/OpenPype/pull/2115) +- Add startup script for Houdini Core. [\#2110](https://github.com/pypeclub/OpenPype/pull/2110) +- TVPaint: Behavior name of loop also accept repeat [\#2109](https://github.com/pypeclub/OpenPype/pull/2109) +- Ftrack: Project settings save custom attributes skip unknown attributes [\#2103](https://github.com/pypeclub/OpenPype/pull/2103) +- Fix broken import in sftp provider [\#2100](https://github.com/pypeclub/OpenPype/pull/2100) +- Global: Fix docstring on publish plugin extract review [\#2097](https://github.com/pypeclub/OpenPype/pull/2097) +- General: Cloud mongo ca certificate issue [\#2095](https://github.com/pypeclub/OpenPype/pull/2095) +- TVPaint: Creator use context from workfile [\#2087](https://github.com/pypeclub/OpenPype/pull/2087) +- Blender: fix texture missing when publishing blend files [\#2085](https://github.com/pypeclub/OpenPype/pull/2085) +- General: Startup validations oiio tool path fix on linux [\#2083](https://github.com/pypeclub/OpenPype/pull/2083) +- Deadline: Collect deadline server does not check existence of deadline key [\#2082](https://github.com/pypeclub/OpenPype/pull/2082) +- Blender: fixed Curves with modifiers in Rigs [\#2081](https://github.com/pypeclub/OpenPype/pull/2081) +- Maya: Fix multi-camera renders [\#2065](https://github.com/pypeclub/OpenPype/pull/2065) +- Fix Sync Queue when project disabled [\#2063](https://github.com/pypeclub/OpenPype/pull/2063) + +**Merged pull requests:** + +- Blender: Fix NoneType error when animation\_data is missing for a rig [\#2101](https://github.com/pypeclub/OpenPype/pull/2101) +- Delivery Action Files Sequence fix [\#2096](https://github.com/pypeclub/OpenPype/pull/2096) +- Bump pywin32 from 300 to 301 [\#2086](https://github.com/pypeclub/OpenPype/pull/2086) +- Nuke UI scaling [\#2077](https://github.com/pypeclub/OpenPype/pull/2077) + ## [3.4.1](https://github.com/pypeclub/OpenPype/tree/3.4.1) (2021-09-23) -[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.4.0...3.4.1) +[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.4.1-nightly.1...3.4.1) + +**πŸ†• New features** + +- Settings: Flag project as deactivated and hide from tools' view [\#2008](https://github.com/pypeclub/OpenPype/pull/2008) **πŸš€ Enhancements** - General: Startup validations [\#2054](https://github.com/pypeclub/OpenPype/pull/2054) - Nuke: proxy mode validator [\#2052](https://github.com/pypeclub/OpenPype/pull/2052) -- Ftrack: Removed ftrack interface [\#2049](https://github.com/pypeclub/OpenPype/pull/2049) - Settings UI: Deffered set value on entity [\#2044](https://github.com/pypeclub/OpenPype/pull/2044) - Loader: Families filtering [\#2043](https://github.com/pypeclub/OpenPype/pull/2043) - Settings UI: Project view enhancements [\#2042](https://github.com/pypeclub/OpenPype/pull/2042) - Settings for Nuke IncrementScriptVersion [\#2039](https://github.com/pypeclub/OpenPype/pull/2039) +- Loader & Library loader: Use tools from OpenPype [\#2038](https://github.com/pypeclub/OpenPype/pull/2038) - Adding predefined project folders creation in PM [\#2030](https://github.com/pypeclub/OpenPype/pull/2030) - WebserverModule: Removed interface of webserver module [\#2028](https://github.com/pypeclub/OpenPype/pull/2028) - TimersManager: Removed interface of timers manager [\#2024](https://github.com/pypeclub/OpenPype/pull/2024) - Feature Maya import asset from scene inventory [\#2018](https://github.com/pypeclub/OpenPype/pull/2018) -- Settings: Flag project as deactivated and hide from tools' view [\#2008](https://github.com/pypeclub/OpenPype/pull/2008) **πŸ› Bug fixes** @@ -35,14 +92,18 @@ [Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.4.0-nightly.6...3.4.0) +### πŸ“– Documentation + +- Documentation: Ftrack launch argsuments update [\#2014](https://github.com/pypeclub/OpenPype/pull/2014) + **πŸ†• New features** - Nuke: Compatibility with Nuke 13 [\#2003](https://github.com/pypeclub/OpenPype/pull/2003) **πŸš€ Enhancements** +- Ftrack: Removed ftrack interface [\#2049](https://github.com/pypeclub/OpenPype/pull/2049) - Added possibility to configure of synchronization of workfile version… [\#2041](https://github.com/pypeclub/OpenPype/pull/2041) -- Loader & Library loader: Use tools from OpenPype [\#2038](https://github.com/pypeclub/OpenPype/pull/2038) - General: Task types in profiles [\#2036](https://github.com/pypeclub/OpenPype/pull/2036) - Console interpreter: Handle invalid sizes on initialization [\#2022](https://github.com/pypeclub/OpenPype/pull/2022) - Ftrack: Show OpenPype versions in event server status [\#2019](https://github.com/pypeclub/OpenPype/pull/2019) @@ -54,15 +115,6 @@ - Configurable items for providers without Settings [\#1987](https://github.com/pypeclub/OpenPype/pull/1987) - Global: Example addons [\#1986](https://github.com/pypeclub/OpenPype/pull/1986) - Standalone Publisher: Extract harmony zip handle workfile template [\#1982](https://github.com/pypeclub/OpenPype/pull/1982) -- Settings UI: Number sliders [\#1978](https://github.com/pypeclub/OpenPype/pull/1978) -- Workfiles: Support more workfile templates [\#1966](https://github.com/pypeclub/OpenPype/pull/1966) -- Launcher: Fix crashes on action click [\#1964](https://github.com/pypeclub/OpenPype/pull/1964) -- Settings: Minor fixes in UI and missing default values [\#1963](https://github.com/pypeclub/OpenPype/pull/1963) -- Blender: Toggle system console works on windows [\#1962](https://github.com/pypeclub/OpenPype/pull/1962) -- Global: Settings defined by Addons/Modules [\#1959](https://github.com/pypeclub/OpenPype/pull/1959) -- CI: change release numbering triggers [\#1954](https://github.com/pypeclub/OpenPype/pull/1954) -- Global: Avalon Host name collector [\#1949](https://github.com/pypeclub/OpenPype/pull/1949) -- OpenPype: Add version validation and `--headless` mode and update progress πŸ”„ [\#1939](https://github.com/pypeclub/OpenPype/pull/1939) **πŸ› Bug fixes** @@ -77,55 +129,15 @@ - Nuke: last version from path gets correct version [\#1990](https://github.com/pypeclub/OpenPype/pull/1990) - nuke, resolve, hiero: precollector order lest then 0.5 [\#1984](https://github.com/pypeclub/OpenPype/pull/1984) - Last workfile with multiple work templates [\#1981](https://github.com/pypeclub/OpenPype/pull/1981) -- Collectors order [\#1977](https://github.com/pypeclub/OpenPype/pull/1977) -- Stop timer was within validator order range. [\#1975](https://github.com/pypeclub/OpenPype/pull/1975) -- Ftrack: arrow submodule has https url source [\#1974](https://github.com/pypeclub/OpenPype/pull/1974) -- Ftrack: Fix hosts attribute in collect ftrack username [\#1972](https://github.com/pypeclub/OpenPype/pull/1972) -- Deadline: Houdini plugins in different hierarchy [\#1970](https://github.com/pypeclub/OpenPype/pull/1970) -- Removed deprecated submodules [\#1967](https://github.com/pypeclub/OpenPype/pull/1967) -- Global: ExtractJpeg can handle filepaths with spaces [\#1961](https://github.com/pypeclub/OpenPype/pull/1961) - -### πŸ“– Documentation - -- Documentation: Ftrack launch argsuments update [\#2014](https://github.com/pypeclub/OpenPype/pull/2014) -- Nuke Quick Start / Tutorial [\#1952](https://github.com/pypeclub/OpenPype/pull/1952) ## [3.3.1](https://github.com/pypeclub/OpenPype/tree/3.3.1) (2021-08-20) [Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.3.1-nightly.1...3.3.1) -**πŸ› Bug fixes** - -- TVPaint: Fixed rendered frame indexes [\#1946](https://github.com/pypeclub/OpenPype/pull/1946) -- Maya: Menu actions fix [\#1945](https://github.com/pypeclub/OpenPype/pull/1945) -- standalone: editorial shared object problem [\#1941](https://github.com/pypeclub/OpenPype/pull/1941) -- Bugfix nuke deadline app name [\#1928](https://github.com/pypeclub/OpenPype/pull/1928) - ## [3.3.0](https://github.com/pypeclub/OpenPype/tree/3.3.0) (2021-08-17) [Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.3.0-nightly.11...3.3.0) -**πŸ†• New features** - -- Settings UI: Breadcrumbs in settings [\#1932](https://github.com/pypeclub/OpenPype/pull/1932) - -**πŸš€ Enhancements** - -- Python console interpreter [\#1940](https://github.com/pypeclub/OpenPype/pull/1940) -- Global: Updated logos and Default settings [\#1927](https://github.com/pypeclub/OpenPype/pull/1927) -- Check for missing ✨ Python when using `pyenv` [\#1925](https://github.com/pypeclub/OpenPype/pull/1925) - -**πŸ› Bug fixes** - -- Fix - ftrack family was added incorrectly in some cases [\#1935](https://github.com/pypeclub/OpenPype/pull/1935) -- Fix - Deadline publish on Linux started Tray instead of headless publishing [\#1930](https://github.com/pypeclub/OpenPype/pull/1930) -- Maya: Validate Model Name - repair accident deletion in settings defaults [\#1929](https://github.com/pypeclub/OpenPype/pull/1929) -- Nuke: submit to farm failed due `ftrack` family remove [\#1926](https://github.com/pypeclub/OpenPype/pull/1926) - -**Merged pull requests:** - -- Fix - make AE workfile publish to Ftrack configurable [\#1937](https://github.com/pypeclub/OpenPype/pull/1937) - ## [3.2.0](https://github.com/pypeclub/OpenPype/tree/3.2.0) (2021-07-13) [Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.2.0-nightly.7...3.2.0) diff --git a/igniter/tools.py b/igniter/tools.py index c934289064..04d7451335 100644 --- a/igniter/tools.py +++ b/igniter/tools.py @@ -1,18 +1,12 @@ # -*- coding: utf-8 -*- -"""Tools used in **Igniter** GUI. - -Functions ``compose_url()`` and ``decompose_url()`` are the same as in -``openpype.lib`` and they are here to avoid importing OpenPype module before its -version is decided. - -""" -import sys +"""Tools used in **Igniter** GUI.""" import os -from typing import Dict, Union +from typing import Union from urllib.parse import urlparse, parse_qs from pathlib import Path import platform +import certifi from pymongo import MongoClient from pymongo.errors import ( ServerSelectionTimeoutError, @@ -22,89 +16,32 @@ from pymongo.errors import ( ) -def decompose_url(url: str) -> Dict: - """Decompose mongodb url to its separate components. - - Args: - url (str): Mongodb url. - - Returns: - dict: Dictionary of components. +def should_add_certificate_path_to_mongo_url(mongo_url): + """Check if should add ca certificate to mongo url. + Since 30.9.2021 cloud mongo requires newer certificates that are not + available on most of workstation. This adds path to certifi certificate + which is valid for it. To add the certificate path url must have scheme + 'mongodb+srv' or has 'ssl=true' or 'tls=true' in url query. """ - components = { - "scheme": None, - "host": None, - "port": None, - "username": None, - "password": None, - "auth_db": None - } + parsed = urlparse(mongo_url) + query = parse_qs(parsed.query) + lowered_query_keys = set(key.lower() for key in query.keys()) + add_certificate = False + # Check if url 'ssl' or 'tls' are set to 'true' + for key in ("ssl", "tls"): + if key in query and "true" in query["ssl"]: + add_certificate = True + break - result = urlparse(url) - if result.scheme is None: - _url = "mongodb://{}".format(url) - result = urlparse(_url) + # Check if url contains 'mongodb+srv' + if not add_certificate and parsed.scheme == "mongodb+srv": + add_certificate = True - components["scheme"] = result.scheme - components["host"] = result.hostname - try: - components["port"] = result.port - except ValueError: - raise RuntimeError("invalid port specified") - components["username"] = result.username - components["password"] = result.password - - try: - components["auth_db"] = parse_qs(result.query)['authSource'][0] - except KeyError: - # no auth db provided, mongo will use the one we are connecting to - pass - - return components - - -def compose_url(scheme: str = None, - host: str = None, - username: str = None, - password: str = None, - port: int = None, - auth_db: str = None) -> str: - """Compose mongodb url from its individual components. - - Args: - scheme (str, optional): - host (str, optional): - username (str, optional): - password (str, optional): - port (str, optional): - auth_db (str, optional): - - Returns: - str: mongodb url - - """ - - url = "{scheme}://" - - if username and password: - url += "{username}:{password}@" - - url += "{host}" - if port: - url += ":{port}" - - if auth_db: - url += "?authSource={auth_db}" - - return url.format(**{ - "scheme": scheme, - "host": host, - "username": username, - "password": password, - "port": port, - "auth_db": auth_db - }) + # Check if url does already contain certificate path + if add_certificate and "tlscafile" in lowered_query_keys: + add_certificate = False + return add_certificate def validate_mongo_connection(cnx: str) -> (bool, str): @@ -121,12 +58,18 @@ def validate_mongo_connection(cnx: str) -> (bool, str): if parsed.scheme not in ["mongodb", "mongodb+srv"]: return False, "Not mongodb schema" + kwargs = { + "serverSelectionTimeoutMS": 2000 + } + # Add certificate path if should be required + if should_add_certificate_path_to_mongo_url(cnx): + kwargs["ssl_ca_certs"] = certifi.where() + try: - client = MongoClient( - cnx, - serverSelectionTimeoutMS=2000 - ) + client = MongoClient(cnx, **kwargs) client.server_info() + with client.start_session(): + pass client.close() except ServerSelectionTimeoutError as e: return False, f"Cannot connect to server {cnx} - {e}" @@ -152,10 +95,7 @@ def validate_mongo_string(mongo: str) -> (bool, str): """ if not mongo: return True, "empty string" - parsed = urlparse(mongo) - if parsed.scheme in ["mongodb", "mongodb+srv"]: - return validate_mongo_connection(mongo) - return False, "not valid mongodb schema" + return validate_mongo_connection(mongo) def validate_path_string(path: str) -> (bool, str): @@ -195,21 +135,13 @@ def get_openpype_global_settings(url: str) -> dict: Returns: dict: With settings data. Empty dictionary is returned if not found. """ - try: - components = decompose_url(url) - except RuntimeError: - return {} - mongo_kwargs = { - "host": compose_url(**components), - "serverSelectionTimeoutMS": 2000 - } - port = components.get("port") - if port is not None: - mongo_kwargs["port"] = int(port) + kwargs = {} + if should_add_certificate_path_to_mongo_url(url): + kwargs["ssl_ca_certs"] = certifi.where() try: # Create mongo connection - client = MongoClient(**mongo_kwargs) + client = MongoClient(url, **kwargs) # Access settings collection col = client["openpype"]["settings"] # Query global settings diff --git a/openpype/cli.py b/openpype/cli.py index 18cc1c63cd..c69407e295 100644 --- a/openpype/cli.py +++ b/openpype/cli.py @@ -283,3 +283,18 @@ def run(script): args_string = " ".join(args[1:]) print(f"... running: {script} {args_string}") runpy.run_path(script, run_name="__main__", ) + + +@main.command() +@click.argument("folder", nargs=-1) +@click.option("-m", + "--mark", + help="Run tests marked by", + default=None) +@click.option("-p", + "--pyargs", + help="Run tests from package", + default=None) +def runtests(folder, mark, pyargs): + """Run all automatic tests after proper initialization via start.py""" + PypeCommands().run_tests(folder, mark, pyargs) diff --git a/openpype/hooks/pre_global_host_data.py b/openpype/hooks/pre_global_host_data.py index c669d91ad5..b32fb5e44a 100644 --- a/openpype/hooks/pre_global_host_data.py +++ b/openpype/hooks/pre_global_host_data.py @@ -43,6 +43,8 @@ class GlobalHostDataHook(PreLaunchHook): "env": self.launch_context.env, + "last_workfile_path": self.data.get("last_workfile_path"), + "log": self.log }) diff --git a/openpype/hosts/blender/plugins/load/load_rig.py b/openpype/hosts/blender/plugins/load/load_rig.py index 5573c081e1..6062c293df 100644 --- a/openpype/hosts/blender/plugins/load/load_rig.py +++ b/openpype/hosts/blender/plugins/load/load_rig.py @@ -66,12 +66,16 @@ class BlendRigLoader(plugin.AssetLoader): objects = [] nodes = list(container.children) - for obj in nodes: - obj.parent = asset_group + allowed_types = ['ARMATURE', 'MESH'] for obj in nodes: - objects.append(obj) - nodes.extend(list(obj.children)) + if obj.type in allowed_types: + obj.parent = asset_group + + for obj in nodes: + if obj.type in allowed_types: + objects.append(obj) + nodes.extend(list(obj.children)) objects.reverse() @@ -107,7 +111,8 @@ class BlendRigLoader(plugin.AssetLoader): if action is not None: local_obj.animation_data.action = action - elif local_obj.animation_data.action is not None: + elif (local_obj.animation_data and + local_obj.animation_data.action is not None): plugin.prepare_data( local_obj.animation_data.action, group_name) @@ -126,7 +131,30 @@ class BlendRigLoader(plugin.AssetLoader): objects.reverse() - bpy.data.orphans_purge(do_local_ids=False) + curves = [obj for obj in data_to.objects if obj.type == 'CURVE'] + + for curve in curves: + local_obj = plugin.prepare_data(curve, group_name) + plugin.prepare_data(local_obj.data, group_name) + + local_obj.use_fake_user = True + + for mod in local_obj.modifiers: + mod_target_name = mod.object.name + mod.object = bpy.data.objects.get( + f"{group_name}:{mod_target_name}") + + if not local_obj.get(AVALON_PROPERTY): + local_obj[AVALON_PROPERTY] = dict() + + avalon_info = local_obj[AVALON_PROPERTY] + avalon_info.update({"container_name": group_name}) + + local_obj.parent = asset_group + objects.append(local_obj) + + while bpy.data.orphans_purge(do_local_ids=False): + pass bpy.ops.object.select_all(action='DESELECT') diff --git a/openpype/hosts/blender/plugins/publish/extract_blend.py b/openpype/hosts/blender/plugins/publish/extract_blend.py index 6687c9fe76..e880b1bde0 100644 --- a/openpype/hosts/blender/plugins/publish/extract_blend.py +++ b/openpype/hosts/blender/plugins/publish/extract_blend.py @@ -28,6 +28,16 @@ class ExtractBlend(openpype.api.Extractor): for obj in instance: data_blocks.add(obj) + # Pack used images in the blend files. + if obj.type == 'MESH': + for material_slot in obj.material_slots: + mat = material_slot.material + if mat and mat.use_nodes: + tree = mat.node_tree + if tree.type == 'SHADER': + for node in tree.nodes: + if node.bl_idname == 'ShaderNodeTexImage': + node.image.pack() bpy.data.libraries.write(filepath, data_blocks) diff --git a/openpype/hosts/houdini/startup/scripts/houdinicore.py b/openpype/hosts/houdini/startup/scripts/houdinicore.py new file mode 100644 index 0000000000..4233d68c15 --- /dev/null +++ b/openpype/hosts/houdini/startup/scripts/houdinicore.py @@ -0,0 +1,9 @@ +from avalon import api, houdini + + +def main(): + print("Installing OpenPype ...") + api.install(houdini) + + +main() diff --git a/openpype/hosts/maya/api/__init__.py b/openpype/hosts/maya/api/__init__.py index 3412803714..725a075417 100644 --- a/openpype/hosts/maya/api/__init__.py +++ b/openpype/hosts/maya/api/__init__.py @@ -64,14 +64,23 @@ def process_dirmap(project_settings): # type: (dict) -> None """Go through all paths in Settings and set them using `dirmap`. + If artists has Site Sync enabled, take dirmap mapping directly from + Local Settings when artist is syncing workfile locally. + Args: project_settings (dict): Settings for current project. """ - if not project_settings["maya"].get("maya-dirmap"): + local_mapping = _get_local_sync_dirmap(project_settings) + if not project_settings["maya"].get("maya-dirmap") and not local_mapping: return - mapping = project_settings["maya"]["maya-dirmap"]["paths"] or {} - mapping_enabled = project_settings["maya"]["maya-dirmap"]["enabled"] + + mapping = local_mapping or \ + project_settings["maya"]["maya-dirmap"]["paths"] \ + or {} + mapping_enabled = project_settings["maya"]["maya-dirmap"]["enabled"] \ + or bool(local_mapping) + if not mapping or not mapping_enabled: return if mapping.get("source-path") and mapping_enabled is True: @@ -94,10 +103,72 @@ def process_dirmap(project_settings): continue +def _get_local_sync_dirmap(project_settings): + """ + Returns dirmap if synch to local project is enabled. + + Only valid mapping is from roots of remote site to local site set in + Local Settings. + + Args: + project_settings (dict) + Returns: + dict : { "source-path": [XXX], "destination-path": [YYYY]} + """ + import json + mapping = {} + + if not project_settings["global"]["sync_server"]["enabled"]: + log.debug("Site Sync not enabled") + return mapping + + from openpype.settings.lib import get_site_local_overrides + from openpype.modules import ModulesManager + + manager = ModulesManager() + sync_module = manager.modules_by_name["sync_server"] + + project_name = os.getenv("AVALON_PROJECT") + sync_settings = sync_module.get_sync_project_setting( + os.getenv("AVALON_PROJECT"), exclude_locals=False, cached=False) + log.debug(json.dumps(sync_settings, indent=4)) + + active_site = sync_module.get_local_normalized_site( + sync_module.get_active_site(project_name)) + remote_site = sync_module.get_local_normalized_site( + sync_module.get_remote_site(project_name)) + log.debug("active {} - remote {}".format(active_site, remote_site)) + + if active_site == "local" \ + and project_name in sync_module.get_enabled_projects()\ + and active_site != remote_site: + overrides = get_site_local_overrides(os.getenv("AVALON_PROJECT"), + active_site) + for root_name, value in overrides.items(): + if os.path.isdir(value): + try: + mapping["destination-path"] = [value] + mapping["source-path"] = [sync_settings["sites"]\ + [remote_site]\ + ["root"]\ + [root_name]] + except IndexError: + # missing corresponding destination path + log.debug("overrides".format(overrides)) + log.error( + ("invalid dirmap mapping, missing corresponding" + " destination directory.")) + break + + log.debug("local sync mapping:: {}".format(mapping)) + return mapping + + def uninstall(): pyblish.deregister_plugin_path(PUBLISH_PATH) avalon.deregister_plugin_path(avalon.Loader, LOAD_PATH) avalon.deregister_plugin_path(avalon.Creator, CREATE_PATH) + avalon.deregister_plugin_path(avalon.InventoryAction, INVENTORY_PATH) menu.uninstall() diff --git a/openpype/hosts/maya/api/lib_renderproducts.py b/openpype/hosts/maya/api/lib_renderproducts.py index fb99584c5d..4983109d58 100644 --- a/openpype/hosts/maya/api/lib_renderproducts.py +++ b/openpype/hosts/maya/api/lib_renderproducts.py @@ -114,6 +114,8 @@ class RenderProduct(object): aov = attr.ib(default=None) # source aov driver = attr.ib(default=None) # source driver multipart = attr.ib(default=False) # multichannel file + camera = attr.ib(default=None) # used only when rendering + # from multiple cameras def get(layer, render_instance=None): @@ -183,6 +185,16 @@ class ARenderProducts: self.layer_data = self._get_layer_data() self.layer_data.products = self.get_render_products() + def has_camera_token(self): + # type: () -> bool + """Check if camera token is in image prefix. + + Returns: + bool: True/False if camera token is present. + + """ + return "" in self.layer_data.filePrefix.lower() + @abstractmethod def get_render_products(self): """To be implemented by renderer class. @@ -307,7 +319,7 @@ class ARenderProducts: # Deadline allows submitting renders with a custom frame list # to support those cases we might want to allow 'custom frames' # to be overridden to `ExpectFiles` class? - layer_data = LayerMetadata( + return LayerMetadata( frameStart=int(self.get_render_attribute("startFrame")), frameEnd=int(self.get_render_attribute("endFrame")), frameStep=int(self.get_render_attribute("byFrameStep")), @@ -321,7 +333,6 @@ class ARenderProducts: defaultExt=self._get_attr("defaultRenderGlobals.imfPluginKey"), filePrefix=file_prefix ) - return layer_data def _generate_file_sequence( self, layer_data, @@ -330,7 +341,7 @@ class ARenderProducts: force_cameras=None): # type: (LayerMetadata, str, str, list) -> list expected_files = [] - cameras = force_cameras if force_cameras else layer_data.cameras + cameras = force_cameras or layer_data.cameras ext = force_ext or layer_data.defaultExt for cam in cameras: file_prefix = layer_data.filePrefix @@ -361,8 +372,8 @@ class ARenderProducts: ) return expected_files - def get_files(self, product, camera): - # type: (RenderProduct, str) -> list + def get_files(self, product): + # type: (RenderProduct) -> list """Return list of expected files. It will translate render token strings ('', etc.) to @@ -373,7 +384,6 @@ class ARenderProducts: Args: product (RenderProduct): Render product to be used for file generation. - camera (str): Camera name. Returns: List of files @@ -383,7 +393,7 @@ class ARenderProducts: self.layer_data, force_aov_name=product.productName, force_ext=product.ext, - force_cameras=[camera] + force_cameras=[product.camera] ) def get_renderable_cameras(self): @@ -460,15 +470,21 @@ class RenderProductsArnold(ARenderProducts): return prefix - def _get_aov_render_products(self, aov): + def _get_aov_render_products(self, aov, cameras=None): """Return all render products for the AOV""" - products = list() + products = [] aov_name = self._get_attr(aov, "name") ai_drivers = cmds.listConnections("{}.outputs".format(aov), source=True, destination=False, type="aiAOVDriver") or [] + if not cameras: + cameras = [ + self.sanitize_camera_name( + self.get_renderable_cameras()[0] + ) + ] for ai_driver in ai_drivers: # todo: check aiAOVDriver.prefix as it could have @@ -497,30 +513,37 @@ class RenderProductsArnold(ARenderProducts): name = "beauty" # Support Arnold light groups for AOVs - # Global AOV: When disabled the main layer is not written: `{pass}` + # Global AOV: When disabled the main layer is + # not written: `{pass}` # All Light Groups: When enabled, a `{pass}_lgroups` file is - # written and is always merged into a single file - # Light Groups List: When set, a product per light group is written + # written and is always merged into a + # single file + # Light Groups List: When set, a product per light + # group is written # e.g. {pass}_front, {pass}_rim global_aov = self._get_attr(aov, "globalAov") if global_aov: - product = RenderProduct(productName=name, - ext=ext, - aov=aov_name, - driver=ai_driver) - products.append(product) + for camera in cameras: + product = RenderProduct(productName=name, + ext=ext, + aov=aov_name, + driver=ai_driver, + camera=camera) + products.append(product) all_light_groups = self._get_attr(aov, "lightGroups") if all_light_groups: # All light groups is enabled. A single multipart # Render Product - product = RenderProduct(productName=name + "_lgroups", - ext=ext, - aov=aov_name, - driver=ai_driver, - # Always multichannel output - multipart=True) - products.append(product) + for camera in cameras: + product = RenderProduct(productName=name + "_lgroups", + ext=ext, + aov=aov_name, + driver=ai_driver, + # Always multichannel output + multipart=True, + camera=camera) + products.append(product) else: value = self._get_attr(aov, "lightGroupsList") if not value: @@ -529,11 +552,15 @@ class RenderProductsArnold(ARenderProducts): for light_group in selected_light_groups: # Render Product per selected light group aov_light_group_name = "{}_{}".format(name, light_group) - product = RenderProduct(productName=aov_light_group_name, - aov=aov_name, - driver=ai_driver, - ext=ext) - products.append(product) + for camera in cameras: + product = RenderProduct( + productName=aov_light_group_name, + aov=aov_name, + driver=ai_driver, + ext=ext, + camera=camera + ) + products.append(product) return products @@ -556,17 +583,26 @@ class RenderProductsArnold(ARenderProducts): # anyway. return [] - default_ext = self._get_attr("defaultRenderGlobals.imfPluginKey") - beauty_product = RenderProduct(productName="beauty", - ext=default_ext, - driver="defaultArnoldDriver") + # check if camera token is in prefix. If so, and we have list of + # renderable cameras, generate render product for each and every + # of them. + cameras = [ + self.sanitize_camera_name(c) + for c in self.get_renderable_cameras() + ] + default_ext = self._get_attr("defaultRenderGlobals.imfPluginKey") + beauty_products = [RenderProduct( + productName="beauty", + ext=default_ext, + driver="defaultArnoldDriver", + camera=camera) for camera in cameras] # AOVs > Legacy > Maya Render View > Mode aovs_enabled = bool( self._get_attr("defaultArnoldRenderOptions.aovMode") ) if not aovs_enabled: - return [beauty_product] + return beauty_products # Common > File Output > Merge AOVs or # We don't need to check for Merge AOVs due to overridden @@ -575,8 +611,9 @@ class RenderProductsArnold(ARenderProducts): "" in self.layer_data.filePrefix.lower() ) if not has_renderpass_token: - beauty_product.multipart = True - return [beauty_product] + for product in beauty_products: + product.multipart = True + return beauty_products # AOVs are set to be rendered separately. We should expect # token in path. @@ -598,14 +635,14 @@ class RenderProductsArnold(ARenderProducts): continue # For now stick to the legacy output format. - aov_products = self._get_aov_render_products(aov) + aov_products = self._get_aov_render_products(aov, cameras) products.extend(aov_products) - if not any(product.aov == "RGBA" for product in products): + if all(product.aov != "RGBA" for product in products): # Append default 'beauty' as this is arnolds default. # However, it is excluded whenever a RGBA pass is enabled. # For legibility add the beauty layer as first entry - products.insert(0, beauty_product) + products += beauty_products # TODO: Output Denoising AOVs? @@ -670,6 +707,11 @@ class RenderProductsVray(ARenderProducts): # anyway. return [] + cameras = [ + self.sanitize_camera_name(c) + for c in self.get_renderable_cameras() + ] + image_format_str = self._get_attr("vraySettings.imageFormatStr") default_ext = image_format_str if default_ext in {"exr (multichannel)", "exr (deep)"}: @@ -680,13 +722,21 @@ class RenderProductsVray(ARenderProducts): # add beauty as default when not disabled dont_save_rgb = self._get_attr("vraySettings.dontSaveRgbChannel") if not dont_save_rgb: - products.append(RenderProduct(productName="", ext=default_ext)) + for camera in cameras: + products.append( + RenderProduct(productName="", + ext=default_ext, + camera=camera)) # separate alpha file separate_alpha = self._get_attr("vraySettings.separateAlpha") if separate_alpha: - products.append(RenderProduct(productName="Alpha", - ext=default_ext)) + for camera in cameras: + products.append( + RenderProduct(productName="Alpha", + ext=default_ext, + camera=camera) + ) if image_format_str == "exr (multichannel)": # AOVs are merged in m-channel file, only main layer is rendered @@ -716,19 +766,23 @@ class RenderProductsVray(ARenderProducts): # instead seems to output multiple Render Products, # specifically "Self_Illumination" and "Environment" product_names = ["Self_Illumination", "Environment"] - for name in product_names: - product = RenderProduct(productName=name, - ext=default_ext, - aov=aov) - products.append(product) + for camera in cameras: + for name in product_names: + product = RenderProduct(productName=name, + ext=default_ext, + aov=aov, + camera=camera) + products.append(product) # Continue as we've processed this special case AOV continue aov_name = self._get_vray_aov_name(aov) - product = RenderProduct(productName=aov_name, - ext=default_ext, - aov=aov) - products.append(product) + for camera in cameras: + product = RenderProduct(productName=aov_name, + ext=default_ext, + aov=aov, + camera=camera) + products.append(product) return products @@ -875,6 +929,11 @@ class RenderProductsRedshift(ARenderProducts): # anyway. return [] + cameras = [ + self.sanitize_camera_name(c) + for c in self.get_renderable_cameras() + ] + # For Redshift we don't directly return upon forcing multilayer # due to some AOVs still being written into separate files, # like Cryptomatte. @@ -933,11 +992,14 @@ class RenderProductsRedshift(ARenderProducts): for light_group in light_groups: aov_light_group_name = "{}_{}".format(aov_name, light_group) - product = RenderProduct(productName=aov_light_group_name, - aov=aov_name, - ext=ext, - multipart=aov_multipart) - products.append(product) + for camera in cameras: + product = RenderProduct( + productName=aov_light_group_name, + aov=aov_name, + ext=ext, + multipart=aov_multipart, + camera=camera) + products.append(product) if light_groups: light_groups_enabled = True @@ -945,11 +1007,13 @@ class RenderProductsRedshift(ARenderProducts): # Redshift AOV Light Select always renders the global AOV # even when light groups are present so we don't need to # exclude it when light groups are active - product = RenderProduct(productName=aov_name, - aov=aov_name, - ext=ext, - multipart=aov_multipart) - products.append(product) + for camera in cameras: + product = RenderProduct(productName=aov_name, + aov=aov_name, + ext=ext, + multipart=aov_multipart, + camera=camera) + products.append(product) # When a Beauty AOV is added manually, it will be rendered as # 'Beauty_other' in file name and "standard" beauty will have @@ -959,10 +1023,12 @@ class RenderProductsRedshift(ARenderProducts): return products beauty_name = "Beauty_other" if has_beauty_aov else "" - products.insert(0, - RenderProduct(productName=beauty_name, - ext=ext, - multipart=multipart)) + for camera in cameras: + products.insert(0, + RenderProduct(productName=beauty_name, + ext=ext, + multipart=multipart, + camera=camera)) return products @@ -987,6 +1053,16 @@ class RenderProductsRenderman(ARenderProducts): :func:`ARenderProducts.get_render_products()` """ + cameras = [ + self.sanitize_camera_name(c) + for c in self.get_renderable_cameras() + ] + + if not cameras: + cameras = [ + self.sanitize_camera_name( + self.get_renderable_cameras()[0]) + ] products = [] default_ext = "exr" @@ -1000,9 +1076,11 @@ class RenderProductsRenderman(ARenderProducts): if aov_name == "rmanDefaultDisplay": aov_name = "beauty" - product = RenderProduct(productName=aov_name, - ext=default_ext) - products.append(product) + for camera in cameras: + product = RenderProduct(productName=aov_name, + ext=default_ext, + camera=camera) + products.append(product) return products diff --git a/openpype/hosts/maya/api/plugin.py b/openpype/hosts/maya/api/plugin.py index 448cb814d9..fdad0e0989 100644 --- a/openpype/hosts/maya/api/plugin.py +++ b/openpype/hosts/maya/api/plugin.py @@ -123,7 +123,7 @@ class ReferenceLoader(api.Loader): count = options.get("count") or 1 for c in range(0, count): namespace = namespace or lib.unique_namespace( - asset["name"] + "_", + "{}_{}_".format(asset["name"], context["subset"]["name"]), prefix="_" if asset["name"][0].isdigit() else "", suffix="_", ) diff --git a/openpype/hosts/maya/plugins/create/create_mayaascii.py b/openpype/hosts/maya/plugins/create/create_mayaascii.py index f51e126c00..8bbdf107c6 100644 --- a/openpype/hosts/maya/plugins/create/create_mayaascii.py +++ b/openpype/hosts/maya/plugins/create/create_mayaascii.py @@ -1,11 +1,11 @@ from openpype.hosts.maya.api import plugin -class CreateMayaAscii(plugin.Creator): - """Raw Maya Ascii file export""" +class CreateMayaScene(plugin.Creator): + """Raw Maya Scene file export""" - name = "mayaAscii" - label = "Maya Ascii" - family = "mayaAscii" + name = "mayaScene" + label = "Maya Scene" + family = "mayaScene" icon = "file-archive-o" defaults = ['Main'] diff --git a/openpype/hosts/maya/plugins/create/create_setdress.py b/openpype/hosts/maya/plugins/create/create_setdress.py index 8274a6dd83..4246183fdb 100644 --- a/openpype/hosts/maya/plugins/create/create_setdress.py +++ b/openpype/hosts/maya/plugins/create/create_setdress.py @@ -9,3 +9,8 @@ class CreateSetDress(plugin.Creator): family = "setdress" icon = "cubes" defaults = ["Main", "Anim"] + + def __init__(self, *args, **kwargs): + super(CreateSetDress, self).__init__(*args, **kwargs) + + self.data["exactSetMembersOnly"] = True diff --git a/openpype/hosts/maya/plugins/inventory/import_modelrender.py b/openpype/hosts/maya/plugins/inventory/import_modelrender.py new file mode 100644 index 0000000000..e3cad4cf2e --- /dev/null +++ b/openpype/hosts/maya/plugins/inventory/import_modelrender.py @@ -0,0 +1,92 @@ +from avalon import api, io + + +class ImportModelRender(api.InventoryAction): + + label = "Import Model Render Sets" + icon = "industry" + color = "#55DDAA" + + scene_type_regex = "meta.render.m[ab]" + look_data_type = "meta.render.json" + + @staticmethod + def is_compatible(container): + return ( + container.get("loader") == "ReferenceLoader" + and container.get("name", "").startswith("model") + ) + + def process(self, containers): + from maya import cmds + + for container in containers: + con_name = container["objectName"] + nodes = [] + for n in cmds.sets(con_name, query=True, nodesOnly=True) or []: + if cmds.nodeType(n) == "reference": + nodes += cmds.referenceQuery(n, nodes=True) + else: + nodes.append(n) + + repr_doc = io.find_one({ + "_id": io.ObjectId(container["representation"]), + }) + version_id = repr_doc["parent"] + + print("Importing render sets for model %r" % con_name) + self.assign_model_render_by_version(nodes, version_id) + + def assign_model_render_by_version(self, nodes, version_id): + """Assign nodes a specific published model render data version by id. + + This assumes the nodes correspond with the asset. + + Args: + nodes(list): nodes to assign render data to + version_id (bson.ObjectId): database id of the version of model + + Returns: + None + """ + import json + from maya import cmds + from avalon import maya, io, pipeline + from openpype.hosts.maya.api import lib + + # Get representations of shader file and relationships + look_repr = io.find_one({ + "type": "representation", + "parent": version_id, + "name": {"$regex": self.scene_type_regex}, + }) + if not look_repr: + print("No model render sets for this model version..") + return + + json_repr = io.find_one({ + "type": "representation", + "parent": version_id, + "name": self.look_data_type, + }) + + context = pipeline.get_representation_context(look_repr["_id"]) + maya_file = pipeline.get_representation_path_from_context(context) + + context = pipeline.get_representation_context(json_repr["_id"]) + json_file = pipeline.get_representation_path_from_context(context) + + # Import the look file + with maya.maintained_selection(): + shader_nodes = cmds.file(maya_file, + i=True, # import + returnNewNodes=True) + # imprint context data + + # Load relationships + shader_relation = json_file + with open(shader_relation, "r") as f: + relationships = json.load(f) + + # Assign relationships + lib.apply_shaders(relationships, shader_nodes, nodes) diff --git a/openpype/hosts/maya/plugins/load/load_reference.py b/openpype/hosts/maya/plugins/load/load_reference.py index d5952ed267..cfe8149218 100644 --- a/openpype/hosts/maya/plugins/load/load_reference.py +++ b/openpype/hosts/maya/plugins/load/load_reference.py @@ -13,6 +13,7 @@ class ReferenceLoader(openpype.hosts.maya.api.plugin.ReferenceLoader): "pointcache", "animation", "mayaAscii", + "mayaScene", "setdress", "layout", "camera", @@ -40,14 +41,13 @@ class ReferenceLoader(openpype.hosts.maya.api.plugin.ReferenceLoader): family = "model" with maya.maintained_selection(): - - groupName = "{}:{}".format(namespace, name) + groupName = "{}:_GRP".format(namespace) cmds.loadPlugin("AbcImport.mll", quiet=True) nodes = cmds.file(self.fname, namespace=namespace, sharedReferenceFile=False, groupReference=True, - groupName="{}:{}".format(namespace, name), + groupName=groupName, reference=True, returnNewNodes=True) @@ -71,7 +71,7 @@ class ReferenceLoader(openpype.hosts.maya.api.plugin.ReferenceLoader): except: # noqa: E722 pass - if family not in ["layout", "setdress", "mayaAscii"]: + if family not in ["layout", "setdress", "mayaAscii", "mayaScene"]: for root in roots: root.setParent(world=True) diff --git a/openpype/hosts/maya/plugins/publish/collect_look.py b/openpype/hosts/maya/plugins/publish/collect_look.py index 0dde52447d..9c047b252f 100644 --- a/openpype/hosts/maya/plugins/publish/collect_look.py +++ b/openpype/hosts/maya/plugins/publish/collect_look.py @@ -223,8 +223,8 @@ class CollectLook(pyblish.api.InstancePlugin): def process(self, instance): """Collect the Look in the instance with the correct layer settings""" - - with lib.renderlayer(instance.data["renderlayer"]): + renderlayer = instance.data.get("renderlayer", "defaultRenderLayer") + with lib.renderlayer(renderlayer): self.collect(instance) def collect(self, instance): @@ -357,6 +357,23 @@ class CollectLook(pyblish.api.InstancePlugin): for vray_node in vray_plugin_nodes: history.extend(cmds.listHistory(vray_node)) + # handling render attribute sets + render_set_types = [ + "VRayDisplacement", + "VRayLightMesh", + "VRayObjectProperties", + "RedshiftObjectId", + "RedshiftMeshParameters", + ] + render_sets = cmds.ls(look_sets, type=render_set_types) + if render_sets: + history.extend( + cmds.listHistory(render_sets, + future=False, + pruneDagObjects=True) + or [] + ) + files = cmds.ls(history, type="file", long=True) files.extend(cmds.ls(history, type="aiImage", long=True)) files.extend(cmds.ls(history, type="RedshiftNormalMap", long=True)) @@ -550,3 +567,45 @@ class CollectLook(pyblish.api.InstancePlugin): "source": source, # required for resources "files": files, "color_space": color_space} # required for resources + + +class CollectModelRenderSets(CollectLook): + """Collect render attribute sets for model instance. + + Collects additional render attribute sets so they can be + published with model. + + """ + + order = pyblish.api.CollectorOrder + 0.21 + families = ["model"] + label = "Collect Model Render Sets" + hosts = ["maya"] + maketx = True + + def collect_sets(self, instance): + """Collect all related objectSets except shadingEngines + + Args: + instance (list): all nodes to be published + + Returns: + dict + """ + + sets = {} + for node in instance: + related_sets = lib.get_related_sets(node) + if not related_sets: + continue + + for objset in related_sets: + if objset in sets: + continue + + if "shadingEngine" in cmds.nodeType(objset, inherited=True): + continue + + sets[objset] = {"uuid": lib.get_id(objset), "members": list()} + + return sets diff --git a/openpype/hosts/maya/plugins/publish/collect_mayaascii.py b/openpype/hosts/maya/plugins/publish/collect_maya_scene.py similarity index 82% rename from openpype/hosts/maya/plugins/publish/collect_mayaascii.py rename to openpype/hosts/maya/plugins/publish/collect_maya_scene.py index b02f61b7c6..eb21b17989 100644 --- a/openpype/hosts/maya/plugins/publish/collect_mayaascii.py +++ b/openpype/hosts/maya/plugins/publish/collect_maya_scene.py @@ -3,14 +3,14 @@ from maya import cmds import pyblish.api -class CollectMayaAscii(pyblish.api.InstancePlugin): - """Collect May Ascii Data +class CollectMayaScene(pyblish.api.InstancePlugin): + """Collect Maya Scene Data """ order = pyblish.api.CollectorOrder + 0.2 label = 'Collect Model Data' - families = ["mayaAscii"] + families = ["mayaScene"] def process(self, instance): # Extract only current frame (override) diff --git a/openpype/hosts/maya/plugins/publish/collect_render.py b/openpype/hosts/maya/plugins/publish/collect_render.py index 5049647ff9..575cc2456b 100644 --- a/openpype/hosts/maya/plugins/publish/collect_render.py +++ b/openpype/hosts/maya/plugins/publish/collect_render.py @@ -174,10 +174,16 @@ class CollectMayaRender(pyblish.api.ContextPlugin): assert render_products, "no render products generated" exp_files = [] for product in render_products: - for camera in layer_render_products.layer_data.cameras: - exp_files.append( - {product.productName: layer_render_products.get_files( - product, camera)}) + product_name = product.productName + if product.camera and layer_render_products.has_camera_token(): + product_name = "{}{}".format( + product.camera, + "_" + product_name if product_name else "") + exp_files.append( + { + product_name: layer_render_products.get_files( + product) + }) self.log.info("multipart: {}".format( layer_render_products.multipart)) @@ -199,12 +205,14 @@ class CollectMayaRender(pyblish.api.ContextPlugin): # replace relative paths with absolute. Render products are # returned as list of dictionaries. + publish_meta_path = None for aov in exp_files: full_paths = [] for file in aov[aov.keys()[0]]: full_path = os.path.join(workspace, "renders", file) full_path = full_path.replace("\\", "/") full_paths.append(full_path) + publish_meta_path = os.path.dirname(full_path) aov_dict[aov.keys()[0]] = full_paths frame_start_render = int(self.get_render_attribute( @@ -230,6 +238,26 @@ class CollectMayaRender(pyblish.api.ContextPlugin): frame_end_handle = frame_end_render full_exp_files.append(aov_dict) + + # find common path to store metadata + # so if image prefix is branching to many directories + # metadata file will be located in top-most common + # directory. + # TODO: use `os.path.commonpath()` after switch to Python 3 + common_publish_meta_path = os.path.splitdrive( + publish_meta_path)[0] + if common_publish_meta_path: + common_publish_meta_path += os.path.sep + for part in publish_meta_path.split("/"): + common_publish_meta_path = os.path.join( + common_publish_meta_path, part) + if part == expected_layer_name: + break + common_publish_meta_path = common_publish_meta_path.replace( + "\\", "/") + self.log.info( + "Publish meta path: {}".format(common_publish_meta_path)) + self.log.info(full_exp_files) self.log.info("collecting layer: {}".format(layer_name)) # Get layer specific settings, might be overrides @@ -262,6 +290,7 @@ class CollectMayaRender(pyblish.api.ContextPlugin): # which was submitted originally "source": filepath, "expectedFiles": full_exp_files, + "publishRenderMetadataFolder": common_publish_meta_path, "resolutionWidth": cmds.getAttr("defaultResolution.width"), "resolutionHeight": cmds.getAttr("defaultResolution.height"), "pixelAspect": cmds.getAttr("defaultResolution.pixelAspect"), diff --git a/openpype/hosts/maya/plugins/publish/collect_scene.py b/openpype/hosts/maya/plugins/publish/collect_workfile.py similarity index 97% rename from openpype/hosts/maya/plugins/publish/collect_scene.py rename to openpype/hosts/maya/plugins/publish/collect_workfile.py index be2a294f26..ee676f50d0 100644 --- a/openpype/hosts/maya/plugins/publish/collect_scene.py +++ b/openpype/hosts/maya/plugins/publish/collect_workfile.py @@ -4,7 +4,7 @@ import os from maya import cmds -class CollectMayaScene(pyblish.api.ContextPlugin): +class CollectWorkfile(pyblish.api.ContextPlugin): """Inject the current working file into context""" order = pyblish.api.CollectorOrder - 0.01 diff --git a/openpype/hosts/maya/plugins/publish/extract_look.py b/openpype/hosts/maya/plugins/publish/extract_look.py index f09d50d714..bbf25ebdc7 100644 --- a/openpype/hosts/maya/plugins/publish/extract_look.py +++ b/openpype/hosts/maya/plugins/publish/extract_look.py @@ -122,7 +122,7 @@ def no_workspace_dir(): class ExtractLook(openpype.api.Extractor): - """Extract Look (Maya Ascii + JSON) + """Extract Look (Maya Scene + JSON) Only extracts the sets (shadingEngines and alike) alongside a .json file that stores it relationships for the sets and "attribute" data for the @@ -130,11 +130,12 @@ class ExtractLook(openpype.api.Extractor): """ - label = "Extract Look (Maya ASCII + JSON)" + label = "Extract Look (Maya Scene + JSON)" hosts = ["maya"] families = ["look"] order = pyblish.api.ExtractorOrder + 0.2 scene_type = "ma" + look_data_type = "json" @staticmethod def get_renderer_name(): @@ -176,6 +177,8 @@ class ExtractLook(openpype.api.Extractor): # no preset found pass + return "mayaAscii" if self.scene_type == "ma" else "mayaBinary" + def process(self, instance): """Plugin entry point. @@ -183,10 +186,12 @@ class ExtractLook(openpype.api.Extractor): instance: Instance to process. """ + _scene_type = self.get_maya_scene_type(instance) + # Define extract output file path dir_path = self.staging_dir(instance) maya_fname = "{0}.{1}".format(instance.name, self.scene_type) - json_fname = "{0}.json".format(instance.name) + json_fname = "{0}.{1}".format(instance.name, self.look_data_type) # Make texture dump folder maya_path = os.path.join(dir_path, maya_fname) @@ -196,11 +201,100 @@ class ExtractLook(openpype.api.Extractor): # Remove all members of the sets so they are not included in the # exported file by accident - self.log.info("Extract sets (Maya ASCII) ...") + self.log.info("Extract sets (%s) ..." % _scene_type) lookdata = instance.data["lookData"] relationships = lookdata["relationships"] sets = relationships.keys() + results = self.process_resources(instance, staging_dir=dir_path) + transfers = results["fileTransfers"] + hardlinks = results["fileHardlinks"] + hashes = results["fileHashes"] + remap = results["attrRemap"] + + # Extract in correct render layer + layer = instance.data.get("renderlayer", "defaultRenderLayer") + with lib.renderlayer(layer): + # TODO: Ensure membership edits don't become renderlayer overrides + with lib.empty_sets(sets, force=True): + # To avoid Maya trying to automatically remap the file + # textures relative to the `workspace -directory` we force + # it to a fake temporary workspace. This fixes textures + # getting incorrectly remapped. (LKD-17, PLN-101) + with no_workspace_dir(): + with lib.attribute_values(remap): + with avalon.maya.maintained_selection(): + cmds.select(sets, noExpand=True) + cmds.file( + maya_path, + force=True, + typ=_scene_type, + exportSelected=True, + preserveReferences=False, + channels=True, + constraints=True, + expressions=True, + constructionHistory=True, + ) + + # Write the JSON data + self.log.info("Extract json..") + data = { + "attributes": lookdata["attributes"], + "relationships": relationships + } + + with open(json_path, "w") as f: + json.dump(data, f) + + if "files" not in instance.data: + instance.data["files"] = [] + if "hardlinks" not in instance.data: + instance.data["hardlinks"] = [] + if "transfers" not in instance.data: + instance.data["transfers"] = [] + + instance.data["files"].append(maya_fname) + instance.data["files"].append(json_fname) + + if instance.data.get("representations") is None: + instance.data["representations"] = [] + + instance.data["representations"].append( + { + "name": self.scene_type, + "ext": self.scene_type, + "files": os.path.basename(maya_fname), + "stagingDir": os.path.dirname(maya_fname), + } + ) + instance.data["representations"].append( + { + "name": self.look_data_type, + "ext": self.look_data_type, + "files": os.path.basename(json_fname), + "stagingDir": os.path.dirname(json_fname), + } + ) + + # Set up the resources transfers/links for the integrator + instance.data["transfers"].extend(transfers) + instance.data["hardlinks"].extend(hardlinks) + + # Source hash for the textures + instance.data["sourceHashes"] = hashes + + """ + self.log.info("Returning colorspaces to their original values ...") + for attr, value in remap.items(): + self.log.info(" - {}: {}".format(attr, value)) + cmds.setAttr(attr, value, type="string") + """ + self.log.info("Extracted instance '%s' to: %s" % (instance.name, + maya_path)) + + def process_resources(self, instance, staging_dir): + # Extract the textures to transfer, possibly convert with maketx and # remap the node paths to the destination path. Note that a source # might be included more than once amongst the resources as they could @@ -218,7 +312,6 @@ class ExtractLook(openpype.api.Extractor): color_space = resource.get("color_space") for f in resource["files"]: - files_metadata[os.path.normpath(f)] = { "color_space": color_space} # files.update(os.path.normpath(f)) @@ -244,7 +337,7 @@ class ExtractLook(openpype.api.Extractor): source, mode, texture_hash = self._process_texture( filepath, do_maketx, - staging=dir_path, + staging=staging_dir, linearize=linearize, force=force_copy ) @@ -299,85 +392,13 @@ class ExtractLook(openpype.api.Extractor): self.log.info("Finished remapping destinations ...") - # Extract in correct render layer - layer = instance.data.get("renderlayer", "defaultRenderLayer") - with lib.renderlayer(layer): - # TODO: Ensure membership edits don't become renderlayer overrides - with lib.empty_sets(sets, force=True): - # To avoid Maya trying to automatically remap the file - # textures relative to the `workspace -directory` we force - # it to a fake temporary workspace. This fixes textures - # getting incorrectly remapped. (LKD-17, PLN-101) - with no_workspace_dir(): - with lib.attribute_values(remap): - with avalon.maya.maintained_selection(): - cmds.select(sets, noExpand=True) - cmds.file( - maya_path, - force=True, - typ="mayaAscii", - exportSelected=True, - preserveReferences=False, - channels=True, - constraints=True, - expressions=True, - constructionHistory=True, - ) - - # Write the JSON data - self.log.info("Extract json..") - data = { - "attributes": lookdata["attributes"], - "relationships": relationships + return { + "fileTransfers": transfers, + "fileHardlinks": hardlinks, + "fileHashes": hashes, + "attrRemap": remap, } - with open(json_path, "w") as f: - json.dump(data, f) - - if "files" not in instance.data: - instance.data["files"] = [] - if "hardlinks" not in instance.data: - instance.data["hardlinks"] = [] - if "transfers" not in instance.data: - instance.data["transfers"] = [] - - instance.data["files"].append(maya_fname) - instance.data["files"].append(json_fname) - - instance.data["representations"] = [] - instance.data["representations"].append( - { - "name": "ma", - "ext": "ma", - "files": os.path.basename(maya_fname), - "stagingDir": os.path.dirname(maya_fname), - } - ) - instance.data["representations"].append( - { - "name": "json", - "ext": "json", - "files": os.path.basename(json_fname), - "stagingDir": os.path.dirname(json_fname), - } - ) - - # Set up the resources transfers/links for the integrator - instance.data["transfers"].extend(transfers) - instance.data["hardlinks"].extend(hardlinks) - - # Source hash for the textures - instance.data["sourceHashes"] = hashes - - """ - self.log.info("Returning colorspaces to their original values ...") - for attr, value in remap.items(): - self.log.info(" - {}: {}".format(attr, value)) - cmds.setAttr(attr, value, type="string") - """ - self.log.info("Extracted instance '%s' to: %s" % (instance.name, - maya_path)) - def resource_destination(self, instance, filepath, do_maketx): """Get resource destination path. @@ -467,3 +488,26 @@ class ExtractLook(openpype.api.Extractor): return converted, COPY, texture_hash return filepath, COPY, texture_hash + + +class ExtractModelRenderSets(ExtractLook): + """Extract model render attribute sets as model metadata + + Only extracts the render attrib sets (NO shadingEngines) alongside + a .json file that stores it relationships for the sets and "attribute" + data for the instance members. + + """ + + label = "Model Render Sets" + hosts = ["maya"] + families = ["model"] + scene_type_prefix = "meta.render." + look_data_type = "meta.render.json" + + def get_maya_scene_type(self, instance): + typ = super(ExtractModelRenderSets, self).get_maya_scene_type(instance) + # add prefix + self.scene_type = self.scene_type_prefix + self.scene_type + + return typ diff --git a/openpype/hosts/maya/plugins/publish/extract_maya_scene_raw.py b/openpype/hosts/maya/plugins/publish/extract_maya_scene_raw.py index 3c2b70900d..e7fb5bc8cb 100644 --- a/openpype/hosts/maya/plugins/publish/extract_maya_scene_raw.py +++ b/openpype/hosts/maya/plugins/publish/extract_maya_scene_raw.py @@ -17,6 +17,7 @@ class ExtractMayaSceneRaw(openpype.api.Extractor): label = "Maya Scene (Raw)" hosts = ["maya"] families = ["mayaAscii", + "mayaScene", "setdress", "layout", "camerarig", diff --git a/openpype/plugins/publish/validate_instance_in_context.py b/openpype/hosts/maya/plugins/publish/validate_instance_in_context.py similarity index 65% rename from openpype/plugins/publish/validate_instance_in_context.py rename to openpype/hosts/maya/plugins/publish/validate_instance_in_context.py index 61b4d82027..7b8c335062 100644 --- a/openpype/plugins/publish/validate_instance_in_context.py +++ b/openpype/hosts/maya/plugins/publish/validate_instance_in_context.py @@ -5,6 +5,8 @@ from __future__ import absolute_import import pyblish.api import openpype.api +from maya import cmds + class SelectInvalidInstances(pyblish.api.Action): """Select invalid instances in Outliner.""" @@ -18,13 +20,12 @@ class SelectInvalidInstances(pyblish.api.Action): # Get the errored instances failed = [] for result in context.data["results"]: - if result["error"] is None: - continue - if result["instance"] is None: - continue - if result["instance"] in failed: - continue - if result["plugin"] != plugin: + if ( + result["error"] is None + or result["instance"] is None + or result["instance"] in failed + or result["plugin"] != plugin + ): continue failed.append(result["instance"]) @@ -44,25 +45,10 @@ class SelectInvalidInstances(pyblish.api.Action): self.deselect() def select(self, instances): - if "nuke" in pyblish.api.registered_hosts(): - import avalon.nuke.lib - import nuke - avalon.nuke.lib.select_nodes( - [nuke.toNode(str(x)) for x in instances] - ) - - if "maya" in pyblish.api.registered_hosts(): - from maya import cmds - cmds.select(instances, replace=True, noExpand=True) + cmds.select(instances, replace=True, noExpand=True) def deselect(self): - if "nuke" in pyblish.api.registered_hosts(): - import avalon.nuke.lib - avalon.nuke.lib.reset_selection() - - if "maya" in pyblish.api.registered_hosts(): - from maya import cmds - cmds.select(deselect=True) + cmds.select(deselect=True) class RepairSelectInvalidInstances(pyblish.api.Action): @@ -92,23 +78,14 @@ class RepairSelectInvalidInstances(pyblish.api.Action): context_asset = context.data["assetEntity"]["name"] for instance in instances: - if "nuke" in pyblish.api.registered_hosts(): - import openpype.hosts.nuke.api as nuke_api - origin_node = instance[0] - nuke_api.lib.recreate_instance( - origin_node, avalon_data={"asset": context_asset} - ) - else: - self.set_attribute(instance, context_asset) + self.set_attribute(instance, context_asset) def set_attribute(self, instance, context_asset): - if "maya" in pyblish.api.registered_hosts(): - from maya import cmds - cmds.setAttr( - instance.data.get("name") + ".asset", - context_asset, - type="string" - ) + cmds.setAttr( + instance.data.get("name") + ".asset", + context_asset, + type="string" + ) class ValidateInstanceInContext(pyblish.api.InstancePlugin): @@ -124,7 +101,7 @@ class ValidateInstanceInContext(pyblish.api.InstancePlugin): order = openpype.api.ValidateContentsOrder label = "Instance in same Context" optional = True - hosts = ["maya", "nuke"] + hosts = ["maya"] actions = [SelectInvalidInstances, RepairSelectInvalidInstances] def process(self, instance): diff --git a/openpype/hosts/maya/plugins/publish/validate_loaded_plugin.py b/openpype/hosts/maya/plugins/publish/validate_loaded_plugin.py new file mode 100644 index 0000000000..9306d8ce15 --- /dev/null +++ b/openpype/hosts/maya/plugins/publish/validate_loaded_plugin.py @@ -0,0 +1,47 @@ +import pyblish.api +import maya.cmds as cmds +import openpype.api +import os + + +class ValidateLoadedPlugin(pyblish.api.ContextPlugin): + """Ensure there are no unauthorized loaded plugins""" + + label = "Loaded Plugin" + order = pyblish.api.ValidatorOrder + host = ["maya"] + actions = [openpype.api.RepairContextAction] + + @classmethod + def get_invalid(cls, context): + + invalid = [] + loaded_plugin = cmds.pluginInfo(query=True, listPlugins=True) + # get variable from OpenPype settings + whitelist_native_plugins = cls.whitelist_native_plugins + authorized_plugins = cls.authorized_plugins or [] + + for plugin in loaded_plugin: + if not whitelist_native_plugins and os.getenv('MAYA_LOCATION') \ + in cmds.pluginInfo(plugin, query=True, path=True): + continue + if plugin not in authorized_plugins: + invalid.append(plugin) + + return invalid + + def process(self, context): + + invalid = self.get_invalid(context) + if invalid: + raise RuntimeError( + "Found forbidden plugin name: {}".format(", ".join(invalid)) + ) + + @classmethod + def repair(cls, context): + """Unload forbidden plugins""" + + for plugin in cls.get_invalid(context): + cmds.pluginInfo(plugin, edit=True, autoload=False) + cmds.unloadPlugin(plugin, force=True) diff --git a/openpype/hosts/maya/plugins/publish/validate_rendersettings.py b/openpype/hosts/maya/plugins/publish/validate_rendersettings.py index 7c795db43d..65ddacfc57 100644 --- a/openpype/hosts/maya/plugins/publish/validate_rendersettings.py +++ b/openpype/hosts/maya/plugins/publish/validate_rendersettings.py @@ -76,7 +76,7 @@ class ValidateRenderSettings(pyblish.api.InstancePlugin): r'%a||', re.IGNORECASE) R_LAYER_TOKEN = re.compile( r'%l||', re.IGNORECASE) - R_CAMERA_TOKEN = re.compile(r'%c|', re.IGNORECASE) + R_CAMERA_TOKEN = re.compile(r'%c|Camera>') R_SCENE_TOKEN = re.compile(r'%s|', re.IGNORECASE) DEFAULT_PADDING = 4 @@ -126,7 +126,9 @@ class ValidateRenderSettings(pyblish.api.InstancePlugin): if len(cameras) > 1 and not re.search(cls.R_CAMERA_TOKEN, prefix): invalid = True cls.log.error("Wrong image prefix [ {} ] - " - "doesn't have: '' token".format(prefix)) + "doesn't have: '' token".format(prefix)) + cls.log.error( + "Note that to needs to have capital 'C' at the beginning") # renderer specific checks if renderer == "vray": diff --git a/openpype/hosts/maya/plugins/publish/validate_setdress_root.py b/openpype/hosts/maya/plugins/publish/validate_setdress_root.py new file mode 100644 index 0000000000..0b4842d208 --- /dev/null +++ b/openpype/hosts/maya/plugins/publish/validate_setdress_root.py @@ -0,0 +1,25 @@ + +import pyblish.api +import openpype.api + + +class ValidateSetdressRoot(pyblish.api.InstancePlugin): + """ + """ + + order = openpype.api.ValidateContentsOrder + label = "SetDress Root" + hosts = ["maya"] + families = ["setdress"] + + def process(self, instance): + from maya import cmds + + if instance.data.get("exactSetMembersOnly"): + return + + set_member = instance.data["setMembers"] + root = cmds.ls(set_member, assemblies=True, long=True) + + if not root or root[0] not in set_member: + raise Exception("Setdress top root node is not being published.") diff --git a/openpype/hosts/nuke/__init__.py b/openpype/hosts/nuke/__init__.py index f1e81617e0..366f704dd8 100644 --- a/openpype/hosts/nuke/__init__.py +++ b/openpype/hosts/nuke/__init__.py @@ -21,6 +21,7 @@ def add_implementation_envs(env, _app): new_nuke_paths.append(norm_path) env["NUKE_PATH"] = os.pathsep.join(new_nuke_paths) + env.pop("QT_AUTO_SCREEN_SCALE_FACTOR", None) # Try to add QuickTime to PATH quick_time_path = "C:/Program Files (x86)/QuickTime/QTSystem" diff --git a/openpype/hosts/nuke/api/lib.py b/openpype/hosts/nuke/api/lib.py index 8948cb4d78..6fe9af744b 100644 --- a/openpype/hosts/nuke/api/lib.py +++ b/openpype/hosts/nuke/api/lib.py @@ -288,14 +288,15 @@ def script_name(): def add_button_write_to_read(node): name = "createReadNode" label = "Create Read From Rendered" - value = "import write_to_read;write_to_read.write_to_read(nuke.thisNode())" + value = "import write_to_read;\ + write_to_read.write_to_read(nuke.thisNode(), allow_relative=False)" knob = nuke.PyScript_Knob(name, label, value) knob.clearFlag(nuke.STARTLINE) node.addKnob(knob) def create_write_node(name, data, input=None, prenodes=None, - review=True, linked_knobs=None): + review=True, linked_knobs=None, farm=True): ''' Creating write node which is group node Arguments: @@ -421,7 +422,15 @@ def create_write_node(name, data, input=None, prenodes=None, )) continue - if knob and value: + if not knob and not value: + continue + + log.info((knob, value)) + + if isinstance(value, str): + if "[" in value: + now_node[knob].setExpression(value) + else: now_node[knob].setValue(value) # connect to previous node @@ -466,7 +475,7 @@ def create_write_node(name, data, input=None, prenodes=None, # imprinting group node anlib.set_avalon_knob_data(GN, data["avalon"]) anlib.add_publish_knob(GN) - add_rendering_knobs(GN) + add_rendering_knobs(GN, farm) if review: add_review_knob(GN) @@ -526,7 +535,7 @@ def create_write_node(name, data, input=None, prenodes=None, return GN -def add_rendering_knobs(node): +def add_rendering_knobs(node, farm=True): ''' Adds additional rendering knobs to given node Arguments: @@ -535,9 +544,13 @@ def add_rendering_knobs(node): Return: node (obj): with added knobs ''' + knob_options = [ + "Use existing frames", "Local"] + if farm: + knob_options.append("On farm") + if "render" not in node.knobs(): - knob = nuke.Enumeration_Knob("render", "", [ - "Use existing frames", "Local", "On farm"]) + knob = nuke.Enumeration_Knob("render", "", knob_options) knob.clearFlag(nuke.STARTLINE) node.addKnob(knob) return node diff --git a/openpype/hosts/nuke/plugins/create/create_write_still.py b/openpype/hosts/nuke/plugins/create/create_write_still.py new file mode 100644 index 0000000000..eebb5613c3 --- /dev/null +++ b/openpype/hosts/nuke/plugins/create/create_write_still.py @@ -0,0 +1,141 @@ +from collections import OrderedDict +from openpype.hosts.nuke.api import ( + plugin, + lib) +import nuke + + +class CreateWriteStill(plugin.PypeCreator): + # change this to template preset + name = "WriteStillFrame" + label = "Create Write Still Image" + hosts = ["nuke"] + n_class = "Write" + family = "still" + icon = "image" + defaults = [ + "ImageFrame{:0>4}".format(nuke.frame()), + "MPFrame{:0>4}".format(nuke.frame()), + "LayoutFrame{:0>4}".format(nuke.frame()) + ] + + def __init__(self, *args, **kwargs): + super(CreateWriteStill, self).__init__(*args, **kwargs) + + data = OrderedDict() + + data["family"] = self.family + data["families"] = self.n_class + + for k, v in self.data.items(): + if k not in data.keys(): + data.update({k: v}) + + self.data = data + self.nodes = nuke.selectedNodes() + self.log.debug("_ self.data: '{}'".format(self.data)) + + def process(self): + + inputs = [] + outputs = [] + instance = nuke.toNode(self.data["subset"]) + selected_node = None + + # use selection + if (self.options or {}).get("useSelection"): + nodes = self.nodes + + if not (len(nodes) < 2): + msg = ("Select only one node. " + "The node you want to connect to, " + "or tick off `Use selection`") + self.log.error(msg) + nuke.message(msg) + return + + if len(nodes) == 0: + msg = ( + "No nodes selected. Please select a single node to connect" + " to or tick off `Use selection`" + ) + self.log.error(msg) + nuke.message(msg) + return + + selected_node = nodes[0] + inputs = [selected_node] + outputs = selected_node.dependent() + + if instance: + if (instance.name() in selected_node.name()): + selected_node = instance.dependencies()[0] + + # if node already exist + if instance: + # collect input / outputs + inputs = instance.dependencies() + outputs = instance.dependent() + selected_node = inputs[0] + # remove old one + nuke.delete(instance) + + # recreate new + write_data = { + "nodeclass": self.n_class, + "families": [self.family], + "avalon": self.data + } + + # add creator data + creator_data = {"creator": self.__class__.__name__} + self.data.update(creator_data) + write_data.update(creator_data) + + self.log.info("Adding template path from plugin") + write_data.update({ + "fpath_template": ( + "{work}/renders/nuke/{subset}/{subset}.{ext}")}) + + _prenodes = [ + { + "name": "FrameHold01", + "class": "FrameHold", + "knobs": [ + ("first_frame", nuke.frame()) + ], + "dependent": None + } + ] + + write_node = lib.create_write_node( + self.name, + write_data, + input=selected_node, + review=False, + prenodes=_prenodes, + farm=False, + linked_knobs=["channels", "___", "first", "last", "use_limit"]) + + # relinking to collected connections + for i, input in enumerate(inputs): + write_node.setInput(i, input) + + write_node.autoplace() + + for output in outputs: + output.setInput(0, write_node) + + # link frame hold to group node + write_node.begin() + for n in nuke.allNodes(): + # get write node + if n.Class() in "Write": + w_node = n + write_node.end() + + w_node["use_limit"].setValue(True) + w_node["first"].setValue(nuke.frame()) + w_node["last"].setValue(nuke.frame()) + + return write_node diff --git a/openpype/hosts/nuke/plugins/load/load_image.py b/openpype/hosts/nuke/plugins/load/load_image.py index 8bc266f01b..9b8bc43d12 100644 --- a/openpype/hosts/nuke/plugins/load/load_image.py +++ b/openpype/hosts/nuke/plugins/load/load_image.py @@ -13,7 +13,7 @@ class LoadImage(api.Loader): """Load still image into Nuke""" families = ["render", "source", "plate", "review", "image"] - representations = ["exr", "dpx", "jpg", "jpeg", "png", "psd"] + representations = ["exr", "dpx", "jpg", "jpeg", "png", "psd", "tiff"] label = "Load Image" order = -10 diff --git a/openpype/hosts/nuke/plugins/publish/extract_render_local.py b/openpype/hosts/nuke/plugins/publish/extract_render_local.py index 49609f70e0..bc7b41c733 100644 --- a/openpype/hosts/nuke/plugins/publish/extract_render_local.py +++ b/openpype/hosts/nuke/plugins/publish/extract_render_local.py @@ -17,7 +17,7 @@ class NukeRenderLocal(openpype.api.Extractor): order = pyblish.api.ExtractorOrder label = "Render Local" hosts = ["nuke"] - families = ["render.local", "prerender.local"] + families = ["render.local", "prerender.local", "still.local"] def process(self, instance): families = instance.data["families"] @@ -66,13 +66,23 @@ class NukeRenderLocal(openpype.api.Extractor): instance.data["representations"] = [] collected_frames = os.listdir(out_dir) - repre = { - 'name': ext, - 'ext': ext, - 'frameStart': "%0{}d".format(len(str(last_frame))) % first_frame, - 'files': collected_frames, - "stagingDir": out_dir - } + + if len(collected_frames) == 1: + repre = { + 'name': ext, + 'ext': ext, + 'files': collected_frames.pop(), + "stagingDir": out_dir + } + else: + repre = { + 'name': ext, + 'ext': ext, + 'frameStart': "%0{}d".format( + len(str(last_frame))) % first_frame, + 'files': collected_frames, + "stagingDir": out_dir + } instance.data["representations"].append(repre) self.log.info("Extracted instance '{0}' to: {1}".format( @@ -89,6 +99,9 @@ class NukeRenderLocal(openpype.api.Extractor): instance.data['family'] = 'prerender' families.remove('prerender.local') families.insert(0, "prerender") + elif "still.local" in families: + instance.data['family'] = 'image' + families.remove('still.local') instance.data["families"] = families collections, remainder = clique.assemble(collected_frames) diff --git a/openpype/hosts/nuke/plugins/publish/precollect_writes.py b/openpype/hosts/nuke/plugins/publish/precollect_writes.py index 47189c31fc..189f28f7c6 100644 --- a/openpype/hosts/nuke/plugins/publish/precollect_writes.py +++ b/openpype/hosts/nuke/plugins/publish/precollect_writes.py @@ -64,7 +64,7 @@ class CollectNukeWrites(pyblish.api.InstancePlugin): ) if [fm for fm in _families_test - if fm in ["render", "prerender"]]: + if fm in ["render", "prerender", "still"]]: if "representations" not in instance.data: instance.data["representations"] = list() @@ -100,7 +100,13 @@ class CollectNukeWrites(pyblish.api.InstancePlugin): frame_start_str, frame_slate_str) collected_frames.insert(0, slate_frame) - representation['files'] = collected_frames + if collected_frames_len == 1: + representation['files'] = collected_frames.pop() + if "still" in _families_test: + instance.data['family'] = 'image' + instance.data["families"].remove('still') + else: + representation['files'] = collected_frames instance.data["representations"].append(representation) except Exception: instance.data["representations"].append(representation) diff --git a/openpype/hosts/nuke/plugins/publish/validate_instance_in_context.py b/openpype/hosts/nuke/plugins/publish/validate_instance_in_context.py new file mode 100644 index 0000000000..ddf46a0873 --- /dev/null +++ b/openpype/hosts/nuke/plugins/publish/validate_instance_in_context.py @@ -0,0 +1,110 @@ +# -*- coding: utf-8 -*- +"""Validate if instance asset is the same as context asset.""" +from __future__ import absolute_import + +import nuke + +import pyblish.api +import openpype.api +import avalon.nuke.lib +import openpype.hosts.nuke.api as nuke_api + + +class SelectInvalidInstances(pyblish.api.Action): + """Select invalid instances in Outliner.""" + + label = "Select Instances" + icon = "briefcase" + on = "failed" + + def process(self, context, plugin): + """Process invalid validators and select invalid instances.""" + # Get the errored instances + failed = [] + for result in context.data["results"]: + if ( + result["error"] is None + or result["instance"] is None + or result["instance"] in failed + or result["plugin"] != plugin + ): + continue + + failed.append(result["instance"]) + + # Apply pyblish.logic to get the instances for the plug-in + instances = pyblish.api.instances_by_plugin(failed, plugin) + + if instances: + self.log.info( + "Selecting invalid nodes: %s" % ", ".join( + [str(x) for x in instances] + ) + ) + self.select(instances) + else: + self.log.info("No invalid nodes found.") + self.deselect() + + def select(self, instances): + avalon.nuke.lib.select_nodes( + [nuke.toNode(str(x)) for x in instances] + ) + + def deselect(self): + avalon.nuke.lib.reset_selection() + + +class RepairSelectInvalidInstances(pyblish.api.Action): + """Repair the instance asset.""" + + label = "Repair" + icon = "wrench" + on = "failed" + + def process(self, context, plugin): + # Get the errored instances + failed = [] + for result in context.data["results"]: + if ( + result["error"] is None + or result["instance"] is None + or result["instance"] in failed + or result["plugin"] != plugin + ): + continue + + failed.append(result["instance"]) + + # Apply pyblish.logic to get the instances for the plug-in + instances = pyblish.api.instances_by_plugin(failed, plugin) + + context_asset = context.data["assetEntity"]["name"] + for instance in instances: + origin_node = instance[0] + nuke_api.lib.recreate_instance( + origin_node, avalon_data={"asset": context_asset} + ) + + +class ValidateInstanceInContext(pyblish.api.InstancePlugin): + """Validator to check if instance asset match context asset. + + When working in per-shot style you always publish data in context of + current asset (shot). This validator checks if this is so. It is optional + so it can be disabled when needed. + + Action on this validator will select invalid instances in Outliner. + """ + + order = openpype.api.ValidateContentsOrder + label = "Instance in same Context" + hosts = ["nuke"] + actions = [SelectInvalidInstances, RepairSelectInvalidInstances] + optional = True + + def process(self, instance): + asset = instance.data.get("asset") + context_asset = instance.context.data["assetEntity"]["name"] + msg = "{} has asset {}".format(instance.name, asset) + assert asset == context_asset, msg diff --git a/openpype/hosts/nuke/plugins/publish/validate_rendered_frames.py b/openpype/hosts/nuke/plugins/publish/validate_rendered_frames.py index 0c88014649..29faf867d2 100644 --- a/openpype/hosts/nuke/plugins/publish/validate_rendered_frames.py +++ b/openpype/hosts/nuke/plugins/publish/validate_rendered_frames.py @@ -55,7 +55,7 @@ class ValidateRenderedFrames(pyblish.api.InstancePlugin): """ Validates file output. """ order = pyblish.api.ValidatorOrder + 0.1 - families = ["render", "prerender"] + families = ["render", "prerender", "still"] label = "Validate rendered frame" hosts = ["nuke", "nukestudio"] @@ -71,6 +71,9 @@ class ValidateRenderedFrames(pyblish.api.InstancePlugin): self.log.error(msg) raise ValidationException(msg) + if isinstance(repre["files"], str): + return + collections, remainder = clique.assemble(repre["files"]) self.log.info("collections: {}".format(str(collections))) self.log.info("remainder: {}".format(str(remainder))) diff --git a/openpype/hosts/nuke/startup/write_to_read.py b/openpype/hosts/nuke/startup/write_to_read.py index 295a6e3c85..f5cf66b357 100644 --- a/openpype/hosts/nuke/startup/write_to_read.py +++ b/openpype/hosts/nuke/startup/write_to_read.py @@ -9,7 +9,9 @@ SINGLE_FILE_FORMATS = ['avi', 'mp4', 'mxf', 'mov', 'mpg', 'mpeg', 'wmv', 'm4v', 'm2v'] -def evaluate_filepath_new(k_value, k_eval, project_dir, first_frame): +def evaluate_filepath_new( + k_value, k_eval, project_dir, first_frame, allow_relative): + # get combined relative path combined_relative_path = None if k_eval is not None and project_dir is not None: @@ -26,8 +28,9 @@ def evaluate_filepath_new(k_value, k_eval, project_dir, first_frame): combined_relative_path = None try: - k_value = k_value % first_frame - if os.path.exists(k_value): + # k_value = k_value % first_frame + if os.path.isdir(os.path.basename(k_value)): + # doesn't check for file, only parent dir filepath = k_value elif os.path.exists(k_eval): filepath = k_eval @@ -37,10 +40,12 @@ def evaluate_filepath_new(k_value, k_eval, project_dir, first_frame): filepath = os.path.abspath(filepath) except Exception as E: - log.error("Cannot create Read node. Perhaps it needs to be rendered first :) Error: `{}`".format(E)) + log.error("Cannot create Read node. Perhaps it needs to be \ + rendered first :) Error: `{}`".format(E)) return None filepath = filepath.replace('\\', '/') + # assumes last number is a sequence counter current_frame = re.findall(r'\d+', filepath)[-1] padding = len(current_frame) basename = filepath[: filepath.rfind(current_frame)] @@ -51,11 +56,13 @@ def evaluate_filepath_new(k_value, k_eval, project_dir, first_frame): pass else: # Image sequence needs hashes + # to do still with no number not handled filepath = basename + '#' * padding + '.' + filetype # relative path? make it relative again - if not isinstance(project_dir, type(None)): - filepath = filepath.replace(project_dir, '.') + if allow_relative: + if (not isinstance(project_dir, type(None))) and project_dir != "": + filepath = filepath.replace(project_dir, '.') # get first and last frame from disk frames = [] @@ -95,41 +102,40 @@ def create_read_node(ndata, comp_start): return -def write_to_read(gn): +def write_to_read(gn, + allow_relative=False): + comp_start = nuke.Root().knob('first_frame').value() - comp_end = nuke.Root().knob('last_frame').value() project_dir = nuke.Root().knob('project_directory').getValue() if not os.path.exists(project_dir): project_dir = nuke.Root().knob('project_directory').evaluate() group_read_nodes = [] - with gn: height = gn.screenHeight() # get group height and position new_xpos = int(gn.knob('xpos').value()) new_ypos = int(gn.knob('ypos').value()) + height + 20 group_writes = [n for n in nuke.allNodes() if n.Class() == "Write"] - print("__ group_writes: {}".format(group_writes)) if group_writes != []: # there can be only 1 write node, taking first n = group_writes[0] if n.knob('file') is not None: - file_path_new = evaluate_filepath_new( + myfile, firstFrame, lastFrame = evaluate_filepath_new( n.knob('file').getValue(), n.knob('file').evaluate(), project_dir, - comp_start + comp_start, + allow_relative ) - if not file_path_new: + if not myfile: return - myfiletranslated, firstFrame, lastFrame = file_path_new # get node data ndata = { - 'filepath': myfiletranslated, - 'firstframe': firstFrame, - 'lastframe': lastFrame, + 'filepath': myfile, + 'firstframe': int(firstFrame), + 'lastframe': int(lastFrame), 'new_xpos': new_xpos, 'new_ypos': new_ypos, 'colorspace': n.knob('colorspace').getValue(), @@ -139,7 +145,6 @@ def write_to_read(gn): } group_read_nodes.append(ndata) - # create reads in one go for oneread in group_read_nodes: # create read node diff --git a/openpype/hosts/tvpaint/api/__init__.py b/openpype/hosts/tvpaint/api/__init__.py index 57a03d38b7..1c50987d6d 100644 --- a/openpype/hosts/tvpaint/api/__init__.py +++ b/openpype/hosts/tvpaint/api/__init__.py @@ -1,6 +1,8 @@ import os import logging +import requests + import avalon.api import pyblish.api from avalon.tvpaint import pipeline @@ -8,6 +10,7 @@ from avalon.tvpaint.communication_server import register_localization_file from .lib import set_context_settings from openpype.hosts import tvpaint +from openpype.api import get_current_project_settings log = logging.getLogger(__name__) @@ -51,6 +54,19 @@ def initial_launch(): set_context_settings() +def application_exit(): + data = get_current_project_settings() + stop_timer = data["tvpaint"]["stop_timer_on_application_exit"] + + if not stop_timer: + return + + # Stop application timer. + webserver_url = os.environ.get("OPENPYPE_WEBSERVER_URL") + rest_api_url = "{}/timers_manager/stop_timer".format(webserver_url) + requests.post(rest_api_url) + + def install(): log.info("OpenPype - Installing TVPaint integration") localization_file = os.path.join(HOST_DIR, "resources", "avalon.loc") @@ -67,6 +83,7 @@ def install(): pyblish.api.register_callback("instanceToggled", on_instance_toggle) avalon.api.on("application.launched", initial_launch) + avalon.api.on("application.exit", application_exit) def uninstall(): diff --git a/openpype/hosts/tvpaint/api/plugin.py b/openpype/hosts/tvpaint/api/plugin.py index 3eb9a5be31..e148e44a27 100644 --- a/openpype/hosts/tvpaint/api/plugin.py +++ b/openpype/hosts/tvpaint/api/plugin.py @@ -3,4 +3,17 @@ from avalon.tvpaint import pipeline class Creator(PypeCreatorMixin, pipeline.Creator): - pass + @classmethod + def get_dynamic_data(cls, *args, **kwargs): + dynamic_data = super(Creator, cls).get_dynamic_data(*args, **kwargs) + + # Change asset and name by current workfile context + workfile_context = pipeline.get_current_workfile_context() + asset_name = workfile_context.get("asset") + task_name = workfile_context.get("task") + if "asset" not in dynamic_data and asset_name: + dynamic_data["asset"] = asset_name + + if "task" not in dynamic_data and task_name: + dynamic_data["task"] = task_name + return dynamic_data diff --git a/openpype/hosts/tvpaint/plugins/publish/extract_sequence.py b/openpype/hosts/tvpaint/plugins/publish/extract_sequence.py index 36f0b0c954..c45ff53c3c 100644 --- a/openpype/hosts/tvpaint/plugins/publish/extract_sequence.py +++ b/openpype/hosts/tvpaint/plugins/publish/extract_sequence.py @@ -606,7 +606,7 @@ class ExtractSequence(pyblish.api.Extractor): self._copy_image(eq_frame_filepath, new_filepath) layer_files_by_frame[frame_idx] = new_filepath - elif pre_behavior == "loop": + elif pre_behavior in ("loop", "repeat"): # Loop backwards from last frame of layer for frame_idx in reversed(range(mark_in_index, frame_start_index)): eq_frame_idx_offset = ( @@ -678,7 +678,7 @@ class ExtractSequence(pyblish.api.Extractor): self._copy_image(eq_frame_filepath, new_filepath) layer_files_by_frame[frame_idx] = new_filepath - elif post_behavior == "loop": + elif post_behavior in ("loop", "repeat"): # Loop backwards from last frame of layer for frame_idx in range(frame_end_index + 1, mark_out_index + 1): eq_frame_idx = frame_idx % frame_count diff --git a/openpype/lib/applications.py b/openpype/lib/applications.py index 245f2ee9a2..cc8cb8e7be 100644 --- a/openpype/lib/applications.py +++ b/openpype/lib/applications.py @@ -1162,8 +1162,12 @@ def prepare_host_environments(data, implementation_envs=True): if final_env is None: final_env = loaded_env + keys_to_remove = set(data["env"].keys()) - set(final_env.keys()) + # Update env data["env"].update(final_env) + for key in keys_to_remove: + data["env"].pop(key, None) def apply_project_environments_value(project_name, env, project_settings=None): @@ -1349,23 +1353,23 @@ def _prepare_last_workfile(data, workdir, workfile_template_key): ) # Last workfile path - last_workfile_path = "" - extensions = avalon.api.HOST_WORKFILE_EXTENSIONS.get( - app.host_name - ) - if extensions: - anatomy = data["anatomy"] - # Find last workfile - file_template = anatomy.templates[workfile_template_key]["file"] - workdir_data.update({ - "version": 1, - "user": get_openpype_username(), - "ext": extensions[0] - }) + last_workfile_path = data.get("last_workfile_path") or "" + if not last_workfile_path: + extensions = avalon.api.HOST_WORKFILE_EXTENSIONS.get(app.host_name) - last_workfile_path = avalon.api.last_workfile( - workdir, file_template, workdir_data, extensions, True - ) + if extensions: + anatomy = data["anatomy"] + # Find last workfile + file_template = anatomy.templates["work"]["file"] + workdir_data.update({ + "version": 1, + "user": get_openpype_username(), + "ext": extensions[0] + }) + + last_workfile_path = avalon.api.last_workfile( + workdir, file_template, workdir_data, extensions, True + ) if os.path.exists(last_workfile_path): log.debug(( diff --git a/openpype/lib/delivery.py b/openpype/lib/delivery.py index 943cd9fcaf..5735cbc99d 100644 --- a/openpype/lib/delivery.py +++ b/openpype/lib/delivery.py @@ -1,9 +1,11 @@ """Functions useful for delivery action or loader""" import os import shutil +import glob import clique import collections + def collect_frames(files): """ Returns dict of source path and its frame, if from sequence @@ -228,7 +230,16 @@ def process_sequence( Returns: (collections.defaultdict , int) """ - if not os.path.exists(src_path): + + def hash_path_exist(myPath): + res = myPath.replace('#', '*') + glob_search_results = glob.glob(res) + if len(glob_search_results) > 0: + return True + else: + return False + + if not hash_path_exist(src_path): msg = "{} doesn't exist for {}".format(src_path, repre["_id"]) report_items["Source file was not found"].append(msg) diff --git a/openpype/lib/mongo.py b/openpype/lib/mongo.py index 8bfaba75d6..0fd4517b5b 100644 --- a/openpype/lib/mongo.py +++ b/openpype/lib/mongo.py @@ -3,6 +3,7 @@ import sys import time import logging import pymongo +import certifi if sys.version_info[0] == 2: from urlparse import urlparse, parse_qs @@ -85,12 +86,33 @@ def get_default_components(): return decompose_url(mongo_url) -def extract_port_from_url(url): - parsed_url = urlparse(url) - if parsed_url.scheme is None: - _url = "mongodb://{}".format(url) - parsed_url = urlparse(_url) - return parsed_url.port +def should_add_certificate_path_to_mongo_url(mongo_url): + """Check if should add ca certificate to mongo url. + + Since 30.9.2021 cloud mongo requires newer certificates that are not + available on most of workstation. This adds path to certifi certificate + which is valid for it. To add the certificate path url must have scheme + 'mongodb+srv' or has 'ssl=true' or 'tls=true' in url query. + """ + parsed = urlparse(mongo_url) + query = parse_qs(parsed.query) + lowered_query_keys = set(key.lower() for key in query.keys()) + add_certificate = False + # Check if url 'ssl' or 'tls' are set to 'true' + for key in ("ssl", "tls"): + if key in query and "true" in query["ssl"]: + add_certificate = True + break + + # Check if url contains 'mongodb+srv' + if not add_certificate and parsed.scheme == "mongodb+srv": + add_certificate = True + + # Check if url does already contain certificate path + if add_certificate and "tlscafile" in lowered_query_keys: + add_certificate = False + + return add_certificate def validate_mongo_connection(mongo_uri): @@ -106,26 +128,9 @@ def validate_mongo_connection(mongo_uri): passed so probably couldn't connect to mongo server. """ - parsed = urlparse(mongo_uri) - # Force validation of scheme - if parsed.scheme not in ["mongodb", "mongodb+srv"]: - raise pymongo.errors.InvalidURI(( - "Invalid URI scheme:" - " URI must begin with 'mongodb://' or 'mongodb+srv://'" - )) - # we have mongo connection string. Let's try if we can connect. - components = decompose_url(mongo_uri) - mongo_args = { - "host": compose_url(**components), - "serverSelectionTimeoutMS": 1000 - } - port = components.get("port") - if port is not None: - mongo_args["port"] = int(port) - - # Create connection - client = pymongo.MongoClient(**mongo_args) - client.server_info() + client = OpenPypeMongoConnection.create_connection( + mongo_uri, retry_attempts=1 + ) client.close() @@ -151,6 +156,8 @@ class OpenPypeMongoConnection: # Naive validation of existing connection try: connection.server_info() + with connection.start_session(): + pass except Exception: connection = None @@ -162,38 +169,53 @@ class OpenPypeMongoConnection: return connection @classmethod - def create_connection(cls, mongo_url, timeout=None): + def create_connection(cls, mongo_url, timeout=None, retry_attempts=None): + parsed = urlparse(mongo_url) + # Force validation of scheme + if parsed.scheme not in ["mongodb", "mongodb+srv"]: + raise pymongo.errors.InvalidURI(( + "Invalid URI scheme:" + " URI must begin with 'mongodb://' or 'mongodb+srv://'" + )) + if timeout is None: timeout = int(os.environ.get("AVALON_TIMEOUT") or 1000) kwargs = { - "host": mongo_url, "serverSelectionTimeoutMS": timeout } + if should_add_certificate_path_to_mongo_url(mongo_url): + kwargs["ssl_ca_certs"] = certifi.where() - port = extract_port_from_url(mongo_url) - if port is not None: - kwargs["port"] = int(port) + mongo_client = pymongo.MongoClient(mongo_url, **kwargs) - mongo_client = pymongo.MongoClient(**kwargs) + if retry_attempts is None: + retry_attempts = 3 - for _retry in range(3): + elif not retry_attempts: + retry_attempts = 1 + + last_exc = None + valid = False + t1 = time.time() + for attempt in range(1, retry_attempts + 1): try: - t1 = time.time() mongo_client.server_info() - - except Exception: - cls.log.warning("Retrying...") - time.sleep(1) - timeout *= 1.5 - - else: + with mongo_client.start_session(): + pass + valid = True break - else: - raise IOError(( - "ERROR: Couldn't connect to {} in less than {:.3f}ms" - ).format(mongo_url, timeout)) + except Exception as exc: + last_exc = exc + if attempt < retry_attempts: + cls.log.warning( + "Attempt {} failed. Retrying... ".format(attempt) + ) + time.sleep(1) + + if not valid: + raise last_exc cls.log.info("Connected to {}, delay {:.3f}s".format( mongo_url, time.time() - t1 diff --git a/openpype/modules/default_modules/deadline/plugins/publish/collect_deadline_server_from_instance.py b/openpype/modules/default_modules/deadline/plugins/publish/collect_deadline_server_from_instance.py index 784616615d..1bc4eaa067 100644 --- a/openpype/modules/default_modules/deadline/plugins/publish/collect_deadline_server_from_instance.py +++ b/openpype/modules/default_modules/deadline/plugins/publish/collect_deadline_server_from_instance.py @@ -11,7 +11,7 @@ import pyblish.api class CollectDeadlineServerFromInstance(pyblish.api.InstancePlugin): """Collect Deadline Webservice URL from instance.""" - order = pyblish.api.CollectorOrder + order = pyblish.api.CollectorOrder + 0.02 label = "Deadline Webservice from the Instance" families = ["rendering"] @@ -46,24 +46,25 @@ class CollectDeadlineServerFromInstance(pyblish.api.InstancePlugin): ["deadline"] ) - try: - default_servers = deadline_settings["deadline_urls"] - project_servers = ( - render_instance.context.data - ["project_settings"] - ["deadline"] - ["deadline_servers"] - ) - deadline_servers = { - k: default_servers[k] - for k in project_servers - if k in default_servers - } - - except AttributeError: - # Handle situation were we had only one url for deadline. - return render_instance.context.data["defaultDeadline"] + default_server = render_instance.context.data["defaultDeadline"] + instance_server = render_instance.data.get("deadlineServers") + if not instance_server: + return default_server + default_servers = deadline_settings["deadline_urls"] + project_servers = ( + render_instance.context.data + ["project_settings"] + ["deadline"] + ["deadline_servers"] + ) + deadline_servers = { + k: default_servers[k] + for k in project_servers + if k in default_servers + } + # This is Maya specific and may not reflect real selection of deadline + # url as dictionary keys in Python 2 are not ordered return deadline_servers[ list(deadline_servers.keys())[ int(render_instance.data.get("deadlineServers")) diff --git a/openpype/modules/default_modules/deadline/plugins/publish/submit_maya_deadline.py b/openpype/modules/default_modules/deadline/plugins/publish/submit_maya_deadline.py index 1ab3dc2554..2d43b0d085 100644 --- a/openpype/modules/default_modules/deadline/plugins/publish/submit_maya_deadline.py +++ b/openpype/modules/default_modules/deadline/plugins/publish/submit_maya_deadline.py @@ -351,6 +351,11 @@ class MayaSubmitDeadline(pyblish.api.InstancePlugin): f.replace(orig_scene, new_scene) ) instance.data["expectedFiles"] = [new_exp] + + if instance.data.get("publishRenderMetadataFolder"): + instance.data["publishRenderMetadataFolder"] = \ + instance.data["publishRenderMetadataFolder"].replace( + orig_scene, new_scene) self.log.info("Scene name was switched {} -> {}".format( orig_scene, new_scene )) diff --git a/openpype/modules/default_modules/deadline/plugins/publish/submit_publish_job.py b/openpype/modules/default_modules/deadline/plugins/publish/submit_publish_job.py index 19e3174384..6b07749819 100644 --- a/openpype/modules/default_modules/deadline/plugins/publish/submit_publish_job.py +++ b/openpype/modules/default_modules/deadline/plugins/publish/submit_publish_job.py @@ -385,6 +385,7 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin): """ task = os.environ["AVALON_TASK"] subset = instance_data["subset"] + cameras = instance_data.get("cameras", []) instances = [] # go through aovs in expected files for aov, files in exp_files[0].items(): @@ -410,7 +411,11 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin): task[0].upper(), task[1:], subset[0].upper(), subset[1:]) - subset_name = '{}_{}'.format(group_name, aov) + cam = [c for c in cameras if c in col.head] + if cam: + subset_name = '{}_{}_{}'.format(group_name, cam, aov) + else: + subset_name = '{}_{}'.format(group_name, aov) if isinstance(col, (list, tuple)): staging = os.path.dirname(col[0]) diff --git a/openpype/modules/default_modules/ftrack/event_handlers_user/action_create_cust_attrs.py b/openpype/modules/default_modules/ftrack/event_handlers_user/action_create_cust_attrs.py index 3869d8ad08..0bd243ab4c 100644 --- a/openpype/modules/default_modules/ftrack/event_handlers_user/action_create_cust_attrs.py +++ b/openpype/modules/default_modules/ftrack/event_handlers_user/action_create_cust_attrs.py @@ -10,6 +10,7 @@ from openpype_modules.ftrack.lib import ( CUST_ATTR_GROUP, CUST_ATTR_TOOLS, CUST_ATTR_APPLICATIONS, + CUST_ATTR_INTENT, default_custom_attributes_definition, app_definitions_from_app_manager, @@ -431,7 +432,7 @@ class CustomAttributes(BaseAction): intent_custom_attr_data = { "label": "Intent", - "key": "intent", + "key": CUST_ATTR_INTENT, "type": "enumerator", "entity_type": "assetversion", "group": CUST_ATTR_GROUP, diff --git a/openpype/modules/default_modules/ftrack/ftrack_module.py b/openpype/modules/default_modules/ftrack/ftrack_module.py index c73f9b100d..5a1fdbc276 100644 --- a/openpype/modules/default_modules/ftrack/ftrack_module.py +++ b/openpype/modules/default_modules/ftrack/ftrack_module.py @@ -1,7 +1,6 @@ import os import json import collections -import openpype from openpype.modules import OpenPypeModule from openpype_interfaces import ( @@ -230,7 +229,13 @@ class FtrackModule( return import ftrack_api - from openpype_modules.ftrack.lib import get_openpype_attr + from openpype_modules.ftrack.lib import ( + get_openpype_attr, + default_custom_attributes_definition, + CUST_ATTR_TOOLS, + CUST_ATTR_APPLICATIONS, + CUST_ATTR_INTENT + ) try: session = self.create_ftrack_session() @@ -255,6 +260,15 @@ class FtrackModule( project_id = project_entity["id"] + ca_defs = default_custom_attributes_definition() + hierarchical_attrs = ca_defs.get("is_hierarchical") or {} + project_attrs = ca_defs.get("show") or {} + ca_keys = ( + set(hierarchical_attrs.keys()) + | set(project_attrs.keys()) + | {CUST_ATTR_TOOLS, CUST_ATTR_APPLICATIONS, CUST_ATTR_INTENT} + ) + cust_attr, hier_attr = get_openpype_attr(session) cust_attr_by_key = {attr["key"]: attr for attr in cust_attr} hier_attrs_by_key = {attr["key"]: attr for attr in hier_attr} @@ -262,6 +276,9 @@ class FtrackModule( failed = {} missing = {} for key, value in attributes_changes.items(): + if key not in ca_keys: + continue + configuration = hier_attrs_by_key.get(key) if not configuration: configuration = cust_attr_by_key.get(key) @@ -379,3 +396,16 @@ class FtrackModule( def timer_stopped(self): if self._timers_manager_module is not None: self._timers_manager_module.timer_stopped(self.id) + + def get_task_time(self, project_name, asset_name, task_name): + session = self.create_ftrack_session() + query = ( + 'Task where name is "{}"' + ' and parent.name is "{}"' + ' and project.full_name is "{}"' + ).format(task_name, asset_name, project_name) + task_entity = session.query(query).first() + if not task_entity: + return 0 + hours_logged = (task_entity["time_logged"] / 60) / 60 + return hours_logged diff --git a/openpype/modules/default_modules/ftrack/ftrack_server/event_server_cli.py b/openpype/modules/default_modules/ftrack/ftrack_server/event_server_cli.py index 075694d8f6..1a76905b38 100644 --- a/openpype/modules/default_modules/ftrack/ftrack_server/event_server_cli.py +++ b/openpype/modules/default_modules/ftrack/ftrack_server/event_server_cli.py @@ -17,7 +17,8 @@ from openpype.lib import ( get_pype_execute_args, OpenPypeMongoConnection, get_openpype_version, - get_build_version + get_build_version, + validate_mongo_connection ) from openpype_modules.ftrack import FTRACK_MODULE_DIR from openpype_modules.ftrack.lib import credentials @@ -36,11 +37,15 @@ class MongoPermissionsError(Exception): def check_mongo_url(mongo_uri, log_error=False): """Checks if mongo server is responding""" try: - client = pymongo.MongoClient(mongo_uri) - # Force connection on a request as the connect=True parameter of - # MongoClient seems to be useless here - client.server_info() - client.close() + validate_mongo_connection(mongo_uri) + + except pymongo.errors.InvalidURI as err: + if log_error: + print("Can't connect to MongoDB at {} because: {}".format( + mongo_uri, err + )) + return False + except pymongo.errors.ServerSelectionTimeoutError as err: if log_error: print("Can't connect to MongoDB at {} because: {}".format( diff --git a/openpype/modules/default_modules/ftrack/lib/__init__.py b/openpype/modules/default_modules/ftrack/lib/__init__.py index 433a1f7881..80b4db9dd6 100644 --- a/openpype/modules/default_modules/ftrack/lib/__init__.py +++ b/openpype/modules/default_modules/ftrack/lib/__init__.py @@ -3,7 +3,8 @@ from .constants import ( CUST_ATTR_AUTO_SYNC, CUST_ATTR_GROUP, CUST_ATTR_TOOLS, - CUST_ATTR_APPLICATIONS + CUST_ATTR_APPLICATIONS, + CUST_ATTR_INTENT ) from .settings import ( get_ftrack_event_mongo_info diff --git a/openpype/modules/default_modules/ftrack/lib/constants.py b/openpype/modules/default_modules/ftrack/lib/constants.py index 73d5112e6d..e6e2013d2b 100644 --- a/openpype/modules/default_modules/ftrack/lib/constants.py +++ b/openpype/modules/default_modules/ftrack/lib/constants.py @@ -10,3 +10,5 @@ CUST_ATTR_AUTO_SYNC = "avalon_auto_sync" CUST_ATTR_APPLICATIONS = "applications" # Environment tools custom attribute CUST_ATTR_TOOLS = "tools_env" +# Intent custom attribute name +CUST_ATTR_INTENT = "intent" diff --git a/openpype/modules/default_modules/idle_manager/__init__.py b/openpype/modules/default_modules/idle_manager/__init__.py deleted file mode 100644 index 9d6e10bf39..0000000000 --- a/openpype/modules/default_modules/idle_manager/__init__.py +++ /dev/null @@ -1,8 +0,0 @@ -from .idle_module import ( - IdleManager -) - - -__all__ = ( - "IdleManager", -) diff --git a/openpype/modules/default_modules/idle_manager/idle_module.py b/openpype/modules/default_modules/idle_manager/idle_module.py deleted file mode 100644 index 1a6d71a961..0000000000 --- a/openpype/modules/default_modules/idle_manager/idle_module.py +++ /dev/null @@ -1,79 +0,0 @@ -import platform -import collections - -from openpype.modules import OpenPypeModule -from openpype_interfaces import ( - ITrayService, - IIdleManager -) - - -class IdleManager(OpenPypeModule, ITrayService): - """ Measure user's idle time in seconds. - Idle time resets on keyboard/mouse input. - Is able to emit signals at specific time idle. - """ - label = "Idle Service" - name = "idle_manager" - - def initialize(self, module_settings): - enabled = True - # Ignore on MacOs - # - pynput need root permissions and enabled access for application - if platform.system().lower() == "darwin": - enabled = False - self.enabled = enabled - - self.time_callbacks = collections.defaultdict(list) - self.idle_thread = None - - def tray_init(self): - return - - def tray_start(self): - if self.time_callbacks: - self.start_thread() - - def tray_exit(self): - self.stop_thread() - try: - self.time_callbacks = {} - except Exception: - pass - - def connect_with_modules(self, enabled_modules): - for module in enabled_modules: - if not isinstance(module, IIdleManager): - continue - - module.idle_manager = self - callbacks_items = module.callbacks_by_idle_time() or {} - for emit_time, callbacks in callbacks_items.items(): - if not isinstance(callbacks, (tuple, list, set)): - callbacks = [callbacks] - self.time_callbacks[emit_time].extend(callbacks) - - @property - def idle_time(self): - if self.idle_thread and self.idle_thread.is_running: - return self.idle_thread.idle_time - - def _create_thread(self): - from .idle_threads import IdleManagerThread - - return IdleManagerThread(self) - - def start_thread(self): - if self.idle_thread: - self.idle_thread.stop() - self.idle_thread.join() - self.idle_thread = self._create_thread() - self.idle_thread.start() - - def stop_thread(self): - if self.idle_thread: - self.idle_thread.stop() - self.idle_thread.join() - - def on_thread_stop(self): - self.set_service_failed_icon() diff --git a/openpype/modules/default_modules/idle_manager/idle_threads.py b/openpype/modules/default_modules/idle_manager/idle_threads.py deleted file mode 100644 index f19feddb77..0000000000 --- a/openpype/modules/default_modules/idle_manager/idle_threads.py +++ /dev/null @@ -1,97 +0,0 @@ -import time -import threading - -from pynput import mouse, keyboard - -from openpype.lib import PypeLogger - - -class MouseThread(mouse.Listener): - """Listens user's mouse movement.""" - - def __init__(self, callback): - super(MouseThread, self).__init__(on_move=self.on_move) - self.callback = callback - - def on_move(self, posx, posy): - self.callback() - - -class KeyboardThread(keyboard.Listener): - """Listens user's keyboard input.""" - - def __init__(self, callback): - super(KeyboardThread, self).__init__(on_press=self.on_press) - - self.callback = callback - - def on_press(self, key): - self.callback() - - -class IdleManagerThread(threading.Thread): - def __init__(self, module, *args, **kwargs): - super(IdleManagerThread, self).__init__(*args, **kwargs) - self.log = PypeLogger.get_logger(self.__class__.__name__) - self.module = module - self.threads = [] - self.is_running = False - self.idle_time = 0 - - def stop(self): - self.is_running = False - - def reset_time(self): - self.idle_time = 0 - - @property - def time_callbacks(self): - return self.module.time_callbacks - - def on_stop(self): - self.is_running = False - self.log.info("IdleManagerThread has stopped") - self.module.on_thread_stop() - - def run(self): - self.log.info("IdleManagerThread has started") - self.is_running = True - thread_mouse = MouseThread(self.reset_time) - thread_keyboard = KeyboardThread(self.reset_time) - thread_mouse.start() - thread_keyboard.start() - try: - while self.is_running: - if self.idle_time in self.time_callbacks: - for callback in self.time_callbacks[self.idle_time]: - thread = threading.Thread(target=callback) - thread.start() - self.threads.append(thread) - - for thread in tuple(self.threads): - if not thread.isAlive(): - thread.join() - self.threads.remove(thread) - - self.idle_time += 1 - time.sleep(1) - - except Exception: - self.log.warning( - 'Idle Manager service has failed', exc_info=True - ) - - # Threads don't have their attrs when Qt application already finished - try: - thread_mouse.stop() - thread_mouse.join() - except AttributeError: - pass - - try: - thread_keyboard.stop() - thread_keyboard.join() - except AttributeError: - pass - - self.on_stop() diff --git a/openpype/modules/default_modules/idle_manager/interfaces.py b/openpype/modules/default_modules/idle_manager/interfaces.py deleted file mode 100644 index 71cd17a64a..0000000000 --- a/openpype/modules/default_modules/idle_manager/interfaces.py +++ /dev/null @@ -1,26 +0,0 @@ -from abc import abstractmethod -from openpype.modules import OpenPypeInterface - - -class IIdleManager(OpenPypeInterface): - """Other modules interface to return callbacks by idle time in seconds. - - Expected output is dictionary with seconds as keys and callback/s - as value, value may be callback of list of callbacks. - EXAMPLE: - ``` - { - 60: self.on_minute_idle - } - ``` - """ - idle_manager = None - - @abstractmethod - def callbacks_by_idle_time(self): - pass - - @property - def idle_time(self): - if self.idle_manager: - return self.idle_manager.idle_time diff --git a/openpype/modules/default_modules/sync_server/providers/abstract_provider.py b/openpype/modules/default_modules/sync_server/providers/abstract_provider.py index 7fd25b9852..8ae0ceed79 100644 --- a/openpype/modules/default_modules/sync_server/providers/abstract_provider.py +++ b/openpype/modules/default_modules/sync_server/providers/abstract_provider.py @@ -80,7 +80,8 @@ class AbstractProvider: representation (dict): complete repre containing 'file' site (str): site name Returns: - (string) file_id of created file, raises exception + (string) file_id of created/modified file , + throws FileExistsError, FileNotFoundError exceptions """ pass @@ -103,7 +104,8 @@ class AbstractProvider: representation (dict): complete repre containing 'file' site (str): site name Returns: - None + (string) file_id of created/modified file , + throws FileExistsError, FileNotFoundError exceptions """ pass diff --git a/openpype/modules/default_modules/sync_server/providers/dropbox.py b/openpype/modules/default_modules/sync_server/providers/dropbox.py new file mode 100644 index 0000000000..0d735a0b59 --- /dev/null +++ b/openpype/modules/default_modules/sync_server/providers/dropbox.py @@ -0,0 +1,423 @@ +import os + +import dropbox + +from openpype.api import Logger +from .abstract_provider import AbstractProvider +from ..utils import EditableScopes + +log = Logger().get_logger("SyncServer") + + +class DropboxHandler(AbstractProvider): + CODE = 'dropbox' + LABEL = 'Dropbox' + + def __init__(self, project_name, site_name, tree=None, presets=None): + self.active = False + self.site_name = site_name + self.presets = presets + + if not self.presets: + log.info( + "Sync Server: There are no presets for {}.".format(site_name) + ) + return + + provider_presets = self.presets.get(self.CODE) + if not provider_presets: + msg = "Sync Server: No provider presets for {}".format(self.CODE) + log.info(msg) + return + + token = self.presets[self.CODE].get("token", "") + if not token: + msg = "Sync Server: No access token for dropbox provider" + log.info(msg) + return + + team_folder_name = self.presets[self.CODE].get("team_folder_name", "") + if not team_folder_name: + msg = "Sync Server: No team folder name for dropbox provider" + log.info(msg) + return + + acting_as_member = self.presets[self.CODE].get("acting_as_member", "") + if not acting_as_member: + msg = ( + "Sync Server: No acting member for dropbox provider" + ) + log.info(msg) + return + + self.dbx = None + try: + self.dbx = self._get_service( + token, acting_as_member, team_folder_name + ) + except Exception as e: + log.info("Could not establish dropbox object: {}".format(e)) + return + + super(AbstractProvider, self).__init__() + + @classmethod + def get_system_settings_schema(cls): + """ + Returns dict for editable properties on system settings level + + + Returns: + (list) of dict + """ + return [] + + @classmethod + def get_project_settings_schema(cls): + """ + Returns dict for editable properties on project settings level + + + Returns: + (list) of dict + """ + # {platform} tells that value is multiplatform and only specific OS + # should be returned + return [ + { + "type": "text", + "key": "token", + "label": "Access Token" + }, + { + "type": "text", + "key": "team_folder_name", + "label": "Team Folder Name" + }, + { + "type": "text", + "key": "acting_as_member", + "label": "Acting As Member" + }, + # roots could be overriden only on Project level, User cannot + { + 'key': "roots", + 'label': "Roots", + 'type': 'dict' + } + ] + + @classmethod + def get_local_settings_schema(cls): + """ + Returns dict for editable properties on local settings level + + + Returns: + (dict) + """ + return [] + + def _get_service(self, token, acting_as_member, team_folder_name): + dbx = dropbox.DropboxTeam(token) + + # Getting member id. + member_id = None + member_names = [] + for member in dbx.team_members_list().members: + member_names.append(member.profile.name.display_name) + if member.profile.name.display_name == acting_as_member: + member_id = member.profile.team_member_id + + if member_id is None: + raise ValueError( + "Could not find member \"{}\". Available members: {}".format( + acting_as_member, member_names + ) + ) + + # Getting team folder id. + team_folder_id = None + team_folder_names = [] + for entry in dbx.team_team_folder_list().team_folders: + team_folder_names.append(entry.name) + if entry.name == team_folder_name: + team_folder_id = entry.team_folder_id + + if team_folder_id is None: + raise ValueError( + "Could not find team folder \"{}\". Available folders: " + "{}".format( + team_folder_name, team_folder_names + ) + ) + + # Establish dropbox object. + path_root = dropbox.common.PathRoot.namespace_id(team_folder_id) + return dropbox.DropboxTeam( + token + ).with_path_root(path_root).as_user(member_id) + + def is_active(self): + """ + Returns True if provider is activated, eg. has working credentials. + Returns: + (boolean) + """ + return self.dbx is not None + + @classmethod + def get_configurable_items(cls): + """ + Returns filtered dict of editable properties + + + Returns: + (dict) + """ + editable = { + 'token': { + 'scope': [EditableScopes.PROJECT], + 'label': "Access Token", + 'type': 'text', + 'namespace': ( + '{project_settings}/global/sync_server/sites/{site}/token' + ) + }, + 'team_folder_name': { + 'scope': [EditableScopes.PROJECT], + 'label': "Team Folder Name", + 'type': 'text', + 'namespace': ( + '{project_settings}/global/sync_server/sites/{site}' + '/team_folder_name' + ) + }, + 'acting_as_member': { + 'scope': [EditableScopes.PROJECT, EditableScopes.LOCAL], + 'label': "Acting As Member", + 'type': 'text', + 'namespace': ( + '{project_settings}/global/sync_server/sites/{site}' + '/acting_as_member' + ) + } + } + return editable + + def _path_exists(self, path): + try: + entries = self.dbx.files_list_folder( + path=os.path.dirname(path) + ).entries + except dropbox.exceptions.ApiError: + return False + + for entry in entries: + if entry.name == os.path.basename(path): + return True + + return False + + def upload_file(self, source_path, path, + server, collection, file, representation, site, + overwrite=False): + """ + Copy file from 'source_path' to 'target_path' on provider. + Use 'overwrite' boolean to rewrite existing file on provider + + Args: + source_path (string): + path (string): absolute path with or without name of the file + overwrite (boolean): replace existing file + + arguments for saving progress: + server (SyncServer): server instance to call update_db on + collection (str): name of collection + file (dict): info about uploaded file (matches structure from db) + representation (dict): complete repre containing 'file' + site (str): site name + Returns: + (string) file_id of created file, raises exception + """ + # Check source path. + if not os.path.exists(source_path): + raise FileNotFoundError( + "Source file {} doesn't exist.".format(source_path) + ) + + if self._path_exists(path) and not overwrite: + raise FileExistsError( + "File already exists, use 'overwrite' argument" + ) + + mode = dropbox.files.WriteMode("add", None) + if overwrite: + mode = dropbox.files.WriteMode.overwrite + + with open(source_path, "rb") as f: + self.dbx.files_upload(f.read(), path, mode=mode) + + server.update_db( + collection=collection, + new_file_id=None, + file=file, + representation=representation, + site=site, + progress=100 + ) + + return path + + def download_file(self, source_path, local_path, + server, collection, file, representation, site, + overwrite=False): + """ + Download file from provider into local system + + Args: + source_path (string): absolute path on provider + local_path (string): absolute path with or without name of the file + overwrite (boolean): replace existing file + + arguments for saving progress: + server (SyncServer): server instance to call update_db on + collection (str): name of collection + file (dict): info about uploaded file (matches structure from db) + representation (dict): complete repre containing 'file' + site (str): site name + Returns: + None + """ + # Check source path. + if not self._path_exists(source_path): + raise FileNotFoundError( + "Source file {} doesn't exist.".format(source_path) + ) + + if os.path.exists(local_path) and not overwrite: + raise FileExistsError( + "File already exists, use 'overwrite' argument" + ) + + if os.path.exists(local_path) and overwrite: + os.unlink(local_path) + + self.dbx.files_download_to_file(local_path, source_path) + + server.update_db( + collection=collection, + new_file_id=None, + file=file, + representation=representation, + site=site, + progress=100 + ) + + return os.path.basename(source_path) + + def delete_file(self, path): + """ + Deletes file from 'path'. Expects path to specific file. + + Args: + path (string): absolute path to particular file + + Returns: + None + """ + if not self._path_exists(path): + raise FileExistsError("File {} doesn't exist".format(path)) + + self.dbx.files_delete(path) + + def list_folder(self, folder_path): + """ + List all files and subfolders of particular path non-recursively. + Args: + folder_path (string): absolut path on provider + + Returns: + (list) + """ + if not self._path_exists(folder_path): + raise FileExistsError( + "Folder \"{}\" does not exist".format(folder_path) + ) + + entry_names = [] + for entry in self.dbx.files_list_folder(path=folder_path).entries: + entry_names.append(entry.name) + return entry_names + + def create_folder(self, folder_path): + """ + Create all nonexistent folders and subfolders in 'path'. + + Args: + path (string): absolute path + + Returns: + (string) folder id of lowest subfolder from 'path' + """ + if self._path_exists(folder_path): + return folder_path + + self.dbx.files_create_folder_v2(folder_path) + + return folder_path + + def get_tree(self): + """ + Creates folder structure for providers which do not provide + tree folder structure (GDrive has no accessible tree structure, + only parents and their parents) + """ + pass + + def get_roots_config(self, anatomy=None): + """ + Returns root values for path resolving + + Takes value from Anatomy which takes values from Settings + overridden by Local Settings + + Returns: + (dict) - {"root": {"root": "/My Drive"}} + OR + {"root": {"root_ONE": "value", "root_TWO":"value}} + Format is importing for usage of python's format ** approach + """ + return self.presets['root'] + + def resolve_path(self, path, root_config=None, anatomy=None): + """ + Replaces all root placeholders with proper values + + Args: + path(string): root[work]/folder... + root_config (dict): {'work': "c:/..."...} + anatomy (Anatomy): object of Anatomy + Returns: + (string): proper url + """ + if not root_config: + root_config = self.get_roots_config(anatomy) + + if root_config and not root_config.get("root"): + root_config = {"root": root_config} + + try: + if not root_config: + raise KeyError + + path = path.format(**root_config) + except KeyError: + try: + path = anatomy.fill_root(path) + except KeyError: + msg = "Error in resolving local root from anatomy" + log.error(msg) + raise ValueError(msg) + + return path diff --git a/openpype/modules/default_modules/sync_server/providers/gdrive.py b/openpype/modules/default_modules/sync_server/providers/gdrive.py index f1ec0b6a0d..0aabd9fbcd 100644 --- a/openpype/modules/default_modules/sync_server/providers/gdrive.py +++ b/openpype/modules/default_modules/sync_server/providers/gdrive.py @@ -61,7 +61,6 @@ class GDriveHandler(AbstractProvider): CHUNK_SIZE = 2097152 # must be divisible by 256! used for upload chunks def __init__(self, project_name, site_name, tree=None, presets=None): - self.presets = None self.active = False self.project_name = project_name self.site_name = site_name @@ -74,7 +73,13 @@ class GDriveHandler(AbstractProvider): format(site_name)) return - cred_path = self.presets.get("credentials_url", {}).\ + provider_presets = self.presets.get(self.CODE) + if not provider_presets: + msg = "Sync Server: No provider presets for {}".format(self.CODE) + log.info(msg) + return + + cred_path = self.presets[self.CODE].get("credentials_url", {}).\ get(platform.system().lower()) or '' if not os.path.exists(cred_path): msg = "Sync Server: No credentials for gdrive provider " + \ diff --git a/openpype/modules/default_modules/sync_server/providers/lib.py b/openpype/modules/default_modules/sync_server/providers/lib.py index 463e49dd4d..3daee366cf 100644 --- a/openpype/modules/default_modules/sync_server/providers/lib.py +++ b/openpype/modules/default_modules/sync_server/providers/lib.py @@ -1,5 +1,7 @@ from .gdrive import GDriveHandler +from .dropbox import DropboxHandler from .local_drive import LocalDriveHandler +from .sftp import SFTPHandler class ProviderFactory: @@ -111,4 +113,6 @@ factory = ProviderFactory() # 7 denotes number of files that could be synced in single loop - learned by # trial and error factory.register_provider(GDriveHandler.CODE, GDriveHandler, 7) +factory.register_provider(DropboxHandler.CODE, DropboxHandler, 10) factory.register_provider(LocalDriveHandler.CODE, LocalDriveHandler, 50) +factory.register_provider(SFTPHandler.CODE, SFTPHandler, 20) diff --git a/openpype/modules/default_modules/sync_server/providers/resources/dropbox.png b/openpype/modules/default_modules/sync_server/providers/resources/dropbox.png new file mode 100644 index 0000000000..6f56e3335b Binary files /dev/null and b/openpype/modules/default_modules/sync_server/providers/resources/dropbox.png differ diff --git a/openpype/modules/default_modules/sync_server/providers/resources/sftp.png b/openpype/modules/default_modules/sync_server/providers/resources/sftp.png new file mode 100644 index 0000000000..56c7a5cca3 Binary files /dev/null and b/openpype/modules/default_modules/sync_server/providers/resources/sftp.png differ diff --git a/openpype/modules/default_modules/sync_server/providers/sftp.py b/openpype/modules/default_modules/sync_server/providers/sftp.py new file mode 100644 index 0000000000..07450265e2 --- /dev/null +++ b/openpype/modules/default_modules/sync_server/providers/sftp.py @@ -0,0 +1,461 @@ +import os +import os.path +import time +import sys +import six +import threading +import platform + +from openpype.api import Logger +from openpype.api import get_system_settings +from .abstract_provider import AbstractProvider +log = Logger().get_logger("SyncServer") + +pysftp = None +try: + import pysftp +except (ImportError, SyntaxError): + pass + + # handle imports from Python 2 hosts - in those only basic methods are used + log.warning("Import failed, imported from Python 2, operations will fail.") + + +class SFTPHandler(AbstractProvider): + """ + Implementation of SFTP API. + + Authentication could be done in 2 ways: + - user and password + - ssh key file for user (optionally password for ssh key) + + Settings could be overwritten per project. + + """ + CODE = 'sftp' + LABEL = 'SFTP' + + def __init__(self, project_name, site_name, tree=None, presets=None): + self.presets = None + self.active = False + self.project_name = project_name + self.site_name = site_name + self.root = None + self._conn = None + + self.presets = presets + if not self.presets: + log.warning("Sync Server: There are no presets for {}.". + format(site_name)) + return + + provider_presets = self.presets.get(self.CODE) + if not provider_presets: + msg = "Sync Server: No provider presets for {}".format(self.CODE) + log.warning(msg) + return + + # store to instance for reconnect + self.sftp_host = provider_presets["sftp_host"] + self.sftp_port = provider_presets["sftp_port"] + self.sftp_user = provider_presets["sftp_user"] + self.sftp_pass = provider_presets["sftp_pass"] + self.sftp_key = provider_presets["sftp_key"] + self.sftp_key_pass = provider_presets["sftp_key_pass"] + + self._tree = None + self.active = True + + @property + def conn(self): + """SFTP connection, cannot be used in all places though.""" + if not self._conn: + self._conn = self._get_conn() + + return self._conn + + def is_active(self): + """ + Returns True if provider is activated, eg. has working credentials. + Returns: + (boolean) + """ + return self.conn is not None + + @classmethod + def get_system_settings_schema(cls): + """ + Returns dict for editable properties on system settings level + + + Returns: + (list) of dict + """ + return [] + + @classmethod + def get_project_settings_schema(cls): + """ + Returns dict for editable properties on project settings level + + Currently not implemented in Settings yet! + + Returns: + (list) of dict + """ + # {platform} tells that value is multiplatform and only specific OS + # should be returned + editable = [ + # credentials could be overriden on Project or User level + { + 'key': "sftp_server", + 'label': "SFTP host name", + 'type': 'text' + }, + { + "type": "number", + "key": "sftp_port", + "label": "SFTP port" + }, + { + 'key': "sftp_user", + 'label': "SFTP user name", + 'type': 'text' + }, + { + 'key': "sftp_pass", + 'label': "SFTP password", + 'type': 'text' + }, + { + 'key': "sftp_key", + 'label': "SFTP user ssh key", + 'type': 'path' + }, + { + 'key': "sftp_key_pass", + 'label': "SFTP user ssh key password", + 'type': 'text' + }, + # roots could be overriden only on Project leve, User cannot + { + 'key': "roots", + 'label': "Roots", + 'type': 'dict' + } + ] + return editable + + @classmethod + def get_local_settings_schema(cls): + """ + Returns dict for editable properties on local settings level + + Currently not implemented in Settings yet! + + Returns: + (dict) + """ + editable = [ + # credentials could be override on Project or User level + { + 'key': "sftp_user", + 'label': "SFTP user name", + 'type': 'text' + }, + { + 'key': "sftp_pass", + 'label': "SFTP password", + 'type': 'text' + }, + { + 'key': "sftp_key", + 'label': "SFTP user ssh key", + 'type': 'path' + }, + { + 'key': "sftp_key_pass", + 'label': "SFTP user ssh key password", + 'type': 'text' + } + ] + return editable + + def get_roots_config(self, anatomy=None): + """ + Returns root values for path resolving + + Use only Settings as GDrive cannot be modified by Local Settings + + Returns: + (dict) - {"root": {"root": "/My Drive"}} + OR + {"root": {"root_ONE": "value", "root_TWO":"value}} + Format is importing for usage of python's format ** approach + """ + # roots cannot be locally overridden + return self.presets['root'] + + def get_tree(self): + """ + Building of the folder tree could be potentially expensive, + constructor provides argument that could inject previously created + tree. + Tree structure must be handled in thread safe fashion! + Returns: + (dictionary) - url to id mapping + """ + # not needed in this provider + pass + + def create_folder(self, path): + """ + Create all nonexistent folders and subfolders in 'path'. + Updates self._tree structure with new paths + + Args: + path (string): absolute path, starts with GDrive root, + without filename + Returns: + (string) folder id of lowest subfolder from 'path' + """ + self.conn.makedirs(path) + + return os.path.basename(path) + + def upload_file(self, source_path, target_path, + server, collection, file, representation, site, + overwrite=False): + """ + Uploads single file from 'source_path' to destination 'path'. + It creates all folders on the path if are not existing. + + Args: + source_path (string): + target_path (string): absolute path with or without name of a file + overwrite (boolean): replace existing file + + arguments for saving progress: + server (SyncServer): server instance to call update_db on + collection (str): name of collection + file (dict): info about uploaded file (matches structure from db) + representation (dict): complete repre containing 'file' + site (str): site name + + Returns: + (string) file_id of created/modified file , + throws FileExistsError, FileNotFoundError exceptions + """ + if not os.path.isfile(source_path): + raise FileNotFoundError("Source file {} doesn't exist." + .format(source_path)) + + if self.file_path_exists(target_path): + if not overwrite: + raise ValueError("File {} exists, set overwrite". + format(target_path)) + + thread = threading.Thread(target=self._upload, + args=(source_path, target_path)) + thread.start() + self._mark_progress(collection, file, representation, server, + site, source_path, target_path, "upload") + + return os.path.basename(target_path) + + def _upload(self, source_path, target_path): + print("copying {}->{}".format(source_path, target_path)) + conn = self._get_conn() + conn.put(source_path, target_path) + + def download_file(self, source_path, target_path, + server, collection, file, representation, site, + overwrite=False): + """ + Downloads single file from 'source_path' (remote) to 'target_path'. + It creates all folders on the local_path if are not existing. + By default existing file on 'target_path' will trigger an exception + + Args: + source_path (string): absolute path on provider + target_path (string): absolute path with or without name of a file + overwrite (boolean): replace existing file + + arguments for saving progress: + server (SyncServer): server instance to call update_db on + collection (str): name of collection + file (dict): info about uploaded file (matches structure from db) + representation (dict): complete repre containing 'file' + site (str): site name + + Returns: + (string) file_id of created/modified file , + throws FileExistsError, FileNotFoundError exceptions + """ + if not self.file_path_exists(source_path): + raise FileNotFoundError("Source file {} doesn't exist." + .format(source_path)) + + if os.path.isfile(target_path): + if not overwrite: + raise ValueError("File {} exists, set overwrite". + format(target_path)) + + thread = threading.Thread(target=self._download, + args=(source_path, target_path)) + thread.start() + self._mark_progress(collection, file, representation, server, + site, source_path, target_path, "download") + + return os.path.basename(target_path) + + def _download(self, source_path, target_path): + print("downloading {}->{}".format(source_path, target_path)) + conn = self._get_conn() + conn.get(source_path, target_path) + + def delete_file(self, path): + """ + Deletes file from 'path'. Expects path to specific file. + + Args: + path: absolute path to particular file + + Returns: + None + """ + if not self.file_path_exists(path): + raise FileNotFoundError("File {} to be deleted doesn't exist." + .format(path)) + + self.conn.remove(path) + + def list_folder(self, folder_path): + """ + List all files and subfolders of particular path non-recursively. + + Args: + folder_path (string): absolut path on provider + Returns: + (list) + """ + return list(pysftp.path_advance(folder_path)) + + def folder_path_exists(self, file_path): + """ + Checks if path from 'file_path' exists. If so, return its + folder id. + Args: + file_path (string): path with / as a separator + Returns: + (string) folder id or False + """ + if not file_path: + return False + + return self.conn.isdir(file_path) + + def file_path_exists(self, file_path): + """ + Checks if 'file_path' exists on GDrive + + Args: + file_path (string): separated by '/', from root, with file name + Returns: + (dictionary|boolean) file metadata | False if not found + """ + if not file_path: + return False + + return self.conn.isfile(file_path) + + @classmethod + def get_presets(cls): + """ + Get presets for this provider + Returns: + (dictionary) of configured sites + """ + provider_presets = None + try: + provider_presets = ( + get_system_settings()["modules"] + ["sync_server"] + ["providers"] + ["sftp"] + ) + except KeyError: + log.info(("Sync Server: There are no presets for SFTP " + + "provider."). + format(str(provider_presets))) + return + return provider_presets + + def _get_conn(self): + """ + Returns fresh sftp connection. + + It seems that connection cannot be cached into self.conn, at least + for get and put which run in separate threads. + + Returns: + pysftp.Connection + """ + if not pysftp: + raise ImportError + + cnopts = pysftp.CnOpts() + cnopts.hostkeys = None + + conn_params = { + 'host': self.sftp_host, + 'port': self.sftp_port, + 'username': self.sftp_user, + 'cnopts': cnopts + } + if self.sftp_pass and self.sftp_pass.strip(): + conn_params['password'] = self.sftp_pass + if self.sftp_key: # expects .pem format, not .ppk! + conn_params['private_key'] = \ + self.sftp_key[platform.system().lower()] + if self.sftp_key_pass: + conn_params['private_key_pass'] = self.sftp_key_pass + + return pysftp.Connection(**conn_params) + + def _mark_progress(self, collection, file, representation, server, site, + source_path, target_path, direction): + """ + Updates progress field in DB by values 0-1. + + Compares file sizes of source and target. + """ + pass + if direction == "upload": + source_file_size = os.path.getsize(source_path) + else: + source_file_size = self.conn.stat(source_path).st_size + + target_file_size = 0 + last_tick = status_val = None + while source_file_size != target_file_size: + if not last_tick or \ + time.time() - last_tick >= server.LOG_PROGRESS_SEC: + status_val = target_file_size / source_file_size + last_tick = time.time() + log.debug(direction + "ed %d%%." % int(status_val * 100)) + server.update_db(collection=collection, + new_file_id=None, + file=file, + representation=representation, + site=site, + progress=status_val + ) + try: + if direction == "upload": + target_file_size = self.conn.stat(target_path).st_size + else: + target_file_size = os.path.getsize(target_path) + except FileNotFoundError: + pass + time.sleep(0.5) diff --git a/openpype/modules/default_modules/sync_server/sync_server.py b/openpype/modules/default_modules/sync_server/sync_server.py index 638a4a367f..2227ec9366 100644 --- a/openpype/modules/default_modules/sync_server/sync_server.py +++ b/openpype/modules/default_modules/sync_server/sync_server.py @@ -221,6 +221,7 @@ def _get_configured_sites_from_setting(module, project_name, project_setting): return configured_sites + class SyncServerThread(threading.Thread): """ Separate thread running synchronization server with asyncio loop. diff --git a/openpype/modules/default_modules/sync_server/sync_server_module.py b/openpype/modules/default_modules/sync_server/sync_server_module.py index 7dabd45bae..f2e9237542 100644 --- a/openpype/modules/default_modules/sync_server/sync_server_module.py +++ b/openpype/modules/default_modules/sync_server/sync_server_module.py @@ -398,6 +398,18 @@ class SyncServerModule(OpenPypeModule, ITrayModule): return remote_site + def get_local_normalized_site(self, site_name): + """ + Return 'site_name' or 'local' if 'site_name' is local id. + + In some places Settings or Local Settings require 'local' instead + of real site name. + """ + if site_name == get_local_site_id(): + site_name = self.LOCAL_SITE + + return site_name + # Methods for Settings UI to draw appropriate forms @classmethod def get_system_settings_schema(cls): diff --git a/openpype/modules/default_modules/sync_server/utils.py b/openpype/modules/default_modules/sync_server/utils.py index d4fc29ff8a..85e4e03f77 100644 --- a/openpype/modules/default_modules/sync_server/utils.py +++ b/openpype/modules/default_modules/sync_server/utils.py @@ -29,7 +29,6 @@ def time_function(method): kw['log_time'][name] = int((te - ts) * 1000) else: log.debug('%r %2.2f ms' % (method.__name__, (te - ts) * 1000)) - print('%r %2.2f ms' % (method.__name__, (te - ts) * 1000)) return result return timed diff --git a/openpype/modules/default_modules/timers_manager/idle_threads.py b/openpype/modules/default_modules/timers_manager/idle_threads.py new file mode 100644 index 0000000000..9ec27e659b --- /dev/null +++ b/openpype/modules/default_modules/timers_manager/idle_threads.py @@ -0,0 +1,160 @@ +import time +from Qt import QtCore +from pynput import mouse, keyboard + +from openpype.lib import PypeLogger + + +class IdleItem: + """Python object holds information if state of idle changed. + + This item is used to be independent from Qt objects. + """ + def __init__(self): + self.changed = False + + def reset(self): + self.changed = False + + def set_changed(self, changed=True): + self.changed = changed + + +class IdleManager(QtCore.QThread): + """ Measure user's idle time in seconds. + Idle time resets on keyboard/mouse input. + Is able to emit signals at specific time idle. + """ + time_signals = {} + idle_time = 0 + signal_reset_timer = QtCore.Signal() + + def __init__(self): + super(IdleManager, self).__init__() + self.log = PypeLogger.get_logger(self.__class__.__name__) + self.signal_reset_timer.connect(self._reset_time) + + self.idle_item = IdleItem() + + self._is_running = False + self._mouse_thread = None + self._keyboard_thread = None + + def add_time_signal(self, emit_time, signal): + """ If any module want to use IdleManager, need to use add_time_signal + + Args: + emit_time(int): Time when signal will be emitted. + signal(QtCore.Signal): Signal that will be emitted + (without objects). + """ + if emit_time not in self.time_signals: + self.time_signals[emit_time] = [] + self.time_signals[emit_time].append(signal) + + @property + def is_running(self): + return self._is_running + + def _reset_time(self): + self.idle_time = 0 + + def stop(self): + self._is_running = False + + def _on_mouse_destroy(self): + self._mouse_thread = None + + def _on_keyboard_destroy(self): + self._keyboard_thread = None + + def run(self): + self.log.info('IdleManager has started') + self._is_running = True + + thread_mouse = MouseThread(self.idle_item) + thread_keyboard = KeyboardThread(self.idle_item) + + thread_mouse.destroyed.connect(self._on_mouse_destroy) + thread_keyboard.destroyed.connect(self._on_keyboard_destroy) + + self._mouse_thread = thread_mouse + self._keyboard_thread = thread_keyboard + + thread_mouse.start() + thread_keyboard.start() + + # Main loop here is each second checked if idle item changed state + while self._is_running: + if self.idle_item.changed: + self.idle_item.reset() + self.signal_reset_timer.emit() + else: + self.idle_time += 1 + + if self.idle_time in self.time_signals: + for signal in self.time_signals[self.idle_time]: + signal.emit() + time.sleep(1) + + self._post_run() + self.log.info('IdleManager has stopped') + + def _post_run(self): + # Stop threads if still exist + if self._mouse_thread is not None: + self._mouse_thread.signal_stop.emit() + self._mouse_thread.terminate() + self._mouse_thread.wait() + + if self._keyboard_thread is not None: + self._keyboard_thread.signal_stop.emit() + self._keyboard_thread.terminate() + self._keyboard_thread.wait() + + +class MouseThread(QtCore.QThread): + """Listens user's mouse movement.""" + signal_stop = QtCore.Signal() + + def __init__(self, idle_item): + super(MouseThread, self).__init__() + self.signal_stop.connect(self.stop) + self.m_listener = None + self.idle_item = idle_item + + def stop(self): + if self.m_listener is not None: + self.m_listener.stop() + + def on_move(self, *args, **kwargs): + self.idle_item.set_changed() + + def run(self): + self.m_listener = mouse.Listener(on_move=self.on_move) + self.m_listener.start() + + +class KeyboardThread(QtCore.QThread): + """Listens user's keyboard input + """ + signal_stop = QtCore.Signal() + + def __init__(self, idle_item): + super(KeyboardThread, self).__init__() + self.signal_stop.connect(self.stop) + self.k_listener = None + self.idle_item = idle_item + + def stop(self): + if self.k_listener is not None: + listener = self.k_listener + self.k_listener = None + listener.stop() + + def on_press(self, *args, **kwargs): + self.idle_item.set_changed() + + def run(self): + self.k_listener = keyboard.Listener(on_press=self.on_press) + self.k_listener.start() diff --git a/openpype/modules/default_modules/timers_manager/rest_api.py b/openpype/modules/default_modules/timers_manager/rest_api.py index ac8d8b7b74..19b72d688b 100644 --- a/openpype/modules/default_modules/timers_manager/rest_api.py +++ b/openpype/modules/default_modules/timers_manager/rest_api.py @@ -1,3 +1,5 @@ +import json + from aiohttp.web_response import Response from openpype.api import Logger @@ -28,6 +30,11 @@ class TimersManagerModuleRestApi: self.prefix + "/stop_timer", self.stop_timer ) + self.server_manager.add_route( + "GET", + self.prefix + "/get_task_time", + self.get_task_time + ) async def start_timer(self, request): data = await request.json() @@ -48,3 +55,20 @@ class TimersManagerModuleRestApi: async def stop_timer(self, request): self.module.stop_timers() return Response(status=200) + + async def get_task_time(self, request): + data = await request.json() + try: + project_name = data['project_name'] + asset_name = data['asset_name'] + task_name = data['task_name'] + except KeyError: + message = ( + "Payload must contain fields 'project_name, 'asset_name'," + " 'task_name'" + ) + log.warning(message) + return Response(text=message, status=404) + + time = self.module.get_task_time(project_name, asset_name, task_name) + return Response(text=json.dumps(time)) diff --git a/openpype/modules/default_modules/timers_manager/timers_manager.py b/openpype/modules/default_modules/timers_manager/timers_manager.py index 47ba0b4059..1199db0611 100644 --- a/openpype/modules/default_modules/timers_manager/timers_manager.py +++ b/openpype/modules/default_modules/timers_manager/timers_manager.py @@ -1,10 +1,9 @@ import os -import collections +import platform from openpype.modules import OpenPypeModule from openpype_interfaces import ( ITimersManager, - ITrayService, - IIdleManager + ITrayService ) from avalon.api import AvalonMongoDB @@ -68,7 +67,7 @@ class ExampleTimersManagerConnector: self._timers_manager_module.timer_stopped(self._module.id) -class TimersManager(OpenPypeModule, ITrayService, IIdleManager): +class TimersManager(OpenPypeModule, ITrayService): """ Handles about Timers. Should be able to start/stop all timers at once. @@ -93,12 +92,16 @@ class TimersManager(OpenPypeModule, ITrayService, IIdleManager): self.enabled = timers_settings["enabled"] - auto_stop = timers_settings["auto_stop"] # When timer will stop if idle manager is running (minutes) full_time = int(timers_settings["full_time"] * 60) # How many minutes before the timer is stopped will popup the message message_time = int(timers_settings["message_time"] * 60) + auto_stop = timers_settings["auto_stop"] + # Turn of auto stop on MacOs because pynput requires root permissions + if platform.system().lower() == "darwin" or full_time <= 0: + auto_stop = False + self.auto_stop = auto_stop self.time_show_message = full_time - message_time self.time_stop_timer = full_time @@ -107,24 +110,46 @@ class TimersManager(OpenPypeModule, ITrayService, IIdleManager): self.last_task = None # Tray attributes - self.signal_handler = None - self.widget_user_idle = None - self.signal_handler = None + self._signal_handler = None + self._widget_user_idle = None + self._idle_manager = None self._connectors_by_module_id = {} self._modules_by_id = {} def tray_init(self): + if not self.auto_stop: + return + + from .idle_threads import IdleManager from .widget_user_idle import WidgetUserIdle, SignalHandler - self.widget_user_idle = WidgetUserIdle(self) - self.signal_handler = SignalHandler(self) + + signal_handler = SignalHandler(self) + idle_manager = IdleManager() + widget_user_idle = WidgetUserIdle(self) + widget_user_idle.set_countdown_start(self.time_show_message) + + idle_manager.signal_reset_timer.connect( + widget_user_idle.reset_countdown + ) + idle_manager.add_time_signal( + self.time_show_message, signal_handler.signal_show_message + ) + idle_manager.add_time_signal( + self.time_stop_timer, signal_handler.signal_stop_timers + ) + + self._signal_handler = signal_handler + self._widget_user_idle = widget_user_idle + self._idle_manager = idle_manager def tray_start(self, *_a, **_kw): - return + if self._idle_manager: + self._idle_manager.start() def tray_exit(self): - """Nothing special for TimersManager.""" - return + if self._idle_manager: + self._idle_manager.stop() def start_timer(self, project_name, asset_name, task_name, hierarchy): """ @@ -166,6 +191,16 @@ class TimersManager(OpenPypeModule, ITrayService, IIdleManager): } self.timer_started(None, data) + def get_task_time(self, project_name, asset_name, task_name): + times = {} + for module_id, connector in self._connectors_by_module_id.items(): + if hasattr(connector, "get_task_time"): + module = self._modules_by_id[module_id] + times[module.name] = connector.get_task_time( + project_name, asset_name, task_name + ) + return times + def timer_started(self, source_id, data): for module_id, connector in self._connectors_by_module_id.items(): if module_id == source_id: @@ -205,8 +240,8 @@ class TimersManager(OpenPypeModule, ITrayService, IIdleManager): if self.is_running is False: return - self.widget_user_idle.bool_not_stopped = False - self.widget_user_idle.refresh_context() + if self._widget_user_idle is not None: + self._widget_user_idle.set_timer_stopped() self.is_running = False self.timer_stopped(None) @@ -244,70 +279,12 @@ class TimersManager(OpenPypeModule, ITrayService, IIdleManager): " for connector of module \"{}\"." ).format(module.name)) - def callbacks_by_idle_time(self): - """Implementation of IIdleManager interface.""" - # Time when message is shown - if not self.auto_stop: - return {} - - callbacks = collections.defaultdict(list) - callbacks[self.time_show_message].append(lambda: self.time_callback(0)) - - # Times when idle is between show widget and stop timers - show_to_stop_range = range( - self.time_show_message - 1, self.time_stop_timer - ) - for num in show_to_stop_range: - callbacks[num].append(lambda: self.time_callback(1)) - - # Times when widget is already shown and user restart idle - shown_and_moved_range = range( - self.time_stop_timer - self.time_show_message - ) - for num in shown_and_moved_range: - callbacks[num].append(lambda: self.time_callback(1)) - - # Time when timers are stopped - callbacks[self.time_stop_timer].append(lambda: self.time_callback(2)) - - return callbacks - - def time_callback(self, int_def): - if not self.signal_handler: - return - - if int_def == 0: - self.signal_handler.signal_show_message.emit() - elif int_def == 1: - self.signal_handler.signal_change_label.emit() - elif int_def == 2: - self.signal_handler.signal_stop_timers.emit() - - def change_label(self): - if self.is_running is False: - return - - if ( - not self.idle_manager - or self.widget_user_idle.bool_is_showed is False - ): - return - - if self.idle_manager.idle_time > self.time_show_message: - value = self.time_stop_timer - self.idle_manager.idle_time - else: - value = 1 + ( - self.time_stop_timer - - self.time_show_message - - self.idle_manager.idle_time - ) - self.widget_user_idle.change_count_widget(value) - def show_message(self): if self.is_running is False: return - if self.widget_user_idle.bool_is_showed is False: - self.widget_user_idle.show() + if not self._widget_user_idle.is_showed(): + self._widget_user_idle.reset_countdown() + self._widget_user_idle.show() # Webserver module implementation def webserver_initialization(self, server_manager): diff --git a/openpype/modules/default_modules/timers_manager/widget_user_idle.py b/openpype/modules/default_modules/timers_manager/widget_user_idle.py index cefa6bb4fb..1ecea74440 100644 --- a/openpype/modules/default_modules/timers_manager/widget_user_idle.py +++ b/openpype/modules/default_modules/timers_manager/widget_user_idle.py @@ -3,168 +3,193 @@ from openpype import resources, style class WidgetUserIdle(QtWidgets.QWidget): - SIZE_W = 300 SIZE_H = 160 def __init__(self, module): - super(WidgetUserIdle, self).__init__() - self.bool_is_showed = False - self.bool_not_stopped = True - - self.module = module + self.setWindowTitle("OpenPype - Stop timers") icon = QtGui.QIcon(resources.get_openpype_icon_filepath()) self.setWindowIcon(icon) + self.setWindowFlags( QtCore.Qt.WindowCloseButtonHint | QtCore.Qt.WindowMinimizeButtonHint ) - self._translate = QtCore.QCoreApplication.translate + self._is_showed = False + self._timer_stopped = False + self._countdown = 0 + self._countdown_start = 0 - self.font = QtGui.QFont() - self.font.setFamily("DejaVu Sans Condensed") - self.font.setPointSize(9) - self.font.setBold(True) - self.font.setWeight(50) - self.font.setKerning(True) + self.module = module + + msg_info = "You didn't work for a long time." + msg_question = "Would you like to stop Timers?" + msg_stopped = ( + "Your Timers were stopped. Do you want to start them again?" + ) + + lbl_info = QtWidgets.QLabel(msg_info, self) + lbl_info.setTextFormat(QtCore.Qt.RichText) + lbl_info.setWordWrap(True) + + lbl_question = QtWidgets.QLabel(msg_question, self) + lbl_question.setTextFormat(QtCore.Qt.RichText) + lbl_question.setWordWrap(True) + + lbl_stopped = QtWidgets.QLabel(msg_stopped, self) + lbl_stopped.setTextFormat(QtCore.Qt.RichText) + lbl_stopped.setWordWrap(True) + + lbl_rest_time = QtWidgets.QLabel(self) + lbl_rest_time.setTextFormat(QtCore.Qt.RichText) + lbl_rest_time.setWordWrap(True) + lbl_rest_time.setAlignment(QtCore.Qt.AlignCenter) + + form = QtWidgets.QFormLayout() + form.setContentsMargins(10, 15, 10, 5) + + form.addRow(lbl_info) + form.addRow(lbl_question) + form.addRow(lbl_stopped) + form.addRow(lbl_rest_time) + + btn_stop = QtWidgets.QPushButton("Stop timer", self) + btn_stop.setToolTip("Stop's All timers") + + btn_continue = QtWidgets.QPushButton("Continue", self) + btn_continue.setToolTip("Timer won't stop") + + btn_close = QtWidgets.QPushButton("Close", self) + btn_close.setToolTip("Close window") + + btn_restart = QtWidgets.QPushButton("Start timers", self) + btn_restart.setToolTip("Timer will be started again") + + group_layout = QtWidgets.QHBoxLayout() + group_layout.addStretch(1) + group_layout.addWidget(btn_continue) + group_layout.addWidget(btn_stop) + group_layout.addWidget(btn_restart) + group_layout.addWidget(btn_close) + + layout = QtWidgets.QVBoxLayout(self) + layout.addLayout(form) + layout.addLayout(group_layout) + + count_timer = QtCore.QTimer() + count_timer.setInterval(1000) + + btn_stop.clicked.connect(self._on_stop_clicked) + btn_continue.clicked.connect(self._on_continue_clicked) + btn_close.clicked.connect(self._close_widget) + btn_restart.clicked.connect(self._on_restart_clicked) + count_timer.timeout.connect(self._on_count_timeout) + + self.lbl_info = lbl_info + self.lbl_question = lbl_question + self.lbl_stopped = lbl_stopped + self.lbl_rest_time = lbl_rest_time + + self.btn_stop = btn_stop + self.btn_continue = btn_continue + self.btn_close = btn_close + self.btn_restart = btn_restart + + self._count_timer = count_timer self.resize(self.SIZE_W, self.SIZE_H) self.setMinimumSize(QtCore.QSize(self.SIZE_W, self.SIZE_H)) self.setMaximumSize(QtCore.QSize(self.SIZE_W+100, self.SIZE_H+100)) self.setStyleSheet(style.load_stylesheet()) - self.setLayout(self._main()) - self.refresh_context() - self.setWindowTitle('Pype - Stop timers') + def set_countdown_start(self, countdown): + self._countdown_start = countdown + if not self.is_showed(): + self.reset_countdown() - def _main(self): - self.main = QtWidgets.QVBoxLayout() - self.main.setObjectName('main') + def reset_countdown(self): + self._countdown = self._countdown_start + self._update_countdown_label() - self.form = QtWidgets.QFormLayout() - self.form.setContentsMargins(10, 15, 10, 5) - self.form.setObjectName('form') + def is_showed(self): + return self._is_showed - msg_info = 'You didn\'t work for a long time.' - msg_question = 'Would you like to stop Timers?' - msg_stopped = ( - 'Your Timers were stopped. Do you want to start them again?' - ) + def set_timer_stopped(self): + self._timer_stopped = True + self._refresh_context() - self.lbl_info = QtWidgets.QLabel(msg_info) - self.lbl_info.setFont(self.font) - self.lbl_info.setTextFormat(QtCore.Qt.RichText) - self.lbl_info.setObjectName("lbl_info") - self.lbl_info.setWordWrap(True) + def _update_countdown_label(self): + self.lbl_rest_time.setText(str(self._countdown)) - self.lbl_question = QtWidgets.QLabel(msg_question) - self.lbl_question.setFont(self.font) - self.lbl_question.setTextFormat(QtCore.Qt.RichText) - self.lbl_question.setObjectName("lbl_question") - self.lbl_question.setWordWrap(True) + def _on_count_timeout(self): + if self._timer_stopped or not self._is_showed: + self._count_timer.stop() + return - self.lbl_stopped = QtWidgets.QLabel(msg_stopped) - self.lbl_stopped.setFont(self.font) - self.lbl_stopped.setTextFormat(QtCore.Qt.RichText) - self.lbl_stopped.setObjectName("lbl_stopped") - self.lbl_stopped.setWordWrap(True) - - self.lbl_rest_time = QtWidgets.QLabel("") - self.lbl_rest_time.setFont(self.font) - self.lbl_rest_time.setTextFormat(QtCore.Qt.RichText) - self.lbl_rest_time.setObjectName("lbl_rest_time") - self.lbl_rest_time.setWordWrap(True) - self.lbl_rest_time.setAlignment(QtCore.Qt.AlignCenter) - - self.form.addRow(self.lbl_info) - self.form.addRow(self.lbl_question) - self.form.addRow(self.lbl_stopped) - self.form.addRow(self.lbl_rest_time) - - self.group_btn = QtWidgets.QHBoxLayout() - self.group_btn.addStretch(1) - self.group_btn.setObjectName("group_btn") - - self.btn_stop = QtWidgets.QPushButton("Stop timer") - self.btn_stop.setToolTip('Stop\'s All timers') - self.btn_stop.clicked.connect(self.stop_timer) - - self.btn_continue = QtWidgets.QPushButton("Continue") - self.btn_continue.setToolTip('Timer won\'t stop') - self.btn_continue.clicked.connect(self.continue_timer) - - self.btn_close = QtWidgets.QPushButton("Close") - self.btn_close.setToolTip('Close window') - self.btn_close.clicked.connect(self.close_widget) - - self.btn_restart = QtWidgets.QPushButton("Start timers") - self.btn_restart.setToolTip('Timer will be started again') - self.btn_restart.clicked.connect(self.restart_timer) - - self.group_btn.addWidget(self.btn_continue) - self.group_btn.addWidget(self.btn_stop) - self.group_btn.addWidget(self.btn_restart) - self.group_btn.addWidget(self.btn_close) - - self.main.addLayout(self.form) - self.main.addLayout(self.group_btn) - - return self.main - - def refresh_context(self): - self.lbl_question.setVisible(self.bool_not_stopped) - self.lbl_rest_time.setVisible(self.bool_not_stopped) - self.lbl_stopped.setVisible(not self.bool_not_stopped) - - self.btn_continue.setVisible(self.bool_not_stopped) - self.btn_stop.setVisible(self.bool_not_stopped) - self.btn_restart.setVisible(not self.bool_not_stopped) - self.btn_close.setVisible(not self.bool_not_stopped) - - def change_count_widget(self, time): - str_time = str(time) - self.lbl_rest_time.setText(str_time) - - def stop_timer(self): - self.module.stop_timers() - self.close_widget() - - def restart_timer(self): - self.module.restart_timers() - self.close_widget() - - def continue_timer(self): - self.close_widget() - - def closeEvent(self, event): - event.ignore() - if self.bool_not_stopped is True: - self.continue_timer() + if self._countdown <= 0: + self._stop_timers() + self.set_timer_stopped() else: - self.close_widget() + self._countdown -= 1 + self._update_countdown_label() - def close_widget(self): - self.bool_is_showed = False - self.bool_not_stopped = True - self.refresh_context() + def _refresh_context(self): + self.lbl_question.setVisible(not self._timer_stopped) + self.lbl_rest_time.setVisible(not self._timer_stopped) + self.lbl_stopped.setVisible(self._timer_stopped) + + self.btn_continue.setVisible(not self._timer_stopped) + self.btn_stop.setVisible(not self._timer_stopped) + self.btn_restart.setVisible(self._timer_stopped) + self.btn_close.setVisible(self._timer_stopped) + + def _stop_timers(self): + self.module.stop_timers() + + def _on_stop_clicked(self): + self._stop_timers() + self._close_widget() + + def _on_restart_clicked(self): + self.module.restart_timers() + self._close_widget() + + def _on_continue_clicked(self): + self._close_widget() + + def _close_widget(self): + self._is_showed = False + self._timer_stopped = False + self._refresh_context() self.hide() def showEvent(self, event): - self.bool_is_showed = True + if not self._is_showed: + self._is_showed = True + self._refresh_context() + + if not self._count_timer.isActive(): + self._count_timer.start() + super(WidgetUserIdle, self).showEvent(event) + + def closeEvent(self, event): + event.ignore() + if self._timer_stopped: + self._close_widget() + else: + self._on_continue_clicked() class SignalHandler(QtCore.QObject): signal_show_message = QtCore.Signal() - signal_change_label = QtCore.Signal() signal_stop_timers = QtCore.Signal() def __init__(self, module): super(SignalHandler, self).__init__() self.module = module self.signal_show_message.connect(module.show_message) - self.signal_change_label.connect(module.change_label) self.signal_stop_timers.connect(module.stop_timers) diff --git a/openpype/plugins/load/open_djv.py b/openpype/plugins/load/open_djv.py index 39b54364d9..5b49bb58d0 100644 --- a/openpype/plugins/load/open_djv.py +++ b/openpype/plugins/load/open_djv.py @@ -1,26 +1,28 @@ import os -import subprocess from avalon import api +from openpype.api import ApplicationManager def existing_djv_path(): - djv_paths = os.environ.get("DJV_PATH") or "" - for path in djv_paths.split(os.pathsep): - if os.path.exists(path): - return path - return None + app_manager = ApplicationManager() + djv_list = [] + for app_name, app in app_manager.applications.items(): + if 'djv' in app_name and app.find_executable(): + djv_list.append(app_name) + + return djv_list class OpenInDJV(api.Loader): """Open Image Sequence with system default""" - djv_path = existing_djv_path() - families = ["*"] if djv_path else [] + djv_list = existing_djv_path() + families = ["*"] if djv_list else [] representations = [ "cin", "dpx", "avi", "dv", "gif", "flv", "mkv", "mov", "mpg", "mpeg", "mp4", "m4v", "mxf", "iff", "z", "ifl", "jpeg", "jpg", "jfif", "lut", "1dl", "exr", "pic", "png", "ppm", "pnm", "pgm", "pbm", "rla", "rpf", - "sgi", "rgba", "rgb", "bw", "tga", "tiff", "tif", "img" + "sgi", "rgba", "rgb", "bw", "tga", "tiff", "tif", "img", "h264", ] label = "Open in DJV" @@ -41,20 +43,18 @@ class OpenInDJV(api.Loader): ) if not remainder: - seqeunce = collections[0] - first_image = list(seqeunce)[0] + sequence = collections[0] + first_image = list(sequence)[0] else: first_image = self.fname filepath = os.path.normpath(os.path.join(directory, first_image)) self.log.info("Opening : {}".format(filepath)) - cmd = [ - # DJV path - os.path.normpath(self.djv_path), - # PATH TO COMPONENT - os.path.normpath(filepath) - ] + last_djv_version = sorted(self.djv_list)[-1] - # Run DJV with these commands - subprocess.Popen(cmd) + app_manager = ApplicationManager() + djv = app_manager.applications.get(last_djv_version) + djv.arguments.append(filepath) + + app_manager.launch(last_djv_version) diff --git a/openpype/plugins/publish/collect_avalon_entities.py b/openpype/plugins/publish/collect_avalon_entities.py index 0b6423818e..a6120d42fe 100644 --- a/openpype/plugins/publish/collect_avalon_entities.py +++ b/openpype/plugins/publish/collect_avalon_entities.py @@ -22,6 +22,7 @@ class CollectAvalonEntities(pyblish.api.ContextPlugin): io.install() project_name = api.Session["AVALON_PROJECT"] asset_name = api.Session["AVALON_ASSET"] + task_name = api.Session["AVALON_TASK"] project_entity = io.find_one({ "type": "project", @@ -48,6 +49,12 @@ class CollectAvalonEntities(pyblish.api.ContextPlugin): data = asset_entity['data'] + # Task type + asset_tasks = data.get("tasks") or {} + task_info = asset_tasks.get(task_name) or {} + task_type = task_info.get("type") + context.data["taskType"] = task_type + frame_start = data.get("frameStart") if frame_start is None: frame_start = 1 diff --git a/openpype/plugins/publish/collect_resources_path.py b/openpype/plugins/publish/collect_resources_path.py index 98b59332da..fa181301ee 100644 --- a/openpype/plugins/publish/collect_resources_path.py +++ b/openpype/plugins/publish/collect_resources_path.py @@ -26,6 +26,7 @@ class CollectResourcesPath(pyblish.api.InstancePlugin): "animation", "model", "mayaAscii", + "mayaScene", "setdress", "layout", "ass", @@ -67,6 +68,12 @@ class CollectResourcesPath(pyblish.api.InstancePlugin): "representation": "TEMP" }) + # For the first time publish + if instance.data.get("hierarchy"): + template_data.update({ + "hierarchy": instance.data["hierarchy"] + }) + anatomy_filled = anatomy.format(template_data) if "folder" in anatomy.templates["publish"]: diff --git a/openpype/plugins/publish/extract_burnin.py b/openpype/plugins/publish/extract_burnin.py index 625125321c..207e696fb1 100644 --- a/openpype/plugins/publish/extract_burnin.py +++ b/openpype/plugins/publish/extract_burnin.py @@ -1,6 +1,5 @@ import os import re -import subprocess import json import copy import tempfile @@ -158,6 +157,11 @@ class ExtractBurnin(openpype.api.Extractor): filled_anatomy = anatomy.format_all(burnin_data) burnin_data["anatomy"] = filled_anatomy.get_solved() + # Add context data burnin_data. + burnin_data["custom"] = ( + instance.data.get("custom_burnin_data") or {} + ) + # Add source camera name to burnin data camera_name = repre.get("camera_name") if camera_name: @@ -226,7 +230,8 @@ class ExtractBurnin(openpype.api.Extractor): "options": copy.deepcopy(burnin_options), "values": burnin_values, "full_input_path": temp_data["full_input_paths"][0], - "first_frame": temp_data["first_frame"] + "first_frame": temp_data["first_frame"], + "ffmpeg_cmd": new_repre.get("ffmpeg_cmd", "") } self.log.debug( diff --git a/openpype/plugins/publish/extract_review.py b/openpype/plugins/publish/extract_review.py index f5d6789dd4..7284483f5f 100644 --- a/openpype/plugins/publish/extract_review.py +++ b/openpype/plugins/publish/extract_review.py @@ -30,8 +30,8 @@ class ExtractReview(pyblish.api.InstancePlugin): otherwise the representation is ignored. All new representations are created and encoded by ffmpeg following - presets found in `pype-config/presets/plugins/global/ - publish.json:ExtractReview:outputs`. + presets found in OpenPype Settings interface at + `project_settings/global/publish/ExtractReview/profiles:outputs`. """ label = "Extract Review" @@ -241,7 +241,8 @@ class ExtractReview(pyblish.api.InstancePlugin): "outputName": output_name, "outputDef": output_def, "frameStartFtrack": temp_data["output_frame_start"], - "frameEndFtrack": temp_data["output_frame_end"] + "frameEndFtrack": temp_data["output_frame_end"], + "ffmpeg_cmd": subprcs_cmd }) # Force to pop these key if are in new repre diff --git a/openpype/plugins/publish/integrate_new.py b/openpype/plugins/publish/integrate_new.py index e9f9e87d52..753ed78083 100644 --- a/openpype/plugins/publish/integrate_new.py +++ b/openpype/plugins/publish/integrate_new.py @@ -63,6 +63,7 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin): "animation", "model", "mayaAscii", + "mayaScene", "setdress", "layout", "ass", diff --git a/openpype/plugins/publish/start_timer.py b/openpype/plugins/publish/start_timer.py index 6312294bf1..112d92bef0 100644 --- a/openpype/plugins/publish/start_timer.py +++ b/openpype/plugins/publish/start_timer.py @@ -1,6 +1,5 @@ import pyblish.api -from openpype.api import get_system_settings from openpype.lib import change_timer_to_current_context @@ -10,6 +9,6 @@ class StartTimer(pyblish.api.ContextPlugin): hosts = ["*"] def process(self, context): - modules_settings = get_system_settings()["modules"] + modules_settings = context.data["system_settings"]["modules"] if modules_settings["timers_manager"]["disregard_publishing"]: change_timer_to_current_context() diff --git a/openpype/plugins/publish/stop_timer.py b/openpype/plugins/publish/stop_timer.py index 5c939b5f1b..414e43a3c4 100644 --- a/openpype/plugins/publish/stop_timer.py +++ b/openpype/plugins/publish/stop_timer.py @@ -3,8 +3,6 @@ import requests import pyblish.api -from openpype.api import get_system_settings - class StopTimer(pyblish.api.ContextPlugin): label = "Stop Timer" @@ -12,7 +10,7 @@ class StopTimer(pyblish.api.ContextPlugin): hosts = ["*"] def process(self, context): - modules_settings = get_system_settings()["modules"] + modules_settings = context.data["system_settings"]["modules"] if modules_settings["timers_manager"]["disregard_publishing"]: webserver_url = os.environ.get("OPENPYPE_WEBSERVER_URL") rest_api_url = "{}/timers_manager/stop_timer".format(webserver_url) diff --git a/openpype/plugins/publish/validate_containers.py b/openpype/plugins/publish/validate_containers.py index 52df493451..784221c3b6 100644 --- a/openpype/plugins/publish/validate_containers.py +++ b/openpype/plugins/publish/validate_containers.py @@ -1,7 +1,5 @@ import pyblish.api - import openpype.lib -from avalon.tools import cbsceneinventory class ShowInventory(pyblish.api.Action): @@ -11,7 +9,9 @@ class ShowInventory(pyblish.api.Action): on = "failed" def process(self, context, plugin): - cbsceneinventory.show() + from avalon.tools import sceneinventory + + sceneinventory.show() class ValidateContainers(pyblish.api.ContextPlugin): diff --git a/openpype/plugins/publish/validate_intent.py b/openpype/plugins/publish/validate_intent.py index 80bcb0e164..23d57bb2b7 100644 --- a/openpype/plugins/publish/validate_intent.py +++ b/openpype/plugins/publish/validate_intent.py @@ -1,5 +1,7 @@ -import pyblish.api import os +import pyblish.api + +from openpype.lib import filter_profiles class ValidateIntent(pyblish.api.ContextPlugin): @@ -12,20 +14,49 @@ class ValidateIntent(pyblish.api.ContextPlugin): order = pyblish.api.ValidatorOrder label = "Validate Intent" - # TODO: this should be off by default and only activated viac config - tasks = ["animation"] - hosts = ["harmony"] - if os.environ.get("AVALON_TASK") not in tasks: - active = False + enabled = False + + # Can be modified by settings + profiles = [{ + "hosts": [], + "task_types": [], + "tasks": [], + "validate": False + }] def process(self, context): + # Skip if there are no profiles + validate = True + if self.profiles: + # Collect data from context + task_name = context.data.get("task") + task_type = context.data.get("taskType") + host_name = context.data.get("hostName") + + filter_data = { + "hosts": host_name, + "task_types": task_type, + "tasks": task_name + } + matching_profile = filter_profiles( + self.profiles, filter_data, logger=self.log + ) + if matching_profile: + validate = matching_profile["validate"] + + if not validate: + self.log.debug(( + "Validation of intent was skipped." + " Matching profile for current context disabled validation." + )) + return + msg = ( "Please make sure that you select the intent of this publish." ) - intent = context.data.get("intent") - self.log.debug(intent) - assert intent, msg - + intent = context.data.get("intent") or {} + self.log.debug(str(intent)) intent_value = intent.get("value") - assert intent is not "", msg + if not intent_value: + raise AssertionError(msg) diff --git a/openpype/plugins/publish/validate_version.py b/openpype/plugins/publish/validate_version.py index 6701041541..927e024476 100644 --- a/openpype/plugins/publish/validate_version.py +++ b/openpype/plugins/publish/validate_version.py @@ -12,6 +12,9 @@ class ValidateVersion(pyblish.api.InstancePlugin): label = "Validate Version" hosts = ["nuke", "maya", "blender", "standalonepublisher"] + optional = False + active = True + def process(self, instance): version = instance.data.get("version") latest_version = instance.data.get("latestVersion") diff --git a/openpype/pype_commands.py b/openpype/pype_commands.py index c18fe36667..5288749e8b 100644 --- a/openpype/pype_commands.py +++ b/openpype/pype_commands.py @@ -257,3 +257,30 @@ class PypeCommands: def validate_jsons(self): pass + def run_tests(self, folder, mark, pyargs): + """ + Runs tests from 'folder' + + Args: + folder (str): relative path to folder with tests + mark (str): label to run tests marked by it (slow etc) + pyargs (str): package path to test + """ + print("run_tests") + import subprocess + + if folder: + folder = " ".join(list(folder)) + else: + folder = "../tests" + + mark_str = pyargs_str = '' + if mark: + mark_str = "-m {}".format(mark) + + if pyargs: + pyargs_str = "--pyargs {}".format(pyargs) + + cmd = "pytest {} {} {}".format(folder, mark_str, pyargs_str) + print("Running {}".format(cmd)) + subprocess.run(cmd) diff --git a/openpype/scripts/otio_burnin.py b/openpype/scripts/otio_burnin.py index dc8d60cb37..184d7689e3 100644 --- a/openpype/scripts/otio_burnin.py +++ b/openpype/scripts/otio_burnin.py @@ -69,7 +69,7 @@ def get_fps(str_value): return str(fps) -def _prores_codec_args(ffprobe_data): +def _prores_codec_args(ffprobe_data, source_ffmpeg_cmd): output = [] tags = ffprobe_data.get("tags") or {} @@ -108,14 +108,24 @@ def _prores_codec_args(ffprobe_data): return output -def _h264_codec_args(ffprobe_data): +def _h264_codec_args(ffprobe_data, source_ffmpeg_cmd): output = [] output.extend(["-codec:v", "h264"]) - bit_rate = ffprobe_data.get("bit_rate") - if bit_rate: - output.extend(["-b:v", bit_rate]) + # Use arguments from source if are available source arguments + if source_ffmpeg_cmd: + copy_args = ( + "-crf", + "-b:v", "-vb", + "-minrate", "-minrate:", + "-maxrate", "-maxrate:", + "-bufsize", "-bufsize:" + ) + args = source_ffmpeg_cmd.split(" ") + for idx, arg in enumerate(args): + if arg in copy_args: + output.extend([arg, args[idx + 1]]) pix_fmt = ffprobe_data.get("pix_fmt") if pix_fmt: @@ -127,15 +137,15 @@ def _h264_codec_args(ffprobe_data): return output -def get_codec_args(ffprobe_data): +def get_codec_args(ffprobe_data, source_ffmpeg_cmd): codec_name = ffprobe_data.get("codec_name") # Codec "prores" if codec_name == "prores": - return _prores_codec_args(ffprobe_data) + return _prores_codec_args(ffprobe_data, source_ffmpeg_cmd) # Codec "h264" if codec_name == "h264": - return _h264_codec_args(ffprobe_data) + return _h264_codec_args(ffprobe_data, source_ffmpeg_cmd) output = [] if codec_name: @@ -469,7 +479,7 @@ def example(input_path, output_path): def burnins_from_data( input_path, output_path, data, codec_data=None, options=None, burnin_values=None, overwrite=True, - full_input_path=None, first_frame=None + full_input_path=None, first_frame=None, source_ffmpeg_cmd=None ): """This method adds burnins to video/image file based on presets setting. @@ -647,7 +657,7 @@ def burnins_from_data( else: ffprobe_data = burnin._streams[0] - ffmpeg_args.extend(get_codec_args(ffprobe_data)) + ffmpeg_args.extend(get_codec_args(ffprobe_data, source_ffmpeg_cmd)) # Use group one (same as `-intra` argument, which is deprecated) ffmpeg_args_str = " ".join(ffmpeg_args) @@ -670,6 +680,7 @@ if __name__ == "__main__": options=in_data.get("options"), burnin_values=in_data.get("values"), full_input_path=in_data.get("full_input_path"), - first_frame=in_data.get("first_frame") + first_frame=in_data.get("first_frame"), + source_ffmpeg_cmd=in_data.get("ffmpeg_cmd") ) print("* Burnin script has finished") diff --git a/openpype/settings/__init__.py b/openpype/settings/__init__.py index 74f2684b2a..9d7598a948 100644 --- a/openpype/settings/__init__.py +++ b/openpype/settings/__init__.py @@ -25,7 +25,8 @@ from .lib import ( ) from .entities import ( SystemSettings, - ProjectSettings + ProjectSettings, + DefaultsNotDefined ) @@ -51,6 +52,8 @@ __all__ = ( "get_anatomy_settings", "get_environments", "get_local_settings", + "SystemSettings", - "ProjectSettings" + "ProjectSettings", + "DefaultsNotDefined" ) diff --git a/openpype/settings/defaults/project_anatomy/imageio.json b/openpype/settings/defaults/project_anatomy/imageio.json index fcebc876f5..38313a3d84 100644 --- a/openpype/settings/defaults/project_anatomy/imageio.json +++ b/openpype/settings/defaults/project_anatomy/imageio.json @@ -124,9 +124,47 @@ "value": "True" } ] + }, + { + "plugins": [ + "CreateWriteStill" + ], + "nukeNodeClass": "Write", + "knobs": [ + { + "name": "file_type", + "value": "tiff" + }, + { + "name": "datatype", + "value": "16 bit" + }, + { + "name": "compression", + "value": "Deflate" + }, + { + "name": "tile_color", + "value": "0x23ff00ff" + }, + { + "name": "channels", + "value": "rgb" + }, + { + "name": "colorspace", + "value": "sRGB" + }, + { + "name": "create_directories", + "value": "True" + } + ] } ], - "customNodes": [] + "customNodes": [ + + ] }, "regexInputs": { "inputs": [ diff --git a/openpype/settings/defaults/project_settings/global.json b/openpype/settings/defaults/project_settings/global.json index 8cc8d28e5f..45c1a59d17 100644 --- a/openpype/settings/defaults/project_settings/global.json +++ b/openpype/settings/defaults/project_settings/global.json @@ -6,7 +6,12 @@ }, "ValidateVersion": { "enabled": true, - "optional": false + "optional": false, + "active": true + }, + "ValidateIntent": { + "enabled": false, + "profiles": [] }, "IntegrateHeroVersion": { "enabled": true, @@ -19,7 +24,7 @@ "animation", "setdress", "layout", - "mayaAscii" + "mayaScene" ] }, "ExtractJpegEXR": { diff --git a/openpype/settings/defaults/project_settings/maya.json b/openpype/settings/defaults/project_settings/maya.json index 3540c3eb29..c592d74350 100644 --- a/openpype/settings/defaults/project_settings/maya.json +++ b/openpype/settings/defaults/project_settings/maya.json @@ -156,6 +156,11 @@ "CollectMayaRender": { "sync_workfile_version": false }, + "ValidateInstanceInContext": { + "enabled": true, + "optional": true, + "active": true + }, "ValidateContainers": { "enabled": true, "optional": true, @@ -169,6 +174,11 @@ "enabled": false, "attributes": {} }, + "ValidateLoadedPlugin": { + "enabled": false, + "whitelist_native_plugins": false, + "authorized_plugins": [] + }, "ValidateRenderSettings": { "arnold_render_attributes": [], "vray_render_attributes": [], @@ -479,6 +489,12 @@ 255, 255 ], + "mayaScene": [ + 67, + 174, + 255, + 255 + ], "setdress": [ 255, 250, diff --git a/openpype/settings/defaults/project_settings/nuke.json b/openpype/settings/defaults/project_settings/nuke.json index ac35349415..dd65df02e5 100644 --- a/openpype/settings/defaults/project_settings/nuke.json +++ b/openpype/settings/defaults/project_settings/nuke.json @@ -38,6 +38,11 @@ "render" ] }, + "ValidateInstanceInContext": { + "enabled": true, + "optional": true, + "active": true + }, "ValidateContainers": { "enabled": true, "optional": true, @@ -127,7 +132,8 @@ "jpg", "jpeg", "png", - "psd" + "psd", + "tiff" ], "node_name_template": "{class_name}_{ext}" }, diff --git a/openpype/settings/defaults/project_settings/tvpaint.json b/openpype/settings/defaults/project_settings/tvpaint.json index 47f486aa98..528bf6de8e 100644 --- a/openpype/settings/defaults/project_settings/tvpaint.json +++ b/openpype/settings/defaults/project_settings/tvpaint.json @@ -1,4 +1,5 @@ { + "stop_timer_on_application_exit": false, "publish": { "ExtractSequence": { "review_bg": [ diff --git a/openpype/settings/defaults/system_settings/general.json b/openpype/settings/defaults/system_settings/general.json index d03fedf3c9..f54e8b2b16 100644 --- a/openpype/settings/defaults/system_settings/general.json +++ b/openpype/settings/defaults/system_settings/general.json @@ -7,6 +7,11 @@ "global": [] } }, + "disk_mapping": { + "windows": [], + "linux": [], + "darwin": [] + }, "openpype_path": { "windows": [], "darwin": [], diff --git a/openpype/settings/defaults/system_settings/modules.json b/openpype/settings/defaults/system_settings/modules.json index 229b867327..beb1eb4f24 100644 --- a/openpype/settings/defaults/system_settings/modules.json +++ b/openpype/settings/defaults/system_settings/modules.json @@ -179,4 +179,4 @@ "slack": { "enabled": false } -} +} \ No newline at end of file diff --git a/openpype/settings/entities/schemas/projects_schema/schema_project_syncserver.json b/openpype/settings/entities/schemas/projects_schema/schema_project_syncserver.json index cb2cc9c9d1..3211babd43 100644 --- a/openpype/settings/entities/schemas/projects_schema/schema_project_syncserver.json +++ b/openpype/settings/entities/schemas/projects_schema/schema_project_syncserver.json @@ -5,64 +5,134 @@ "collapsible": true, "checkbox_key": "enabled", "children": [ - { - "type": "boolean", - "key": "enabled", - "label": "Enabled" - }, - { - "type": "dict", - "key": "config", - "label": "Config", - "collapsible": true, - "children": [ - { - "type": "text", - "key": "retry_cnt", - "label": "Retry Count" - }, - { - "type": "text", - "key": "loop_delay", - "label": "Loop Delay" - }, - { - "type": "text", - "key": "active_site", - "label": "Active Site" - }, - { - "type": "text", - "key": "remote_site", - "label": "Remote Site" - } - ] - }, { - "type": "dict-modifiable", - "collapsible": true, - "key": "sites", - "label": "Sites", - "collapsible_key": false, - "object_type": + { + "type": "boolean", + "key": "enabled", + "label": "Enabled" + }, { "type": "dict", + "key": "config", + "label": "Config", + "collapsible": true, "children": [ - { - "type": "path", - "key": "credentials_url", - "label": "Credentials url", - "multiplatform": true - }, - { - "type": "dict-modifiable", - "key": "root", - "label": "Roots", - "collapsable": false, - "collapsable_key": false, - "object_type": "text" - } + { + "type": "text", + "key": "retry_cnt", + "label": "Retry Count" + }, + { + "type": "text", + "key": "loop_delay", + "label": "Loop Delay" + }, + { + "type": "text", + "key": "active_site", + "label": "Active Site" + }, + { + "type": "text", + "key": "remote_site", + "label": "Remote Site" + } ] + }, + { + "type": "dict-modifiable", + "collapsible": true, + "key": "sites", + "label": "Sites", + "collapsible_key": false, + "object_type": { + "type": "dict", + "children": [ + { + "type": "dict", + "key": "gdrive", + "label": "Google Drive", + "collapsible": true, + "children": [ + { + "type": "path", + "key": "credentials_url", + "label": "Credentials url", + "multiplatform": true + } + ] + }, + { + "type": "dict", + "key": "dropbox", + "label": "Dropbox", + "collapsible": true, + "children": [ + { + "type": "text", + "key": "token", + "label": "Access Token" + }, + { + "type": "text", + "key": "team_folder_name", + "label": "Team Folder Name" + }, + { + "type": "text", + "key": "acting_as_member", + "label": "Acting As Member" + } + ] + }, + { + "type": "dict", + "key": "sftp", + "label": "SFTP", + "collapsible": true, + "children": [ + { + "type": "text", + "key": "sftp_host", + "label": "SFTP host" + }, + { + "type": "number", + "key": "sftp_port", + "label": "SFTP port" + }, + { + "type": "text", + "key": "sftp_user", + "label": "SFTP user" + }, + { + "type": "text", + "key": "sftp_pass", + "label": "SFTP pass" + }, + { + "type": "path", + "key": "sftp_key", + "label": "SFTP user ssh key", + "multiplatform": true + }, + { + "type": "text", + "key": "sftp_key_pass", + "label": "SFTP user ssh key password" + } + ] + }, + { + "type": "dict-modifiable", + "key": "root", + "label": "Roots", + "collapsable": false, + "collapsable_key": false, + "object_type": "text" + } + ] + } } - } ] } diff --git a/openpype/settings/entities/schemas/projects_schema/schema_project_tvpaint.json b/openpype/settings/entities/schemas/projects_schema/schema_project_tvpaint.json index 368141813f..8286ed1193 100644 --- a/openpype/settings/entities/schemas/projects_schema/schema_project_tvpaint.json +++ b/openpype/settings/entities/schemas/projects_schema/schema_project_tvpaint.json @@ -5,6 +5,11 @@ "label": "TVPaint", "is_file": true, "children": [ + { + "type": "boolean", + "key": "stop_timer_on_application_exit", + "label": "Stop timer on application exit" + }, { "type": "dict", "collapsible": true, diff --git a/openpype/settings/entities/schemas/projects_schema/schemas/schema_global_publish.json b/openpype/settings/entities/schemas/projects_schema/schemas/schema_global_publish.json index e59d22aa89..c50f383f02 100644 --- a/openpype/settings/entities/schemas/projects_schema/schemas/schema_global_publish.json +++ b/openpype/settings/entities/schemas/projects_schema/schemas/schema_global_publish.json @@ -24,13 +24,22 @@ } ] }, + { + "type": "schema_template", + "name": "template_publish_plugin", + "template_data": [ + { + "key": "ValidateVersion", + "label": "Validate Version" + } + ] + }, { "type": "dict", - "collapsible": true, - "checkbox_key": "enabled", - "key": "ValidateVersion", - "label": "Validate Version", + "label": "Validate Intent", + "key": "ValidateIntent", "is_group": true, + "checkbox_key": "enabled", "children": [ { "type": "boolean", @@ -38,9 +47,43 @@ "label": "Enabled" }, { - "type": "boolean", - "key": "optional", - "label": "Optional" + "type": "label", + "label": "Validate if Publishing intent was selected. It is possible to disable validation for specific publishing context with profiles." + }, + { + "type": "list", + "collapsible": true, + "key": "profiles", + "object_type": { + "type": "dict", + "children": [ + { + "key": "hosts", + "label": "Host names", + "type": "hosts-enum", + "multiselection": true + }, + { + "key": "task_types", + "label": "Task types", + "type": "task-types-enum" + }, + { + "key": "tasks", + "label": "Task names", + "type": "list", + "object_type": "text" + }, + { + "type": "separator" + }, + { + "key": "validate", + "label": "Validate", + "type": "boolean" + } + ] + } } ] }, diff --git a/openpype/settings/entities/schemas/projects_schema/schemas/schema_maya_load.json b/openpype/settings/entities/schemas/projects_schema/schemas/schema_maya_load.json index 0b09d08700..7c87644817 100644 --- a/openpype/settings/entities/schemas/projects_schema/schemas/schema_maya_load.json +++ b/openpype/settings/entities/schemas/projects_schema/schemas/schema_maya_load.json @@ -47,9 +47,14 @@ }, { "type": "color", - "label": "Maya Scene:", + "label": "Maya Ascii:", "key": "mayaAscii" }, + { + "type": "color", + "label": "Maya Scene:", + "key": "mayaScene" + }, { "type": "color", "label": "Set Dress:", diff --git a/openpype/settings/entities/schemas/projects_schema/schemas/schema_maya_publish.json b/openpype/settings/entities/schemas/projects_schema/schemas/schema_maya_publish.json index 89cd30aed0..26ebfb2bd7 100644 --- a/openpype/settings/entities/schemas/projects_schema/schemas/schema_maya_publish.json +++ b/openpype/settings/entities/schemas/projects_schema/schemas/schema_maya_publish.json @@ -28,6 +28,16 @@ "type": "label", "label": "Validators" }, + { + "type": "schema_template", + "name": "template_publish_plugin", + "template_data": [ + { + "key": "ValidateInstanceInContext", + "label": "Validate Instance In Context" + } + ] + }, { "type": "schema_template", "name": "template_publish_plugin", @@ -82,6 +92,32 @@ ] }, + { + "type": "dict", + "collapsible": true, + "key": "ValidateLoadedPlugin", + "label": "Validate Loaded Plugin", + "checkbox_key": "enabled", + "children": [ + { + "type": "boolean", + "key": "enabled", + "label": "Enabled" + }, + { + "type": "boolean", + "key": "whitelist_native_plugins", + "label": "Whitelist Maya Native Plugins" + }, + { + "type": "list", + "key": "authorized_plugins", + "label": "Authorized plugins", + "object_type": "text" + } + ] + }, + { "type": "dict", "collapsible": true, diff --git a/openpype/settings/entities/schemas/projects_schema/schemas/schema_nuke_publish.json b/openpype/settings/entities/schemas/projects_schema/schemas/schema_nuke_publish.json index c73453f8aa..74b2592d29 100644 --- a/openpype/settings/entities/schemas/projects_schema/schemas/schema_nuke_publish.json +++ b/openpype/settings/entities/schemas/projects_schema/schemas/schema_nuke_publish.json @@ -50,6 +50,16 @@ "type": "label", "label": "Validators" }, + { + "type": "schema_template", + "name": "template_publish_plugin", + "template_data": [ + { + "key": "ValidateInstanceInContext", + "label": "Validate Instance In Context" + } + ] + }, { "type": "schema_template", "name": "template_publish_plugin", diff --git a/openpype/settings/entities/schemas/system_schema/schema_general.json b/openpype/settings/entities/schemas/system_schema/schema_general.json index fe5a8d8203..51a58a6e27 100644 --- a/openpype/settings/entities/schemas/system_schema/schema_general.json +++ b/openpype/settings/entities/schemas/system_schema/schema_general.json @@ -40,6 +40,76 @@ { "type": "splitter" }, + { + "type": "dict", + "key": "disk_mapping", + "label": "Disk mapping", + "is_group": true, + "use_label_wrap": false, + "collapsible": false, + "children": [ + { + "key": "windows", + "label": "Windows", + "type": "list", + "object_type": { + "type": "list-strict", + "key": "item", + "object_types": [ + { + "label": "Source", + "type": "path" + }, + { + "label": "Destination", + "type": "path" + } + ] + } + }, + { + "key": "linux", + "label": "Linux", + "type": "list", + "object_type": { + "type": "list-strict", + "key": "item", + "object_types": [ + { + "label": "Source", + "type": "path" + }, + { + "label": "Destination", + "type": "path" + } + ] + } + }, + { + "key": "darwin", + "label": "MacOS", + "type": "list", + "object_type": { + "type": "list-strict", + "key": "item", + "object_types": [ + { + "label": "Source", + "type": "path" + }, + { + "label": "Destination", + "type": "path" + } + ] + } + } + ] + }, + { + "type": "splitter" + }, { "type": "path", "key": "openpype_path", diff --git a/openpype/settings/handlers.py b/openpype/settings/handlers.py index 288fc76801..c59e2bc542 100644 --- a/openpype/settings/handlers.py +++ b/openpype/settings/handlers.py @@ -168,7 +168,7 @@ class CacheValues: class MongoSettingsHandler(SettingsHandler): """Settings handler that use mongo for storing and loading of settings.""" - global_general_keys = ("openpype_path", "admin_password") + global_general_keys = ("openpype_path", "admin_password", "disk_mapping") def __init__(self): # Get mongo connection diff --git a/openpype/tools/project_manager/project_manager/__init__.py b/openpype/tools/project_manager/project_manager/__init__.py index 49ade4a989..6e44afd841 100644 --- a/openpype/tools/project_manager/project_manager/__init__.py +++ b/openpype/tools/project_manager/project_manager/__init__.py @@ -1,9 +1,11 @@ __all__ = ( "IDENTIFIER_ROLE", + "PROJECT_NAME_ROLE", "HierarchyView", "ProjectModel", + "ProjectProxyFilter", "CreateProjectDialog", "HierarchyModel", @@ -20,12 +22,14 @@ __all__ = ( from .constants import ( - IDENTIFIER_ROLE + IDENTIFIER_ROLE, + PROJECT_NAME_ROLE ) from .widgets import CreateProjectDialog from .view import HierarchyView from .model import ( ProjectModel, + ProjectProxyFilter, HierarchyModel, HierarchySelectionModel, diff --git a/openpype/tools/project_manager/project_manager/constants.py b/openpype/tools/project_manager/project_manager/constants.py index 67dea79e59..7ca4aa9492 100644 --- a/openpype/tools/project_manager/project_manager/constants.py +++ b/openpype/tools/project_manager/project_manager/constants.py @@ -17,6 +17,9 @@ ITEM_TYPE_ROLE = QtCore.Qt.UserRole + 5 # Item has opened editor (per column) EDITOR_OPENED_ROLE = QtCore.Qt.UserRole + 6 +# Role for project model +PROJECT_NAME_ROLE = QtCore.Qt.UserRole + 7 + # Allowed symbols for any name NAME_ALLOWED_SYMBOLS = "a-zA-Z0-9_" NAME_REGEX = re.compile("^[" + NAME_ALLOWED_SYMBOLS + "]*$") diff --git a/openpype/tools/project_manager/project_manager/model.py b/openpype/tools/project_manager/project_manager/model.py index 7036b65f87..5b6ed78b50 100644 --- a/openpype/tools/project_manager/project_manager/model.py +++ b/openpype/tools/project_manager/project_manager/model.py @@ -9,7 +9,8 @@ from .constants import ( DUPLICATED_ROLE, HIERARCHY_CHANGE_ABLE_ROLE, REMOVED_ROLE, - EDITOR_OPENED_ROLE + EDITOR_OPENED_ROLE, + PROJECT_NAME_ROLE ) from .style import ResourceCache @@ -29,7 +30,7 @@ class ProjectModel(QtGui.QStandardItemModel): def __init__(self, dbcon, *args, **kwargs): self.dbcon = dbcon - self._project_names = set() + self._items_by_name = {} super(ProjectModel, self).__init__(*args, **kwargs) @@ -37,29 +38,62 @@ class ProjectModel(QtGui.QStandardItemModel): """Reload projects.""" self.dbcon.Session["AVALON_PROJECT"] = None - project_items = [] + new_project_items = [] - none_project = QtGui.QStandardItem("< Select Project >") - none_project.setData(None) - project_items.append(none_project) + if None not in self._items_by_name: + none_project = QtGui.QStandardItem("< Select Project >") + self._items_by_name[None] = none_project + new_project_items.append(none_project) + project_docs = self.dbcon.projects( + projection={"name": 1}, + only_active=True + ) project_names = set() + for project_doc in project_docs: + project_name = project_doc.get("name") + if not project_name: + continue - for doc in sorted( - self.dbcon.projects(projection={"name": 1}, only_active=True), - key=lambda x: x["name"] - ): + project_names.add(project_name) + if project_name not in self._items_by_name: + project_item = QtGui.QStandardItem(project_name) + project_item.setData(project_name, PROJECT_NAME_ROLE) - project_name = doc.get("name") - if project_name: - project_names.add(project_name) - project_items.append(QtGui.QStandardItem(project_name)) + self._items_by_name[project_name] = project_item + new_project_items.append(project_item) - self.clear() + root_item = self.invisibleRootItem() + for project_name in tuple(self._items_by_name.keys()): + if project_name is None or project_name in project_names: + continue + project_item = self._items_by_name.pop(project_name) + root_item.removeRow(project_item.row()) - self._project_names = project_names + if new_project_items: + root_item.appendRows(new_project_items) - self.invisibleRootItem().appendRows(project_items) + +class ProjectProxyFilter(QtCore.QSortFilterProxyModel): + """Filters default project item.""" + def __init__(self, *args, **kwargs): + super(ProjectProxyFilter, self).__init__(*args, **kwargs) + self._filter_default = False + + def set_filter_default(self, enabled=True): + """Set if filtering of default item is enabled.""" + if enabled == self._filter_default: + return + self._filter_default = enabled + self.invalidateFilter() + + def filterAcceptsRow(self, row, parent): + if not self._filter_default: + return True + + model = self.sourceModel() + source_index = model.index(row, self.filterKeyColumn(), parent) + return source_index.data(PROJECT_NAME_ROLE) is not None class HierarchySelectionModel(QtCore.QItemSelectionModel): diff --git a/openpype/tools/project_manager/project_manager/window.py b/openpype/tools/project_manager/project_manager/window.py index 79eb9651e9..a19031ceda 100644 --- a/openpype/tools/project_manager/project_manager/window.py +++ b/openpype/tools/project_manager/project_manager/window.py @@ -2,15 +2,17 @@ from Qt import QtWidgets, QtCore, QtGui from . import ( ProjectModel, + ProjectProxyFilter, HierarchyModel, HierarchySelectionModel, HierarchyView, - CreateProjectDialog + CreateProjectDialog, + PROJECT_NAME_ROLE ) -from openpype.style import load_stylesheet from .style import ResourceCache +from openpype.style import load_stylesheet from openpype.lib import is_admin_password_required from openpype.widgets import PasswordDialog @@ -35,9 +37,6 @@ class ProjectManagerWindow(QtWidgets.QWidget): self._password_dialog = None self._user_passed = False - # keep track of the current project PM is viewing - self._current_project = None - self.setWindowTitle("OpenPype Project Manager") self.setWindowIcon(QtGui.QIcon(resources.get_openpype_icon_filepath())) @@ -50,11 +49,15 @@ class ProjectManagerWindow(QtWidgets.QWidget): dbcon = AvalonMongoDB() project_model = ProjectModel(dbcon) + project_proxy = ProjectProxyFilter() + project_proxy.setSourceModel(project_model) + project_proxy.setDynamicSortFilter(True) + project_combobox = QtWidgets.QComboBox(project_widget) project_combobox.setSizeAdjustPolicy( QtWidgets.QComboBox.AdjustToContents ) - project_combobox.setModel(project_model) + project_combobox.setModel(project_proxy) project_combobox.setRootModelIndex(QtCore.QModelIndex()) style_delegate = QtWidgets.QStyledItemDelegate() project_combobox.setItemDelegate(style_delegate) @@ -72,6 +75,7 @@ class ProjectManagerWindow(QtWidgets.QWidget): "Create Starting Folders", project_widget ) + create_folders_btn.setEnabled(False) project_layout = QtWidgets.QHBoxLayout(project_widget) project_layout.setContentsMargins(0, 0, 0, 0) @@ -147,6 +151,7 @@ class ProjectManagerWindow(QtWidgets.QWidget): add_task_btn.clicked.connect(self._on_add_task) self._project_model = project_model + self._project_proxy_model = project_proxy self.hierarchy_view = hierarchy_view self.hierarchy_model = hierarchy_model @@ -165,8 +170,17 @@ class ProjectManagerWindow(QtWidgets.QWidget): self.setStyleSheet(load_stylesheet()) def _set_project(self, project_name=None): + self._create_folders_btn.setEnabled(project_name is not None) + self._project_proxy_model.set_filter_default(project_name is not None) self.hierarchy_view.set_project(project_name) + def _current_project(self): + row = self._project_combobox.currentIndex() + if row < 0: + return None + index = self._project_proxy_model.index(row, 0) + return index.data(PROJECT_NAME_ROLE) + def showEvent(self, event): super(ProjectManagerWindow, self).showEvent(event) @@ -185,6 +199,7 @@ class ProjectManagerWindow(QtWidgets.QWidget): project_name = self._project_combobox.currentText() self._project_model.refresh() + self._project_proxy_model.sort(0, QtCore.Qt.AscendingOrder) if self._project_combobox.count() == 0: return self._set_project() @@ -194,12 +209,12 @@ class ProjectManagerWindow(QtWidgets.QWidget): if row >= 0: self._project_combobox.setCurrentIndex(row) - self._set_project(self._project_combobox.currentText()) + selected_project = self._current_project() + self._set_project(selected_project) def _on_project_change(self): - if self._project_combobox.currentIndex() != 0: - self._current_project = self._project_combobox.currentText() - self._set_project(self._current_project) + selected_project = self._current_project() + self._set_project(selected_project) def _on_project_refresh(self): self.refresh_projects() @@ -214,7 +229,8 @@ class ProjectManagerWindow(QtWidgets.QWidget): self.hierarchy_view.add_task() def _on_create_folders(self): - if not self._current_project: + project_name = self._current_project() + if not project_name: return qm = QtWidgets.QMessageBox @@ -225,11 +241,11 @@ class ProjectManagerWindow(QtWidgets.QWidget): if ans == qm.Yes: try: # Get paths based on presets - basic_paths = get_project_basic_paths(self._current_project) + basic_paths = get_project_basic_paths(project_name) if not basic_paths: pass # Invoking OpenPype API to create the project folders - create_project_folders(basic_paths, self._current_project) + create_project_folders(basic_paths, project_name) except Exception as exc: self.log.warning( "Cannot create starting folders: {}".format(exc), @@ -246,11 +262,9 @@ class ProjectManagerWindow(QtWidgets.QWidget): if dialog.result() != 1: return - self._current_project = dialog.project_name - self.show_message( - "Created project \"{}\"".format(self._current_project) - ) - self.refresh_projects(self._current_project) + project_name = dialog.project_name + self.show_message("Created project \"{}\"".format(project_name)) + self.refresh_projects(project_name) def _show_password_dialog(self): if self._password_dialog: diff --git a/openpype/tools/settings/settings/constants.py b/openpype/tools/settings/settings/constants.py new file mode 100644 index 0000000000..5c20bf1afe --- /dev/null +++ b/openpype/tools/settings/settings/constants.py @@ -0,0 +1,16 @@ +from Qt import QtCore + + +DEFAULT_PROJECT_LABEL = "< Default >" +PROJECT_NAME_ROLE = QtCore.Qt.UserRole + 1 +PROJECT_IS_ACTIVE_ROLE = QtCore.Qt.UserRole + 2 +PROJECT_IS_SELECTED_ROLE = QtCore.Qt.UserRole + 3 + + +__all__ = ( + "DEFAULT_PROJECT_LABEL", + + "PROJECT_NAME_ROLE", + "PROJECT_IS_ACTIVE_ROLE", + "PROJECT_IS_SELECTED_ROLE" +) diff --git a/openpype/tools/settings/settings/widgets.py b/openpype/tools/settings/settings/widgets.py index a461f3e675..710884e9e5 100644 --- a/openpype/tools/settings/settings/widgets.py +++ b/openpype/tools/settings/settings/widgets.py @@ -7,6 +7,12 @@ from avalon.mongodb import ( ) from openpype.settings.lib import get_system_settings +from .constants import ( + DEFAULT_PROJECT_LABEL, + PROJECT_NAME_ROLE, + PROJECT_IS_ACTIVE_ROLE, + PROJECT_IS_SELECTED_ROLE +) class SettingsLineEdit(QtWidgets.QLineEdit): @@ -602,10 +608,63 @@ class NiceCheckbox(QtWidgets.QFrame): return super(NiceCheckbox, self).mouseReleaseEvent(event) -class ProjectListModel(QtGui.QStandardItemModel): - sort_role = QtCore.Qt.UserRole + 10 - filter_role = QtCore.Qt.UserRole + 11 - selected_role = QtCore.Qt.UserRole + 12 +class ProjectModel(QtGui.QStandardItemModel): + def __init__(self, only_active, *args, **kwargs): + super(ProjectModel, self).__init__(*args, **kwargs) + + self.dbcon = None + + self._only_active = only_active + self._default_item = None + self._items_by_name = {} + + def set_dbcon(self, dbcon): + self.dbcon = dbcon + + def refresh(self): + new_items = [] + if self._default_item is None: + item = QtGui.QStandardItem(DEFAULT_PROJECT_LABEL) + item.setData(None, PROJECT_NAME_ROLE) + item.setData(True, PROJECT_IS_ACTIVE_ROLE) + item.setData(False, PROJECT_IS_SELECTED_ROLE) + new_items.append(item) + self._default_item = item + + project_names = set() + if self.dbcon is not None: + for project_doc in self.dbcon.projects( + projection={"name": 1, "data.active": 1}, + only_active=self._only_active + ): + project_name = project_doc["name"] + project_names.add(project_name) + if project_name in self._items_by_name: + item = self._items_by_name[project_name] + else: + item = QtGui.QStandardItem(project_name) + + self._items_by_name[project_name] = item + new_items.append(item) + + is_active = project_doc.get("data", {}).get("active", True) + item.setData(project_name, PROJECT_NAME_ROLE) + item.setData(is_active, PROJECT_IS_ACTIVE_ROLE) + item.setData(False, PROJECT_IS_SELECTED_ROLE) + + if not is_active: + font = item.font() + font.setItalic(True) + item.setFont(font) + + root_item = self.invisibleRootItem() + for project_name in tuple(self._items_by_name.keys()): + if project_name not in project_names: + item = self._items_by_name.pop(project_name) + root_item.removeRow(item.row()) + + if new_items: + root_item.appendRows(new_items) class ProjectListView(QtWidgets.QListView): @@ -618,19 +677,36 @@ class ProjectListView(QtWidgets.QListView): super(ProjectListView, self).mouseReleaseEvent(event) -class ProjectListSortFilterProxy(QtCore.QSortFilterProxyModel): - +class ProjectSortFilterProxy(QtCore.QSortFilterProxyModel): def __init__(self, *args, **kwargs): - super(ProjectListSortFilterProxy, self).__init__(*args, **kwargs) + super(ProjectSortFilterProxy, self).__init__(*args, **kwargs) self._enable_filter = True + def lessThan(self, left_index, right_index): + if left_index.data(PROJECT_NAME_ROLE) is None: + return True + + if right_index.data(PROJECT_NAME_ROLE) is None: + return False + + left_is_active = left_index.data(PROJECT_IS_ACTIVE_ROLE) + right_is_active = right_index.data(PROJECT_IS_ACTIVE_ROLE) + if right_is_active == left_is_active: + return super(ProjectSortFilterProxy, self).lessThan( + left_index, right_index + ) + + if left_is_active: + return True + return False + def filterAcceptsRow(self, source_row, source_parent): if not self._enable_filter: return True index = self.sourceModel().index(source_row, 0, source_parent) is_active = bool(index.data(self.filterRole())) - is_selected = bool(index.data(ProjectListModel.selected_role)) + is_selected = bool(index.data(PROJECT_IS_SELECTED_ROLE)) return is_active or is_selected @@ -643,7 +719,6 @@ class ProjectListSortFilterProxy(QtCore.QSortFilterProxyModel): class ProjectListWidget(QtWidgets.QWidget): - default = "< Default >" project_changed = QtCore.Signal() def __init__(self, parent, only_active=False): @@ -657,13 +732,10 @@ class ProjectListWidget(QtWidgets.QWidget): label_widget = QtWidgets.QLabel("Projects") project_list = ProjectListView(self) - project_model = ProjectListModel() - project_proxy = ProjectListSortFilterProxy() - - project_proxy.setFilterRole(ProjectListModel.filter_role) - project_proxy.setSortRole(ProjectListModel.sort_role) - project_proxy.setSortCaseSensitivity(QtCore.Qt.CaseInsensitive) + project_model = ProjectModel(only_active) + project_proxy = ProjectSortFilterProxy() + project_proxy.setFilterRole(PROJECT_IS_ACTIVE_ROLE) project_proxy.setSourceModel(project_model) project_list.setModel(project_proxy) @@ -693,13 +765,14 @@ class ProjectListWidget(QtWidgets.QWidget): project_list.left_mouse_released_at.connect(self.on_item_clicked) + self._default_project_item = None + self.project_list = project_list self.project_proxy = project_proxy self.project_model = project_model self.inactive_chk = inactive_chk self.dbcon = None - self._only_active = only_active def on_item_clicked(self, new_index): new_project_name = new_index.data(QtCore.Qt.DisplayRole) @@ -746,12 +819,12 @@ class ProjectListWidget(QtWidgets.QWidget): return not self._parent.entity.has_unsaved_changes def project_name(self): - if self.current_project == self.default: + if self.current_project == DEFAULT_PROJECT_LABEL: return None return self.current_project def select_default_project(self): - self.select_project(self.default) + self.select_project(DEFAULT_PROJECT_LABEL) def select_project(self, project_name): model = self.project_model @@ -759,10 +832,10 @@ class ProjectListWidget(QtWidgets.QWidget): found_items = model.findItems(project_name) if not found_items: - found_items = model.findItems(self.default) + found_items = model.findItems(DEFAULT_PROJECT_LABEL) index = model.indexFromItem(found_items[0]) - model.setData(index, True, ProjectListModel.selected_role) + model.setData(index, True, PROJECT_IS_SELECTED_ROLE) index = proxy.mapFromSource(index) @@ -777,9 +850,6 @@ class ProjectListWidget(QtWidgets.QWidget): selected_project = index.data(QtCore.Qt.DisplayRole) break - model = self.project_model - model.clear() - mongo_url = os.environ["OPENPYPE_MONGO"] # Force uninstall of whole avalon connection if url does not match @@ -797,35 +867,8 @@ class ProjectListWidget(QtWidgets.QWidget): self.dbcon = None self.current_project = None - items = [(self.default, True)] - - if self.dbcon: - - for doc in self.dbcon.projects( - projection={"name": 1, "data.active": 1}, - only_active=self._only_active - ): - items.append( - (doc["name"], doc.get("data", {}).get("active", True)) - ) - - for project_name, is_active in items: - - row = QtGui.QStandardItem(project_name) - row.setData(is_active, ProjectListModel.filter_role) - row.setData(False, ProjectListModel.selected_role) - - if is_active: - row.setData(project_name, ProjectListModel.sort_role) - - else: - row.setData("~" + project_name, ProjectListModel.sort_role) - - font = row.font() - font.setItalic(True) - row.setFont(font) - - model.appendRow(row) + self.project_model.set_dbcon(self.dbcon) + self.project_model.refresh() self.project_proxy.sort(0) diff --git a/openpype/tools/tray/pype_tray.py b/openpype/tools/tray/pype_tray.py index 35b254513f..3050e206ce 100644 --- a/openpype/tools/tray/pype_tray.py +++ b/openpype/tools/tray/pype_tray.py @@ -17,6 +17,11 @@ from openpype.api import ( from openpype.lib import get_pype_execute_args from openpype.modules import TrayModulesManager from openpype import style +from openpype.settings import ( + SystemSettings, + ProjectSettings, + DefaultsNotDefined +) from .pype_info_widget import PypeInfoWidget @@ -114,6 +119,54 @@ class TrayManager: self.main_thread_timer = main_thread_timer + # For storing missing settings dialog + self._settings_validation_dialog = None + + self.execute_in_main_thread(self._startup_validations) + + def _startup_validations(self): + """Run possible startup validations.""" + self._validate_settings_defaults() + + def _validate_settings_defaults(self): + valid = True + try: + SystemSettings() + ProjectSettings() + + except DefaultsNotDefined: + valid = False + + if valid: + return + + title = "Settings miss default values" + msg = ( + "Your OpenPype will not work as expected! \n" + "Some default values in settigs are missing. \n\n" + "Please contact OpenPype team." + ) + msg_box = QtWidgets.QMessageBox( + QtWidgets.QMessageBox.Warning, + title, + msg, + QtWidgets.QMessageBox.Ok, + flags=QtCore.Qt.Dialog + ) + icon = QtGui.QIcon(resources.get_openpype_icon_filepath()) + msg_box.setWindowIcon(icon) + msg_box.setStyleSheet(style.load_stylesheet()) + msg_box.buttonClicked.connect(self._post_validate_settings_defaults) + + self._settings_validation_dialog = msg_box + + msg_box.show() + + def _post_validate_settings_defaults(self): + widget = self._settings_validation_dialog + self._settings_validation_dialog = None + widget.deleteLater() + def show_tray_message(self, title, message, icon=None, msecs=None): """Show tray message. @@ -236,6 +289,7 @@ class SystemTrayIcon(QtWidgets.QSystemTrayIcon): self._click_timer = click_timer self._doubleclick = False + self._click_pos = None def _click_timer_timeout(self): self._click_timer.stop() @@ -248,13 +302,17 @@ class SystemTrayIcon(QtWidgets.QSystemTrayIcon): self._show_context_menu() def _show_context_menu(self): - pos = QtGui.QCursor().pos() + pos = self._click_pos + self._click_pos = None + if pos is None: + pos = QtGui.QCursor().pos() self.contextMenu().popup(pos) def on_systray_activated(self, reason): # show contextMenu if left click if reason == QtWidgets.QSystemTrayIcon.Trigger: if self.tray_man.doubleclick_callback: + self._click_pos = QtGui.QCursor().pos() self._click_timer.start() else: self._show_context_menu() diff --git a/openpype/version.py b/openpype/version.py index 0e52014a56..3a589bac75 100644 --- a/openpype/version.py +++ b/openpype/version.py @@ -1,3 +1,3 @@ # -*- coding: utf-8 -*- """Package declaring Pype version.""" -__version__ = "3.4.1" +__version__ = "3.5.0-nightly.6" diff --git a/poetry.lock b/poetry.lock index e011b781c9..10b049cd0a 100644 --- a/poetry.lock +++ b/poetry.lock @@ -144,6 +144,22 @@ python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*" [package.dependencies] pytz = ">=2015.7" +[[package]] +name = "bcrypt" +version = "3.2.0" +description = "Modern password hashing for your software and your servers" +category = "main" +optional = false +python-versions = ">=3.6" + +[package.dependencies] +cffi = ">=1.1" +six = ">=1.4.1" + +[package.extras] +tests = ["pytest (>=3.2.1,!=3.3.0)"] +typecheck = ["mypy"] + [[package]] name = "blessed" version = "1.18.0" @@ -313,6 +329,19 @@ category = "dev" optional = false python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*" +[[package]] +name = "dropbox" +version = "11.20.0" +description = "Official Dropbox API Client" +category = "main" +optional = false +python-versions = "*" + +[package.dependencies] +requests = ">=2.16.2" +six = ">=1.12.0" +stone = ">=2" + [[package]] name = "enlighten" version = "1.10.1" @@ -704,6 +733,25 @@ python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*" [package.dependencies] pyparsing = ">=2.0.2" +[[package]] +name = "paramiko" +version = "2.7.2" +description = "SSH2 protocol library" +category = "main" +optional = false +python-versions = "*" + +[package.dependencies] +bcrypt = ">=3.1.3" +cryptography = ">=2.5" +pynacl = ">=1.0.1" + +[package.extras] +all = ["pyasn1 (>=0.1.7)", "pynacl (>=1.0.1)", "bcrypt (>=3.1.3)", "invoke (>=1.3)", "gssapi (>=1.4.1)", "pywin32 (>=2.1.8)"] +ed25519 = ["pynacl (>=1.0.1)", "bcrypt (>=3.1.3)"] +gssapi = ["pyasn1 (>=0.1.7)", "gssapi (>=1.4.1)", "pywin32 (>=2.1.8)"] +invoke = ["invoke (>=1.3)"] + [[package]] name = "parso" version = "0.8.2" @@ -749,6 +797,14 @@ importlib-metadata = {version = ">=0.12", markers = "python_version < \"3.8\""} [package.extras] dev = ["pre-commit", "tox"] +[[package]] +name = "ply" +version = "3.11" +description = "Python Lex & Yacc" +category = "main" +optional = false +python-versions = "*" + [[package]] name = "prefixed" version = "0.3.2" @@ -888,6 +944,22 @@ srv = ["dnspython (>=1.16.0,<1.17.0)"] tls = ["ipaddress"] zstd = ["zstandard"] +[[package]] +name = "pynacl" +version = "1.4.0" +description = "Python binding to the Networking and Cryptography (NaCl) library" +category = "main" +optional = false +python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*" + +[package.dependencies] +cffi = ">=1.4.1" +six = "*" + +[package.extras] +docs = ["sphinx (>=1.6.5)", "sphinx-rtd-theme"] +tests = ["pytest (>=3.2.1,!=3.3.0)", "hypothesis (>=3.27.0)"] + [[package]] name = "pynput" version = "1.7.3" @@ -977,6 +1049,17 @@ category = "main" optional = false python-versions = ">=3.5" +[[package]] +name = "pysftp" +version = "0.2.9" +description = "A friendly face on SFTP" +category = "main" +optional = false +python-versions = "*" + +[package.dependencies] +paramiko = ">=1.17" + [[package]] name = "pytest" version = "6.2.4" @@ -1069,7 +1152,7 @@ python-versions = "*" [[package]] name = "pywin32" -version = "300" +version = "301" description = "Python for Window Extensions" category = "main" optional = false @@ -1341,6 +1424,18 @@ sphinxcontrib-serializinghtml = "*" lint = ["flake8"] test = ["pytest", "sqlalchemy", "whoosh", "sphinx"] +[[package]] +name = "stone" +version = "3.2.1" +description = "Stone is an interface description language (IDL) for APIs." +category = "main" +optional = false +python-versions = "*" + +[package.dependencies] +ply = ">=3.4" +six = ">=1.3.0" + [[package]] name = "termcolor" version = "1.1.0" @@ -1466,7 +1561,7 @@ testing = ["pytest (>=4.6)", "pytest-checkdocs (>=1.2.3)", "pytest-flake8", "pyt [metadata] lock-version = "1.1" python-versions = "3.7.*" -content-hash = "8875d530ae66f9763b5b0cb84d9d35edc184ef5c141b63d38bf1ff5a1226e556" +content-hash = "ff2bfa35a7304378917a0c25d7d7af9f81a130288d95789bdf7429f071e80b69" [metadata.files] acre = [] @@ -1553,6 +1648,15 @@ babel = [ {file = "Babel-2.9.1-py2.py3-none-any.whl", hash = "sha256:ab49e12b91d937cd11f0b67cb259a57ab4ad2b59ac7a3b41d6c06c0ac5b0def9"}, {file = "Babel-2.9.1.tar.gz", hash = "sha256:bc0c176f9f6a994582230df350aa6e05ba2ebe4b3ac317eab29d9be5d2768da0"}, ] +bcrypt = [ + {file = "bcrypt-3.2.0-cp36-abi3-macosx_10_9_x86_64.whl", hash = "sha256:c95d4cbebffafcdd28bd28bb4e25b31c50f6da605c81ffd9ad8a3d1b2ab7b1b6"}, + {file = "bcrypt-3.2.0-cp36-abi3-manylinux1_x86_64.whl", hash = "sha256:63d4e3ff96188e5898779b6057878fecf3f11cfe6ec3b313ea09955d587ec7a7"}, + {file = "bcrypt-3.2.0-cp36-abi3-manylinux2010_x86_64.whl", hash = "sha256:cd1ea2ff3038509ea95f687256c46b79f5fc382ad0aa3664d200047546d511d1"}, + {file = "bcrypt-3.2.0-cp36-abi3-manylinux2014_aarch64.whl", hash = "sha256:cdcdcb3972027f83fe24a48b1e90ea4b584d35f1cc279d76de6fc4b13376239d"}, + {file = "bcrypt-3.2.0-cp36-abi3-win32.whl", hash = "sha256:a67fb841b35c28a59cebed05fbd3e80eea26e6d75851f0574a9273c80f3e9b55"}, + {file = "bcrypt-3.2.0-cp36-abi3-win_amd64.whl", hash = "sha256:81fec756feff5b6818ea7ab031205e1d323d8943d237303baca2c5f9c7846f34"}, + {file = "bcrypt-3.2.0.tar.gz", hash = "sha256:5b93c1726e50a93a033c36e5ca7fdcd29a5c7395af50a6892f5d9e7c6cfbfb29"}, +] blessed = [ {file = "blessed-1.18.0-py2.py3-none-any.whl", hash = "sha256:5b5e2f0563d5a668c282f3f5946f7b1abb70c85829461900e607e74d7725106e"}, {file = "blessed-1.18.0.tar.gz", hash = "sha256:1312879f971330a1b7f2c6341f2ae7e2cbac244bfc9d0ecfbbecd4b0293bc755"}, @@ -1582,24 +1686,36 @@ cffi = [ {file = "cffi-1.14.5-cp36-cp36m-manylinux1_i686.whl", hash = "sha256:48e1c69bbacfc3d932221851b39d49e81567a4d4aac3b21258d9c24578280058"}, {file = "cffi-1.14.5-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:69e395c24fc60aad6bb4fa7e583698ea6cc684648e1ffb7fe85e3c1ca131a7d5"}, {file = "cffi-1.14.5-cp36-cp36m-manylinux2014_aarch64.whl", hash = "sha256:9e93e79c2551ff263400e1e4be085a1210e12073a31c2011dbbda14bda0c6132"}, + {file = "cffi-1.14.5-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:24ec4ff2c5c0c8f9c6b87d5bb53555bf267e1e6f70e52e5a9740d32861d36b6f"}, + {file = "cffi-1.14.5-cp36-cp36m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3c3f39fa737542161d8b0d680df2ec249334cd70a8f420f71c9304bd83c3cbed"}, + {file = "cffi-1.14.5-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:681d07b0d1e3c462dd15585ef5e33cb021321588bebd910124ef4f4fb71aef55"}, {file = "cffi-1.14.5-cp36-cp36m-win32.whl", hash = "sha256:58e3f59d583d413809d60779492342801d6e82fefb89c86a38e040c16883be53"}, {file = "cffi-1.14.5-cp36-cp36m-win_amd64.whl", hash = "sha256:005a36f41773e148deac64b08f233873a4d0c18b053d37da83f6af4d9087b813"}, {file = "cffi-1.14.5-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:2894f2df484ff56d717bead0a5c2abb6b9d2bf26d6960c4604d5c48bbc30ee73"}, {file = "cffi-1.14.5-cp37-cp37m-manylinux1_i686.whl", hash = "sha256:0857f0ae312d855239a55c81ef453ee8fd24136eaba8e87a2eceba644c0d4c06"}, {file = "cffi-1.14.5-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:cd2868886d547469123fadc46eac7ea5253ea7fcb139f12e1dfc2bbd406427d1"}, {file = "cffi-1.14.5-cp37-cp37m-manylinux2014_aarch64.whl", hash = "sha256:35f27e6eb43380fa080dccf676dece30bef72e4a67617ffda586641cd4508d49"}, + {file = "cffi-1.14.5-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:06d7cd1abac2ffd92e65c0609661866709b4b2d82dd15f611e602b9b188b0b69"}, + {file = "cffi-1.14.5-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0f861a89e0043afec2a51fd177a567005847973be86f709bbb044d7f42fc4e05"}, + {file = "cffi-1.14.5-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cc5a8e069b9ebfa22e26d0e6b97d6f9781302fe7f4f2b8776c3e1daea35f1adc"}, {file = "cffi-1.14.5-cp37-cp37m-win32.whl", hash = "sha256:9ff227395193126d82e60319a673a037d5de84633f11279e336f9c0f189ecc62"}, {file = "cffi-1.14.5-cp37-cp37m-win_amd64.whl", hash = "sha256:9cf8022fb8d07a97c178b02327b284521c7708d7c71a9c9c355c178ac4bbd3d4"}, {file = "cffi-1.14.5-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:8b198cec6c72df5289c05b05b8b0969819783f9418e0409865dac47288d2a053"}, {file = "cffi-1.14.5-cp38-cp38-manylinux1_i686.whl", hash = "sha256:ad17025d226ee5beec591b52800c11680fca3df50b8b29fe51d882576e039ee0"}, {file = "cffi-1.14.5-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:6c97d7350133666fbb5cf4abdc1178c812cb205dc6f41d174a7b0f18fb93337e"}, {file = "cffi-1.14.5-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:8ae6299f6c68de06f136f1f9e69458eae58f1dacf10af5c17353eae03aa0d827"}, + {file = "cffi-1.14.5-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:04c468b622ed31d408fea2346bec5bbffba2cc44226302a0de1ade9f5ea3d373"}, + {file = "cffi-1.14.5-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:06db6321b7a68b2bd6df96d08a5adadc1fa0e8f419226e25b2a5fbf6ccc7350f"}, + {file = "cffi-1.14.5-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:293e7ea41280cb28c6fcaaa0b1aa1f533b8ce060b9e701d78511e1e6c4a1de76"}, {file = "cffi-1.14.5-cp38-cp38-win32.whl", hash = "sha256:b85eb46a81787c50650f2392b9b4ef23e1f126313b9e0e9013b35c15e4288e2e"}, {file = "cffi-1.14.5-cp38-cp38-win_amd64.whl", hash = "sha256:1f436816fc868b098b0d63b8920de7d208c90a67212546d02f84fe78a9c26396"}, {file = "cffi-1.14.5-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:1071534bbbf8cbb31b498d5d9db0f274f2f7a865adca4ae429e147ba40f73dea"}, {file = "cffi-1.14.5-cp39-cp39-manylinux1_i686.whl", hash = "sha256:9de2e279153a443c656f2defd67769e6d1e4163952b3c622dcea5b08a6405322"}, {file = "cffi-1.14.5-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:6e4714cc64f474e4d6e37cfff31a814b509a35cb17de4fb1999907575684479c"}, {file = "cffi-1.14.5-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:158d0d15119b4b7ff6b926536763dc0714313aa59e320ddf787502c70c4d4bee"}, + {file = "cffi-1.14.5-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1bf1ac1984eaa7675ca8d5745a8cb87ef7abecb5592178406e55858d411eadc0"}, + {file = "cffi-1.14.5-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:df5052c5d867c1ea0b311fb7c3cd28b19df469c056f7fdcfe88c7473aa63e333"}, + {file = "cffi-1.14.5-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:24a570cd11895b60829e941f2613a4f79df1a27344cbbb82164ef2e0116f09c7"}, {file = "cffi-1.14.5-cp39-cp39-win32.whl", hash = "sha256:afb29c1ba2e5a3736f1c301d9d0abe3ec8b86957d04ddfa9d7a6a42b9367e396"}, {file = "cffi-1.14.5-cp39-cp39-win_amd64.whl", hash = "sha256:f2d45f97ab6bb54753eab54fffe75aaf3de4ff2341c9daee1987ee1837636f1d"}, {file = "cffi-1.14.5.tar.gz", hash = "sha256:fd78e5fee591709f32ef6edb9a015b4aa1a5022598e36227500c8f4e02328d9c"}, @@ -1692,8 +1808,10 @@ cryptography = [ {file = "cryptography-3.4.7-cp36-abi3-win_amd64.whl", hash = "sha256:de4e5f7f68220d92b7637fc99847475b59154b7a1b3868fb7385337af54ac9ca"}, {file = "cryptography-3.4.7-pp36-pypy36_pp73-manylinux2010_x86_64.whl", hash = "sha256:26965837447f9c82f1855e0bc8bc4fb910240b6e0d16a664bb722df3b5b06873"}, {file = "cryptography-3.4.7-pp36-pypy36_pp73-manylinux2014_x86_64.whl", hash = "sha256:eb8cc2afe8b05acbd84a43905832ec78e7b3873fb124ca190f574dca7389a87d"}, + {file = "cryptography-3.4.7-pp37-pypy37_pp73-macosx_10_10_x86_64.whl", hash = "sha256:b01fd6f2737816cb1e08ed4807ae194404790eac7ad030b34f2ce72b332f5586"}, {file = "cryptography-3.4.7-pp37-pypy37_pp73-manylinux2010_x86_64.whl", hash = "sha256:7ec5d3b029f5fa2b179325908b9cd93db28ab7b85bb6c1db56b10e0b54235177"}, {file = "cryptography-3.4.7-pp37-pypy37_pp73-manylinux2014_x86_64.whl", hash = "sha256:ee77aa129f481be46f8d92a1a7db57269a2f23052d5f2433b4621bb457081cc9"}, + {file = "cryptography-3.4.7-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:bf40af59ca2465b24e54f671b2de2c59257ddc4f7e5706dbd6930e26823668d3"}, {file = "cryptography-3.4.7.tar.gz", hash = "sha256:3d10de8116d25649631977cb37da6cbdd2d6fa0e0281d014a5b7d337255ca713"}, ] cx-freeze = [ @@ -1730,6 +1848,11 @@ docutils = [ {file = "docutils-0.16-py2.py3-none-any.whl", hash = "sha256:0c5b78adfbf7762415433f5515cd5c9e762339e23369dbe8000d84a4bf4ab3af"}, {file = "docutils-0.16.tar.gz", hash = "sha256:c2de3a60e9e7d07be26b7f2b00ca0309c207e06c100f9cc2a94931fc75a478fc"}, ] +dropbox = [ + {file = "dropbox-11.20.0-py2-none-any.whl", hash = "sha256:0926aab25445fe78b0284e0b86f4126ec4e5e2bf6cd2ac8562002008a21073b8"}, + {file = "dropbox-11.20.0-py3-none-any.whl", hash = "sha256:f2106aa566f9e3c175879c226c60b7089a39099b228061acbb7258670f6b859c"}, + {file = "dropbox-11.20.0.tar.gz", hash = "sha256:1aa351ec8bbb11cf3560e731b81d25f39c7edcb5fa92c06c5d68866cb9f90d54"}, +] enlighten = [ {file = "enlighten-1.10.1-py2.py3-none-any.whl", hash = "sha256:3d6c3eec8cf3eb626ee7b65eddc1b3e904d01f4547a2b9fe7f1da8892a0297e8"}, {file = "enlighten-1.10.1.tar.gz", hash = "sha256:3391916586364aedced5d6926482b48745e4948f822de096d32258ba238ea984"}, @@ -1852,12 +1975,22 @@ log4mongo = [ {file = "log4mongo-1.7.0.tar.gz", hash = "sha256:dc374617206162a0b14167fbb5feac01dbef587539a235dadba6200362984a68"}, ] markupsafe = [ + {file = "MarkupSafe-2.0.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:d8446c54dc28c01e5a2dbac5a25f071f6653e6e40f3a8818e8b45d790fe6ef53"}, + {file = "MarkupSafe-2.0.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:36bc903cbb393720fad60fc28c10de6acf10dc6cc883f3e24ee4012371399a38"}, + {file = "MarkupSafe-2.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2d7d807855b419fc2ed3e631034685db6079889a1f01d5d9dac950f764da3dad"}, + {file = "MarkupSafe-2.0.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:add36cb2dbb8b736611303cd3bfcee00afd96471b09cda130da3581cbdc56a6d"}, + {file = "MarkupSafe-2.0.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:168cd0a3642de83558a5153c8bd34f175a9a6e7f6dc6384b9655d2697312a646"}, + {file = "MarkupSafe-2.0.1-cp310-cp310-win32.whl", hash = "sha256:99df47edb6bda1249d3e80fdabb1dab8c08ef3975f69aed437cb69d0a5de1e28"}, + {file = "MarkupSafe-2.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:e0f138900af21926a02425cf736db95be9f4af72ba1bb21453432a07f6082134"}, {file = "MarkupSafe-2.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:f9081981fe268bd86831e5c75f7de206ef275defcb82bc70740ae6dc507aee51"}, {file = "MarkupSafe-2.0.1-cp36-cp36m-manylinux1_i686.whl", hash = "sha256:0955295dd5eec6cb6cc2fe1698f4c6d84af2e92de33fbcac4111913cd100a6ff"}, {file = "MarkupSafe-2.0.1-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:0446679737af14f45767963a1a9ef7620189912317d095f2d9ffa183a4d25d2b"}, {file = "MarkupSafe-2.0.1-cp36-cp36m-manylinux2010_i686.whl", hash = "sha256:f826e31d18b516f653fe296d967d700fddad5901ae07c622bb3705955e1faa94"}, {file = "MarkupSafe-2.0.1-cp36-cp36m-manylinux2010_x86_64.whl", hash = "sha256:fa130dd50c57d53368c9d59395cb5526eda596d3ffe36666cd81a44d56e48872"}, {file = "MarkupSafe-2.0.1-cp36-cp36m-manylinux2014_aarch64.whl", hash = "sha256:905fec760bd2fa1388bb5b489ee8ee5f7291d692638ea5f67982d968366bef9f"}, + {file = "MarkupSafe-2.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bf5d821ffabf0ef3533c39c518f3357b171a1651c1ff6827325e4489b0e46c3c"}, + {file = "MarkupSafe-2.0.1-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:0d4b31cc67ab36e3392bbf3862cfbadac3db12bdd8b02a2731f509ed5b829724"}, + {file = "MarkupSafe-2.0.1-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:baa1a4e8f868845af802979fcdbf0bb11f94f1cb7ced4c4b8a351bb60d108145"}, {file = "MarkupSafe-2.0.1-cp36-cp36m-win32.whl", hash = "sha256:6c4ca60fa24e85fe25b912b01e62cb969d69a23a5d5867682dd3e80b5b02581d"}, {file = "MarkupSafe-2.0.1-cp36-cp36m-win_amd64.whl", hash = "sha256:b2f4bf27480f5e5e8ce285a8c8fd176c0b03e93dcc6646477d4630e83440c6a9"}, {file = "MarkupSafe-2.0.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:0717a7390a68be14b8c793ba258e075c6f4ca819f15edfc2a3a027c823718567"}, @@ -1866,14 +1999,21 @@ markupsafe = [ {file = "MarkupSafe-2.0.1-cp37-cp37m-manylinux2010_i686.whl", hash = "sha256:d7f9850398e85aba693bb640262d3611788b1f29a79f0c93c565694658f4071f"}, {file = "MarkupSafe-2.0.1-cp37-cp37m-manylinux2010_x86_64.whl", hash = "sha256:6a7fae0dd14cf60ad5ff42baa2e95727c3d81ded453457771d02b7d2b3f9c0c2"}, {file = "MarkupSafe-2.0.1-cp37-cp37m-manylinux2014_aarch64.whl", hash = "sha256:b7f2d075102dc8c794cbde1947378051c4e5180d52d276987b8d28a3bd58c17d"}, + {file = "MarkupSafe-2.0.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e9936f0b261d4df76ad22f8fee3ae83b60d7c3e871292cd42f40b81b70afae85"}, + {file = "MarkupSafe-2.0.1-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:2a7d351cbd8cfeb19ca00de495e224dea7e7d919659c2841bbb7f420ad03e2d6"}, + {file = "MarkupSafe-2.0.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:60bf42e36abfaf9aff1f50f52644b336d4f0a3fd6d8a60ca0d054ac9f713a864"}, {file = "MarkupSafe-2.0.1-cp37-cp37m-win32.whl", hash = "sha256:a30e67a65b53ea0a5e62fe23682cfe22712e01f453b95233b25502f7c61cb415"}, {file = "MarkupSafe-2.0.1-cp37-cp37m-win_amd64.whl", hash = "sha256:611d1ad9a4288cf3e3c16014564df047fe08410e628f89805e475368bd304914"}, + {file = "MarkupSafe-2.0.1-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:5bb28c636d87e840583ee3adeb78172efc47c8b26127267f54a9c0ec251d41a9"}, {file = "MarkupSafe-2.0.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:be98f628055368795d818ebf93da628541e10b75b41c559fdf36d104c5787066"}, {file = "MarkupSafe-2.0.1-cp38-cp38-manylinux1_i686.whl", hash = "sha256:1d609f577dc6e1aa17d746f8bd3c31aa4d258f4070d61b2aa5c4166c1539de35"}, {file = "MarkupSafe-2.0.1-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:7d91275b0245b1da4d4cfa07e0faedd5b0812efc15b702576d103293e252af1b"}, {file = "MarkupSafe-2.0.1-cp38-cp38-manylinux2010_i686.whl", hash = "sha256:01a9b8ea66f1658938f65b93a85ebe8bc016e6769611be228d797c9d998dd298"}, {file = "MarkupSafe-2.0.1-cp38-cp38-manylinux2010_x86_64.whl", hash = "sha256:47ab1e7b91c098ab893b828deafa1203de86d0bc6ab587b160f78fe6c4011f75"}, {file = "MarkupSafe-2.0.1-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:97383d78eb34da7e1fa37dd273c20ad4320929af65d156e35a5e2d89566d9dfb"}, + {file = "MarkupSafe-2.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6fcf051089389abe060c9cd7caa212c707e58153afa2c649f00346ce6d260f1b"}, + {file = "MarkupSafe-2.0.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:5855f8438a7d1d458206a2466bf82b0f104a3724bf96a1c781ab731e4201731a"}, + {file = "MarkupSafe-2.0.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:3dd007d54ee88b46be476e293f48c85048603f5f516008bee124ddd891398ed6"}, {file = "MarkupSafe-2.0.1-cp38-cp38-win32.whl", hash = "sha256:023cb26ec21ece8dc3907c0e8320058b2e0cb3c55cf9564da612bc325bed5e64"}, {file = "MarkupSafe-2.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:984d76483eb32f1bcb536dc27e4ad56bba4baa70be32fa87152832cdd9db0833"}, {file = "MarkupSafe-2.0.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:2ef54abee730b502252bcdf31b10dacb0a416229b72c18b19e24a4509f273d26"}, @@ -1883,6 +2023,9 @@ markupsafe = [ {file = "MarkupSafe-2.0.1-cp39-cp39-manylinux2010_i686.whl", hash = "sha256:4efca8f86c54b22348a5467704e3fec767b2db12fc39c6d963168ab1d3fc9135"}, {file = "MarkupSafe-2.0.1-cp39-cp39-manylinux2010_x86_64.whl", hash = "sha256:ab3ef638ace319fa26553db0624c4699e31a28bb2a835c5faca8f8acf6a5a902"}, {file = "MarkupSafe-2.0.1-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:f8ba0e8349a38d3001fae7eadded3f6606f0da5d748ee53cc1dab1d6527b9509"}, + {file = "MarkupSafe-2.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c47adbc92fc1bb2b3274c4b3a43ae0e4573d9fbff4f54cd484555edbf030baf1"}, + {file = "MarkupSafe-2.0.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:37205cac2a79194e3750b0af2a5720d95f786a55ce7df90c3af697bfa100eaac"}, + {file = "MarkupSafe-2.0.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:1f2ade76b9903f39aa442b4aadd2177decb66525062db244b35d71d0ee8599b6"}, {file = "MarkupSafe-2.0.1-cp39-cp39-win32.whl", hash = "sha256:10f82115e21dc0dfec9ab5c0223652f7197feb168c940f3ef61563fc2d6beb74"}, {file = "MarkupSafe-2.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:693ce3f9e70a6cf7d2fb9e6c9d8b204b6b39897a2c4a1aa65728d5ac97dcc1d8"}, {file = "MarkupSafe-2.0.1.tar.gz", hash = "sha256:594c67807fb16238b30c44bdf74f36c02cdf22d1c8cda91ef8a0ed8dabf5620a"}, @@ -1935,6 +2078,10 @@ packaging = [ {file = "packaging-20.9-py2.py3-none-any.whl", hash = "sha256:67714da7f7bc052e064859c05c595155bd1ee9f69f76557e21f051443c20947a"}, {file = "packaging-20.9.tar.gz", hash = "sha256:5b327ac1320dc863dca72f4514ecc086f31186744b84a230374cc1fd776feae5"}, ] +paramiko = [ + {file = "paramiko-2.7.2-py2.py3-none-any.whl", hash = "sha256:4f3e316fef2ac628b05097a637af35685183111d4bc1b5979bd397c2ab7b5898"}, + {file = "paramiko-2.7.2.tar.gz", hash = "sha256:7f36f4ba2c0d81d219f4595e35f70d56cc94f9ac40a6acdf51d6ca210ce65035"}, +] parso = [ {file = "parso-0.8.2-py2.py3-none-any.whl", hash = "sha256:a8c4922db71e4fdb90e0d0bc6e50f9b273d3397925e5e60a717e719201778d22"}, {file = "parso-0.8.2.tar.gz", hash = "sha256:12b83492c6239ce32ff5eed6d3639d6a536170723c6f3f1506869f1ace413398"}, @@ -1983,6 +2130,10 @@ pluggy = [ {file = "pluggy-0.13.1-py2.py3-none-any.whl", hash = "sha256:966c145cd83c96502c3c3868f50408687b38434af77734af1e9ca461a4081d2d"}, {file = "pluggy-0.13.1.tar.gz", hash = "sha256:15b2acde666561e1298d71b523007ed7364de07029219b604cf808bfa1c765b0"}, ] +ply = [ + {file = "ply-3.11-py2.py3-none-any.whl", hash = "sha256:096f9b8350b65ebd2fd1346b12452efe5b9607f7482813ffca50c22722a807ce"}, + {file = "ply-3.11.tar.gz", hash = "sha256:00c7c1aaa88358b9c765b6d3000c6eec0ba42abca5351b095321aef446081da3"}, +] prefixed = [ {file = "prefixed-0.3.2-py2.py3-none-any.whl", hash = "sha256:5e107306462d63f2f03c529dbf11b0026fdfec621a9a008ca639d71de22995c3"}, {file = "prefixed-0.3.2.tar.gz", hash = "sha256:ca48277ba5fa8346dd4b760847da930c7b84416387c39e93affef086add2c029"}, @@ -2006,9 +2157,13 @@ protobuf = [ {file = "protobuf-3.17.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2ae692bb6d1992afb6b74348e7bb648a75bb0d3565a3f5eea5bec8f62bd06d87"}, {file = "protobuf-3.17.3-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:99938f2a2d7ca6563c0ade0c5ca8982264c484fdecf418bd68e880a7ab5730b1"}, {file = "protobuf-3.17.3-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:6902a1e4b7a319ec611a7345ff81b6b004b36b0d2196ce7a748b3493da3d226d"}, + {file = "protobuf-3.17.3-cp38-cp38-win32.whl", hash = "sha256:59e5cf6b737c3a376932fbfb869043415f7c16a0cf176ab30a5bbc419cd709c1"}, + {file = "protobuf-3.17.3-cp38-cp38-win_amd64.whl", hash = "sha256:ebcb546f10069b56dc2e3da35e003a02076aaa377caf8530fe9789570984a8d2"}, {file = "protobuf-3.17.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:4ffbd23640bb7403574f7aff8368e2aeb2ec9a5c6306580be48ac59a6bac8bde"}, {file = "protobuf-3.17.3-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:26010f693b675ff5a1d0e1bdb17689b8b716a18709113288fead438703d45539"}, {file = "protobuf-3.17.3-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:e76d9686e088fece2450dbc7ee905f9be904e427341d289acbe9ad00b78ebd47"}, + {file = "protobuf-3.17.3-cp39-cp39-win32.whl", hash = "sha256:a38bac25f51c93e4be4092c88b2568b9f407c27217d3dd23c7a57fa522a17554"}, + {file = "protobuf-3.17.3-cp39-cp39-win_amd64.whl", hash = "sha256:85d6303e4adade2827e43c2b54114d9a6ea547b671cb63fafd5011dc47d0e13d"}, {file = "protobuf-3.17.3-py2.py3-none-any.whl", hash = "sha256:2bfb815216a9cd9faec52b16fd2bfa68437a44b67c56bee59bc3926522ecb04e"}, {file = "protobuf-3.17.3.tar.gz", hash = "sha256:72804ea5eaa9c22a090d2803813e280fb273b62d5ae497aaf3553d141c4fdd7b"}, ] @@ -2144,6 +2299,26 @@ pymongo = [ {file = "pymongo-3.11.4-py2.7-macosx-10.14-intel.egg", hash = "sha256:506a6dab4c7ffdcacdf0b8e70bd20eb2e77fa994519547c9d88d676400fcad58"}, {file = "pymongo-3.11.4.tar.gz", hash = "sha256:539d4cb1b16b57026999c53e5aab857fe706e70ae5310cc8c232479923f932e6"}, ] +pynacl = [ + {file = "PyNaCl-1.4.0-cp27-cp27m-macosx_10_10_x86_64.whl", hash = "sha256:ea6841bc3a76fa4942ce00f3bda7d436fda21e2d91602b9e21b7ca9ecab8f3ff"}, + {file = "PyNaCl-1.4.0-cp27-cp27m-manylinux1_x86_64.whl", hash = "sha256:d452a6746f0a7e11121e64625109bc4468fc3100452817001dbe018bb8b08514"}, + {file = "PyNaCl-1.4.0-cp27-cp27m-win32.whl", hash = "sha256:2fe0fc5a2480361dcaf4e6e7cea00e078fcda07ba45f811b167e3f99e8cff574"}, + {file = "PyNaCl-1.4.0-cp27-cp27m-win_amd64.whl", hash = "sha256:f8851ab9041756003119368c1e6cd0b9c631f46d686b3904b18c0139f4419f80"}, + {file = "PyNaCl-1.4.0-cp27-cp27mu-manylinux1_x86_64.whl", hash = "sha256:7757ae33dae81c300487591c68790dfb5145c7d03324000433d9a2c141f82af7"}, + {file = "PyNaCl-1.4.0-cp35-abi3-macosx_10_10_x86_64.whl", hash = "sha256:757250ddb3bff1eecd7e41e65f7f833a8405fede0194319f87899690624f2122"}, + {file = "PyNaCl-1.4.0-cp35-abi3-manylinux1_x86_64.whl", hash = "sha256:30f9b96db44e09b3304f9ea95079b1b7316b2b4f3744fe3aaecccd95d547063d"}, + {file = "PyNaCl-1.4.0-cp35-abi3-win32.whl", hash = "sha256:4e10569f8cbed81cb7526ae137049759d2a8d57726d52c1a000a3ce366779634"}, + {file = "PyNaCl-1.4.0-cp35-abi3-win_amd64.whl", hash = "sha256:c914f78da4953b33d4685e3cdc7ce63401247a21425c16a39760e282075ac4a6"}, + {file = "PyNaCl-1.4.0-cp35-cp35m-win32.whl", hash = "sha256:06cbb4d9b2c4bd3c8dc0d267416aaed79906e7b33f114ddbf0911969794b1cc4"}, + {file = "PyNaCl-1.4.0-cp35-cp35m-win_amd64.whl", hash = "sha256:511d269ee845037b95c9781aa702f90ccc36036f95d0f31373a6a79bd8242e25"}, + {file = "PyNaCl-1.4.0-cp36-cp36m-win32.whl", hash = "sha256:11335f09060af52c97137d4ac54285bcb7df0cef29014a1a4efe64ac065434c4"}, + {file = "PyNaCl-1.4.0-cp36-cp36m-win_amd64.whl", hash = "sha256:cd401ccbc2a249a47a3a1724c2918fcd04be1f7b54eb2a5a71ff915db0ac51c6"}, + {file = "PyNaCl-1.4.0-cp37-cp37m-win32.whl", hash = "sha256:8122ba5f2a2169ca5da936b2e5a511740ffb73979381b4229d9188f6dcb22f1f"}, + {file = "PyNaCl-1.4.0-cp37-cp37m-win_amd64.whl", hash = "sha256:537a7ccbea22905a0ab36ea58577b39d1fa9b1884869d173b5cf111f006f689f"}, + {file = "PyNaCl-1.4.0-cp38-cp38-win32.whl", hash = "sha256:9c4a7ea4fb81536c1b1f5cc44d54a296f96ae78c1ebd2311bd0b60be45a48d96"}, + {file = "PyNaCl-1.4.0-cp38-cp38-win_amd64.whl", hash = "sha256:7c6092102219f59ff29788860ccb021e80fffd953920c4a8653889c029b2d420"}, + {file = "PyNaCl-1.4.0.tar.gz", hash = "sha256:54e9a2c849c742006516ad56a88f5c74bf2ce92c9f67435187c3c5953b346505"}, +] pynput = [ {file = "pynput-1.7.3-py2.py3-none-any.whl", hash = "sha256:fea5777454f896bd79d35393088cd29a089f3b2da166f0848a922b1d5a807d4f"}, {file = "pynput-1.7.3-py3.8.egg", hash = "sha256:6626e8ea9ca482bb5628a7169e1193824e382c4ad3053e40f4f24f41ee7b41c9"}, @@ -2151,6 +2326,7 @@ pynput = [ ] pyobjc-core = [ {file = "pyobjc-core-7.3.tar.gz", hash = "sha256:5081aedf8bb40aac1a8ad95adac9e44e148a882686ded614adf46bb67fd67574"}, + {file = "pyobjc_core-7.3-1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:a1f1e6b457127cbf2b5bd2b94520a7c89fb590b739911eadb2b0499a3a5b0e6f"}, {file = "pyobjc_core-7.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:4e93ad769a20b908778fe950f62a843a6d8f0fa71996e5f3cc9fab5ae7d17771"}, {file = "pyobjc_core-7.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:9f63fd37bbf3785af4ddb2f86cad5ca81c62cfc7d1c0099637ca18343c3656c1"}, {file = "pyobjc_core-7.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:e9b1311f72f2e170742a7ee3a8149f52c35158dc024a21e88d6f1e52ba5d718b"}, @@ -2159,6 +2335,7 @@ pyobjc-core = [ ] pyobjc-framework-cocoa = [ {file = "pyobjc-framework-Cocoa-7.3.tar.gz", hash = "sha256:b18d05e7a795a3455ad191c3e43d6bfa673c2a4fd480bb1ccf57191051b80b7e"}, + {file = "pyobjc_framework_Cocoa-7.3-1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:1e31376806e5de883a1d7c7c87d9ff2a8b09fc05d267e0dfce6e42409fb70c67"}, {file = "pyobjc_framework_Cocoa-7.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:9edffdfa6dd1f71f21b531c3e61fdd3e4d5d3bf6c5a528c98e88828cd60bac11"}, {file = "pyobjc_framework_Cocoa-7.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:35a6340437a4e0109a302150b7d1f6baf57004ccf74834f9e6062fcafe2fd8d7"}, {file = "pyobjc_framework_Cocoa-7.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:7c3886f2608ab3ed02482f8b2ebf9f782b324c559e84b52cfd92dba8a1109872"}, @@ -2167,6 +2344,7 @@ pyobjc-framework-cocoa = [ ] pyobjc-framework-quartz = [ {file = "pyobjc-framework-Quartz-7.3.tar.gz", hash = "sha256:98812844c34262def980bdf60923a875cd43428a8375b6fd53bd2cd800eccf0b"}, + {file = "pyobjc_framework_Quartz-7.3-1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:1139bc6874c0f8b58f0b8602015e0994198bc506a6bcec1071208de32b55ed26"}, {file = "pyobjc_framework_Quartz-7.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:1ef18f5a16511ded65980bf4f5983ea5d35c88224dbad1b3112abd29c60413ea"}, {file = "pyobjc_framework_Quartz-7.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:3b41eec8d4b10c7c7e011e2f9051367f5499ef315ba52dfbae573c3a2e05469c"}, {file = "pyobjc_framework_Quartz-7.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c65456ed045dfe1711d0298734e5a3ad670f8c770f7eb3b19979256c388bdd2"}, @@ -2212,6 +2390,9 @@ pyqt5-sip = [ pyrsistent = [ {file = "pyrsistent-0.17.3.tar.gz", hash = "sha256:2e636185d9eb976a18a8a8e96efce62f2905fea90041958d8cc2a189756ebf3e"}, ] +pysftp = [ + {file = "pysftp-0.2.9.tar.gz", hash = "sha256:fbf55a802e74d663673400acd92d5373c1c7ee94d765b428d9f977567ac4854a"}, +] pytest = [ {file = "pytest-6.2.4-py3-none-any.whl", hash = "sha256:91ef2131a9bd6be8f76f1f08eac5c5317221d6ad1e143ae03894b862e8976890"}, {file = "pytest-6.2.4.tar.gz", hash = "sha256:50bcad0a0b9c5a72c8e4e7c9855a3ad496ca6a881a3641b4260605450772c54b"}, @@ -2240,16 +2421,16 @@ pytz = [ {file = "pytz-2021.1.tar.gz", hash = "sha256:83a4a90894bf38e243cf052c8b58f381bfe9a7a483f6a9cab140bc7f702ac4da"}, ] pywin32 = [ - {file = "pywin32-300-cp35-cp35m-win32.whl", hash = "sha256:1c204a81daed2089e55d11eefa4826c05e604d27fe2be40b6bf8db7b6a39da63"}, - {file = "pywin32-300-cp35-cp35m-win_amd64.whl", hash = "sha256:350c5644775736351b77ba68da09a39c760d75d2467ecec37bd3c36a94fbed64"}, - {file = "pywin32-300-cp36-cp36m-win32.whl", hash = "sha256:a3b4c48c852d4107e8a8ec980b76c94ce596ea66d60f7a697582ea9dce7e0db7"}, - {file = "pywin32-300-cp36-cp36m-win_amd64.whl", hash = "sha256:27a30b887afbf05a9cbb05e3ffd43104a9b71ce292f64a635389dbad0ed1cd85"}, - {file = "pywin32-300-cp37-cp37m-win32.whl", hash = "sha256:d7e8c7efc221f10d6400c19c32a031add1c4a58733298c09216f57b4fde110dc"}, - {file = "pywin32-300-cp37-cp37m-win_amd64.whl", hash = "sha256:8151e4d7a19262d6694162d6da85d99a16f8b908949797fd99c83a0bfaf5807d"}, - {file = "pywin32-300-cp38-cp38-win32.whl", hash = "sha256:fbb3b1b0fbd0b4fc2a3d1d81fe0783e30062c1abed1d17c32b7879d55858cfae"}, - {file = "pywin32-300-cp38-cp38-win_amd64.whl", hash = "sha256:60a8fa361091b2eea27f15718f8eb7f9297e8d51b54dbc4f55f3d238093d5190"}, - {file = "pywin32-300-cp39-cp39-win32.whl", hash = "sha256:638b68eea5cfc8def537e43e9554747f8dee786b090e47ead94bfdafdb0f2f50"}, - {file = "pywin32-300-cp39-cp39-win_amd64.whl", hash = "sha256:b1609ce9bd5c411b81f941b246d683d6508992093203d4eb7f278f4ed1085c3f"}, + {file = "pywin32-301-cp35-cp35m-win32.whl", hash = "sha256:93367c96e3a76dfe5003d8291ae16454ca7d84bb24d721e0b74a07610b7be4a7"}, + {file = "pywin32-301-cp35-cp35m-win_amd64.whl", hash = "sha256:9635df6998a70282bd36e7ac2a5cef9ead1627b0a63b17c731312c7a0daebb72"}, + {file = "pywin32-301-cp36-cp36m-win32.whl", hash = "sha256:c866f04a182a8cb9b7855de065113bbd2e40524f570db73ef1ee99ff0a5cc2f0"}, + {file = "pywin32-301-cp36-cp36m-win_amd64.whl", hash = "sha256:dafa18e95bf2a92f298fe9c582b0e205aca45c55f989937c52c454ce65b93c78"}, + {file = "pywin32-301-cp37-cp37m-win32.whl", hash = "sha256:98f62a3f60aa64894a290fb7494bfa0bfa0a199e9e052e1ac293b2ad3cd2818b"}, + {file = "pywin32-301-cp37-cp37m-win_amd64.whl", hash = "sha256:fb3b4933e0382ba49305cc6cd3fb18525df7fd96aa434de19ce0878133bf8e4a"}, + {file = "pywin32-301-cp38-cp38-win32.whl", hash = "sha256:88981dd3cfb07432625b180f49bf4e179fb8cbb5704cd512e38dd63636af7a17"}, + {file = "pywin32-301-cp38-cp38-win_amd64.whl", hash = "sha256:8c9d33968aa7fcddf44e47750e18f3d034c3e443a707688a008a2e52bbef7e96"}, + {file = "pywin32-301-cp39-cp39-win32.whl", hash = "sha256:595d397df65f1b2e0beaca63a883ae6d8b6df1cdea85c16ae85f6d2e648133fe"}, + {file = "pywin32-301-cp39-cp39-win_amd64.whl", hash = "sha256:87604a4087434cd814ad8973bd47d6524bd1fa9e971ce428e76b62a5e0860fdf"}, ] pywin32-ctypes = [ {file = "pywin32-ctypes-0.2.0.tar.gz", hash = "sha256:24ffc3b341d457d48e8922352130cf2644024a4ff09762a2261fd34c36ee5942"}, @@ -2339,6 +2520,11 @@ sphinxcontrib-websupport = [ {file = "sphinxcontrib-websupport-1.2.4.tar.gz", hash = "sha256:4edf0223a0685a7c485ae5a156b6f529ba1ee481a1417817935b20bde1956232"}, {file = "sphinxcontrib_websupport-1.2.4-py2.py3-none-any.whl", hash = "sha256:6fc9287dfc823fe9aa432463edd6cea47fa9ebbf488d7f289b322ffcfca075c7"}, ] +stone = [ + {file = "stone-3.2.1-py2-none-any.whl", hash = "sha256:2a50866528f60cc7cedd010def733e8ae9d581d17f967278a08059bffaea3c57"}, + {file = "stone-3.2.1-py3-none-any.whl", hash = "sha256:76235137c09ee88aa53e8c1e666819f6c20ac8064c4ac6c4ee4194eac0e3b7af"}, + {file = "stone-3.2.1.tar.gz", hash = "sha256:9bc78b40143b4ef33bf569e515408c2996ffebefbb1a897616ebe8aa6f2d7e75"}, +] termcolor = [ {file = "termcolor-1.1.0.tar.gz", hash = "sha256:1d6d69ce66211143803fbc56652b41d73b4a400a2891d7bf7a1cdf4c02de613b"}, ] diff --git a/pyproject.toml b/pyproject.toml index e376986606..fb47c7eff6 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -55,7 +55,7 @@ speedcopy = "^2.1" six = "^1.15" semver = "^2.13.0" # for version resolution wsrpc_aiohttp = "^3.1.1" # websocket server -pywin32 = { version = "300", markers = "sys_platform == 'win32'" } +pywin32 = { version = "301", markers = "sys_platform == 'win32'" } jinxed = [ { version = "^1.0.1", markers = "sys_platform == 'darwin'" }, { version = "^1.0.1", markers = "sys_platform == 'linux'" } @@ -63,6 +63,8 @@ jinxed = [ python3-xlib = { version="*", markers = "sys_platform == 'linux'"} enlighten = "^1.9.0" slack-sdk = "^3.6.0" +pysftp = "^0.2.9" +dropbox = "^11.20.0" [tool.poetry.dev-dependencies] flake8 = "^3.7" diff --git a/repos/avalon-core b/repos/avalon-core index 8aee68fa10..4b80f81e66 160000 --- a/repos/avalon-core +++ b/repos/avalon-core @@ -1 +1 @@ -Subproject commit 8aee68fa10ab4d79be1a91e7728a609748e7c3c6 +Subproject commit 4b80f81e66aca593784be8b299110a0b6541276f diff --git a/start.py b/start.py index f3adabd942..a4c58b8fc7 100644 --- a/start.py +++ b/start.py @@ -102,7 +102,6 @@ import subprocess import site from pathlib import Path - # OPENPYPE_ROOT is variable pointing to build (or code) directory # WARNING `OPENPYPE_ROOT` must be defined before igniter import # - igniter changes cwd which cause that filepath of this script won't lead @@ -190,6 +189,7 @@ else: import igniter # noqa: E402 from igniter import BootstrapRepos # noqa: E402 from igniter.tools import ( + get_openpype_global_settings, get_openpype_path_from_db, validate_mongo_connection ) # noqa @@ -275,6 +275,34 @@ def run(arguments: list, env: dict = None) -> int: return p.returncode +def run_disk_mapping_commands(mongo_url): + """ Run disk mapping command + + Used to map shared disk for OP to pull codebase. + """ + settings = get_openpype_global_settings(mongo_url) + + low_platform = platform.system().lower() + disk_mapping = settings.get("disk_mapping") + if not disk_mapping: + return + + mappings = disk_mapping.get(low_platform) or [] + for source, destination in mappings: + args = ["subst", destination.rstrip('/'), source.rstrip('/')] + _print("disk mapping args:: {}".format(args)) + try: + output = subprocess.Popen(args) + if output.returncode and output.returncode != 0: + exc_msg = "Executing args was not successful: \"{}\"".format( + args) + + raise RuntimeError(exc_msg) + except TypeError: + _print("Error in mapping drive") + raise + + def set_avalon_environments(): """Set avalon specific environments. @@ -403,15 +431,24 @@ def _validate_thirdparty_binaries(): raise RuntimeError(error_msg.format("FFmpeg")) # Validate existence of OpenImageIO (not on MacOs) - if low_platform != "darwin": + oiio_tool_path = None + if low_platform == "linux": + oiio_tool_path = os.path.join( + binary_vendors_dir, + "oiio", + low_platform, + "bin", + "oiiotool" + ) + elif low_platform == "windows": oiio_tool_path = os.path.join( binary_vendors_dir, "oiio", low_platform, "oiiotool" ) - if not is_tool(oiio_tool_path): - raise RuntimeError(error_msg.format("OpenImageIO")) + if oiio_tool_path is not None and not is_tool(oiio_tool_path): + raise RuntimeError(error_msg.format("OpenImageIO")) def _process_arguments() -> tuple: @@ -886,6 +923,9 @@ def boot(): os.environ["OPENPYPE_MONGO"] = openpype_mongo os.environ["OPENPYPE_DATABASE_NAME"] = "openpype" # name of Pype database + _print(">>> run disk mapping command ...") + run_disk_mapping_commands(openpype_mongo) + # Get openpype path from database and set it to environment so openpype can # find its versions there and bootstrap them. openpype_path = get_openpype_path_from_db(openpype_mongo) diff --git a/tests/README.md b/tests/README.md index e69de29bb2..6317b2ab3c 100644 --- a/tests/README.md +++ b/tests/README.md @@ -0,0 +1,25 @@ +Automatic tests for OpenPype +============================ +Structure: +- integration - end to end tests, slow (see README.md in the integration folder for more info) + - openpype/modules/MODULE_NAME - structure follow directory structure in code base + - fixture - sample data `(MongoDB dumps, test files etc.)` + - `tests.py` - single or more pytest files for MODULE_NAME +- unit - quick unit test + - MODULE_NAME + - fixture + - `tests.py` + +How to run: +---------- +- single test class could be run by PyCharm and its pytest runner directly +- OR +- use Openpype command 'runtests' from command line +-- `${OPENPYPE_ROOT}/start.py runtests` + +By default, this command will run all tests in ${OPENPYPE_ROOT}/tests. + +Specific location could be provided to this command as an argument, either as absolute path, or relative path to ${OPENPYPE_ROOT}. +(eg. `${OPENPYPE_ROOT}/start.py runtests ../tests/integration`) will trigger only tests in `integration` folder. + +See `${OPENPYPE_ROOT}/cli.py:runtests` for other arguments. diff --git a/tests/__init__.py b/tests/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/tests/integration/README.md b/tests/integration/README.md new file mode 100644 index 0000000000..81c07ec50c --- /dev/null +++ b/tests/integration/README.md @@ -0,0 +1,37 @@ +Integration test for OpenPype +============================= +Contains end-to-end tests for automatic testing of OP. + +Should run headless publish on all hosts to check basic publish use cases automatically +to limit regression issues. + +How to create test for publishing from host +------------------------------------------ +- Extend PublishTest +- Use `resources\test_data.zip` skeleton file as a template for testing input data +- Put workfile into `test_data.zip/input/workfile` +- If you require other than base DB dumps provide them to `test_data.zip/input/dumps` +-- (Check commented code in `db_handler.py` how to dump specific DB. Currently all collections will be dumped.) +- Implement `last_workfile_path` +- `startup_scripts` - must contain pointing host to startup script saved into `test_data.zip/input/startup` + -- Script must contain something like +``` +import openpype +from avalon import api, HOST + +api.install(HOST) +pyblish.util.publish() + +EXIT_APP (command to exit host) +``` +(Install and publish methods must be triggered only AFTER host app is fully initialized!) +- Zip `test_data.zip`, named it with descriptive name, upload it to Google Drive, right click - `Get link`, copy hash id +- Put this hash id and zip file name into TEST_FILES [(HASH_ID, FILE_NAME, MD5_OPTIONAL)]. If you want to check MD5 of downloaded +file, provide md5 value of zipped file. +- Implement any assert checks you need in extended class +- Run test class manually (via Pycharm or pytest runner (TODO)) +- If you want test to compare expected files to published one, set PERSIST to True, run test manually + -- Locate temporary `publish` subfolder of temporary folder (found in debugging console log) + -- Copy whole folder content into .zip file into `expected` subfolder + -- By default tests are comparing only structure of `expected` and published format (eg. if you want to save space, replace published files with empty files, but with expected names!) + -- Zip and upload again, change PERSIST to False \ No newline at end of file diff --git a/tests/integration/hosts/maya/test_publish_in_maya.py b/tests/integration/hosts/maya/test_publish_in_maya.py new file mode 100644 index 0000000000..1babf30029 --- /dev/null +++ b/tests/integration/hosts/maya/test_publish_in_maya.py @@ -0,0 +1,103 @@ +import pytest +import os +import shutil + +from tests.lib.testing_classes import PublishTest + + +class TestPublishInMaya(PublishTest): + """Basic test case for publishing in Maya + + Shouldnt be running standalone only via 'runtests' pype command! (??) + + Uses generic TestCase to prepare fixtures for test data, testing DBs, + env vars. + + Opens Maya, run publish on prepared workile. + + Then checks content of DB (if subset, version, representations were + created. + Checks tmp folder if all expected files were published. + + """ + PERSIST = True + + TEST_FILES = [ + ("1pOwjA_VVBc6ooTZyFxtAwLS2KZHaBlkY", "test_maya_publish.zip", "") + ] + + APP = "maya" + APP_VARIANT = "2019" + + APP_NAME = "{}/{}".format(APP, APP_VARIANT) + + TIMEOUT = 120 # publish timeout + + @pytest.fixture(scope="module") + def last_workfile_path(self, download_test_data): + """Get last_workfile_path from source data. + + Maya expects workfile in proper folder, so copy is done first. + """ + src_path = os.path.join(download_test_data, + "input", + "workfile", + "test_project_test_asset_TestTask_v001.mb") + dest_folder = os.path.join(download_test_data, + self.PROJECT, + self.ASSET, + "work", + self.TASK) + os.makedirs(dest_folder) + dest_path = os.path.join(dest_folder, + "test_project_test_asset_TestTask_v001.mb") + shutil.copy(src_path, dest_path) + + yield dest_path + + @pytest.fixture(scope="module") + def startup_scripts(self, monkeypatch_session, download_test_data): + """Points Maya to userSetup file from input data""" + startup_path = os.path.join(download_test_data, + "input", + "startup") + original_pythonpath = os.environ.get("PYTHONPATH") + monkeypatch_session.setenv("PYTHONPATH", + "{}{}{}".format(startup_path, + os.pathsep, + original_pythonpath)) + + def test_db_asserts(self, dbcon, publish_finished): + """Host and input data dependent expected results in DB.""" + print("test_db_asserts") + assert 5 == dbcon.count_documents({"type": "version"}), \ + "Not expected no of versions" + + assert 0 == dbcon.count_documents({"type": "version", + "name": {"$ne": 1}}), \ + "Only versions with 1 expected" + + assert 1 == dbcon.count_documents({"type": "subset", + "name": "modelMain"}), \ + "modelMain subset must be present" + + assert 1 == dbcon.count_documents({"type": "subset", + "name": "workfileTest_task"}), \ + "workfileTest_task subset must be present" + + assert 11 == dbcon.count_documents({"type": "representation"}), \ + "Not expected no of representations" + + assert 2 == dbcon.count_documents({"type": "representation", + "context.subset": "modelMain", + "context.ext": "abc"}), \ + "Not expected no of representations with ext 'abc'" + + assert 2 == dbcon.count_documents({"type": "representation", + "context.subset": "modelMain", + "context.ext": "ma"}), \ + "Not expected no of representations with ext 'abc'" + + +if __name__ == "__main__": + test_case = TestPublishInMaya() diff --git a/tests/lib/README.md b/tests/lib/README.md new file mode 100644 index 0000000000..0384cd2ff0 --- /dev/null +++ b/tests/lib/README.md @@ -0,0 +1,46 @@ +Automatic testing +----------------- +Folder for libs and tooling for automatic testing. + +- db_handler.py - class for preparation of test DB + - dumps DB(s) to BSON (mongodump) + - loads dump(s) to new DB (mongorestore) + - loads sql file(s) to DB (mongoimport) + - deletes test DB + +- file_handler.py - class to download test data from GDrive + - downloads data from (list) of files from GDrive + - check file integrity with MD5 hash + - unzips if zip + +- testing_wrapper.py - base class to use for testing + - all env var necessary for running (OPENPYPE_MONGO ...) + - implements reusable fixtures to: + - load test data (uses `file_handler`) + - prepare DB (uses `db_handler`) + - modify temporarily env vars for testing + + Should be used as a skeleton to create new test cases. + + +Test data +--------- +Each class implementing `TestCase` can provide test file(s) by adding them to +TEST_FILES ('GDRIVE_FILE_ID', 'ACTUAL_FILE_NAME', 'MD5HASH') + +GDRIVE_FILE_ID can be pulled from shareable link from Google Drive app. + +Currently it is expected that test file will be zip file with structure: +- expected - expected files (not implemented yet) +- input + - data - test data (workfiles, images etc) + - dumps - folder for BSON dumps from (`mongodump`) + - env_vars + env_vars.json - dictionary with environment variables {key:value} + + - json - json files to load with `mongoimport` (human readable) + + +Example +------- +See `tests\unit\openpype\modules\sync_server\test_site_operations.py` for example usage of implemented classes. \ No newline at end of file diff --git a/tests/lib/__init__.py b/tests/lib/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/tests/lib/db_handler.py b/tests/lib/db_handler.py new file mode 100644 index 0000000000..9be70895da --- /dev/null +++ b/tests/lib/db_handler.py @@ -0,0 +1,231 @@ +""" + Helper class for automatic testing, provides dump and restore via command + line utilities. + + Expect mongodump, mongoimport and mongorestore present at PATH +""" +import os +import pymongo +import subprocess + + +class DBHandler: + + def __init__(self, uri=None, host=None, port=None, + user=None, password=None): + """'uri' or rest of separate credentials""" + if uri: + self.uri = uri + if host: + if all([user, password]): + host = "{}:{}@{}".format(user, password, host) + self.uri = 'mongodb://{}:{}'.format(host, port or 27017) + + assert self.uri, "Must have uri to MongoDB" + self.client = pymongo.MongoClient(uri) + self.db = None + + def setup_empty(self, name): + # not much sense + self.db = self.client[name] + + def setup_from_sql(self, db_name, sql_dir, collection=None, + drop=True, mode=None): + """ + Restores 'db_name' from 'sql_url'. + + Works with directory with .json files, + if 'collection' arg is empty, name + of .json file is used as name of target collection. + + Args: + db_name (str): source DB name + sql_dir (str): folder with json files + collection (str): if all sql files are meant for single coll. + drop (bool): True if drop whole collection + mode (str): "insert" - fails on duplicates + "upsert" - modifies existing + "merge" - updates existing + "delete" - removes in DB present if file + """ + if not os.path.exists(sql_dir): + raise RuntimeError( + "Backup folder {} doesn't exist".format(sql_dir)) + + for (dirpath, _dirnames, filenames) in os.walk(sql_dir): + for file_name in filenames: + sql_url = os.path.join(dirpath, file_name) + query = self._import_query(self.uri, sql_url, + db_name=db_name, + collection=collection, + drop=drop, + mode=mode) + + print("mongoimport query:: {}".format(query)) + subprocess.run(query) + + def setup_from_sql_file(self, db_name, sql_url, + collection=None, drop=True, mode=None): + """ + Restores 'db_name' from 'sql_url'. + + Works with single .json file. + If 'collection' arg is empty, name + of .json file is used as name of target collection. + + Args: + db_name (str): source DB name + sql_file (str): folder with json files + collection (str): name of target collection + drop (bool): True if drop collection + mode (str): "insert" - fails on duplicates + "upsert" - modifies existing + "merge" - updates existing + "delete" - removes in DB present if file + """ + if not os.path.exists(sql_url): + raise RuntimeError( + "Sql file {} doesn't exist".format(sql_url)) + + query = self._import_query(self.uri, sql_url, + db_name=db_name, + collection=collection, + drop=drop, + mode=mode) + + print("mongoimport query:: {}".format(query)) + subprocess.run(query) + + def setup_from_dump(self, db_name, dump_dir, overwrite=False, + collection=None, db_name_out=None): + """ + Restores 'db_name' from 'dump_dir'. + + Works with BSON folders exported by mongodump + + Args: + db_name (str): source DB name + dump_dir (str): folder with dumped subfolders + overwrite (bool): True if overwrite target + collection (str): name of source project + db_name_out (str): name of target DB, if empty restores to + source 'db_name' + """ + db_name_out = db_name_out or db_name + if self._db_exists(db_name) and not overwrite: + raise RuntimeError("DB {} already exists".format(db_name_out) + + "Run with overwrite=True") + + dir_path = os.path.join(dump_dir, db_name) + if not os.path.exists(dir_path): + raise RuntimeError( + "Backup folder {} doesn't exist".format(dir_path)) + + query = self._restore_query(self.uri, dump_dir, + db_name=db_name, db_name_out=db_name_out, + collection=collection) + print("mongorestore query:: {}".format(query)) + subprocess.run(query) + + def teardown(self, db_name): + """Drops 'db_name' if exists.""" + if not self._db_exists(db_name): + print("{} doesn't exist".format(db_name)) + return + + print("Dropping {} database".format(db_name)) + self.client.drop_database(db_name) + + def backup_to_dump(self, db_name, dump_dir, overwrite=False): + """ + Helper method for running mongodump for specific 'db_name' + """ + if not self._db_exists(db_name) and not overwrite: + raise RuntimeError("DB {} doesn't exists".format(db_name)) + + dir_path = os.path.join(dump_dir, db_name) + if os.path.exists(dir_path) and not overwrite: + raise RuntimeError("Backup already exists, " + "run with overwrite=True") + + query = self._dump_query(self.uri, dump_dir, db_name=db_name) + print("Mongodump query:: {}".format(query)) + subprocess.run(query) + + def _db_exists(self, db_name): + return db_name in self.client.list_database_names() + + def _dump_query(self, uri, output_path, db_name=None, collection=None): + """Prepares dump query based on 'db_name' or 'collection'.""" + db_part = coll_part = "" + if db_name: + db_part = "--db={}".format(db_name) + if collection: + if not db_name: + raise ValueError("db_name must be present") + coll_part = "--nsInclude={}.{}".format(db_name, collection) + query = "\"{}\" --uri=\"{}\" --out={} {} {}".format( + "mongodump", uri, output_path, db_part, coll_part + ) + + return query + + def _restore_query(self, uri, dump_dir, + db_name=None, db_name_out=None, + collection=None, drop=True): + """Prepares query for mongorestore base on arguments""" + db_part = coll_part = drop_part = "" + if db_name: + db_part = "--nsInclude={}.* --nsFrom={}.*".format(db_name, db_name) + if collection: + assert db_name, "Must provide db name too" + db_part = "--nsInclude={}.{} --nsFrom={}.{}".format(db_name, + collection, + db_name, + collection) + if drop: + drop_part = "--drop" + + if db_name_out: + db_part += " --nsTo={}.*".format(db_name_out) + + query = "\"{}\" --uri=\"{}\" --dir=\"{}\" {} {} {}".format( + "mongorestore", uri, dump_dir, db_part, coll_part, drop_part + ) + + return query + + def _import_query(self, uri, sql_url, + db_name=None, + collection=None, drop=True, mode=None): + + db_part = coll_part = drop_part = mode_part = "" + if db_name: + db_part = "--db {}".format(db_name) + if collection: + assert db_name, "Must provide db name too" + coll_part = "--collection {}".format(collection) + if drop: + drop_part = "--drop" + if mode: + mode_part = "--mode {}".format(mode) + + query = \ + "\"{}\" --legacy --uri=\"{}\" --file=\"{}\" {} {} {} {}".format( + "mongoimport", uri, sql_url, + db_part, coll_part, drop_part, mode_part) + + return query + +# handler = DBHandler(uri="mongodb://localhost:27017") +# +# backup_dir = "c:\\projects\\dumps" +# # +# handler.backup_to_dump("openpype", backup_dir, True) +# # handler.setup_from_dump("test_db", backup_dir, True) +# # handler.setup_from_sql_file("test_db", "c:\\projects\\sql\\item.sql", +# # collection="test_project", +# # drop=False, mode="upsert") +# handler.setup_from_sql("test_db", "c:\\projects\\sql", +# collection="test_project", +# drop=False, mode="upsert") diff --git a/tests/lib/file_handler.py b/tests/lib/file_handler.py new file mode 100644 index 0000000000..ee3abc6ecb --- /dev/null +++ b/tests/lib/file_handler.py @@ -0,0 +1,260 @@ +import enlighten +import os +import re +import urllib +from urllib.parse import urlparse +import urllib.request +import urllib.error +import itertools +import hashlib +import tarfile +import zipfile + + +USER_AGENT = "openpype" + + +class RemoteFileHandler: + """Download file from url, might be GDrive shareable link""" + + IMPLEMENTED_ZIP_FORMATS = ['zip', 'tar', 'tgz', + 'tar.gz', 'tar.xz', 'tar.bz2'] + + @staticmethod + def calculate_md5(fpath, chunk_size): + md5 = hashlib.md5() + with open(fpath, 'rb') as f: + for chunk in iter(lambda: f.read(chunk_size), b''): + md5.update(chunk) + return md5.hexdigest() + + @staticmethod + def check_md5(fpath, md5, **kwargs): + return md5 == RemoteFileHandler.calculate_md5(fpath, **kwargs) + + @staticmethod + def check_integrity(fpath, md5=None): + if not os.path.isfile(fpath): + return False + if md5 is None: + return True + return RemoteFileHandler.check_md5(fpath, md5) + + @staticmethod + def download_url( + url, root, filename=None, + md5=None, max_redirect_hops=3 + ): + """Download a file from a url and place it in root. + Args: + url (str): URL to download file from + root (str): Directory to place downloaded file in + filename (str, optional): Name to save the file under. + If None, use the basename of the URL + md5 (str, optional): MD5 checksum of the download. + If None, do not check + max_redirect_hops (int, optional): Maximum number of redirect + hops allowed + """ + root = os.path.expanduser(root) + if not filename: + filename = os.path.basename(url) + fpath = os.path.join(root, filename) + + os.makedirs(root, exist_ok=True) + + # check if file is already present locally + if RemoteFileHandler.check_integrity(fpath, md5): + print('Using downloaded and verified file: ' + fpath) + return + + # expand redirect chain if needed + url = RemoteFileHandler._get_redirect_url(url, + max_hops=max_redirect_hops) + + # check if file is located on Google Drive + file_id = RemoteFileHandler._get_google_drive_file_id(url) + if file_id is not None: + return RemoteFileHandler.download_file_from_google_drive( + file_id, root, filename, md5) + + # download the file + try: + print('Downloading ' + url + ' to ' + fpath) + RemoteFileHandler._urlretrieve(url, fpath) + except (urllib.error.URLError, IOError) as e: + if url[:5] == 'https': + url = url.replace('https:', 'http:') + print('Failed download. Trying https -> http instead.' + ' Downloading ' + url + ' to ' + fpath) + RemoteFileHandler._urlretrieve(url, fpath) + else: + raise e + + # check integrity of downloaded file + if not RemoteFileHandler.check_integrity(fpath, md5): + raise RuntimeError("File not found or corrupted.") + + @staticmethod + def download_file_from_google_drive(file_id, root, + filename=None, + md5=None): + """Download a Google Drive file from and place it in root. + Args: + file_id (str): id of file to be downloaded + root (str): Directory to place downloaded file in + filename (str, optional): Name to save the file under. + If None, use the id of the file. + md5 (str, optional): MD5 checksum of the download. + If None, do not check + """ + # Based on https://stackoverflow.com/questions/38511444/python-download-files-from-google-drive-using-url # noqa + import requests + url = "https://docs.google.com/uc?export=download" + + root = os.path.expanduser(root) + if not filename: + filename = file_id + fpath = os.path.join(root, filename) + + os.makedirs(root, exist_ok=True) + + if os.path.isfile(fpath) and RemoteFileHandler.check_integrity(fpath, + md5): + print('Using downloaded and verified file: ' + fpath) + else: + session = requests.Session() + + response = session.get(url, params={'id': file_id}, stream=True) + token = RemoteFileHandler._get_confirm_token(response) + + if token: + params = {'id': file_id, 'confirm': token} + response = session.get(url, params=params, stream=True) + + response_content_generator = response.iter_content(32768) + first_chunk = None + while not first_chunk: # filter out keep-alive new chunks + first_chunk = next(response_content_generator) + + if RemoteFileHandler._quota_exceeded(first_chunk): + msg = ( + f"The daily quota of the file {filename} is exceeded and " + f"it can't be downloaded. This is a limitation of " + f"Google Drive and can only be overcome by trying " + f"again later." + ) + raise RuntimeError(msg) + + RemoteFileHandler._save_response_content( + itertools.chain((first_chunk, ), + response_content_generator), fpath) + response.close() + + @staticmethod + def unzip(path, destination_path=None): + if not destination_path: + destination_path = os.path.dirname(path) + + _, archive_type = os.path.splitext(path) + archive_type = archive_type.lstrip('.') + + if archive_type in ['zip']: + print("Unzipping {}->{}".format(path, destination_path)) + zip_file = zipfile.ZipFile(path) + zip_file.extractall(destination_path) + zip_file.close() + + elif archive_type in [ + 'tar', 'tgz', 'tar.gz', 'tar.xz', 'tar.bz2' + ]: + print("Unzipping {}->{}".format(path, destination_path)) + if archive_type == 'tar': + tar_type = 'r:' + elif archive_type.endswith('xz'): + tar_type = 'r:xz' + elif archive_type.endswith('gz'): + tar_type = 'r:gz' + elif archive_type.endswith('bz2'): + tar_type = 'r:bz2' + else: + tar_type = 'r:*' + try: + tar_file = tarfile.open(path, tar_type) + except tarfile.ReadError: + raise SystemExit("corrupted archive") + tar_file.extractall(destination_path) + tar_file.close() + + @staticmethod + def _urlretrieve(url, filename, chunk_size): + with open(filename, "wb") as fh: + with urllib.request.urlopen( + urllib.request.Request(url, + headers={"User-Agent": USER_AGENT})) \ + as response: + for chunk in iter(lambda: response.read(chunk_size), ""): + if not chunk: + break + fh.write(chunk) + + @staticmethod + def _get_redirect_url(url, max_hops): + initial_url = url + headers = {"Method": "HEAD", "User-Agent": USER_AGENT} + + for _ in range(max_hops + 1): + with urllib.request.urlopen( + urllib.request.Request(url, headers=headers)) as response: + if response.url == url or response.url is None: + return url + + url = response.url + else: + raise RecursionError( + f"Request to {initial_url} exceeded {max_hops} redirects. " + f"The last redirect points to {url}." + ) + + @staticmethod + def _get_confirm_token(response): + for key, value in response.cookies.items(): + if key.startswith('download_warning'): + return value + + return None + + @staticmethod + def _save_response_content( + response_gen, destination, + ): + with open(destination, "wb") as f: + pbar = enlighten.Counter( + total=None, desc="Save content", units="%", color="green") + progress = 0 + for chunk in response_gen: + if chunk: # filter out keep-alive new chunks + f.write(chunk) + progress += len(chunk) + + pbar.close() + + @staticmethod + def _quota_exceeded(first_chunk): + try: + return "Google Drive - Quota exceeded" in first_chunk.decode() + except UnicodeDecodeError: + return False + + @staticmethod + def _get_google_drive_file_id(url): + parts = urlparse(url) + + if re.match(r"(drive|docs)[.]google[.]com", parts.netloc) is None: + return None + + match = re.match(r"/file/d/(?P[^/]*)", parts.path) + if match is None: + return None + + return match.group("id") diff --git a/tests/lib/testing_classes.py b/tests/lib/testing_classes.py new file mode 100644 index 0000000000..1832efb7ed --- /dev/null +++ b/tests/lib/testing_classes.py @@ -0,0 +1,263 @@ +"""Testing classes for module testing and publishing in hosts.""" +import os +import sys +import six +import json +import pytest +import tempfile +import shutil +import glob + +from tests.lib.db_handler import DBHandler +from tests.lib.file_handler import RemoteFileHandler + + +class BaseTest: + """Empty base test class""" + + +class ModuleUnitTest(BaseTest): + """Generic test class for testing modules + + Use PERSIST==True to keep temporary folder and DB prepared for + debugging or preparation of test files. + + Implemented fixtures: + monkeypatch_session - fixture for env vars with session scope + download_test_data - tmp folder with extracted data from GDrive + env_var - sets env vars from input file + db_setup - prepares avalon AND openpype DBs for testing from + binary dumps from input data + dbcon - returns DBConnection to AvalonDB + dbcon_openpype - returns DBConnection for OpenpypeMongoDB + + """ + PERSIST = False # True to not purge temporary folder nor test DB + + TEST_OPENPYPE_MONGO = "mongodb://localhost:27017" + TEST_DB_NAME = "test_db" + TEST_PROJECT_NAME = "test_project" + TEST_OPENPYPE_NAME = "test_openpype" + + TEST_FILES = [] + + PROJECT = "test_project" + ASSET = "test_asset" + TASK = "test_task" + + @pytest.fixture(scope='session') + def monkeypatch_session(self): + """Monkeypatch couldn't be used with module or session fixtures.""" + from _pytest.monkeypatch import MonkeyPatch + m = MonkeyPatch() + yield m + m.undo() + + @pytest.fixture(scope="module") + def download_test_data(self): + tmpdir = tempfile.mkdtemp() + for test_file in self.TEST_FILES: + file_id, file_name, md5 = test_file + + f_name, ext = os.path.splitext(file_name) + + RemoteFileHandler.download_file_from_google_drive(file_id, + str(tmpdir), + file_name) + + if ext.lstrip('.') in RemoteFileHandler.IMPLEMENTED_ZIP_FORMATS: + RemoteFileHandler.unzip(os.path.join(tmpdir, file_name)) + print("Temporary folder created:: {}".format(tmpdir)) + yield tmpdir + + if not self.PERSIST: + print("Removing {}".format(tmpdir)) + shutil.rmtree(tmpdir) + + @pytest.fixture(scope="module") + def env_var(self, monkeypatch_session, download_test_data): + """Sets temporary env vars from json file.""" + env_url = os.path.join(download_test_data, "input", + "env_vars", "env_var.json") + if not os.path.exists(env_url): + raise ValueError("Env variable file {} doesn't exist". + format(env_url)) + + env_dict = {} + try: + with open(env_url) as json_file: + env_dict = json.load(json_file) + except ValueError: + print("{} doesn't contain valid JSON") + six.reraise(*sys.exc_info()) + + for key, value in env_dict.items(): + all_vars = globals() + all_vars.update(vars(ModuleUnitTest)) # TODO check + value = value.format(**all_vars) + print("Setting {}:{}".format(key, value)) + monkeypatch_session.setenv(key, str(value)) + import openpype + + openpype_root = os.path.dirname(os.path.dirname(openpype.__file__)) + # ?? why 2 of those + monkeypatch_session.setenv("OPENPYPE_ROOT", openpype_root) + monkeypatch_session.setenv("OPENPYPE_REPOS_ROOT", openpype_root) + + @pytest.fixture(scope="module") + def db_setup(self, download_test_data, env_var, monkeypatch_session): + """Restore prepared MongoDB dumps into selected DB.""" + backup_dir = os.path.join(download_test_data, "input", "dumps") + + uri = os.environ.get("OPENPYPE_MONGO") + db_handler = DBHandler(uri) + db_handler.setup_from_dump(self.TEST_DB_NAME, backup_dir, True, + db_name_out=self.TEST_DB_NAME) + + db_handler.setup_from_dump("openpype", backup_dir, True, + db_name_out=self.TEST_OPENPYPE_NAME) + + yield db_handler + + if not self.PERSIST: + db_handler.teardown(self.TEST_DB_NAME) + db_handler.teardown(self.TEST_OPENPYPE_NAME) + + @pytest.fixture(scope="module") + def dbcon(self, db_setup): + """Provide test database connection. + + Database prepared from dumps with 'db_setup' fixture. + """ + from avalon.api import AvalonMongoDB + dbcon = AvalonMongoDB() + dbcon.Session["AVALON_PROJECT"] = self.TEST_PROJECT_NAME + yield dbcon + + @pytest.fixture(scope="module") + def dbcon_openpype(self, db_setup): + """Provide test database connection for OP settings. + + Database prepared from dumps with 'db_setup' fixture. + """ + from openpype.lib import OpenPypeMongoConnection + mongo_client = OpenPypeMongoConnection.get_mongo_client() + yield mongo_client[self.TEST_OPENPYPE_NAME]["settings"] + + +class PublishTest(ModuleUnitTest): + """Test class for publishing in hosts. + + Implemented fixtures: + launched_app - launches APP with last_workfile_path + publish_finished - waits until publish is finished, host must + kill its process when finished publishing. Includes timeout + which raises ValueError + + Not implemented: + last_workfile_path - returns path to testing workfile + startup_scripts - provide script for setup in host + + Implemented tests: + test_folder_structure_same - compares published and expected + subfolders if they contain same files. Compares only on file + presence + + TODO: implement test on file size, file content + """ + + APP = "" + APP_VARIANT = "" + + APP_NAME = "{}/{}".format(APP, APP_VARIANT) + + TIMEOUT = 120 # publish timeout + + @pytest.fixture(scope="module") + def last_workfile_path(self, download_test_data): + raise NotImplementedError + + @pytest.fixture(scope="module") + def startup_scripts(self, monkeypatch_session, download_test_data): + raise NotImplementedError + + @pytest.fixture(scope="module") + def launched_app(self, dbcon, download_test_data, last_workfile_path, + startup_scripts): + """Launch host app""" + # set publishing folders + root_key = "config.roots.work.{}".format("windows") # TEMP + dbcon.update_one( + {"type": "project"}, + {"$set": + { + root_key: download_test_data + }} + ) + + # set schema - for integrate_new + from openpype import PACKAGE_DIR + # Path to OpenPype's schema + schema_path = os.path.join( + os.path.dirname(PACKAGE_DIR), + "schema" + ) + os.environ["AVALON_SCHEMA"] = schema_path + + import openpype + openpype.install() + os.environ["OPENPYPE_EXECUTABLE"] = sys.executable + from openpype.lib import ApplicationManager + + application_manager = ApplicationManager() + data = { + "last_workfile_path": last_workfile_path, + "start_last_workfile": True, + "project_name": self.PROJECT, + "asset_name": self.ASSET, + "task_name": self.TASK + } + + yield application_manager.launch(self.APP_NAME, **data) + + @pytest.fixture(scope="module") + def publish_finished(self, dbcon, launched_app, download_test_data): + """Dummy fixture waiting for publish to finish""" + import time + time_start = time.time() + while launched_app.poll() is None: + time.sleep(0.5) + if time.time() - time_start > self.TIMEOUT: + raise ValueError("Timeout reached") + + # some clean exit test possible? + print("Publish finished") + yield True + + def test_folder_structure_same(self, dbcon, publish_finished, + download_test_data): + """Check if expected and published subfolders contain same files. + + Compares only presence, not size nor content! + """ + published_dir_base = download_test_data + published_dir = os.path.join(published_dir_base, + self.PROJECT, + self.TASK, + "**") + expected_dir_base = os.path.join(published_dir_base, + "expected") + expected_dir = os.path.join(expected_dir_base, + self.PROJECT, + self.TASK, + "**") + + published = set(f.replace(published_dir_base, '') for f in + glob.glob(published_dir, recursive=True) if + f != published_dir_base and os.path.exists(f)) + expected = set(f.replace(expected_dir_base, '') for f in + glob.glob(expected_dir, recursive=True) if + f != expected_dir_base and os.path.exists(f)) + + not_matched = expected.difference(published) + assert not not_matched, "Missing {} files".format(not_matched) diff --git a/tests/resources/test_data.zip b/tests/resources/test_data.zip new file mode 100644 index 0000000000..0faab86b37 Binary files /dev/null and b/tests/resources/test_data.zip differ diff --git a/tests/igniter/test_bootstrap_repos.py b/tests/unit/igniter/test_bootstrap_repos.py similarity index 100% rename from tests/igniter/test_bootstrap_repos.py rename to tests/unit/igniter/test_bootstrap_repos.py diff --git a/tests/igniter/test_tools.py b/tests/unit/igniter/test_tools.py similarity index 100% rename from tests/igniter/test_tools.py rename to tests/unit/igniter/test_tools.py diff --git a/tests/openpype/lib/test_user_settings.py b/tests/unit/openpype/lib/test_user_settings.py similarity index 100% rename from tests/openpype/lib/test_user_settings.py rename to tests/unit/openpype/lib/test_user_settings.py diff --git a/tests/unit/openpype/modules/sync_server/test_site_operations.py b/tests/unit/openpype/modules/sync_server/test_site_operations.py new file mode 100644 index 0000000000..6a861100a4 --- /dev/null +++ b/tests/unit/openpype/modules/sync_server/test_site_operations.py @@ -0,0 +1,134 @@ +"""Test file for Sync Server, tests site operations add_site, remove_site. + + File: + creates temporary directory and downloads .zip file from GDrive + unzips .zip file + uses content of .zip file (MongoDB's dumps) to import to new databases + with use of 'monkeypatch_session' modifies required env vars + temporarily + runs battery of tests checking that site operation for Sync Server + module are working + removes temporary folder + removes temporary databases (?) +""" +import pytest + +from tests.lib.testing_classes import ModuleUnitTest +from bson.objectid import ObjectId + + +class TestSiteOperation(ModuleUnitTest): + + REPRESENTATION_ID = "60e578d0c987036c6a7b741d" + + TEST_FILES = [("1eCwPljuJeOI8A3aisfOIBKKjcmIycTEt", + "test_site_operations.zip", '')] + + @pytest.fixture(scope="module") + def setup_sync_server_module(self, dbcon): + """Get sync_server_module from ModulesManager""" + from openpype.modules import ModulesManager + + manager = ModulesManager() + sync_server = manager.modules_by_name["sync_server"] + yield sync_server + + @pytest.mark.usefixtures("dbcon") + def test_project_created(self, dbcon): + assert ['test_project'] == dbcon.database.collection_names(False) + + @pytest.mark.usefixtures("dbcon") + def test_objects_imported(self, dbcon): + count_obj = len(list(dbcon.database[self.TEST_PROJECT_NAME].find({}))) + assert 15 == count_obj + + @pytest.mark.usefixtures("setup_sync_server_module") + def test_add_site(self, dbcon, setup_sync_server_module): + """Adds 'test_site', checks that added, + checks that doesn't duplicate.""" + query = { + "_id": ObjectId(self.REPRESENTATION_ID) + } + + ret = dbcon.database[self.TEST_PROJECT_NAME].find(query) + + assert 1 == len(list(ret)), \ + "Single {} must be in DB".format(self.REPRESENTATION_ID) + + setup_sync_server_module.add_site(self.TEST_PROJECT_NAME, + self.REPRESENTATION_ID, + site_name='test_site') + + ret = list(dbcon.database[self.TEST_PROJECT_NAME].find(query)) + + assert 1 == len(ret), \ + "Single {} must be in DB".format(self.REPRESENTATION_ID) + + ret = ret.pop() + site_names = [site["name"] for site in ret["files"][0]["sites"]] + assert 'test_site' in site_names, "Site name wasn't added" + + @pytest.mark.usefixtures("setup_sync_server_module") + def test_add_site_again(self, dbcon, setup_sync_server_module): + """Depends on test_add_site, must throw exception.""" + with pytest.raises(ValueError): + setup_sync_server_module.add_site(self.TEST_PROJECT_NAME, + self.REPRESENTATION_ID, + site_name='test_site') + + @pytest.mark.usefixtures("setup_sync_server_module") + def test_add_site_again_force(self, dbcon, setup_sync_server_module): + """Depends on test_add_site, must not throw exception.""" + setup_sync_server_module.add_site(self.TEST_PROJECT_NAME, + self.REPRESENTATION_ID, + site_name='test_site', force=True) + + query = { + "_id": ObjectId(self.REPRESENTATION_ID) + } + + ret = list(dbcon.database[self.TEST_PROJECT_NAME].find(query)) + + assert 1 == len(ret), \ + "Single {} must be in DB".format(self.REPRESENTATION_ID) + + @pytest.mark.usefixtures("setup_sync_server_module") + def test_remove_site(self, dbcon, setup_sync_server_module): + """Depends on test_add_site, must remove 'test_site'.""" + setup_sync_server_module.remove_site(self.TEST_PROJECT_NAME, + self.REPRESENTATION_ID, + site_name='test_site') + + query = { + "_id": ObjectId(self.REPRESENTATION_ID) + } + + ret = list(dbcon.database[self.TEST_PROJECT_NAME].find(query)) + + assert 1 == len(ret), \ + "Single {} must be in DB".format(self.REPRESENTATION_ID) + + ret = ret.pop() + site_names = [site["name"] for site in ret["files"][0]["sites"]] + + assert 'test_site' not in site_names, "Site name wasn't removed" + + @pytest.mark.usefixtures("setup_sync_server_module") + def test_remove_site_again(self, dbcon, setup_sync_server_module): + """Depends on test_add_site, must trow exception""" + with pytest.raises(ValueError): + setup_sync_server_module.remove_site(self.TEST_PROJECT_NAME, + self.REPRESENTATION_ID, + site_name='test_site') + + query = { + "_id": ObjectId(self.REPRESENTATION_ID) + } + + ret = list(dbcon.database[self.TEST_PROJECT_NAME].find(query)) + + assert 1 == len(ret), \ + "Single {} must be in DB".format(self.REPRESENTATION_ID) + + +test_case = TestSiteOperation() diff --git a/website/docs/admin_settings_system.md b/website/docs/admin_settings_system.md index 80154356af..6057ed0830 100644 --- a/website/docs/admin_settings_system.md +++ b/website/docs/admin_settings_system.md @@ -21,6 +21,9 @@ as a naive barier to prevent artists from accidental setting changes. **`Environment`** - Globally applied environment variables that will be appended to any OpenPype process in the studio. +**`Disk mapping`** - Platform dependent configuration for mapping of virtual disk(s) on an artist's OpenPype machines before OP starts up. +Uses `subst` command, if configured volume character in `Destination` field already exists, no re-mapping is done for that character(volume). + **`Versions Repository`** - Location where automatic update mechanism searches for zip files with OpenPype update packages. To read more about preparing OpenPype for automatic updates go to [Admin Distribute docs](admin_distribute#2-openpype-codebase) diff --git a/website/docs/assets/settings/settings_system_general.png b/website/docs/assets/settings/settings_system_general.png index 4c1452d0d4..d04586205d 100644 Binary files a/website/docs/assets/settings/settings_system_general.png and b/website/docs/assets/settings/settings_system_general.png differ diff --git a/website/docs/assets/site_sync_project_sftp_settings.png b/website/docs/assets/site_sync_project_sftp_settings.png new file mode 100644 index 0000000000..79267ba1d7 Binary files /dev/null and b/website/docs/assets/site_sync_project_sftp_settings.png differ diff --git a/website/docs/assets/site_sync_sftp_project_setting_not_forced.png b/website/docs/assets/site_sync_sftp_project_setting_not_forced.png new file mode 100644 index 0000000000..7bd735b46c Binary files /dev/null and b/website/docs/assets/site_sync_sftp_project_setting_not_forced.png differ diff --git a/website/docs/assets/site_sync_sftp_settings_local.png b/website/docs/assets/site_sync_sftp_settings_local.png new file mode 100644 index 0000000000..45125da1a8 Binary files /dev/null and b/website/docs/assets/site_sync_sftp_settings_local.png differ diff --git a/website/docs/assets/site_sync_sftp_system.png b/website/docs/assets/site_sync_sftp_system.png new file mode 100644 index 0000000000..6e8e125f95 Binary files /dev/null and b/website/docs/assets/site_sync_sftp_system.png differ diff --git a/website/docs/module_site_sync.md b/website/docs/module_site_sync.md index 6ee6660048..b0604ed3cf 100644 --- a/website/docs/module_site_sync.md +++ b/website/docs/module_site_sync.md @@ -106,6 +106,36 @@ To get working connection to Google Drive there are some necessary steps: - add new site back in OpenPype Settings, name as you want, provider needs to be 'gdrive' - distribute credentials file via shared mounted disk location +### SFTP + +SFTP provider is used to connect to SFTP server. Currently authentication with `user:password` or `user:ssh key` is implemented. +Please provide only one combination, don't forget to provide password for ssh key if ssh key was created with a passphrase. + +(SFTP connection could be a bit finicky, use FileZilla or WinSCP for testing connection, it will be mush faster.) + +Beware that ssh key expects OpenSSH format (`.pem`) not a Putty format (`.ppk`)! + +#### How to set SFTP site + +- Enable Site Sync module in Settings +- Add side with SFTP provider + +![Enable synching and create site](assets/site_sync_sftp_system.png) + +- In Projects setting enable Site Sync (on default project - all project will be synched, or on specific project) +- Configure SFTP connection and destination folder on a SFTP server (in screenshot `/upload`) + +![SFTP connection](assets/site_sync_project_sftp_settings.png) + +- if you want to force synching between local and sftp site for all users, use combination `active site: local`, `remote site: NAME_OF_SFTP_SITE` +- if you want to allow only specific users to use SFTP synching (external users, not located in the office), use `active site: studio`, `remote site: studio`. + +![Select active and remote site on a project](assets/site_sync_sftp_project_setting_not_forced.png) + +- Each artist can decide and configure synching from his/her local to SFTP via `Local Settings` + +![Select active and remote site on a project](assets/site_sync_sftp_settings_local.png) + ### Custom providers If a studio needs to use other services for cloud storage, or want to implement totally different storage providers, they can do so by writing their own provider plugin. We're working on a developer documentation, however, for now we recommend looking at `abstract_provider.py`and `gdrive.py` inside `openpype/modules/sync_server/providers` and using it as a template. diff --git a/website/docs/pype2/admin_presets_plugins.md b/website/docs/pype2/admin_presets_plugins.md index 797995d2b7..9c838d4a64 100644 --- a/website/docs/pype2/admin_presets_plugins.md +++ b/website/docs/pype2/admin_presets_plugins.md @@ -469,6 +469,7 @@ maya outliner colours for various families "camera": [0.447, 0.312, 1.0], "fbx": [1.0, 0.931, 0.312], "mayaAscii": [0.312, 1.0, 0.747], + "mayaScene": [0.312, 1.0, 0.747], "setdress": [0.312, 1.0, 0.747], "layout": [0.312, 1.0, 0.747], "vdbcache": [0.312, 1.0, 0.428],