Merge branch 'develop' into feature/settings_style_preparation

This commit is contained in:
iLLiCiTiT 2021-11-16 12:17:56 +01:00
commit 1f3b1c9d00
63 changed files with 2961 additions and 1002 deletions

View file

@ -33,7 +33,7 @@ jobs:
id: version
if: steps.version_type.outputs.type != 'skip'
run: |
RESULT=$(python ./tools/ci_tools.py --nightly)
RESULT=$(python ./tools/ci_tools.py --nightly --github_token ${{ secrets.GITHUB_TOKEN }})
echo ::set-output name=next_tag::$RESULT

View file

@ -1,54 +1,61 @@
# Changelog
## [3.6.0-nightly.5](https://github.com/pypeclub/OpenPype/tree/HEAD)
## [3.6.0](https://github.com/pypeclub/OpenPype/tree/3.6.0) (2021-11-15)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.5.0...HEAD)
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.5.0...3.6.0)
**🆕 New features**
- Add validate active site button to sync queue on a project [\#2176](https://github.com/pypeclub/OpenPype/pull/2176)
- Maya : Colorspace configuration [\#2170](https://github.com/pypeclub/OpenPype/pull/2170)
- Blender: Added support for audio [\#2168](https://github.com/pypeclub/OpenPype/pull/2168)
- Flame: a host basic integration [\#2165](https://github.com/pypeclub/OpenPype/pull/2165)
- Houdini: simple HDA workflow [\#2072](https://github.com/pypeclub/OpenPype/pull/2072)
**🚀 Enhancements**
- Tools: Subset manager in OpenPype [\#2243](https://github.com/pypeclub/OpenPype/pull/2243)
- General: Skip module directories without init file [\#2239](https://github.com/pypeclub/OpenPype/pull/2239)
- General: Static interfaces [\#2238](https://github.com/pypeclub/OpenPype/pull/2238)
- Style: Fix transparent image in style [\#2235](https://github.com/pypeclub/OpenPype/pull/2235)
- Add a "following workfile versioning" option on publish [\#2225](https://github.com/pypeclub/OpenPype/pull/2225)
- Modules: Module can add cli commands [\#2224](https://github.com/pypeclub/OpenPype/pull/2224)
- Webpublisher: Separate webpublisher logic [\#2222](https://github.com/pypeclub/OpenPype/pull/2222)
- Add both side availability on Site Sync sites to Loader [\#2220](https://github.com/pypeclub/OpenPype/pull/2220)
- Tools: Center loader and library loader on show [\#2219](https://github.com/pypeclub/OpenPype/pull/2219)
- Maya : Validate shape zero [\#2212](https://github.com/pypeclub/OpenPype/pull/2212)
- Maya : validate unique names [\#2211](https://github.com/pypeclub/OpenPype/pull/2211)
- Tools: OpenPype stylesheet in workfiles tool [\#2208](https://github.com/pypeclub/OpenPype/pull/2208)
- Add alternative sites for Site Sync [\#2206](https://github.com/pypeclub/OpenPype/pull/2206)
- Ftrack: Replace Queue with deque in event handlers logic [\#2204](https://github.com/pypeclub/OpenPype/pull/2204)
- Tools: New select context dialog [\#2200](https://github.com/pypeclub/OpenPype/pull/2200)
- Maya : Validate mesh ngons [\#2199](https://github.com/pypeclub/OpenPype/pull/2199)
- Dirmap in Nuke [\#2198](https://github.com/pypeclub/OpenPype/pull/2198)
- Delivery: Check 'frame' key in template for sequence delivery [\#2196](https://github.com/pypeclub/OpenPype/pull/2196)
- Settings: Site sync project settings improvement [\#2193](https://github.com/pypeclub/OpenPype/pull/2193)
- Usage of tools code [\#2185](https://github.com/pypeclub/OpenPype/pull/2185)
- Settings: Dictionary based on project roots [\#2184](https://github.com/pypeclub/OpenPype/pull/2184)
- Subset name: Be able to pass asset document to get subset name [\#2179](https://github.com/pypeclub/OpenPype/pull/2179)
- Tools: Experimental tools [\#2167](https://github.com/pypeclub/OpenPype/pull/2167)
- Loader: Refactor and use OpenPype stylesheets [\#2166](https://github.com/pypeclub/OpenPype/pull/2166)
- Add loader for linked smart objects in photoshop [\#2149](https://github.com/pypeclub/OpenPype/pull/2149)
**🐛 Bug fixes**
- Ftrack: Sync project ftrack id cache issue [\#2250](https://github.com/pypeclub/OpenPype/pull/2250)
- Ftrack: Session creation and Prepare project [\#2245](https://github.com/pypeclub/OpenPype/pull/2245)
- Added queue for studio processing in PS [\#2237](https://github.com/pypeclub/OpenPype/pull/2237)
- Python 2: Unicode to string conversion [\#2236](https://github.com/pypeclub/OpenPype/pull/2236)
- Fix - enum for color coding in PS [\#2234](https://github.com/pypeclub/OpenPype/pull/2234)
- Pyblish Tool: Fix targets handling [\#2232](https://github.com/pypeclub/OpenPype/pull/2232)
- Ftrack: Base event fix of 'get\_project\_from\_entity' method [\#2214](https://github.com/pypeclub/OpenPype/pull/2214)
- Maya : multiple subsets review broken [\#2210](https://github.com/pypeclub/OpenPype/pull/2210)
- Fix - different command used for Linux and Mac OS [\#2207](https://github.com/pypeclub/OpenPype/pull/2207)
- Tools: Workfiles tool don't use avalon widgets [\#2205](https://github.com/pypeclub/OpenPype/pull/2205)
- Ftrack: Fill missing ftrack id on mongo project [\#2203](https://github.com/pypeclub/OpenPype/pull/2203)
- Project Manager: Fix copying of tasks [\#2191](https://github.com/pypeclub/OpenPype/pull/2191)
- StandalonePublisher: Source validator don't expect representations [\#2190](https://github.com/pypeclub/OpenPype/pull/2190)
- Blender: Fix trying to pack an image when the shader node has no texture [\#2183](https://github.com/pypeclub/OpenPype/pull/2183)
- MacOS: Launching of applications may cause Permissions error [\#2175](https://github.com/pypeclub/OpenPype/pull/2175)
- Maya: review viewport settings [\#2177](https://github.com/pypeclub/OpenPype/pull/2177)
- Maya: Aspect ratio [\#2174](https://github.com/pypeclub/OpenPype/pull/2174)
- Blender: Fix 'Deselect All' with object not in 'Object Mode' [\#2163](https://github.com/pypeclub/OpenPype/pull/2163)
- Maya: Fix hotbox broken by scriptsmenu [\#2151](https://github.com/pypeclub/OpenPype/pull/2151)
- Added validator for source files for Standalone Publisher [\#2138](https://github.com/pypeclub/OpenPype/pull/2138)
**Merged pull requests:**
### 📖 Documentation
- Settings: Site sync project settings improvement [\#2193](https://github.com/pypeclub/OpenPype/pull/2193)
- Add validate active site button to sync queue on a project [\#2176](https://github.com/pypeclub/OpenPype/pull/2176)
- Bump pillow from 8.2.0 to 8.3.2 [\#2162](https://github.com/pypeclub/OpenPype/pull/2162)
- Add command line way of running site sync server [\#2188](https://github.com/pypeclub/OpenPype/pull/2188)
## [3.5.0](https://github.com/pypeclub/OpenPype/tree/3.5.0) (2021-10-17)
@ -63,8 +70,6 @@
- Added project and task into context change message in Maya [\#2131](https://github.com/pypeclub/OpenPype/pull/2131)
- Add ExtractBurnin to photoshop review [\#2124](https://github.com/pypeclub/OpenPype/pull/2124)
- PYPE-1218 - changed namespace to contain subset name in Maya [\#2114](https://github.com/pypeclub/OpenPype/pull/2114)
- Added running configurable disk mapping command before start of OP [\#2091](https://github.com/pypeclub/OpenPype/pull/2091)
- SFTP provider [\#2073](https://github.com/pypeclub/OpenPype/pull/2073)
**🚀 Enhancements**
@ -72,17 +77,10 @@
- Settings: Updated readme for entity types in settings [\#2132](https://github.com/pypeclub/OpenPype/pull/2132)
- Nuke: unified clip loader [\#2128](https://github.com/pypeclub/OpenPype/pull/2128)
- Settings UI: Project model refreshing and sorting [\#2104](https://github.com/pypeclub/OpenPype/pull/2104)
- Create Read From Rendered - Disable Relative paths by default [\#2093](https://github.com/pypeclub/OpenPype/pull/2093)
- Added choosing different dirmap mapping if workfile synched locally [\#2088](https://github.com/pypeclub/OpenPype/pull/2088)
- General: Remove IdleManager module [\#2084](https://github.com/pypeclub/OpenPype/pull/2084)
- Tray UI: Message box about missing settings defaults [\#2080](https://github.com/pypeclub/OpenPype/pull/2080)
- Tray UI: Show menu where first click happened [\#2079](https://github.com/pypeclub/OpenPype/pull/2079)
- Global: add global validators to settings [\#2078](https://github.com/pypeclub/OpenPype/pull/2078)
- Use CRF for burnin when available [\#2070](https://github.com/pypeclub/OpenPype/pull/2070)
- Project manager: Filter first item after selection of project [\#2069](https://github.com/pypeclub/OpenPype/pull/2069)
**🐛 Bug fixes**
- Maya: Fix hotbox broken by scriptsmenu [\#2151](https://github.com/pypeclub/OpenPype/pull/2151)
- Maya: fix model publishing [\#2130](https://github.com/pypeclub/OpenPype/pull/2130)
- Fix - oiiotool wasn't recognized even if present [\#2129](https://github.com/pypeclub/OpenPype/pull/2129)
- General: Disk mapping group [\#2120](https://github.com/pypeclub/OpenPype/pull/2120)
@ -90,21 +88,6 @@
- Add startup script for Houdini Core. [\#2110](https://github.com/pypeclub/OpenPype/pull/2110)
- TVPaint: Behavior name of loop also accept repeat [\#2109](https://github.com/pypeclub/OpenPype/pull/2109)
- Ftrack: Project settings save custom attributes skip unknown attributes [\#2103](https://github.com/pypeclub/OpenPype/pull/2103)
- Blender: Fix NoneType error when animation\_data is missing for a rig [\#2101](https://github.com/pypeclub/OpenPype/pull/2101)
- Fix broken import in sftp provider [\#2100](https://github.com/pypeclub/OpenPype/pull/2100)
- Global: Fix docstring on publish plugin extract review [\#2097](https://github.com/pypeclub/OpenPype/pull/2097)
- Delivery Action Files Sequence fix [\#2096](https://github.com/pypeclub/OpenPype/pull/2096)
- General: Cloud mongo ca certificate issue [\#2095](https://github.com/pypeclub/OpenPype/pull/2095)
- TVPaint: Creator use context from workfile [\#2087](https://github.com/pypeclub/OpenPype/pull/2087)
- Blender: fix texture missing when publishing blend files [\#2085](https://github.com/pypeclub/OpenPype/pull/2085)
- General: Startup validations oiio tool path fix on linux [\#2083](https://github.com/pypeclub/OpenPype/pull/2083)
- Deadline: Collect deadline server does not check existence of deadline key [\#2082](https://github.com/pypeclub/OpenPype/pull/2082)
- Blender: fixed Curves with modifiers in Rigs [\#2081](https://github.com/pypeclub/OpenPype/pull/2081)
- Nuke UI scaling [\#2077](https://github.com/pypeclub/OpenPype/pull/2077)
**Merged pull requests:**
- Bump pywin32 from 300 to 301 [\#2086](https://github.com/pypeclub/OpenPype/pull/2086)
## [3.4.1](https://github.com/pypeclub/OpenPype/tree/3.4.1) (2021-09-23)

View file

@ -57,6 +57,17 @@ def tray(debug=False):
PypeCommands().launch_tray(debug)
@PypeCommands.add_modules
@main.group(help="Run command line arguments of OpenPype modules")
@click.pass_context
def module(ctx):
"""Module specific commands created dynamically.
These commands are generated dynamically by currently loaded addon/modules.
"""
pass
@main.command()
@click.option("-d", "--debug", is_flag=True, help="Print debug messages")
@click.option("--ftrack-url", envvar="FTRACK_SERVER",
@ -147,7 +158,9 @@ def extractenvironments(output_json_path, project, asset, task, app):
@click.option("-d", "--debug", is_flag=True, help="Print debug messages")
@click.option("-t", "--targets", help="Targets module", default=None,
multiple=True)
def publish(debug, paths, targets):
@click.option("-g", "--gui", is_flag=True,
help="Show Publish UI", default=False)
def publish(debug, paths, targets, gui):
"""Start CLI publishing.
Publish collects json from paths provided as an argument.
@ -155,7 +168,7 @@ def publish(debug, paths, targets):
"""
if debug:
os.environ['OPENPYPE_DEBUG'] = '3'
PypeCommands.publish(list(paths), targets)
PypeCommands.publish(list(paths), targets, gui)
@main.command()
@ -346,3 +359,28 @@ def run(script):
def runtests(folder, mark, pyargs):
"""Run all automatic tests after proper initialization via start.py"""
PypeCommands().run_tests(folder, mark, pyargs)
@main.command()
@click.option("-d", "--debug",
is_flag=True, help=("Run process in debug mode"))
@click.option("-a", "--active_site", required=True,
help="Name of active stie")
def syncserver(debug, active_site):
"""Run sync site server in background.
Some Site Sync use cases need to expose site to another one.
For example if majority of artists work in studio, they are not using
SS at all, but if you want to expose published assets to 'studio' site
to SFTP for only a couple of artists, some background process must
mark published assets to live on multiple sites (they might be
physically in same location - mounted shared disk).
Process mimics OP Tray with specific 'active_site' name, all
configuration for this "dummy" user comes from Setting or Local
Settings (configured by starting OP Tray with env
var OPENPYPE_LOCAL_ID set to 'active_site'.
"""
if debug:
os.environ['OPENPYPE_DEBUG'] = '3'
PypeCommands().syncserver(active_site)

View file

@ -20,11 +20,16 @@ log = PypeLogger.get_logger("WebServer")
class RestApiResource:
"""Resource carrying needed info and Avalon DB connection for publish."""
def __init__(self, server_manager, executable, upload_dir):
def __init__(self, server_manager, executable, upload_dir,
studio_task_queue=None):
self.server_manager = server_manager
self.upload_dir = upload_dir
self.executable = executable
if studio_task_queue is None:
studio_task_queue = collections.deque().dequeu
self.studio_task_queue = studio_task_queue
self.dbcon = AvalonMongoDB()
self.dbcon.install()
@ -182,8 +187,6 @@ class WebpublisherBatchPublishEndpoint(_RestApiEndpoint):
msg = "Non existent OpenPype executable {}".format(openpype_app)
raise RuntimeError(msg)
# for postprocessing in host, currently only PS
output = {}
log.info("WebpublisherBatchPublishEndpoint called")
content = await request.json()
@ -203,7 +206,10 @@ class WebpublisherBatchPublishEndpoint(_RestApiEndpoint):
# would change
# - targets argument is not used in 'remotepublishfromapp'
"targets": None
}
},
# does publish need to be handled by a queue, eg. only
# single process running concurrently?
"add_to_queue": True
}
]
@ -219,19 +225,20 @@ class WebpublisherBatchPublishEndpoint(_RestApiEndpoint):
"targets": ["filespublish"]
}
add_to_queue = False
if content.get("studio_processing"):
log.info("Post processing called")
batch_data = parse_json(os.path.join(batch_path, "manifest.json"))
if not batch_data:
raise ValueError(
"Cannot parse batch meta in {} folder".format(batch_path))
"Cannot parse batch manifest in {}".format(batch_path))
task_dir_name = batch_data["tasks"][0]
task_data = parse_json(os.path.join(batch_path, task_dir_name,
"manifest.json"))
if not task_data:
raise ValueError(
"Cannot parse batch meta in {} folder".format(task_data))
"Cannot parse task manifest in {}".format(task_data))
for process_filter in studio_processing_filters:
filter_extensions = process_filter.get("extensions") or []
@ -244,6 +251,7 @@ class WebpublisherBatchPublishEndpoint(_RestApiEndpoint):
add_args.update(
process_filter.get("arguments") or {}
)
add_to_queue = process_filter["add_to_queue"]
break
args = [
@ -263,11 +271,14 @@ class WebpublisherBatchPublishEndpoint(_RestApiEndpoint):
args.append(value)
log.info("args:: {}".format(args))
if add_to_queue:
log.debug("Adding to queue")
self.resource.studio_task_queue.append(args)
else:
subprocess.call(args)
subprocess.call(args)
return Response(
status=200,
body=self.resource.encode(output),
content_type="application/json"
)

View file

@ -1,8 +1,10 @@
import collections
import time
import os
from datetime import datetime
import requests
import json
import subprocess
from openpype.lib import PypeLogger
@ -31,10 +33,13 @@ def run_webserver(*args, **kwargs):
port = kwargs.get("port") or 8079
server_manager = webserver_module.create_new_server_manager(port, host)
webserver_url = server_manager.url
# queue for remotepublishfromapp tasks
studio_task_queue = collections.deque()
resource = RestApiResource(server_manager,
upload_dir=kwargs["upload_dir"],
executable=kwargs["executable"])
executable=kwargs["executable"],
studio_task_queue=studio_task_queue)
projects_endpoint = WebpublisherProjectsEndpoint(resource)
server_manager.add_route(
"GET",
@ -88,6 +93,10 @@ def run_webserver(*args, **kwargs):
if time.time() - last_reprocessed > 20:
reprocess_failed(kwargs["upload_dir"], webserver_url)
last_reprocessed = time.time()
if studio_task_queue:
args = studio_task_queue.popleft()
subprocess.call(args) # blocking call
time.sleep(1.0)

View file

@ -522,6 +522,11 @@ def get_local_site_id():
Identifier is created if does not exists yet.
"""
# override local id from environment
# used for background syncing
if os.environ.get("OPENPYPE_LOCAL_ID"):
return os.environ["OPENPYPE_LOCAL_ID"]
registry = OpenPypeSettingsRegistry()
try:
return registry.get_item("localId")

View file

@ -22,6 +22,9 @@ def import_filepath(filepath, module_name=None):
if module_name is None:
module_name = os.path.splitext(os.path.basename(filepath))[0]
# Make sure it is not 'unicode' in Python 2
module_name = str(module_name)
# Prepare module object where content of file will be parsed
module = types.ModuleType(module_name)

View file

@ -22,6 +22,10 @@ OpenPype modules should contain separated logic of specific kind of implementati
- `__init__` should not be overridden and `initialize` should not do time consuming part but only prepare base data about module
- also keep in mind that they may be initialized in headless mode
- connection with other modules is made with help of interfaces
- `cli` method - add cli commands specific for the module
- command line arguments are handled using `click` python module
- `cli` method should expect single argument which is click group on which can be called any group specific methods (e.g. `add_command` to add another click group as children see `ExampleAddon`)
- it is possible to add trigger cli commands using `./openpype_console module <module_name> <command> *args`
## Addon class `OpenPypeAddOn`
- inherits from `OpenPypeModule` but is enabled by default and doesn't have to implement `initialize` and `connect_with_modules` methods
@ -140,4 +144,4 @@ class ClockifyModule(
### TrayModulesManager
- inherits from `ModulesManager`
- has specific implementation for Pype Tray tool and handle `ITrayModule` methods
- has specific implementation for Pype Tray tool and handle `ITrayModule` methods

View file

@ -107,12 +107,9 @@ class _InterfacesClass(_ModuleClass):
if attr_name in ("__path__", "__file__"):
return None
# Fake Interface if is not missing
self.__attributes__[attr_name] = type(
attr_name,
(MissingInteface, ),
{}
)
raise ImportError((
"cannot import name '{}' from 'openpype_interfaces'"
).format(attr_name))
return self.__attributes__[attr_name]
@ -212,54 +209,17 @@ def _load_interfaces():
_InterfacesClass(modules_key)
)
log = PypeLogger.get_logger("InterfacesLoader")
from . import interfaces
dirpaths = get_module_dirs()
interface_paths = []
interface_paths.append(
os.path.join(get_default_modules_dir(), "interfaces.py")
)
for dirpath in dirpaths:
if not os.path.exists(dirpath):
for attr_name in dir(interfaces):
attr = getattr(interfaces, attr_name)
if (
not inspect.isclass(attr)
or attr is OpenPypeInterface
or not issubclass(attr, OpenPypeInterface)
):
continue
for filename in os.listdir(dirpath):
if filename in ("__pycache__", ):
continue
full_path = os.path.join(dirpath, filename)
if not os.path.isdir(full_path):
continue
interfaces_path = os.path.join(full_path, "interfaces.py")
if os.path.exists(interfaces_path):
interface_paths.append(interfaces_path)
for full_path in interface_paths:
if not os.path.exists(full_path):
continue
try:
# Prepare module object where content of file will be parsed
module = import_filepath(full_path)
except Exception:
log.warning(
"Failed to load path: \"{0}\"".format(full_path),
exc_info=True
)
continue
for attr_name in dir(module):
attr = getattr(module, attr_name)
if (
not inspect.isclass(attr)
or attr is OpenPypeInterface
or not issubclass(attr, OpenPypeInterface)
):
continue
setattr(openpype_interfaces, attr_name, attr)
setattr(openpype_interfaces, attr_name, attr)
def load_modules(force=False):
@ -333,6 +293,15 @@ def _load_modules():
# - check manifest and content of manifest
try:
if os.path.isdir(fullpath):
# Module without init file can't be used as OpenPype module
# because the module class could not be imported
init_file = os.path.join(fullpath, "__init__.py")
if not os.path.exists(init_file):
log.info((
"Skipping module directory because of"
" missing \"__init__.py\" file. \"{}\""
).format(fullpath))
continue
import_module_from_dirpath(dirpath, filename, modules_key)
elif ext in (".py", ):
@ -369,14 +338,6 @@ class OpenPypeInterface:
pass
class MissingInteface(OpenPypeInterface):
"""Class representing missing interface class.
Used when interface is not available from currently registered paths.
"""
pass
@six.add_metaclass(ABCMeta)
class OpenPypeModule:
"""Base class of pype module.
@ -431,6 +392,28 @@ class OpenPypeModule:
"""
return {}
def cli(self, module_click_group):
"""Add commands to click group.
The best practise is to create click group for whole module which is
used to separate commands.
class MyPlugin(OpenPypeModule):
...
def cli(self, module_click_group):
module_click_group.add_command(cli_main)
@click.group(<module name>, help="<Any help shown in cmd>")
def cli_main():
pass
@cli_main.command()
def mycommand():
print("my_command")
"""
pass
class OpenPypeAddOn(OpenPypeModule):
# Enable Addon by default

View file

@ -194,6 +194,7 @@ class SyncToAvalonEvent(BaseEvent):
ftrack_id = proj["data"].get("ftrackId")
if ftrack_id is None:
ftrack_id = self._update_project_ftrack_id()
proj["data"]["ftrackId"] = ftrack_id
self._avalon_ents_by_ftrack_id[ftrack_id] = proj
for ent in ents:
ftrack_id = ent["data"].get("ftrackId")
@ -584,6 +585,10 @@ class SyncToAvalonEvent(BaseEvent):
continue
ftrack_id = ftrack_id[0]
# Skip deleted projects
if action == "remove" and entityType == "show":
return True
# task modified, collect parent id of task, handle separately
if entity_type.lower() == "task":
changes = ent_info.get("changes") or {}

View file

@ -1,8 +1,10 @@
import os
import json
import collections
from openpype.modules import OpenPypeModule
import click
from openpype.modules import OpenPypeModule
from openpype_interfaces import (
ITrayModule,
IPluginPaths,
@ -224,8 +226,8 @@ class FtrackModule(
if not project_name:
return
attributes_changes = changes.get("attributes")
if not attributes_changes:
new_attr_values = new_value.get("attributes")
if not new_attr_values:
return
import ftrack_api
@ -275,7 +277,7 @@ class FtrackModule(
failed = {}
missing = {}
for key, value in attributes_changes.items():
for key, value in new_attr_values.items():
if key not in ca_keys:
continue
@ -349,12 +351,24 @@ class FtrackModule(
if "server_url" not in session_kwargs:
session_kwargs["server_url"] = self.ftrack_url
if "api_key" not in session_kwargs or "api_user" not in session_kwargs:
api_key = session_kwargs.get("api_key")
api_user = session_kwargs.get("api_user")
# First look into environments
# - both OpenPype tray and ftrack event server should have set them
# - ftrack event server may crash when credentials are tried to load
# from keyring
if not api_key or not api_user:
api_key = os.environ.get("FTRACK_API_KEY")
api_user = os.environ.get("FTRACK_API_USER")
if not api_key or not api_user:
from .lib import credentials
cred = credentials.get_credentials()
session_kwargs["api_user"] = cred.get("username")
session_kwargs["api_key"] = cred.get("api_key")
api_user = cred.get("username")
api_key = cred.get("api_key")
session_kwargs["api_user"] = api_user
session_kwargs["api_key"] = api_key
return ftrack_api.Session(**session_kwargs)
def tray_init(self):
@ -409,3 +423,62 @@ class FtrackModule(
return 0
hours_logged = (task_entity["time_logged"] / 60) / 60
return hours_logged
def get_credentials(self):
# type: () -> tuple
"""Get local Ftrack credentials."""
from .lib import credentials
cred = credentials.get_credentials(self.ftrack_url)
return cred.get("username"), cred.get("api_key")
def cli(self, click_group):
click_group.add_command(cli_main)
@click.group(FtrackModule.name, help="Ftrack module related commands.")
def cli_main():
pass
@cli_main.command()
@click.option("-d", "--debug", is_flag=True, help="Print debug messages")
@click.option("--ftrack-url", envvar="FTRACK_SERVER",
help="Ftrack server url")
@click.option("--ftrack-user", envvar="FTRACK_API_USER",
help="Ftrack api user")
@click.option("--ftrack-api-key", envvar="FTRACK_API_KEY",
help="Ftrack api key")
@click.option("--legacy", is_flag=True,
help="run event server without mongo storing")
@click.option("--clockify-api-key", envvar="CLOCKIFY_API_KEY",
help="Clockify API key.")
@click.option("--clockify-workspace", envvar="CLOCKIFY_WORKSPACE",
help="Clockify workspace")
def eventserver(
debug,
ftrack_url,
ftrack_user,
ftrack_api_key,
legacy,
clockify_api_key,
clockify_workspace
):
"""Launch ftrack event server.
This should be ideally used by system service (such us systemd or upstart
on linux and window service).
"""
if debug:
os.environ["OPENPYPE_DEBUG"] = "3"
from .ftrack_server.event_server_cli import run_event_server
return run_event_server(
ftrack_url,
ftrack_user,
ftrack_api_key,
legacy,
clockify_api_key,
clockify_workspace
)

View file

@ -570,9 +570,15 @@ class BaseHandler(object):
if low_entity_type == "assetversion":
asset = entity["asset"]
parent = None
if asset:
parent = asset["parent"]
if parent:
if parent:
if parent.entity_type.lower() == "project":
return parent
if "project" in parent:
return parent["project"]
project_data = entity["link"][0]

View file

@ -0,0 +1,23 @@
# -*- coding: utf-8 -*-
"""Collect default Deadline server."""
import pyblish.api
import os
class CollectLocalFtrackCreds(pyblish.api.ContextPlugin):
"""Collect default Royal Render path."""
order = pyblish.api.CollectorOrder + 0.01
label = "Collect local ftrack credentials"
targets = ["rr_control"]
def process(self, context):
if os.getenv("FTRACK_API_USER") and os.getenv("FTRACK_API_KEY") and \
os.getenv("FTRACK_SERVER"):
return
ftrack_module = context.data["openPypeModules"]["ftrack"]
if ftrack_module.enabled:
creds = ftrack_module.get_credentials()
os.environ["FTRACK_API_USER"] = creds[0]
os.environ["FTRACK_API_KEY"] = creds[1]
os.environ["FTRACK_SERVER"] = ftrack_module.ftrack_url

View file

@ -0,0 +1,6 @@
from .royal_render_module import RoyalRenderModule
__all__ = (
"RoyalRenderModule",
)

View file

@ -0,0 +1,199 @@
# -*- coding: utf-8 -*-
"""Wrapper around Royal Render API."""
import sys
import os
from openpype.settings import get_project_settings
from openpype.lib.local_settings import OpenPypeSettingsRegistry
from openpype.lib import PypeLogger, run_subprocess
from .rr_job import RRJob, SubmitFile, SubmitterParameter
log = PypeLogger.get_logger("RoyalRender")
class Api:
_settings = None
RR_SUBMIT_CONSOLE = 1
RR_SUBMIT_API = 2
def __init__(self, settings, project=None):
self._settings = settings
self._initialize_rr(project)
def _initialize_rr(self, project=None):
# type: (str) -> None
"""Initialize RR Path.
Args:
project (str, Optional): Project name to set RR api in
context.
"""
if project:
project_settings = get_project_settings(project)
rr_path = (
project_settings
["royalrender"]
["rr_paths"]
)
else:
rr_path = (
self._settings
["modules"]
["royalrender"]
["rr_path"]
["default"]
)
os.environ["RR_ROOT"] = rr_path
self._rr_path = rr_path
def _get_rr_bin_path(self, rr_root=None):
# type: (str) -> str
"""Get path to RR bin folder."""
rr_root = rr_root or self._rr_path
is_64bit_python = sys.maxsize > 2 ** 32
rr_bin_path = ""
if sys.platform.lower() == "win32":
rr_bin_path = "/bin/win64"
if not is_64bit_python:
# we are using 64bit python
rr_bin_path = "/bin/win"
rr_bin_path = rr_bin_path.replace(
"/", os.path.sep
)
if sys.platform.lower() == "darwin":
rr_bin_path = "/bin/mac64"
if not is_64bit_python:
rr_bin_path = "/bin/mac"
if sys.platform.lower() == "linux":
rr_bin_path = "/bin/lx64"
return os.path.join(rr_root, rr_bin_path)
def _initialize_module_path(self):
# type: () -> None
"""Set RR modules for Python."""
# default for linux
rr_bin = self._get_rr_bin_path()
rr_module_path = os.path.join(rr_bin, "lx64/lib")
if sys.platform.lower() == "win32":
rr_module_path = rr_bin
rr_module_path = rr_module_path.replace(
"/", os.path.sep
)
if sys.platform.lower() == "darwin":
rr_module_path = os.path.join(rr_bin, "lib/python/27")
sys.path.append(os.path.join(self._rr_path, rr_module_path))
def create_submission(self, jobs, submitter_attributes, file_name=None):
# type: (list[RRJob], list[SubmitterParameter], str) -> SubmitFile
"""Create jobs submission file.
Args:
jobs (list): List of :class:`RRJob`
submitter_attributes (list): List of submitter attributes
:class:`SubmitterParameter` for whole submission batch.
file_name (str), optional): File path to write data to.
Returns:
str: XML data of job submission files.
"""
raise NotImplementedError
def submit_file(self, file, mode=RR_SUBMIT_CONSOLE):
# type: (SubmitFile, int) -> None
if mode == self.RR_SUBMIT_CONSOLE:
self._submit_using_console(file)
# RR v7 supports only Python 2.7 so we bail out in fear
# until there is support for Python 3 😰
raise NotImplementedError(
"Submission via RoyalRender API is not supported yet")
# self._submit_using_api(file)
def _submit_using_console(self, file):
# type: (SubmitFile) -> bool
rr_console = os.path.join(
self._get_rr_bin_path(),
"rrSubmitterconsole"
)
if sys.platform.lower() == "darwin":
if "/bin/mac64" in rr_console:
rr_console = rr_console.replace("/bin/mac64", "/bin/mac")
if sys.platform.lower() == "win32":
if "/bin/win64" in rr_console:
rr_console = rr_console.replace("/bin/win64", "/bin/win")
rr_console += ".exe"
args = [rr_console, file]
run_subprocess(" ".join(args), logger=log)
def _submit_using_api(self, file):
# type: (SubmitFile) -> None
"""Use RR API to submit jobs.
Args:
file (SubmitFile): Submit jobs definition.
Throws:
RoyalRenderException: When something fails.
"""
self._initialize_module_path()
import libpyRR2 as rrLib # noqa
from rrJob import getClass_JobBasics # noqa
import libpyRR2 as _RenderAppBasic # noqa
tcp = rrLib._rrTCP("") # noqa
rr_server = tcp.getRRServer()
if len(rr_server) == 0:
log.info("Got RR IP address {}".format(rr_server))
# TODO: Port is hardcoded in RR? If not, move it to Settings
if not tcp.setServer(rr_server, 7773):
log.error(
"Can not set RR server: {}".format(tcp.errorMessage()))
raise RoyalRenderException(tcp.errorMessage())
# TODO: This need UI and better handling of username/password.
# We can't store password in keychain as it is pulled multiple
# times and users on linux must enter keychain password every time.
# Probably best way until we setup our own user management would be
# to encrypt password and save it to json locally. Not bulletproof
# but at least it is not stored in plaintext.
reg = OpenPypeSettingsRegistry()
try:
rr_user = reg.get_item("rr_username")
rr_password = reg.get_item("rr_password")
except ValueError:
# user has no rr credentials set
pass
else:
# login to RR
tcp.setLogin(rr_user, rr_password)
job = getClass_JobBasics()
renderer = _RenderAppBasic()
# iterate over SubmitFile, set _JobBasic (job) and renderer
# and feed it to jobSubmitNew()
# not implemented yet
job.renderer = renderer
tcp.jobSubmitNew(job)
class RoyalRenderException(Exception):
"""Exception used in various error states coming from RR."""
pass

View file

@ -0,0 +1,23 @@
# -*- coding: utf-8 -*-
"""Collect default Deadline server."""
import pyblish.api
class CollectDefaultRRPath(pyblish.api.ContextPlugin):
"""Collect default Royal Render path."""
order = pyblish.api.CollectorOrder + 0.01
label = "Default Royal Render Path"
def process(self, context):
try:
rr_module = context.data.get(
"openPypeModules")["royalrender"]
except AttributeError:
msg = "Cannot get OpenPype Royal Render module."
self.log.error(msg)
raise AssertionError(msg)
# get default deadline webservice url from deadline module
self.log.debug(rr_module.rr_paths)
context.data["defaultRRPath"] = rr_module.rr_paths["default"] # noqa: E501

View file

@ -0,0 +1,49 @@
# -*- coding: utf-8 -*-
import pyblish.api
class CollectRRPathFromInstance(pyblish.api.InstancePlugin):
"""Collect RR Path from instance."""
order = pyblish.api.CollectorOrder
label = "Royal Render Path from the Instance"
families = ["rendering"]
def process(self, instance):
instance.data["rrPath"] = self._collect_rr_path(instance)
self.log.info(
"Using {} for submission.".format(instance.data["rrPath"]))
@staticmethod
def _collect_rr_path(render_instance):
# type: (pyblish.api.Instance) -> str
"""Get Royal Render path from render instance."""
rr_settings = (
render_instance.context.data
["system_settings"]
["modules"]
["royalrender"]
)
try:
default_servers = rr_settings["rr_paths"]
project_servers = (
render_instance.context.data
["project_settings"]
["royalrender"]
["rr_paths"]
)
rr_servers = {
k: default_servers[k]
for k in project_servers
if k in default_servers
}
except AttributeError:
# Handle situation were we had only one url for deadline.
return render_instance.context.data["defaultRRPath"]
return rr_servers[
list(rr_servers.keys())[
int(render_instance.data.get("rrPaths"))
]
]

View file

@ -0,0 +1,209 @@
# -*- coding: utf-8 -*-
"""Collect sequences from Royal Render Job."""
import os
import re
import copy
import json
from pprint import pformat
import pyblish.api
from avalon import api
def collect(root,
regex=None,
exclude_regex=None,
frame_start=None,
frame_end=None):
"""Collect sequence collections in root"""
from avalon.vendor import clique
files = []
for filename in os.listdir(root):
# Must have extension
ext = os.path.splitext(filename)[1]
if not ext:
continue
# Only files
if not os.path.isfile(os.path.join(root, filename)):
continue
# Include and exclude regex
if regex and not re.search(regex, filename):
continue
if exclude_regex and re.search(exclude_regex, filename):
continue
files.append(filename)
# Match collections
# Support filenames like: projectX_shot01_0010.tiff with this regex
pattern = r"(?P<index>(?P<padding>0*)\d+)\.\D+\d?$"
collections, remainder = clique.assemble(files,
patterns=[pattern],
minimum_items=1)
# Ignore any remainders
if remainder:
print("Skipping remainder {}".format(remainder))
# Exclude any frames outside start and end frame.
for collection in collections:
for index in list(collection.indexes):
if frame_start is not None and index < frame_start:
collection.indexes.discard(index)
continue
if frame_end is not None and index > frame_end:
collection.indexes.discard(index)
continue
# Keep only collections that have at least a single frame
collections = [c for c in collections if c.indexes]
return collections
class CollectSequencesFromJob(pyblish.api.ContextPlugin):
"""Gather file sequences from job directory.
When "OPENPYPE_PUBLISH_DATA" environment variable is set these paths
(folders or .json files) are parsed for image sequences. Otherwise the
current working directory is searched for file sequences.
"""
order = pyblish.api.CollectorOrder
targets = ["rr_control"]
label = "Collect Rendered Frames"
def process(self, context):
if os.environ.get("OPENPYPE_PUBLISH_DATA"):
self.log.debug(os.environ.get("OPENPYPE_PUBLISH_DATA"))
paths = os.environ["OPENPYPE_PUBLISH_DATA"].split(os.pathsep)
self.log.info("Collecting paths: {}".format(paths))
else:
cwd = context.get("workspaceDir", os.getcwd())
paths = [cwd]
for path in paths:
self.log.info("Loading: {}".format(path))
if path.endswith(".json"):
# Search using .json configuration
with open(path, "r") as f:
try:
data = json.load(f)
except Exception as exc:
self.log.error("Error loading json: "
"{} - Exception: {}".format(path, exc))
raise
cwd = os.path.dirname(path)
root_override = data.get("root")
if root_override:
if os.path.isabs(root_override):
root = root_override
else:
root = os.path.join(cwd, root_override)
else:
root = cwd
metadata = data.get("metadata")
if metadata:
session = metadata.get("session")
if session:
self.log.info("setting session using metadata")
api.Session.update(session)
os.environ.update(session)
else:
# Search in directory
data = {}
root = path
self.log.info("Collecting: {}".format(root))
regex = data.get("regex")
if regex:
self.log.info("Using regex: {}".format(regex))
collections = collect(root=root,
regex=regex,
exclude_regex=data.get("exclude_regex"),
frame_start=data.get("frameStart"),
frame_end=data.get("frameEnd"))
self.log.info("Found collections: {}".format(collections))
if data.get("subset") and len(collections) > 1:
self.log.error("Forced subset can only work with a single "
"found sequence")
raise RuntimeError("Invalid sequence")
fps = data.get("fps", 25)
# Get family from the data
families = data.get("families", ["render"])
if "render" not in families:
families.append("render")
if "ftrack" not in families:
families.append("ftrack")
if "review" not in families:
families.append("review")
for collection in collections:
instance = context.create_instance(str(collection))
self.log.info("Collection: %s" % list(collection))
# Ensure each instance gets a unique reference to the data
data = copy.deepcopy(data)
# If no subset provided, get it from collection's head
subset = data.get("subset", collection.head.rstrip("_. "))
# If no start or end frame provided, get it from collection
indices = list(collection.indexes)
start = data.get("frameStart", indices[0])
end = data.get("frameEnd", indices[-1])
# root = os.path.normpath(root)
# self.log.info("Source: {}}".format(data.get("source", "")))
ext = list(collection)[0].split('.')[-1]
instance.data.update({
"name": str(collection),
"family": families[0], # backwards compatibility / pyblish
"families": list(families),
"subset": subset,
"asset": data.get("asset", api.Session["AVALON_ASSET"]),
"stagingDir": root,
"frameStart": start,
"frameEnd": end,
"fps": fps,
"source": data.get('source', '')
})
instance.append(collection)
instance.context.data['fps'] = fps
if "representations" not in instance.data:
instance.data["representations"] = []
representation = {
'name': ext,
'ext': '{}'.format(ext),
'files': list(collection),
"stagingDir": root,
"anatomy_template": "render",
"fps": fps,
"tags": ['review']
}
instance.data["representations"].append(representation)
if data.get('user'):
context.data["user"] = data['user']
self.log.debug("Collected instance:\n"
"{}".format(pformat(instance.data)))

View file

@ -0,0 +1,46 @@
# -*- coding: utf-8 -*-
"""Module providing support for Royal Render."""
import os
import openpype.modules
from openpype.modules import OpenPypeModule
from openpype_interfaces import IPluginPaths
class RoyalRenderModule(OpenPypeModule, IPluginPaths):
"""Class providing basic Royal Render implementation logic."""
name = "royalrender"
@property
def api(self):
if not self._api:
# import royal render modules
from . import api as rr_api
self._api = rr_api.Api(self.settings)
return self._api
def __init__(self, manager, settings):
# type: (openpype.modules.base.ModulesManager, dict) -> None
self.rr_paths = {}
self._api = None
self.settings = settings
super(RoyalRenderModule, self).__init__(manager, settings)
def initialize(self, module_settings):
# type: (dict) -> None
rr_settings = module_settings[self.name]
self.enabled = rr_settings["enabled"]
self.rr_paths = rr_settings.get("rr_paths")
@staticmethod
def get_plugin_paths():
# type: () -> dict
"""Royal Render plugin paths.
Returns:
dict: Dictionary of plugin paths for RR.
"""
current_dir = os.path.dirname(os.path.abspath(__file__))
return {
"publish": [os.path.join(current_dir, "plugins", "publish")]
}

View file

@ -0,0 +1,256 @@
# -*- coding: utf-8 -*-
"""Python wrapper for RoyalRender XML job file."""
from xml.dom import minidom as md
import attr
from collections import namedtuple, OrderedDict
CustomAttribute = namedtuple("CustomAttribute", ["name", "value"])
@attr.s
class RRJob:
"""Mapping of Royal Render job file to a data class."""
# Required
# --------
# Name of your render application. Same as in the render config file.
# (Maya, Softimage)
Software = attr.ib() # type: str
# The OS the scene was created on, all texture paths are set on
# that OS. Possible values are windows, linux, osx
SceneOS = attr.ib() # type: str
# Renderer you use. Same as in the render config file
# (VRay, Mental Ray, Arnold)
Renderer = attr.ib() # type: str
# Version you want to render with. (5.11, 2010, 12)
Version = attr.ib() # type: str
# Name of the scene file with full path.
SceneName = attr.ib() # type: str
# Is the job enabled for submission?
# enabled by default
IsActive = attr.ib() # type: str
# Sequence settings of this job
SeqStart = attr.ib() # type: int
SeqEnd = attr.ib() # type: int
SeqStep = attr.ib() # type: int
SeqFileOffset = attr.ib() # type: int
# If you specify ImageDir, then ImageFilename has no path. If you do
# NOT specify ImageDir, then ImageFilename has to include the path.
# Same for ImageExtension.
# Important: Do not forget any _ or . in front or after the frame
# numbering. Usually ImageExtension always starts with a . (.tga, .exr)
ImageDir = attr.ib() # type: str
ImageFilename = attr.ib() # type: str
ImageExtension = attr.ib() # type: str
# Some applications always add a . or _ in front of the frame number.
# Set this variable to that character. The user can then change
# the filename at the rrSubmitter and the submitter keeps
# track of this character.
ImagePreNumberLetter = attr.ib() # type: str
# If you render a single file, e.g. Quicktime or Avi, then you have to
# set this value. Videos have to be rendered at once on one client.
ImageSingleOutputFile = attr.ib(default="false") # type: str
# Semi-Required (required for some render applications)
# -----------------------------------------------------
# The database of your scene file. In Maya and XSI called "project",
# in Lightwave "content dir"
SceneDatabaseDir = attr.ib(default=None) # type: str
# Required if you want to split frames on multiple clients
ImageWidth = attr.ib(default=None) # type: int
ImageHeight = attr.ib(default=None) # type: int
Camera = attr.ib(default=None) # type: str
Layer = attr.ib(default=None) # type: str
Channel = attr.ib(default=None) # type: str
# Optional
# --------
# Used for the RR render license function.
# E.g. If you render with mentalRay, then add mentalRay. If you render
# with Nuke and you use Furnace plugins in your comp, add Furnace.
# TODO: determine how this work for multiple plugins
RequiredPlugins = attr.ib(default=None) # type: str
# Frame Padding of the frame number in the rendered filename.
# Some render config files are setting the padding at render time.
ImageFramePadding = attr.ib(default=None) # type: str
# Some render applications support overriding the image format at
# the render commandline.
OverrideImageFormat = attr.ib(default=None) # type: str
# rrControl can display the name of additonal channels that are
# rendered. Each channel requires these two values. ChannelFilename
# contains the full path.
ChannelFilename = attr.ib(default=None) # type: str
ChannelExtension = attr.ib(default=None) # type: str
# A value between 0 and 255. Each job gets the Pre ID attached as small
# letter to the main ID. A new main ID is generated for every machine
# for every 5/1000s.
PreID = attr.ib(default=None) # type: int
# When the job is received by the server, the server checks for other
# jobs send from this machine. If a job with the PreID was found, then
# this jobs waits for the other job. Note: This flag can be used multiple
# times to wait for multiple jobs.
WaitForPreID = attr.ib(default=None) # type: int
# List of submitter options per job
# list item must be of `SubmitterParameter` type
SubmitterParameters = attr.ib(factory=list) # type: list
# List of Custom job attributes
# Royal Render support custom attributes in format <CustomFoo> or
# <CustomSomeOtherAttr>
# list item must be of `CustomAttribute` named tuple
CustomAttributes = attr.ib(factory=list) # type: list
# Additional information for subsequent publish script and
# for better display in rrControl
UserName = attr.ib(default=None) # type: str
CustomSeQName = attr.ib(default=None) # type: str
CustomSHotName = attr.ib(default=None) # type: str
CustomVersionName = attr.ib(default=None) # type: str
CustomUserInfo = attr.ib(default=None) # type: str
SubmitMachine = attr.ib(default=None) # type: str
Color_ID = attr.ib(default=2) # type: int
RequiredLicenses = attr.ib(default=None) # type: str
# Additional frame info
Priority = attr.ib(default=50) # type: int
TotalFrames = attr.ib(default=None) # type: int
Tiled = attr.ib(default=None) # type: str
class SubmitterParameter:
"""Wrapper for Submitter Parameters."""
def __init__(self, parameter, *args):
# type: (str, list) -> None
self._parameter = parameter
self._values = args
def serialize(self):
# type: () -> str
"""Serialize submitter parameter as a string value.
This can be later on used as text node in job xml file.
Returns:
str: concatenated string of parameter values.
"""
return '"{param}={val}"'.format(
param=self._parameter, val="~".join(self._values))
@attr.s
class SubmitFile:
"""Class wrapping Royal Render submission XML file."""
# Syntax version of the submission file.
syntax_version = attr.ib(default="6.0") # type: str
# Delete submission file after processing
DeleteXML = attr.ib(default=1) # type: int
# List of submitter options per job
# list item must be of `SubmitterParameter` type
SubmitterParameters = attr.ib(factory=list) # type: list
# List of job is submission batch.
# list item must be of type `RRJob`
Jobs = attr.ib(factory=list) # type: list
@staticmethod
def _process_submitter_parameters(parameters, dom, append_to):
# type: (list[SubmitterParameter], md.Document, md.Element) -> None
"""Take list of :class:`SubmitterParameter` and process it as XML.
This will take :class:`SubmitterParameter`, create XML element
for them and convert value to Royal Render compatible string
(options and values separated by ~)
Args:
parameters (list of SubmitterParameter): List of parameters.
dom (xml.dom.minidom.Document): XML Document
append_to (xml.dom.minidom.Element): Element to append to.
"""
for param in parameters:
if not isinstance(param, SubmitterParameter):
raise AttributeError(
"{} is not of type `SubmitterParameter`".format(param))
xml_parameter = dom.createElement("SubmitterParameter")
xml_parameter.appendChild(dom.createTextNode(param.serialize()))
append_to.appendChild(xml_parameter)
def serialize(self):
# type: () -> str
"""Return all data serialized as XML.
Returns:
str: XML data as string.
"""
def filter_data(a, v):
"""Skip private attributes."""
if a.name.startswith("_"):
return False
if v is None:
return False
return True
root = md.Document()
# root element: <RR_Job_File syntax_version="6.0">
job_file = root.createElement('RR_Job_File')
job_file.setAttribute("syntax_version", self.syntax_version)
# handle Submitter Parameters for batch
# <SubmitterParameter>foo=bar~baz~goo</SubmitterParameter>
self._process_submitter_parameters(
self.SubmitterParameters, root, job_file)
for job in self.Jobs: # type: RRJob
if not isinstance(job, RRJob):
raise AttributeError(
"{} is not of type `SubmitterParameter`".format(job))
xml_job = root.createElement("Job")
# handle Submitter Parameters for job
self._process_submitter_parameters(
job.SubmitterParameters, root, xml_job
)
job_custom_attributes = job.CustomAttributes
serialized_job = attr.asdict(
job, dict_factory=OrderedDict, filter=filter_data)
serialized_job.pop("CustomAttributes")
serialized_job.pop("SubmitterParameters")
for custom_attr in job_custom_attributes: # type: CustomAttribute
serialized_job["Custom{}".format(
custom_attr.name)] = custom_attr.value
for item, value in serialized_job.items():
xml_attr = root.create(item)
xml_attr.appendChild(
root.createTextNode(value)
)
xml_job.appendChild(xml_attr)
return root.toprettyxml(indent="\t")

View file

@ -0,0 +1,5 @@
## OpenPype RoyalRender integration plugins
### Installation
Copy content of this folder to your `RR_ROOT` (place where RoyalRender studio wide installation is).

View file

@ -0,0 +1,170 @@
# -*- coding: utf-8 -*-
"""This is RR control plugin that runs on the job by user interaction.
It asks user for context to publish, getting it from OpenPype. In order to
run it needs `OPENPYPE_ROOT` to be set to know where to execute OpenPype.
"""
import rr # noqa
import rrGlobal # noqa
import subprocess
import os
import glob
import platform
import tempfile
import json
class OpenPypeContextSelector:
"""Class to handle publishing context determination in RR."""
def __init__(self):
self.job = rr.getJob()
self.context = None
self.openpype_executable = "openpype_gui"
if platform.system().lower() == "windows":
self.openpype_executable = "{}.exe".format(
self.openpype_executable)
op_path = os.environ.get("OPENPYPE_ROOT")
print("initializing ... {}".format(op_path))
if not op_path:
print("Warning: OpenPype root is not found.")
if platform.system().lower() == "windows":
print(" * trying to find OpenPype on local computer.")
op_path = os.path.join(
os.environ.get("PROGRAMFILES"),
"OpenPype", "openpype_console.exe"
)
if os.path.exists(op_path):
print(" - found OpenPype installation {}".format(op_path))
else:
# try to find in user local context
op_path = os.path.join(
os.environ.get("LOCALAPPDATA"),
"Programs",
"OpenPype", "openpype_console.exe"
)
if os.path.exists(op_path):
print(
" - found OpenPype installation {}".format(
op_path))
else:
raise Exception("Error: OpenPype was not found.")
self.openpype_root = op_path
# TODO: this should try to find metadata file. Either using
# jobs custom attributes or using environment variable
# or just using plain existence of file.
# self.context = self._process_metadata_file()
def _process_metadata_file(self):
"""Find and process metadata file.
Try to find metadata json file in job folder to get context from.
Returns:
dict: Context from metadata json file.
"""
image_dir = self.job.imageDir
metadata_files = glob.glob(
"{}{}*_metadata.json".format(image_dir, os.path.sep))
if not metadata_files:
return {}
raise NotImplementedError(
"Processing existing metadata not implemented yet.")
def process_job(self):
"""Process selected job.
This should process selected job. If context can be determined
automatically, no UI will be show and publishing will directly
proceed.
"""
if not self.context:
self.show()
self.context["user"] = self.job.userName
self.run_publish()
def show(self):
"""Show UI for context selection.
Because of RR UI limitations, this must be done using OpenPype
itself.
"""
tf = tempfile.TemporaryFile(delete=False)
context_file = tf.name
op_args = [os.path.join(self.openpype_root, self.openpype_executable),
"contextselection", tf.name]
tf.close()
print(">>> running {}".format(" ".join(op_args)))
subprocess.call(op_args)
with open(context_file, "r") as cf:
self.context = json.load(cf)
os.unlink(context_file)
print(">>> context: {}".format(self.context))
if not self.context or \
not self.context.get("project") or \
not self.context.get("asset") or \
not self.context.get("task"):
self._show_rr_warning("Context selection failed.")
return
# self.context["app_name"] = self.job.renderer.name
self.context["app_name"] = "maya/2020"
@staticmethod
def _show_rr_warning(text):
warning_dialog = rrGlobal.getGenericUI()
warning_dialog.addItem(rrGlobal.genUIType.label, "infoLabel", "")
warning_dialog.setText("infoLabel", text)
warning_dialog.addItem(
rrGlobal.genUIType.layoutH, "btnLayout", "")
warning_dialog.addItem(
rrGlobal.genUIType.closeButton, "Ok", "btnLayout")
warning_dialog.execute()
del warning_dialog
def run_publish(self):
"""Run publish process."""
env = {'AVALON_PROJECT': str(self.context.get("project")),
"AVALON_ASSET": str(self.context.get("asset")),
"AVALON_TASK": str(self.context.get("task")),
"AVALON_APP_NAME": str(self.context.get("app_name"))}
print(">>> setting environment:")
for k, v in env.items():
print(" {}: {}".format(k, v))
args = [os.path.join(self.openpype_root, self.openpype_executable),
'publish', '-t', "rr_control", "--gui",
os.path.join(self.job.imageDir,
os.path.dirname(self.job.imageFileName))
]
print(">>> running {}".format(" ".join(args)))
orig = os.environ.copy()
orig.update(env)
try:
subprocess.call(args, env=orig)
except subprocess.CalledProcessError as e:
self._show_rr_warning(" Publish failed [ {} ]".format(
e.returncode
))
print("running selector")
selector = OpenPypeContextSelector()
selector.process_job()

View file

@ -1,30 +0,0 @@
from abc import abstractmethod
from openpype.modules import OpenPypeInterface
class ISettingsChangeListener(OpenPypeInterface):
"""Module has plugin paths to return.
Expected result is dictionary with keys "publish", "create", "load" or
"actions" and values as list or string.
{
"publish": ["path/to/publish_plugins"]
}
"""
@abstractmethod
def on_system_settings_save(
self, old_value, new_value, changes, new_value_metadata
):
pass
@abstractmethod
def on_project_settings_save(
self, old_value, new_value, changes, project_name, new_value_metadata
):
pass
@abstractmethod
def on_project_anatomy_save(
self, old_value, new_value, changes, project_name, new_value_metadata
):
pass

View file

@ -24,25 +24,19 @@ class DropboxHandler(AbstractProvider):
)
return
provider_presets = self.presets.get(self.CODE)
if not provider_presets:
msg = "Sync Server: No provider presets for {}".format(self.CODE)
log.info(msg)
return
token = self.presets[self.CODE].get("token", "")
token = self.presets.get("token", "")
if not token:
msg = "Sync Server: No access token for dropbox provider"
log.info(msg)
return
team_folder_name = self.presets[self.CODE].get("team_folder_name", "")
team_folder_name = self.presets.get("team_folder_name", "")
if not team_folder_name:
msg = "Sync Server: No team folder name for dropbox provider"
log.info(msg)
return
acting_as_member = self.presets[self.CODE].get("acting_as_member", "")
acting_as_member = self.presets.get("acting_as_member", "")
if not acting_as_member:
msg = (
"Sync Server: No acting member for dropbox provider"
@ -51,13 +45,15 @@ class DropboxHandler(AbstractProvider):
return
self.dbx = None
try:
self.dbx = self._get_service(
token, acting_as_member, team_folder_name
)
except Exception as e:
log.info("Could not establish dropbox object: {}".format(e))
return
if self.presets["enabled"]:
try:
self.dbx = self._get_service(
token, acting_as_member, team_folder_name
)
except Exception as e:
log.info("Could not establish dropbox object: {}".format(e))
return
super(AbstractProvider, self).__init__()
@ -101,12 +97,12 @@ class DropboxHandler(AbstractProvider):
},
# roots could be overriden only on Project level, User cannot
{
"key": "roots",
"key": "root",
"label": "Roots",
"type": "dict-roots",
"object_type": {
"type": "path",
"multiplatform": True,
"multiplatform": False,
"multipath": False
}
}
@ -169,7 +165,7 @@ class DropboxHandler(AbstractProvider):
Returns:
(boolean)
"""
return self.dbx is not None
return self.presets["enabled"] and self.dbx is not None
@classmethod
def get_configurable_items(cls):

View file

@ -73,13 +73,7 @@ class GDriveHandler(AbstractProvider):
format(site_name))
return
provider_presets = self.presets.get(self.CODE)
if not provider_presets:
msg = "Sync Server: No provider presets for {}".format(self.CODE)
log.info(msg)
return
cred_path = self.presets[self.CODE].get("credentials_url", {}).\
cred_path = self.presets.get("credentials_url", {}).\
get(platform.system().lower()) or ''
if not os.path.exists(cred_path):
msg = "Sync Server: No credentials for gdrive provider " + \
@ -87,10 +81,12 @@ class GDriveHandler(AbstractProvider):
log.info(msg)
return
self.service = self._get_gd_service(cred_path)
self.service = None
if self.presets["enabled"]:
self.service = self._get_gd_service(cred_path)
self._tree = tree
self.active = True
self._tree = tree
self.active = True
def is_active(self):
"""
@ -98,7 +94,7 @@ class GDriveHandler(AbstractProvider):
Returns:
(boolean)
"""
return self.service is not None
return self.presets["enabled"] and self.service is not None
@classmethod
def get_system_settings_schema(cls):
@ -125,18 +121,20 @@ class GDriveHandler(AbstractProvider):
editable = [
# credentials could be overriden on Project or User level
{
'key': "credentials_url",
'label': "Credentials url",
'type': 'text'
"type": "path",
"key": "credentials_url",
"label": "Credentials url",
"multiplatform": True,
"placeholder": "Credentials url"
},
# roots could be overriden only on Project leve, User cannot
{
"key": "roots",
"key": "root",
"label": "Roots",
"type": "dict-roots",
"object_type": {
"type": "path",
"multiplatform": True,
"multiplatform": False,
"multipath": False
}
}

View file

@ -50,7 +50,7 @@ class LocalDriveHandler(AbstractProvider):
# for non 'studio' sites, 'studio' is configured in Anatomy
editable = [
{
"key": "roots",
"key": "root",
"label": "Roots",
"type": "dict-roots",
"object_type": {
@ -73,7 +73,7 @@ class LocalDriveHandler(AbstractProvider):
"""
editable = [
{
'key': "roots",
'key': "root",
'label': "Roots",
'type': 'dict'
}
@ -89,6 +89,7 @@ class LocalDriveHandler(AbstractProvider):
if not os.path.isfile(source_path):
raise FileNotFoundError("Source file {} doesn't exist."
.format(source_path))
if overwrite:
thread = threading.Thread(target=self._copy,
args=(source_path, target_path))
@ -181,7 +182,10 @@ class LocalDriveHandler(AbstractProvider):
def _copy(self, source_path, target_path):
print("copying {}->{}".format(source_path, target_path))
shutil.copy(source_path, target_path)
try:
shutil.copy(source_path, target_path)
except shutil.SameFileError:
print("same files, skipping")
def _mark_progress(self, collection, file, representation, server, site,
source_path, target_path, direction):

View file

@ -1,8 +1,6 @@
import os
import os.path
import time
import sys
import six
import threading
import platform
@ -14,6 +12,7 @@ log = Logger().get_logger("SyncServer")
pysftp = None
try:
import pysftp
import paramiko
except (ImportError, SyntaxError):
pass
@ -37,7 +36,6 @@ class SFTPHandler(AbstractProvider):
def __init__(self, project_name, site_name, tree=None, presets=None):
self.presets = None
self.active = False
self.project_name = project_name
self.site_name = site_name
self.root = None
@ -49,22 +47,15 @@ class SFTPHandler(AbstractProvider):
format(site_name))
return
provider_presets = self.presets.get(self.CODE)
if not provider_presets:
msg = "Sync Server: No provider presets for {}".format(self.CODE)
log.warning(msg)
return
# store to instance for reconnect
self.sftp_host = provider_presets["sftp_host"]
self.sftp_port = provider_presets["sftp_port"]
self.sftp_user = provider_presets["sftp_user"]
self.sftp_pass = provider_presets["sftp_pass"]
self.sftp_key = provider_presets["sftp_key"]
self.sftp_key_pass = provider_presets["sftp_key_pass"]
self.sftp_host = presets["sftp_host"]
self.sftp_port = presets["sftp_port"]
self.sftp_user = presets["sftp_user"]
self.sftp_pass = presets["sftp_pass"]
self.sftp_key = presets["sftp_key"]
self.sftp_key_pass = presets["sftp_key_pass"]
self._tree = None
self.active = True
@property
def conn(self):
@ -80,7 +71,7 @@ class SFTPHandler(AbstractProvider):
Returns:
(boolean)
"""
return self.conn is not None
return self.presets["enabled"] and self.conn is not None
@classmethod
def get_system_settings_schema(cls):
@ -108,7 +99,7 @@ class SFTPHandler(AbstractProvider):
editable = [
# credentials could be overriden on Project or User level
{
'key': "sftp_server",
'key': "sftp_host",
'label': "SFTP host name",
'type': 'text'
},
@ -130,7 +121,8 @@ class SFTPHandler(AbstractProvider):
{
'key': "sftp_key",
'label': "SFTP user ssh key",
'type': 'path'
'type': 'path',
"multiplatform": True
},
{
'key': "sftp_key_pass",
@ -139,12 +131,12 @@ class SFTPHandler(AbstractProvider):
},
# roots could be overriden only on Project leve, User cannot
{
"key": "roots",
"key": "root",
"label": "Roots",
"type": "dict-roots",
"object_type": {
"type": "path",
"multiplatform": True,
"multiplatform": False,
"multipath": False
}
}
@ -176,7 +168,8 @@ class SFTPHandler(AbstractProvider):
{
'key': "sftp_key",
'label': "SFTP user ssh key",
'type': 'path'
'type': 'path',
"multiplatform": True
},
{
'key': "sftp_key_pass",
@ -199,7 +192,7 @@ class SFTPHandler(AbstractProvider):
Format is importing for usage of python's format ** approach
"""
# roots cannot be locally overridden
return self.presets['root']
return self.presets['roots']
def get_tree(self):
"""
@ -426,7 +419,10 @@ class SFTPHandler(AbstractProvider):
if self.sftp_key_pass:
conn_params['private_key_pass'] = self.sftp_key_pass
return pysftp.Connection(**conn_params)
try:
return pysftp.Connection(**conn_params)
except paramiko.ssh_exception.SSHException:
log.warning("Couldn't connect", exc_info=True)
def _mark_progress(self, collection, file, representation, server, site,
source_path, target_path, direction):

View file

@ -80,6 +80,10 @@ async def upload(module, collection, file, representation, provider_name,
remote_site_name,
True
)
module.handle_alternate_site(collection, representation, remote_site_name,
file["_id"], file_id)
return file_id
@ -131,6 +135,10 @@ async def download(module, collection, file, representation, provider_name,
local_site,
True
)
module.handle_alternate_site(collection, representation, local_site,
file["_id"], file_id)
return file_id
@ -246,6 +254,7 @@ class SyncServerThread(threading.Thread):
asyncio.ensure_future(self.check_shutdown(), loop=self.loop)
asyncio.ensure_future(self.sync_loop(), loop=self.loop)
log.info("Sync Server Started")
self.loop.run_forever()
except Exception:
log.warning(

View file

@ -109,6 +109,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
# some parts of code need to run sequentially, not in async
self.lock = None
self._sync_system_settings = None
# settings for all enabled projects for sync
self._sync_project_settings = None
self.sync_server_thread = None # asyncio requires new thread
@ -152,9 +153,9 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
if not site_name:
site_name = self.DEFAULT_SITE
self.reset_provider_for_file(collection,
representation_id,
site_name=site_name, force=force)
self.reset_site_on_representation(collection,
representation_id,
site_name=site_name, force=force)
# public facing API
def remove_site(self, collection, representation_id, site_name,
@ -176,10 +177,10 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
if not self.get_sync_project_setting(collection):
raise ValueError("Project not configured")
self.reset_provider_for_file(collection,
representation_id,
site_name=site_name,
remove=True)
self.reset_site_on_representation(collection,
representation_id,
site_name=site_name,
remove=True)
if remove_local_files:
self._remove_local_file(collection, representation_id, site_name)
@ -314,8 +315,8 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
"""
log.info("Pausing SyncServer for {}".format(representation_id))
self._paused_representations.add(representation_id)
self.reset_provider_for_file(collection, representation_id,
site_name=site_name, pause=True)
self.reset_site_on_representation(collection, representation_id,
site_name=site_name, pause=True)
def unpause_representation(self, collection, representation_id, site_name):
"""
@ -334,8 +335,8 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
except KeyError:
pass
# self.paused_representations is not persistent
self.reset_provider_for_file(collection, representation_id,
site_name=site_name, pause=False)
self.reset_site_on_representation(collection, representation_id,
site_name=site_name, pause=False)
def is_representation_paused(self, representation_id,
check_parents=False, project_name=None):
@ -769,6 +770,58 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
enabled_projects.append(project_name)
return enabled_projects
def handle_alternate_site(self, collection, representation, processed_site,
file_id, synced_file_id):
"""
For special use cases where one site vendors another.
Current use case is sftp site vendoring (exposing) same data as
regular site (studio). Each site is accessible for different
audience. 'studio' for artists in a studio, 'sftp' for externals.
Change of file status on one site actually means same change on
'alternate' site. (eg. artists publish to 'studio', 'sftp' is using
same location >> file is accesible on 'sftp' site right away.
Args:
collection (str): name of project
representation (dict)
processed_site (str): real site_name of published/uploaded file
file_id (ObjectId): DB id of file handled
synced_file_id (str): id of the created file returned
by provider
"""
sites = self.sync_system_settings.get("sites", {})
sites[self.DEFAULT_SITE] = {"provider": "local_drive",
"alternative_sites": []}
alternate_sites = []
for site_name, site_info in sites.items():
conf_alternative_sites = site_info.get("alternative_sites", [])
if processed_site in conf_alternative_sites:
alternate_sites.append(site_name)
continue
if processed_site == site_name and conf_alternative_sites:
alternate_sites.extend(conf_alternative_sites)
continue
alternate_sites = set(alternate_sites)
for alt_site in alternate_sites:
query = {
"_id": representation["_id"]
}
elem = {"name": alt_site,
"created_dt": datetime.now(),
"id": synced_file_id}
self.log.debug("Adding alternate {} to {}".format(
alt_site, representation["_id"]))
self._add_site(collection, query,
[representation], elem,
alt_site, file_id=file_id, force=True)
""" End of Public API """
def get_local_file_path(self, collection, site_name, file_path):
@ -799,12 +852,19 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
def tray_init(self):
"""
Actual initialization of Sync Server.
Actual initialization of Sync Server for Tray.
Called when tray is initialized, it checks if module should be
enabled. If not, no initialization necessary.
"""
# import only in tray, because of Python2 hosts
self.server_init()
from .tray.app import SyncServerWindow
self.widget = SyncServerWindow(self)
def server_init(self):
"""Actual initialization of Sync Server."""
# import only in tray or Python3, because of Python2 hosts
from .sync_server import SyncServerThread
if not self.enabled:
@ -816,6 +876,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
return
self.lock = threading.Lock()
self.sync_server_thread = SyncServerThread(self)
def tray_start(self):
@ -829,6 +890,9 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
Returns:
None
"""
self.server_start()
def server_start(self):
if self.sync_project_settings and self.enabled:
self.sync_server_thread.start()
else:
@ -841,6 +905,9 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
Called from Module Manager
"""
self.server_exit()
def server_exit(self):
if not self.sync_server_thread:
return
@ -850,6 +917,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
log.info("Stopping sync server server")
self.sync_server_thread.is_running = False
self.sync_server_thread.stop()
log.info("Sync server stopped")
except Exception:
log.warning(
"Error has happened during Killing sync server",
@ -892,6 +960,14 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
return self._connection
@property
def sync_system_settings(self):
if self._sync_system_settings is None:
self._sync_system_settings = get_system_settings()["modules"].\
get("sync_server")
return self._sync_system_settings
@property
def sync_project_settings(self):
if self._sync_project_settings is None:
@ -977,9 +1053,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
(dict): {'studio': {'provider':'local_drive'...},
'MY_LOCAL': {'provider':....}}
"""
sys_sett = get_system_settings()
sync_sett = sys_sett["modules"].get("sync_server")
sync_sett = self.sync_system_settings
project_enabled = True
if project_name:
project_enabled = project_name in self.get_enabled_projects()
@ -1037,10 +1111,9 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
if provider:
return provider
sys_sett = get_system_settings()
sync_sett = sys_sett["modules"].get("sync_server")
for site, detail in sync_sett.get("sites", {}).items():
sites[site] = detail.get("provider")
sync_sett = self.sync_system_settings
for conf_site, detail in sync_sett.get("sites", {}).items():
sites[conf_site] = detail.get("provider")
return sites.get(site, 'N/A')
@ -1319,9 +1392,9 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
return -1, None
def reset_provider_for_file(self, collection, representation_id,
side=None, file_id=None, site_name=None,
remove=False, pause=None, force=False):
def reset_site_on_representation(self, collection, representation_id,
side=None, file_id=None, site_name=None,
remove=False, pause=None, force=False):
"""
Reset information about synchronization for particular 'file_id'
and provider.
@ -1407,9 +1480,12 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
update = {
"$set": {"files.$[f].sites.$[s]": elem}
}
if not isinstance(file_id, ObjectId):
file_id = ObjectId(file_id)
arr_filter = [
{'s.name': site_name},
{'f._id': ObjectId(file_id)}
{'f._id': file_id}
]
self._update_site(collection, query, update, arr_filter)

View file

@ -154,7 +154,7 @@ class SyncProjectListWidget(QtWidgets.QWidget):
selected_index.isValid() and \
not self._selection_changed:
mode = QtCore.QItemSelectionModel.Select | \
QtCore.QItemSelectionModel.Rows
QtCore.QItemSelectionModel.Rows
self.project_list.selectionModel().select(selected_index, mode)
if self.current_project:
@ -489,7 +489,7 @@ class _SyncRepresentationWidget(QtWidgets.QWidget):
format(check_progress))
continue
self.sync_server.reset_provider_for_file(
self.sync_server.reset_site_on_representation(
self.model.project,
representation_id,
site_name=site_name,
@ -872,7 +872,7 @@ class SyncRepresentationDetailWidget(_SyncRepresentationWidget):
format(check_progress))
continue
self.sync_server.reset_provider_for_file(
self.sync_server.reset_site_on_representation(
self.model.project,
self.representation_id,
site_name=site_name,

View file

@ -1,10 +1,7 @@
import os
import platform
from openpype.modules import OpenPypeModule
from openpype_interfaces import (
ITimersManager,
ITrayService
)
from openpype_interfaces import ITrayService
from avalon.api import AvalonMongoDB

View file

@ -8,14 +8,15 @@ in global space here until are required or used.
"""
import os
import click
from openpype.modules import (
JsonFilesSettingsDef,
OpenPypeAddOn
OpenPypeAddOn,
ModulesManager
)
# Import interface defined by this addon to be able find other addons using it
from openpype_interfaces import (
IExampleInterface,
IPluginPaths,
ITrayAction
)
@ -75,19 +76,6 @@ class ExampleAddon(OpenPypeAddOn, IPluginPaths, ITrayAction):
self._create_dialog()
def connect_with_modules(self, enabled_modules):
"""Method where you should find connected modules.
It is triggered by OpenPype modules manager at the best possible time.
Some addons and modules may required to connect with other modules
before their main logic is executed so changes would require to restart
whole process.
"""
self._connected_modules = []
for module in enabled_modules:
if isinstance(module, IExampleInterface):
self._connected_modules.append(module)
def _create_dialog(self):
# Don't recreate dialog if already exists
if self._dialog is not None:
@ -106,8 +94,6 @@ class ExampleAddon(OpenPypeAddOn, IPluginPaths, ITrayAction):
"""
# Make sure dialog is created
self._create_dialog()
# Change value of dialog by current state
self._dialog.set_connected_modules(self.get_connected_modules())
# Show dialog
self._dialog.open()
@ -130,3 +116,32 @@ class ExampleAddon(OpenPypeAddOn, IPluginPaths, ITrayAction):
return {
"publish": [os.path.join(current_dir, "plugins", "publish")]
}
def cli(self, click_group):
click_group.add_command(cli_main)
@click.group(ExampleAddon.name, help="Example addon dynamic cli commands.")
def cli_main():
pass
@cli_main.command()
def nothing():
"""Does nothing but print a message."""
print("You've triggered \"nothing\" command.")
@cli_main.command()
def show_dialog():
"""Show ExampleAddon dialog.
We don't have access to addon directly through cli so we have to create
it again.
"""
from openpype.tools.utils.lib import qt_app_context
manager = ModulesManager()
example_addon = manager.modules_by_name[ExampleAddon.name]
with qt_app_context():
example_addon.show_dialog()

View file

@ -1,28 +0,0 @@
""" Using interfaces is one way of connecting multiple OpenPype Addons/Modules.
Interfaces must be in `interfaces.py` file (or folder). Interfaces should not
import module logic or other module in global namespace. That is because
all of them must be imported before all OpenPype AddOns and Modules.
Ideally they should just define abstract and helper methods. If interface
require any logic or connection it should be defined in module.
Keep in mind that attributes and methods will be added to other addon
attributes and methods so they should be unique and ideally contain
addon name in it's name.
"""
from abc import abstractmethod
from openpype.modules import OpenPypeInterface
class IExampleInterface(OpenPypeInterface):
"""Example interface of addon."""
_example_module = None
def get_example_module(self):
return self._example_module
@abstractmethod
def example_method_of_example_interface(self):
pass

View file

@ -9,7 +9,8 @@ class MyExampleDialog(QtWidgets.QDialog):
self.setWindowTitle("Connected modules")
label_widget = QtWidgets.QLabel(self)
msg = "This is example dialog of example addon."
label_widget = QtWidgets.QLabel(msg, self)
ok_btn = QtWidgets.QPushButton("OK", self)
btns_layout = QtWidgets.QHBoxLayout()
@ -28,12 +29,3 @@ class MyExampleDialog(QtWidgets.QDialog):
def _on_ok_clicked(self):
self.done(1)
def set_connected_modules(self, connected_modules):
if connected_modules:
message = "\n".join(connected_modules)
else:
message = (
"Other enabled modules/addons are not using my interface."
)
self._label_widget.setText(message)

View file

@ -263,3 +263,31 @@ class ITrayService(ITrayModule):
"""Change icon of an QAction to orange circle."""
if self.menu_action:
self.menu_action.setIcon(self.get_icon_idle())
class ISettingsChangeListener(OpenPypeInterface):
"""Module has plugin paths to return.
Expected result is dictionary with keys "publish", "create", "load" or
"actions" and values as list or string.
{
"publish": ["path/to/publish_plugins"]
}
"""
@abstractmethod
def on_system_settings_save(
self, old_value, new_value, changes, new_value_metadata
):
pass
@abstractmethod
def on_project_settings_save(
self, old_value, new_value, changes, project_name, new_value_metadata
):
pass
@abstractmethod
def on_project_anatomy_save(
self, old_value, new_value, changes, project_name, new_value_metadata
):
pass

View file

@ -38,6 +38,8 @@ class CollectAnatomyInstanceData(pyblish.api.ContextPlugin):
order = pyblish.api.CollectorOrder + 0.49
label = "Collect Anatomy Instance data"
follow_workfile_version = False
def process(self, context):
self.log.info("Collecting anatomy data for all instances.")
@ -213,7 +215,10 @@ class CollectAnatomyInstanceData(pyblish.api.ContextPlugin):
context_asset_doc = context.data["assetEntity"]
for instance in context:
version_number = instance.data.get("version")
if self.follow_workfile_version:
version_number = context.data('version')
else:
version_number = instance.data.get("version")
# If version is not specified for instance or context
if version_number is None:
# TODO we should be able to change default version by studio

View file

@ -1029,29 +1029,8 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
"""
local_site = 'studio' # default
remote_site = None
sync_server_presets = None
if (instance.context.data["system_settings"]
["modules"]
["sync_server"]
["enabled"]):
sync_server_presets = (instance.context.data["project_settings"]
["global"]
["sync_server"])
local_site_id = openpype.api.get_local_site_id()
if sync_server_presets["enabled"]:
local_site = sync_server_presets["config"].\
get("active_site", "studio").strip()
if local_site == 'local':
local_site = local_site_id
remote_site = sync_server_presets["config"].get("remote_site")
if remote_site == local_site:
remote_site = None
if remote_site == 'local':
remote_site = local_site_id
always_accesible = []
sync_project_presets = None
rec = {
"_id": io.ObjectId(),
@ -1066,12 +1045,93 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
if sites:
rec["sites"] = sites
else:
system_sync_server_presets = (
instance.context.data["system_settings"]
["modules"]
["sync_server"])
log.debug("system_sett:: {}".format(system_sync_server_presets))
if system_sync_server_presets["enabled"]:
sync_project_presets = (
instance.context.data["project_settings"]
["global"]
["sync_server"])
if sync_project_presets and sync_project_presets["enabled"]:
local_site, remote_site = self._get_sites(sync_project_presets)
always_accesible = sync_project_presets["config"]. \
get("always_accessible_on", [])
already_attached_sites = {}
meta = {"name": local_site, "created_dt": datetime.now()}
rec["sites"] = [meta]
already_attached_sites[meta["name"]] = meta["created_dt"]
if remote_site:
if sync_project_presets and sync_project_presets["enabled"]:
# add remote
meta = {"name": remote_site.strip()}
rec["sites"].append(meta)
already_attached_sites[meta["name"]] = None
# add skeleton for site where it should be always synced to
for always_on_site in always_accesible:
if always_on_site not in already_attached_sites.keys():
meta = {"name": always_on_site.strip()}
rec["sites"].append(meta)
already_attached_sites[meta["name"]] = None
# add alternative sites
rec = self._add_alternative_sites(system_sync_server_presets,
already_attached_sites,
rec)
log.debug("final sites:: {}".format(rec["sites"]))
return rec
def _get_sites(self, sync_project_presets):
"""Returns tuple (local_site, remote_site)"""
local_site_id = openpype.api.get_local_site_id()
local_site = sync_project_presets["config"]. \
get("active_site", "studio").strip()
if local_site == 'local':
local_site = local_site_id
remote_site = sync_project_presets["config"].get("remote_site")
if remote_site == local_site:
remote_site = None
if remote_site == 'local':
remote_site = local_site_id
return local_site, remote_site
def _add_alternative_sites(self,
system_sync_server_presets,
already_attached_sites,
rec):
"""Loop through all configured sites and add alternatives.
See SyncServerModule.handle_alternate_site
"""
conf_sites = system_sync_server_presets.get("sites", {})
for site_name, site_info in conf_sites.items():
alt_sites = set(site_info.get("alternative_sites", []))
already_attached_keys = list(already_attached_sites.keys())
for added_site in already_attached_keys:
if added_site in alt_sites:
if site_name in already_attached_keys:
continue
meta = {"name": site_name}
real_created = already_attached_sites[added_site]
# alt site inherits state of 'created_dt'
if real_created:
meta["created_dt"] = real_created
rec["sites"].append(meta)
already_attached_sites[meta["name"]] = real_created
return rec

View file

@ -21,8 +21,9 @@ class ValidateVersion(pyblish.api.InstancePlugin):
if latest_version is not None:
msg = (
"Version `{0}` that you are trying to publish, already exists"
" in the database. Version in database: `{1}`. Please version "
"up your workfile to a higher version number than: `{1}`."
).format(version, latest_version)
"Version `{0}` from instance `{1}` that you are trying to"
" publish, already exists in the database. Version in"
" database: `{2}`. Please version up your workfile to a higher"
" version number than: `{2}`."
).format(version, instance.data["name"], latest_version)
assert (int(version) > int(latest_version)), msg

View file

@ -41,6 +41,25 @@ class PypeCommands:
user_role = "manager"
settings.main(user_role)
@staticmethod
def add_modules(click_func):
"""Modules/Addons can add their cli commands dynamically."""
from openpype.modules import ModulesManager
manager = ModulesManager()
log = PypeLogger.get_logger("AddModulesCLI")
for module in manager.modules:
try:
module.cli(click_func)
except Exception:
log.warning(
"Failed to add cli command for module \"{}\"".format(
module.name
)
)
return click_func
@staticmethod
def launch_eventservercli(*args):
from openpype_modules.ftrack.ftrack_server.event_server_cli import (
@ -60,7 +79,7 @@ class PypeCommands:
standalonepublish.main()
@staticmethod
def publish(paths, targets=None):
def publish(paths, targets=None, gui=False):
"""Start headless publishing.
Publish use json from passed paths argument.
@ -69,20 +88,35 @@ class PypeCommands:
paths (list): Paths to jsons.
targets (string): What module should be targeted
(to choose validator for example)
gui (bool): Show publish UI.
Raises:
RuntimeError: When there is no path to process.
"""
if not any(paths):
raise RuntimeError("No publish paths specified")
from openpype.modules import ModulesManager
from openpype import install, uninstall
from openpype.api import Logger
from openpype.tools.utils.host_tools import show_publish
from openpype.tools.utils.lib import qt_app_context
# Register target and host
import pyblish.api
import pyblish.util
log = Logger.get_logger()
install()
manager = ModulesManager()
publish_paths = manager.collect_plugin_paths()["publish"]
for path in publish_paths:
pyblish.api.register_plugin_path(path)
if not any(paths):
raise RuntimeError("No publish paths specified")
env = get_app_environments_for_context(
os.environ["AVALON_PROJECT"],
os.environ["AVALON_ASSET"],
@ -91,32 +125,39 @@ class PypeCommands:
)
os.environ.update(env)
log = Logger.get_logger()
install()
pyblish.api.register_target("filesequence")
pyblish.api.register_host("shell")
if targets:
for target in targets:
print(f"setting target: {target}")
pyblish.api.register_target(target)
else:
pyblish.api.register_target("filesequence")
os.environ["OPENPYPE_PUBLISH_DATA"] = os.pathsep.join(paths)
log.info("Running publish ...")
# Error exit as soon as any error occurs.
error_format = "Failed {plugin.__name__}: {error} -- {error.traceback}"
plugins = pyblish.api.discover()
print("Using plugins:")
for plugin in plugins:
print(plugin)
for result in pyblish.util.publish_iter():
if result["error"]:
log.error(error_format.format(**result))
uninstall()
sys.exit(1)
if gui:
with qt_app_context():
show_publish()
else:
# Error exit as soon as any error occurs.
error_format = ("Failed {plugin.__name__}: "
"{error} -- {error.traceback}")
for result in pyblish.util.publish_iter():
if result["error"]:
log.error(error_format.format(**result))
# uninstall()
sys.exit(1)
log.info("Publish finished.")
uninstall()
@staticmethod
def remotepublishfromapp(project, batch_dir, host, user, targets=None):
@ -339,3 +380,28 @@ class PypeCommands:
cmd = "pytest {} {} {}".format(folder, mark_str, pyargs_str)
print("Running {}".format(cmd))
subprocess.run(cmd)
def syncserver(self, active_site):
"""Start running sync_server in background."""
import signal
os.environ["OPENPYPE_LOCAL_ID"] = active_site
def signal_handler(sig, frame):
print("You pressed Ctrl+C. Process ended.")
sync_server_module.server_exit()
sys.exit(0)
signal.signal(signal.SIGINT, signal_handler)
signal.signal(signal.SIGTERM, signal_handler)
from openpype.modules import ModulesManager
manager = ModulesManager()
sync_server_module = manager.modules_by_name["sync_server"]
sync_server_module.server_init()
sync_server_module.server_start()
import time
while True:
time.sleep(1.0)

View file

@ -1,5 +1,8 @@
{
"publish": {
"CollectAnatomyInstanceData": {
"follow_workfile_version": false
},
"ValidateEditorialAssetName": {
"enabled": true,
"optional": false
@ -315,10 +318,11 @@
},
"project_folder_structure": "{\"__project_root__\": {\"prod\": {}, \"resources\": {\"footage\": {\"plates\": {}, \"offline\": {}}, \"audio\": {}, \"art_dept\": {}}, \"editorial\": {}, \"assets[ftrack.Library]\": {\"characters[ftrack]\": {}, \"locations[ftrack]\": {}}, \"shots[ftrack.Sequence]\": {\"scripts\": {}, \"editorial[ftrack.Folder]\": {}}}}",
"sync_server": {
"enabled": true,
"enabled": false,
"config": {
"retry_cnt": "3",
"loop_delay": "60",
"always_accessible_on": [],
"active_site": "studio",
"remote_site": "studio"
},

View file

@ -30,8 +30,7 @@
},
"ExtractReview": {
"jpg_options": {
"tags": [
]
"tags": []
},
"mov_options": {
"tags": [

View file

@ -167,6 +167,16 @@
"ffmpeg": 48
}
},
"royalrender": {
"enabled": false,
"rr_paths": {
"default": {
"windows": "",
"darwin": "",
"linux": ""
}
}
},
"log_viewer": {
"enabled": true
},

View file

@ -762,6 +762,17 @@ class SyncServerProviders(DictConditionalEntity):
enum_children = []
for provider_code, configurables in system_settings_schema.items():
# any site could be exposed or vendorized by different site
# eg studio site content could be mapped on sftp site, single file
# accessible via 2 different protocols (sites)
configurables.append(
{
"type": "list",
"key": "alternative_sites",
"label": "Alternative sites",
"object_type": "text"
}
)
label = provider_code_to_label.get(provider_code) or provider_code
enum_children.append({

View file

@ -67,7 +67,17 @@
"type": "list",
"key": "color_code",
"label": "Color codes for layers",
"object_type": "text"
"type": "enum",
"multiselection": true,
"enum_items": [
{ "red": "red" },
{ "orange": "orange" },
{ "yellowColor": "yellow" },
{ "grain": "green" },
{ "blue": "blue" },
{ "violet": "violet" },
{ "gray": "gray" }
]
},
{
"type": "list",

View file

@ -26,113 +26,33 @@
"key": "loop_delay",
"label": "Loop Delay"
},
{
"type": "list",
"key": "always_accessible_on",
"label": "Always accessible on sites",
"object_type": "text"
},
{
"type": "splitter"
},
{
"type": "text",
"key": "active_site",
"label": "Active Site"
"label": "User Default Active Site"
},
{
"type": "text",
"key": "remote_site",
"label": "Remote Site"
"label": "User Default Remote Site"
}
]
},
{
"type": "dict-modifiable",
"type": "sync-server-sites",
"collapsible": true,
"key": "sites",
"label": "Sites",
"collapsible_key": false,
"object_type": {
"type": "dict",
"children": [
{
"type": "dict",
"key": "gdrive",
"label": "Google Drive",
"collapsible": true,
"children": [
{
"type": "path",
"key": "credentials_url",
"label": "Credentials url",
"multiplatform": true
}
]
},
{
"type": "dict",
"key": "dropbox",
"label": "Dropbox",
"collapsible": true,
"children": [
{
"type": "text",
"key": "token",
"label": "Access Token"
},
{
"type": "text",
"key": "team_folder_name",
"label": "Team Folder Name"
},
{
"type": "text",
"key": "acting_as_member",
"label": "Acting As Member"
}
]
},
{
"type": "dict",
"key": "sftp",
"label": "SFTP",
"collapsible": true,
"children": [
{
"type": "text",
"key": "sftp_host",
"label": "SFTP host"
},
{
"type": "number",
"key": "sftp_port",
"label": "SFTP port"
},
{
"type": "text",
"key": "sftp_user",
"label": "SFTP user"
},
{
"type": "text",
"key": "sftp_pass",
"label": "SFTP pass"
},
{
"type": "path",
"key": "sftp_key",
"label": "SFTP user ssh key",
"multiplatform": true
},
{
"type": "text",
"key": "sftp_key_pass",
"label": "SFTP user ssh key password"
}
]
},
{
"type": "dict-modifiable",
"key": "root",
"label": "Roots",
"collapsable": false,
"collapsable_key": false,
"object_type": "text"
}
]
}
"collapsible_key": false
}
]
}

View file

@ -4,6 +4,20 @@
"key": "publish",
"label": "Publish plugins",
"children": [
{
"type": "dict",
"collapsible": true,
"key": "CollectAnatomyInstanceData",
"label": "Collect Anatomy Instance Data",
"is_group": true,
"children": [
{
"type": "boolean",
"key": "follow_workfile_version",
"label": "Follow workfile version"
}
]
},
{
"type": "dict",
"collapsible": true,

View file

@ -180,6 +180,31 @@
}
]
},
{
"type": "dict",
"key": "royalrender",
"label": "Royal Render",
"require_restart": true,
"collapsible": true,
"checkbox_key": "enabled",
"children": [
{
"type": "boolean",
"key": "enabled",
"label": "Enabled"
},
{
"type": "dict-modifiable",
"object_type": {
"type": "path",
"multiplatform": true
},
"key": "rr_paths",
"required_keys": ["default"],
"label": "Royal Render Root Paths"
}
]
},
{
"type": "dict",
"key": "log_viewer",

View file

@ -765,6 +765,11 @@ QScrollBar::add-page:vertical, QScrollBar::sub-page:vertical {
border: 1px solid {color:border};
border-radius: 0.1em;
}
/* Subset Manager */
#SubsetManagerDetailsText {}
#SubsetManagerDetailsText[state="invalid"] {
border: 1px solid #ff0000;
}
/* Python console interpreter */
#PythonInterpreterOutput, #PythonCodeEditor {

View file

@ -104,8 +104,8 @@ class TestPerformance():
"name": "mb",
"parent": {"oid": '{}'.format(id)},
"data": {
"path": "C:\\projects\\test_performance\\Assets\\Cylinder\\publish\\workfile\\workfileLookdev\\{}\\{}".format(version_str, file_name), # noqa
"template": "{root[work]}\\{project[name]}\\{hierarchy}\\{asset}\\publish\\{family}\\{subset}\\v{version:0>3}\\{project[code]}_{asset}_{subset}_v{version:0>3}<_{output}><.{frame:0>4}>.{representation}" # noqa
"path": "C:\\projects\\test_performance\\Assets\\Cylinder\\publish\\workfile\\workfileLookdev\\{}\\{}".format(version_str, file_name), # noqa: E501
"template": "{root[work]}\\{project[name]}\\{hierarchy}\\{asset}\\publish\\{family}\\{subset}\\v{version:0>3}\\{project[code]}_{asset}_{subset}_v{version:0>3}<_{output}><.{frame:0>4}>.{representation}" # noqa: E501
},
"type": "representation",
"schema": "openpype:representation-2.0"
@ -188,21 +188,21 @@ class TestPerformance():
create_files=False):
ret = [
{
"path": "{root[work]}" + "{root[work]}/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/v{:03d}/test_Cylinder_A_workfileLookdev_v{:03d}.dat".format(i, i), #noqa
"path": "{root[work]}" + "{root[work]}/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/v{:03d}/test_Cylinder_A_workfileLookdev_v{:03d}.dat".format(i, i), # noqa: E501
"_id": '{}'.format(file_id),
"hash": "temphash",
"sites": self.get_sites(self.MAX_NUMBER_OF_SITES),
"size": random.randint(0, self.MAX_FILE_SIZE_B)
},
{
"path": "{root[work]}" + "/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/v{:03d}/test_Cylinder_B_workfileLookdev_v{:03d}.dat".format(i, i), #noqa
"path": "{root[work]}" + "/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/v{:03d}/test_Cylinder_B_workfileLookdev_v{:03d}.dat".format(i, i), # noqa: E501
"_id": '{}'.format(file_id2),
"hash": "temphash",
"sites": self.get_sites(self.MAX_NUMBER_OF_SITES),
"size": random.randint(0, self.MAX_FILE_SIZE_B)
},
{
"path": "{root[work]}" + "/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/v{:03d}/test_Cylinder_C_workfileLookdev_v{:03d}.dat".format(i, i), #noqa
"path": "{root[work]}" + "/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/v{:03d}/test_Cylinder_C_workfileLookdev_v{:03d}.dat".format(i, i), # noqa: E501
"_id": '{}'.format(file_id3),
"hash": "temphash",
"sites": self.get_sites(self.MAX_NUMBER_OF_SITES),
@ -223,8 +223,8 @@ class TestPerformance():
ret = {}
ret['{}'.format(file_id)] = {
"path": "{root[work]}" +
"/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/" #noqa
"v{:03d}/test_CylinderA_workfileLookdev_v{:03d}.mb".format(i, i), # noqa
"/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/" # noqa: E501
"v{:03d}/test_CylinderA_workfileLookdev_v{:03d}.mb".format(i, i), # noqa: E501
"hash": "temphash",
"sites": ["studio"],
"size": 87236
@ -232,16 +232,16 @@ class TestPerformance():
ret['{}'.format(file_id2)] = {
"path": "{root[work]}" +
"/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/" #noqa
"v{:03d}/test_CylinderB_workfileLookdev_v{:03d}.mb".format(i, i), # noqa
"/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/" # noqa: E501
"v{:03d}/test_CylinderB_workfileLookdev_v{:03d}.mb".format(i, i), # noqa: E501
"hash": "temphash",
"sites": ["studio"],
"size": 87236
}
ret['{}'.format(file_id3)] = {
"path": "{root[work]}" +
"/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/" #noqa
"v{:03d}/test_CylinderC_workfileLookdev_v{:03d}.mb".format(i, i), # noqa
"/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/" # noqa: E501
"v{:03d}/test_CylinderC_workfileLookdev_v{:03d}.mb".format(i, i), # noqa: E501
"hash": "temphash",
"sites": ["studio"],
"size": 87236

View file

@ -243,9 +243,9 @@ class SubsetsModel(TreeModel, BaseRepresentationModel):
# update availability on active site when version changes
if self.sync_server.enabled and version:
site = self.active_site
query = self._repre_per_version_pipeline([version["_id"]],
site)
self.active_site,
self.remote_site)
docs = list(self.dbcon.aggregate(query))
if docs:
repre = docs.pop()
@ -801,47 +801,63 @@ class SubsetsModel(TreeModel, BaseRepresentationModel):
{"$unwind": "$files"},
{'$addFields': {
'order_local': {
'$filter': {'input': '$files.sites', 'as': 'p',
'cond': {'$eq': ['$$p.name', active_site]}
}}
'$filter': {
'input': '$files.sites', 'as': 'p',
'cond': {'$eq': ['$$p.name', active_site]}
}
}
}},
{'$addFields': {
'order_remote': {
'$filter': {'input': '$files.sites', 'as': 'p',
'cond': {'$eq': ['$$p.name', remote_site]}
}}
'$filter': {
'input': '$files.sites', 'as': 'p',
'cond': {'$eq': ['$$p.name', remote_site]}
}
}
}},
{'$addFields': {
'progress_local': {"$arrayElemAt": [{
'$cond': [{'$size': "$order_local.progress"},
"$order_local.progress",
# if exists created_dt count is as available
{'$cond': [
{'$size': "$order_local.created_dt"},
[1],
[0]
]}
]}, 0]}
'$cond': [
{'$size': "$order_local.progress"},
"$order_local.progress",
# if exists created_dt count is as available
{'$cond': [
{'$size': "$order_local.created_dt"},
[1],
[0]
]}
]},
0
]}
}},
{'$addFields': {
'progress_remote': {"$arrayElemAt": [{
'$cond': [{'$size': "$order_remote.progress"},
"$order_remote.progress",
# if exists created_dt count is as available
{'$cond': [
{'$size': "$order_remote.created_dt"},
[1],
[0]
]}
]}, 0]}
'$cond': [
{'$size': "$order_remote.progress"},
"$order_remote.progress",
# if exists created_dt count is as available
{'$cond': [
{'$size': "$order_remote.created_dt"},
[1],
[0]
]}
]},
0
]}
}},
{'$group': { # first group by repre
'_id': '$_id',
'parent': {'$first': '$parent'},
'avail_ratio_local': {'$first': {
'$divide': [{'$sum': "$progress_local"}, {'$sum': 1}]}},
'avail_ratio_remote': {'$first': {
'$divide': [{'$sum': "$progress_remote"}, {'$sum': 1}]}}
'avail_ratio_local': {
'$first': {
'$divide': [{'$sum': "$progress_local"}, {'$sum': 1}]
}
},
'avail_ratio_remote': {
'$first': {
'$divide': [{'$sum': "$progress_remote"}, {'$sum': 1}]
}
}
}},
{'$group': { # second group by parent, eg version_id
'_id': '$parent',

View file

@ -0,0 +1,19 @@
Subset manager
--------------
Simple UI showing list of created subset that will be published via Pyblish.
Useful for applications (Photoshop, AfterEffects, TVPaint, Harmony) which are
storing metadata about instance hidden from user.
This UI allows listing all created subset and removal of them if needed (
in case use doesn't want to publish anymore, its using workfile as a starting
file for different task and instances should be completely different etc.
)
Host is expected to implemented:
- `list_instances` - returning list of dictionaries (instances), must contain
unique uuid field
example:
```[{"uuid":"15","active":true,"subset":"imageBG","family":"image","id":"pyblish.avalon.instance","asset":"Town"}]```
- `remove_instance(instance)` - removes instance from file's metadata
instance is a dictionary, with uuid field

View file

@ -0,0 +1,9 @@
from .window import (
show,
SubsetManagerWindow
)
__all__ = (
"show",
"SubsetManagerWindow"
)

View file

@ -0,0 +1,52 @@
import uuid
from Qt import QtCore, QtGui
from avalon import api
ITEM_ID_ROLE = QtCore.Qt.UserRole + 1
class InstanceModel(QtGui.QStandardItemModel):
def __init__(self, *args, **kwargs):
super(InstanceModel, self).__init__(*args, **kwargs)
self._instances_by_item_id = {}
def get_instance_by_id(self, item_id):
return self._instances_by_item_id.get(item_id)
def refresh(self):
self.clear()
self._instances_by_item_id = {}
instances = None
host = api.registered_host()
list_instances = getattr(host, "list_instances", None)
if list_instances:
instances = list_instances()
if not instances:
return
items = []
for instance_data in instances:
item_id = str(uuid.uuid4())
label = instance_data.get("label") or instance_data["subset"]
item = QtGui.QStandardItem(label)
item.setEnabled(True)
item.setEditable(False)
item.setData(item_id, ITEM_ID_ROLE)
items.append(item)
self._instances_by_item_id[item_id] = instance_data
if items:
self.invisibleRootItem().appendRows(items)
def headerData(self, section, orientation, role):
if role == QtCore.Qt.DisplayRole and section == 0:
return "Instance"
return super(InstanceModel, self).headerData(
section, orientation, role
)

View file

@ -0,0 +1,110 @@
import json
from Qt import QtWidgets, QtCore
class InstanceDetail(QtWidgets.QWidget):
save_triggered = QtCore.Signal()
def __init__(self, parent=None):
super(InstanceDetail, self).__init__(parent)
details_widget = QtWidgets.QPlainTextEdit(self)
details_widget.setObjectName("SubsetManagerDetailsText")
save_btn = QtWidgets.QPushButton("Save", self)
self._block_changes = False
self._editable = False
self._item_id = None
layout = QtWidgets.QVBoxLayout(self)
layout.setContentsMargins(0, 0, 0, 0)
layout.addWidget(details_widget, 1)
layout.addWidget(save_btn, 0, QtCore.Qt.AlignRight)
save_btn.clicked.connect(self._on_save_clicked)
details_widget.textChanged.connect(self._on_text_change)
self._details_widget = details_widget
self._save_btn = save_btn
self.set_editable(False)
def _on_save_clicked(self):
if self.is_valid():
self.save_triggered.emit()
def set_editable(self, enabled=True):
self._editable = enabled
self.update_state()
def update_state(self, valid=None):
editable = self._editable
if not self._item_id:
editable = False
self._save_btn.setVisible(editable)
self._details_widget.setReadOnly(not editable)
if valid is None:
valid = self.is_valid()
self._save_btn.setEnabled(valid)
self._set_invalid_detail(valid)
def _set_invalid_detail(self, valid):
state = ""
if not valid:
state = "invalid"
current_state = self._details_widget.property("state")
if current_state != state:
self._details_widget.setProperty("state", state)
self._details_widget.style().polish(self._details_widget)
def set_details(self, container, item_id):
self._item_id = item_id
text = "Nothing selected"
if item_id:
try:
text = json.dumps(container, indent=4)
except Exception:
text = str(container)
self._block_changes = True
self._details_widget.setPlainText(text)
self._block_changes = False
self.update_state()
def instance_data_from_text(self):
try:
jsoned = json.loads(self._details_widget.toPlainText())
except Exception:
jsoned = None
return jsoned
def item_id(self):
return self._item_id
def is_valid(self):
if not self._item_id:
return True
value = self._details_widget.toPlainText()
valid = False
try:
jsoned = json.loads(value)
if jsoned and isinstance(jsoned, dict):
valid = True
except Exception:
pass
return valid
def _on_text_change(self):
if self._block_changes or not self._item_id:
return
valid = self.is_valid()
self.update_state(valid)

View file

@ -0,0 +1,218 @@
import os
import sys
from Qt import QtWidgets, QtCore
from avalon import api
from avalon.vendor import qtawesome
from openpype import style
from openpype.tools.utils.lib import (
iter_model_rows,
qt_app_context
)
from openpype.tools.utils.models import RecursiveSortFilterProxyModel
from .model import (
InstanceModel,
ITEM_ID_ROLE
)
from .widgets import InstanceDetail
module = sys.modules[__name__]
module.window = None
class SubsetManagerWindow(QtWidgets.QDialog):
def __init__(self, parent=None):
super(SubsetManagerWindow, self).__init__(parent=parent)
self.setWindowTitle("Subset Manager 0.1")
self.setObjectName("SubsetManager")
if not parent:
self.setWindowFlags(
self.windowFlags() | QtCore.Qt.WindowStaysOnTopHint
)
self.resize(780, 430)
# Trigger refresh on first called show
self._first_show = True
left_side_widget = QtWidgets.QWidget(self)
# Header part
header_widget = QtWidgets.QWidget(left_side_widget)
# Filter input
filter_input = QtWidgets.QLineEdit(header_widget)
filter_input.setPlaceholderText("Filter subsets..")
# Refresh button
icon = qtawesome.icon("fa.refresh", color="white")
refresh_btn = QtWidgets.QPushButton(header_widget)
refresh_btn.setIcon(icon)
header_layout = QtWidgets.QHBoxLayout(header_widget)
header_layout.setContentsMargins(0, 0, 0, 0)
header_layout.addWidget(filter_input)
header_layout.addWidget(refresh_btn)
# Instances view
view = QtWidgets.QTreeView(left_side_widget)
view.setIndentation(0)
view.setContextMenuPolicy(QtCore.Qt.CustomContextMenu)
model = InstanceModel(view)
proxy = RecursiveSortFilterProxyModel()
proxy.setSourceModel(model)
proxy.setFilterCaseSensitivity(QtCore.Qt.CaseInsensitive)
view.setModel(proxy)
left_side_layout = QtWidgets.QVBoxLayout(left_side_widget)
left_side_layout.setContentsMargins(0, 0, 0, 0)
left_side_layout.addWidget(header_widget)
left_side_layout.addWidget(view)
details_widget = InstanceDetail(self)
layout = QtWidgets.QHBoxLayout(self)
layout.addWidget(left_side_widget, 0)
layout.addWidget(details_widget, 1)
filter_input.textChanged.connect(proxy.setFilterFixedString)
refresh_btn.clicked.connect(self._on_refresh_clicked)
view.clicked.connect(self._on_activated)
view.customContextMenuRequested.connect(self.on_context_menu)
details_widget.save_triggered.connect(self._on_save)
self._model = model
self._proxy = proxy
self._view = view
self._details_widget = details_widget
self._refresh_btn = refresh_btn
def _on_refresh_clicked(self):
self.refresh()
def _on_activated(self, index):
container = None
item_id = None
if index.isValid():
item_id = index.data(ITEM_ID_ROLE)
container = self._model.get_instance_by_id(item_id)
self._details_widget.set_details(container, item_id)
def _on_save(self):
host = api.registered_host()
if not hasattr(host, "save_instances"):
print("BUG: Host does not have \"save_instances\" method")
return
current_index = self._view.selectionModel().currentIndex()
if not current_index.isValid():
return
item_id = current_index.data(ITEM_ID_ROLE)
if item_id != self._details_widget.item_id():
return
item_data = self._details_widget.instance_data_from_text()
new_instances = []
for index in iter_model_rows(self._model, 0):
_item_id = index.data(ITEM_ID_ROLE)
if _item_id == item_id:
instance_data = item_data
else:
instance_data = self._model.get_instance_by_id(item_id)
new_instances.append(instance_data)
host.save_instances(new_instances)
def on_context_menu(self, point):
point_index = self._view.indexAt(point)
item_id = point_index.data(ITEM_ID_ROLE)
instance_data = self._model.get_instance_by_id(item_id)
if instance_data is None:
return
# Prepare menu
menu = QtWidgets.QMenu(self)
actions = []
host = api.registered_host()
if hasattr(host, "remove_instance"):
action = QtWidgets.QAction("Remove instance", menu)
action.setData(host.remove_instance)
actions.append(action)
if hasattr(host, "select_instance"):
action = QtWidgets.QAction("Select instance", menu)
action.setData(host.select_instance)
actions.append(action)
if not actions:
actions.append(QtWidgets.QAction("* Nothing to do", menu))
for action in actions:
menu.addAction(action)
# Show menu under mouse
global_point = self._view.mapToGlobal(point)
action = menu.exec_(global_point)
if not action or not action.data():
return
# Process action
# TODO catch exceptions
function = action.data()
function(instance_data)
# Reset modified data
self.refresh()
def refresh(self):
self._details_widget.set_details(None, None)
self._model.refresh()
host = api.registered_host()
dev_mode = os.environ.get("AVALON_DEVELOP_MODE") or ""
editable = False
if dev_mode.lower() in ("1", "yes", "true", "on"):
editable = hasattr(host, "save_instances")
self._details_widget.set_editable(editable)
def showEvent(self, *args, **kwargs):
super(SubsetManagerWindow, self).showEvent(*args, **kwargs)
if self._first_show:
self._first_show = False
self.setStyleSheet(style.load_stylesheet())
self.refresh()
def show(root=None, debug=False, parent=None):
"""Display Scene Inventory GUI
Arguments:
debug (bool, optional): Run in debug-mode,
defaults to False
parent (QtCore.QObject, optional): When provided parent the interface
to this QObject.
"""
try:
module.window.close()
del module.window
except (RuntimeError, AttributeError):
pass
with qt_app_context():
window = SubsetManagerWindow(parent)
window.show()
module.window = window
# Pull window to the front.
module.window.raise_()
module.window.activateWindow()

View file

@ -133,22 +133,20 @@ class HostToolsHelper:
def get_subset_manager_tool(self, parent):
"""Create, cache and return subset manager tool window."""
if self._subset_manager_tool is None:
from avalon.tools.subsetmanager import Window
from openpype.tools.subsetmanager import SubsetManagerWindow
subset_manager_window = Window(parent=parent or self._parent)
subset_manager_window = SubsetManagerWindow(
parent=parent or self._parent
)
self._subset_manager_tool = subset_manager_window
return self._subset_manager_tool
def show_subset_manager(self, parent=None):
"""Show tool display/remove existing created instances."""
from avalon import style
subset_manager_tool = self.get_subset_manager_tool(parent)
subset_manager_tool.show()
subset_manager_tool.setStyleSheet(style.load_stylesheet())
# Pull window to the front.
subset_manager_tool.raise_()
subset_manager_tool.activateWindow()

View file

@ -1,3 +1,3 @@
# -*- coding: utf-8 -*-
"""Package declaring Pype version."""
__version__ = "3.6.0-nightly.5"
__version__ = "3.6.0"

1172
poetry.lock generated

File diff suppressed because it is too large Load diff

View file

@ -1,6 +1,6 @@
[tool.poetry]
name = "OpenPype"
version = "3.6.0-nightly.5" # OpenPype
version = "3.6.0" # OpenPype
description = "Open VFX and Animation pipeline with support."
authors = ["OpenPype Team <info@openpype.io>"]
license = "MIT License"

View file

@ -0,0 +1,10 @@
# -*- coding: utf-8 -*-
"""Test suite for User Settings."""
# import pytest
# from openpype.modules import ModulesManager
def test_rr_job():
# manager = ModulesManager()
# rr_module = manager.modules_by_name["royalrender"]
...

Binary file not shown.

After

Width:  |  Height:  |  Size: 27 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 13 KiB

View file

@ -27,6 +27,38 @@ To use synchronization, *Site Sync* needs to be enabled globally in **OpenPype S
![Configure module](assets/site_sync_system.png)
### Sites
By default there are two sites created for each OpenPype installation:
- **studio** - default site - usually a centralized mounted disk accessible to all artists. Studio site is used if Site Sync is disabled.
- **local** - each workstation or server running OpenPype Tray receives its own with unique site name. Workstation refers to itself as "local"however all other sites will see it under it's unique ID.
Artists can explore their site ID by opening OpenPype Info tool by clicking on a version number in the tray app.
Many different sites can be created and configured on the system level, and some or all can be assigned to each project.
Each OpenPype Tray app works with two sites at one time. (Sites can be the same, and no synching is done in this setup).
Sites could be configured differently per project basis.
Each new site needs to be created first in `System Settings`. Most important feature of site is its Provider, select one from already prepared Providers.
#### Alternative sites
This attribute is meant for special use cases only.
One of the use cases is sftp site vendoring (exposing) same data as regular site (studio). Each site is accessible for different audience. 'studio' for artists in a studio via shared disk, 'sftp' for externals via sftp server with mounted 'studio' drive.
Change of file status on one site actually means same change on 'alternate' site occured too. (eg. artists publish to 'studio', 'sftp' is using
same location >> file is accessible on 'sftp' site right away, no need to sync it anyhow.)
##### Example
![Configure module](assets/site_sync_system_sites.png)
Admin created new `sftp` site which is handled by `SFTP` provider. Somewhere in the studio SFTP server is deployed on a machine that has access to `studio` drive.
Alternative sites work both way:
- everything published to `studio` is accessible on a `sftp` site too
- everything published to `sftp` (most probably via artist's local disk - artists publishes locally, representation is marked to be synced to `sftp`. Immediately after it is synced, it is marked to be available on `studio` too for artists in the studio to use.)
## Project Settings
@ -45,21 +77,6 @@ Artists can also override which site they use as active and remote if need be.
![Local overrides](assets/site_sync_local_setting.png)
## Sites
By default there are two sites created for each OpenPype installation:
- **studio** - default site - usually a centralized mounted disk accessible to all artists. Studio site is used if Site Sync is disabled.
- **local** - each workstation or server running OpenPype Tray receives its own with unique site name. Workstation refers to itself as "local"however all other sites will see it under it's unique ID.
Artists can explore their site ID by opening OpenPype Info tool by clicking on a version number in the tray app.
Many different sites can be created and configured on the system level, and some or all can be assigned to each project.
Each OpenPype Tray app works with two sites at one time. (Sites can be the same, and no synching is done in this setup).
Sites could be configured differently per project basis.
## Providers
Each site implements a so called `provider` which handles most common operations (list files, copy files etc.) and provides interface with a particular type of storage. (disk, gdrive, aws, etc.)
@ -140,3 +157,42 @@ Beware that ssh key expects OpenSSH format (`.pem`) not a Putty format (`.ppk`)!
If a studio needs to use other services for cloud storage, or want to implement totally different storage providers, they can do so by writing their own provider plugin. We're working on a developer documentation, however, for now we recommend looking at `abstract_provider.py`and `gdrive.py` inside `openpype/modules/sync_server/providers` and using it as a template.
### Running Site Sync in background
Site Sync server synchronizes new published files from artist machine into configured remote location by default.
There might be a use case where you need to synchronize between "non-artist" sites, for example between studio site and cloud. In this case
you need to run Site Sync as a background process from a command line (via service etc) 24/7.
To configure all sites where all published files should be synced eventually you need to configure `project_settings/global/sync_server/config/always_accessible_on` property in Settins (per project) first.
![Set another non artist remote site](assets/site_sync_always_on.png)
This is an example of:
- Site Sync is enabled for a project
- default active and remote sites are set to `studio` - eg. standard process: everyone is working in a studio, publishing to shared location etc.
- (but this also allows any of the artists to work remotely, they would change their active site in their own Local Settings to `local` and configure local root.
This would result in everything artist publishes is saved first onto his local folder AND synchronized to `studio` site eventually.)
- everything exported must also be eventually uploaded to `sftp` site
This eventual synchronization between `studio` and `sftp` sites must be physically handled by background process.
As current implementation relies heavily on Settings and Local Settings, background process for a specific site ('studio' for example) must be configured via Tray first to `syncserver` command to work.
To do this:
- run OP `Tray` with environment variable OPENPYPE_LOCAL_ID set to name of active (source) site. In most use cases it would be studio (for cases of backups of everything published to studio site to different cloud site etc.)
- start `Tray`
- check `Local ID` in information dialog after clicking on version number in the Tray
- open `Local Settings` in the `Tray`
- configure for each project necessary active site and remote site
- close `Tray`
- run OP from a command line with `syncserver` and `--active_site` arguments
This is an example how to trigger background synching process where active (source) site is `studio`.
(It is expected that OP is installed on a machine, `openpype_console` is on PATH. If not, add full path to executable.
)
```shell
openpype_console syncserver --active_site studio
```