mirror of
https://github.com/ynput/ayon-core.git
synced 2026-01-02 00:44:52 +01:00
Merge branch 'develop' into feature/OP-1915_flame-ftrack-direct-link
This commit is contained in:
commit
4d7113a662
106 changed files with 8860 additions and 3598 deletions
2
.github/workflows/prerelease.yml
vendored
2
.github/workflows/prerelease.yml
vendored
|
|
@ -33,7 +33,7 @@ jobs:
|
|||
id: version
|
||||
if: steps.version_type.outputs.type != 'skip'
|
||||
run: |
|
||||
RESULT=$(python ./tools/ci_tools.py --nightly)
|
||||
RESULT=$(python ./tools/ci_tools.py --nightly --github_token ${{ secrets.GITHUB_TOKEN }})
|
||||
|
||||
echo ::set-output name=next_tag::$RESULT
|
||||
|
||||
|
|
|
|||
144
CHANGELOG.md
144
CHANGELOG.md
|
|
@ -1,86 +1,99 @@
|
|||
# Changelog
|
||||
|
||||
## [3.6.0-nightly.4](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
## [3.6.2-nightly.1](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.5.0...HEAD)
|
||||
|
||||
**🆕 New features**
|
||||
|
||||
- Maya : Colorspace configuration [\#2170](https://github.com/pypeclub/OpenPype/pull/2170)
|
||||
- Blender: Added support for audio [\#2168](https://github.com/pypeclub/OpenPype/pull/2168)
|
||||
- Flame: a host basic integration [\#2165](https://github.com/pypeclub/OpenPype/pull/2165)
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.6.1...HEAD)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Ftrack: Replace Queue with deque in event handlers logic [\#2204](https://github.com/pypeclub/OpenPype/pull/2204)
|
||||
- Maya : Validate mesh ngons [\#2199](https://github.com/pypeclub/OpenPype/pull/2199)
|
||||
- Delivery: Check 'frame' key in template for sequence delivery [\#2196](https://github.com/pypeclub/OpenPype/pull/2196)
|
||||
- Usage of tools code [\#2185](https://github.com/pypeclub/OpenPype/pull/2185)
|
||||
- Settings: Dictionary based on project roots [\#2184](https://github.com/pypeclub/OpenPype/pull/2184)
|
||||
- Subset name: Be able to pass asset document to get subset name [\#2179](https://github.com/pypeclub/OpenPype/pull/2179)
|
||||
- Tools: Experimental tools [\#2167](https://github.com/pypeclub/OpenPype/pull/2167)
|
||||
- Loader: Refactor and use OpenPype stylesheets [\#2166](https://github.com/pypeclub/OpenPype/pull/2166)
|
||||
- Add loader for linked smart objects in photoshop [\#2149](https://github.com/pypeclub/OpenPype/pull/2149)
|
||||
- Burnins: DNxHD profiles handling [\#2142](https://github.com/pypeclub/OpenPype/pull/2142)
|
||||
- Tools: Single access point for host tools [\#2139](https://github.com/pypeclub/OpenPype/pull/2139)
|
||||
- Tools: SceneInventory in OpenPype [\#2255](https://github.com/pypeclub/OpenPype/pull/2255)
|
||||
- Tools: Tasks widget [\#2251](https://github.com/pypeclub/OpenPype/pull/2251)
|
||||
- Added endpoint for configured extensions [\#2221](https://github.com/pypeclub/OpenPype/pull/2221)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Burnins: Support mxf metadata [\#2247](https://github.com/pypeclub/OpenPype/pull/2247)
|
||||
|
||||
## [3.6.1](https://github.com/pypeclub/OpenPype/tree/3.6.1) (2021-11-16)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.6.1-nightly.1...3.6.1)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Loader doesn't allow changing of version before loading [\#2254](https://github.com/pypeclub/OpenPype/pull/2254)
|
||||
|
||||
## [3.6.0](https://github.com/pypeclub/OpenPype/tree/3.6.0) (2021-11-15)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.6.0-nightly.6...3.6.0)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
- Add alternative sites for Site Sync [\#2206](https://github.com/pypeclub/OpenPype/pull/2206)
|
||||
- Add command line way of running site sync server [\#2188](https://github.com/pypeclub/OpenPype/pull/2188)
|
||||
|
||||
**🆕 New features**
|
||||
|
||||
- Add validate active site button to sync queue on a project [\#2176](https://github.com/pypeclub/OpenPype/pull/2176)
|
||||
- Maya : Colorspace configuration [\#2170](https://github.com/pypeclub/OpenPype/pull/2170)
|
||||
- Blender: Added support for audio [\#2168](https://github.com/pypeclub/OpenPype/pull/2168)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Tools: Subset manager in OpenPype [\#2243](https://github.com/pypeclub/OpenPype/pull/2243)
|
||||
- General: Skip module directories without init file [\#2239](https://github.com/pypeclub/OpenPype/pull/2239)
|
||||
- General: Static interfaces [\#2238](https://github.com/pypeclub/OpenPype/pull/2238)
|
||||
- Style: Fix transparent image in style [\#2235](https://github.com/pypeclub/OpenPype/pull/2235)
|
||||
- Add a "following workfile versioning" option on publish [\#2225](https://github.com/pypeclub/OpenPype/pull/2225)
|
||||
- Modules: Module can add cli commands [\#2224](https://github.com/pypeclub/OpenPype/pull/2224)
|
||||
- Webpublisher: Separate webpublisher logic [\#2222](https://github.com/pypeclub/OpenPype/pull/2222)
|
||||
- Add both side availability on Site Sync sites to Loader [\#2220](https://github.com/pypeclub/OpenPype/pull/2220)
|
||||
- Tools: Center loader and library loader on show [\#2219](https://github.com/pypeclub/OpenPype/pull/2219)
|
||||
- Maya : Validate shape zero [\#2212](https://github.com/pypeclub/OpenPype/pull/2212)
|
||||
- Maya : validate unique names [\#2211](https://github.com/pypeclub/OpenPype/pull/2211)
|
||||
- Tools: OpenPype stylesheet in workfiles tool [\#2208](https://github.com/pypeclub/OpenPype/pull/2208)
|
||||
- Ftrack: Replace Queue with deque in event handlers logic [\#2204](https://github.com/pypeclub/OpenPype/pull/2204)
|
||||
- Tools: New select context dialog [\#2200](https://github.com/pypeclub/OpenPype/pull/2200)
|
||||
- Maya : Validate mesh ngons [\#2199](https://github.com/pypeclub/OpenPype/pull/2199)
|
||||
- Dirmap in Nuke [\#2198](https://github.com/pypeclub/OpenPype/pull/2198)
|
||||
- Delivery: Check 'frame' key in template for sequence delivery [\#2196](https://github.com/pypeclub/OpenPype/pull/2196)
|
||||
- Settings: Site sync project settings improvement [\#2193](https://github.com/pypeclub/OpenPype/pull/2193)
|
||||
- Usage of tools code [\#2185](https://github.com/pypeclub/OpenPype/pull/2185)
|
||||
- Settings: Dictionary based on project roots [\#2184](https://github.com/pypeclub/OpenPype/pull/2184)
|
||||
- Subset name: Be able to pass asset document to get subset name [\#2179](https://github.com/pypeclub/OpenPype/pull/2179)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Ftrack: Sync project ftrack id cache issue [\#2250](https://github.com/pypeclub/OpenPype/pull/2250)
|
||||
- Ftrack: Session creation and Prepare project [\#2245](https://github.com/pypeclub/OpenPype/pull/2245)
|
||||
- Added queue for studio processing in PS [\#2237](https://github.com/pypeclub/OpenPype/pull/2237)
|
||||
- Python 2: Unicode to string conversion [\#2236](https://github.com/pypeclub/OpenPype/pull/2236)
|
||||
- Fix - enum for color coding in PS [\#2234](https://github.com/pypeclub/OpenPype/pull/2234)
|
||||
- Pyblish Tool: Fix targets handling [\#2232](https://github.com/pypeclub/OpenPype/pull/2232)
|
||||
- Ftrack: Base event fix of 'get\_project\_from\_entity' method [\#2214](https://github.com/pypeclub/OpenPype/pull/2214)
|
||||
- Maya : multiple subsets review broken [\#2210](https://github.com/pypeclub/OpenPype/pull/2210)
|
||||
- Fix - different command used for Linux and Mac OS [\#2207](https://github.com/pypeclub/OpenPype/pull/2207)
|
||||
- Tools: Workfiles tool don't use avalon widgets [\#2205](https://github.com/pypeclub/OpenPype/pull/2205)
|
||||
- Ftrack: Fill missing ftrack id on mongo project [\#2203](https://github.com/pypeclub/OpenPype/pull/2203)
|
||||
- Project Manager: Fix copying of tasks [\#2191](https://github.com/pypeclub/OpenPype/pull/2191)
|
||||
- StandalonePublisher: Source validator don't expect representations [\#2190](https://github.com/pypeclub/OpenPype/pull/2190)
|
||||
- Blender: Fix trying to pack an image when the shader node has no texture [\#2183](https://github.com/pypeclub/OpenPype/pull/2183)
|
||||
- MacOS: Launching of applications may cause Permissions error [\#2175](https://github.com/pypeclub/OpenPype/pull/2175)
|
||||
- Maya: review viewport settings [\#2177](https://github.com/pypeclub/OpenPype/pull/2177)
|
||||
- Maya: Aspect ratio [\#2174](https://github.com/pypeclub/OpenPype/pull/2174)
|
||||
- Blender: Fix 'Deselect All' with object not in 'Object Mode' [\#2163](https://github.com/pypeclub/OpenPype/pull/2163)
|
||||
- Tools: Stylesheets are applied after tool show [\#2161](https://github.com/pypeclub/OpenPype/pull/2161)
|
||||
- Maya: Collect render - fix UNC path support 🐛 [\#2158](https://github.com/pypeclub/OpenPype/pull/2158)
|
||||
- Maya: Fix hotbox broken by scriptsmenu [\#2151](https://github.com/pypeclub/OpenPype/pull/2151)
|
||||
- Ftrack: Ignore save warnings exception in Prepare project action [\#2150](https://github.com/pypeclub/OpenPype/pull/2150)
|
||||
- Added validator for source files for Standalone Publisher [\#2138](https://github.com/pypeclub/OpenPype/pull/2138)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Settings: Site sync project settings improvement [\#2193](https://github.com/pypeclub/OpenPype/pull/2193)
|
||||
- Add validate active site button to sync queue on a project [\#2176](https://github.com/pypeclub/OpenPype/pull/2176)
|
||||
- Bump pillow from 8.2.0 to 8.3.2 [\#2162](https://github.com/pypeclub/OpenPype/pull/2162)
|
||||
- Bump axios from 0.21.1 to 0.21.4 in /website [\#2059](https://github.com/pypeclub/OpenPype/pull/2059)
|
||||
|
||||
## [3.5.0](https://github.com/pypeclub/OpenPype/tree/3.5.0) (2021-10-17)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.5.0-nightly.8...3.5.0)
|
||||
|
||||
**Deprecated:**
|
||||
|
||||
- Maya: Change mayaAscii family to mayaScene [\#2106](https://github.com/pypeclub/OpenPype/pull/2106)
|
||||
|
||||
**🆕 New features**
|
||||
|
||||
- Added project and task into context change message in Maya [\#2131](https://github.com/pypeclub/OpenPype/pull/2131)
|
||||
- Add ExtractBurnin to photoshop review [\#2124](https://github.com/pypeclub/OpenPype/pull/2124)
|
||||
- PYPE-1218 - changed namespace to contain subset name in Maya [\#2114](https://github.com/pypeclub/OpenPype/pull/2114)
|
||||
- Added running configurable disk mapping command before start of OP [\#2091](https://github.com/pypeclub/OpenPype/pull/2091)
|
||||
- SFTP provider [\#2073](https://github.com/pypeclub/OpenPype/pull/2073)
|
||||
- Houdini: simple HDA workflow [\#2072](https://github.com/pypeclub/OpenPype/pull/2072)
|
||||
- Maya: Validate setdress top group [\#2068](https://github.com/pypeclub/OpenPype/pull/2068)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Maya: make rig validators configurable in settings [\#2137](https://github.com/pypeclub/OpenPype/pull/2137)
|
||||
- Settings: Updated readme for entity types in settings [\#2132](https://github.com/pypeclub/OpenPype/pull/2132)
|
||||
- Nuke: unified clip loader [\#2128](https://github.com/pypeclub/OpenPype/pull/2128)
|
||||
- Settings UI: Project model refreshing and sorting [\#2104](https://github.com/pypeclub/OpenPype/pull/2104)
|
||||
- Create Read From Rendered - Disable Relative paths by default [\#2093](https://github.com/pypeclub/OpenPype/pull/2093)
|
||||
- Added choosing different dirmap mapping if workfile synched locally [\#2088](https://github.com/pypeclub/OpenPype/pull/2088)
|
||||
- General: Remove IdleManager module [\#2084](https://github.com/pypeclub/OpenPype/pull/2084)
|
||||
- Tray UI: Message box about missing settings defaults [\#2080](https://github.com/pypeclub/OpenPype/pull/2080)
|
||||
- Tray UI: Show menu where first click happened [\#2079](https://github.com/pypeclub/OpenPype/pull/2079)
|
||||
- Global: add global validators to settings [\#2078](https://github.com/pypeclub/OpenPype/pull/2078)
|
||||
- Use CRF for burnin when available [\#2070](https://github.com/pypeclub/OpenPype/pull/2070)
|
||||
- Project manager: Filter first item after selection of project [\#2069](https://github.com/pypeclub/OpenPype/pull/2069)
|
||||
- Nuke: Adding `still` image family workflow [\#2064](https://github.com/pypeclub/OpenPype/pull/2064)
|
||||
- Maya: validate authorized loaded plugins [\#2062](https://github.com/pypeclub/OpenPype/pull/2062)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
|
|
@ -88,36 +101,11 @@
|
|||
- Fix - oiiotool wasn't recognized even if present [\#2129](https://github.com/pypeclub/OpenPype/pull/2129)
|
||||
- General: Disk mapping group [\#2120](https://github.com/pypeclub/OpenPype/pull/2120)
|
||||
- Hiero: publishing effect first time makes wrong resources path [\#2115](https://github.com/pypeclub/OpenPype/pull/2115)
|
||||
- Add startup script for Houdini Core. [\#2110](https://github.com/pypeclub/OpenPype/pull/2110)
|
||||
- TVPaint: Behavior name of loop also accept repeat [\#2109](https://github.com/pypeclub/OpenPype/pull/2109)
|
||||
- Ftrack: Project settings save custom attributes skip unknown attributes [\#2103](https://github.com/pypeclub/OpenPype/pull/2103)
|
||||
- Blender: Fix NoneType error when animation\_data is missing for a rig [\#2101](https://github.com/pypeclub/OpenPype/pull/2101)
|
||||
- Fix broken import in sftp provider [\#2100](https://github.com/pypeclub/OpenPype/pull/2100)
|
||||
- Global: Fix docstring on publish plugin extract review [\#2097](https://github.com/pypeclub/OpenPype/pull/2097)
|
||||
- Delivery Action Files Sequence fix [\#2096](https://github.com/pypeclub/OpenPype/pull/2096)
|
||||
- General: Cloud mongo ca certificate issue [\#2095](https://github.com/pypeclub/OpenPype/pull/2095)
|
||||
- TVPaint: Creator use context from workfile [\#2087](https://github.com/pypeclub/OpenPype/pull/2087)
|
||||
- Blender: fix texture missing when publishing blend files [\#2085](https://github.com/pypeclub/OpenPype/pull/2085)
|
||||
- General: Startup validations oiio tool path fix on linux [\#2083](https://github.com/pypeclub/OpenPype/pull/2083)
|
||||
- Deadline: Collect deadline server does not check existence of deadline key [\#2082](https://github.com/pypeclub/OpenPype/pull/2082)
|
||||
- Blender: fixed Curves with modifiers in Rigs [\#2081](https://github.com/pypeclub/OpenPype/pull/2081)
|
||||
- Nuke UI scaling [\#2077](https://github.com/pypeclub/OpenPype/pull/2077)
|
||||
- Maya: Fix multi-camera renders [\#2065](https://github.com/pypeclub/OpenPype/pull/2065)
|
||||
- Fix Sync Queue when project disabled [\#2063](https://github.com/pypeclub/OpenPype/pull/2063)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Bump pywin32 from 300 to 301 [\#2086](https://github.com/pypeclub/OpenPype/pull/2086)
|
||||
|
||||
## [3.4.1](https://github.com/pypeclub/OpenPype/tree/3.4.1) (2021-09-23)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.4.1-nightly.1...3.4.1)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Timers manger: Typo fix [\#2058](https://github.com/pypeclub/OpenPype/pull/2058)
|
||||
- Hiero: Editorial fixes [\#2057](https://github.com/pypeclub/OpenPype/pull/2057)
|
||||
|
||||
## [3.4.0](https://github.com/pypeclub/OpenPype/tree/3.4.0) (2021-09-17)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.4.0-nightly.6...3.4.0)
|
||||
|
|
|
|||
|
|
@ -57,6 +57,17 @@ def tray(debug=False):
|
|||
PypeCommands().launch_tray(debug)
|
||||
|
||||
|
||||
@PypeCommands.add_modules
|
||||
@main.group(help="Run command line arguments of OpenPype modules")
|
||||
@click.pass_context
|
||||
def module(ctx):
|
||||
"""Module specific commands created dynamically.
|
||||
|
||||
These commands are generated dynamically by currently loaded addon/modules.
|
||||
"""
|
||||
pass
|
||||
|
||||
|
||||
@main.command()
|
||||
@click.option("-d", "--debug", is_flag=True, help="Print debug messages")
|
||||
@click.option("--ftrack-url", envvar="FTRACK_SERVER",
|
||||
|
|
@ -147,7 +158,9 @@ def extractenvironments(output_json_path, project, asset, task, app):
|
|||
@click.option("-d", "--debug", is_flag=True, help="Print debug messages")
|
||||
@click.option("-t", "--targets", help="Targets module", default=None,
|
||||
multiple=True)
|
||||
def publish(debug, paths, targets):
|
||||
@click.option("-g", "--gui", is_flag=True,
|
||||
help="Show Publish UI", default=False)
|
||||
def publish(debug, paths, targets, gui):
|
||||
"""Start CLI publishing.
|
||||
|
||||
Publish collects json from paths provided as an argument.
|
||||
|
|
@ -155,7 +168,7 @@ def publish(debug, paths, targets):
|
|||
"""
|
||||
if debug:
|
||||
os.environ['OPENPYPE_DEBUG'] = '3'
|
||||
PypeCommands.publish(list(paths), targets)
|
||||
PypeCommands.publish(list(paths), targets, gui)
|
||||
|
||||
|
||||
@main.command()
|
||||
|
|
@ -166,7 +179,7 @@ def publish(debug, paths, targets):
|
|||
@click.option("-p", "--project", help="Project")
|
||||
@click.option("-t", "--targets", help="Targets", default=None,
|
||||
multiple=True)
|
||||
def remotepublishfromapp(debug, project, path, host, targets=None, user=None):
|
||||
def remotepublishfromapp(debug, project, path, host, user=None, targets=None):
|
||||
"""Start CLI publishing.
|
||||
|
||||
Publish collects json from paths provided as an argument.
|
||||
|
|
@ -174,18 +187,19 @@ def remotepublishfromapp(debug, project, path, host, targets=None, user=None):
|
|||
"""
|
||||
if debug:
|
||||
os.environ['OPENPYPE_DEBUG'] = '3'
|
||||
PypeCommands.remotepublishfromapp(project, path, host, user,
|
||||
targets=targets)
|
||||
PypeCommands.remotepublishfromapp(
|
||||
project, path, host, user, targets=targets
|
||||
)
|
||||
|
||||
|
||||
@main.command()
|
||||
@click.argument("path")
|
||||
@click.option("-d", "--debug", is_flag=True, help="Print debug messages")
|
||||
@click.option("-h", "--host", help="Host")
|
||||
@click.option("-u", "--user", help="User email address")
|
||||
@click.option("-p", "--project", help="Project")
|
||||
@click.option("-t", "--targets", help="Targets", default=None,
|
||||
multiple=True)
|
||||
def remotepublish(debug, project, path, host, targets=None, user=None):
|
||||
def remotepublish(debug, project, path, user=None, targets=None):
|
||||
"""Start CLI publishing.
|
||||
|
||||
Publish collects json from paths provided as an argument.
|
||||
|
|
@ -193,7 +207,7 @@ def remotepublish(debug, project, path, host, targets=None, user=None):
|
|||
"""
|
||||
if debug:
|
||||
os.environ['OPENPYPE_DEBUG'] = '3'
|
||||
PypeCommands.remotepublish(project, path, host, user, targets=targets)
|
||||
PypeCommands.remotepublish(project, path, user, targets=targets)
|
||||
|
||||
|
||||
@main.command()
|
||||
|
|
@ -345,3 +359,28 @@ def run(script):
|
|||
def runtests(folder, mark, pyargs):
|
||||
"""Run all automatic tests after proper initialization via start.py"""
|
||||
PypeCommands().run_tests(folder, mark, pyargs)
|
||||
|
||||
|
||||
@main.command()
|
||||
@click.option("-d", "--debug",
|
||||
is_flag=True, help=("Run process in debug mode"))
|
||||
@click.option("-a", "--active_site", required=True,
|
||||
help="Name of active stie")
|
||||
def syncserver(debug, active_site):
|
||||
"""Run sync site server in background.
|
||||
|
||||
Some Site Sync use cases need to expose site to another one.
|
||||
For example if majority of artists work in studio, they are not using
|
||||
SS at all, but if you want to expose published assets to 'studio' site
|
||||
to SFTP for only a couple of artists, some background process must
|
||||
mark published assets to live on multiple sites (they might be
|
||||
physically in same location - mounted shared disk).
|
||||
|
||||
Process mimics OP Tray with specific 'active_site' name, all
|
||||
configuration for this "dummy" user comes from Setting or Local
|
||||
Settings (configured by starting OP Tray with env
|
||||
var OPENPYPE_LOCAL_ID set to 'active_site'.
|
||||
"""
|
||||
if debug:
|
||||
os.environ['OPENPYPE_DEBUG'] = '3'
|
||||
PypeCommands().syncserver(active_site)
|
||||
|
|
|
|||
|
|
@ -13,6 +13,7 @@ from pyblish import api as pyblish
|
|||
from openpype.lib import any_outdated
|
||||
import openpype.hosts.maya
|
||||
from openpype.hosts.maya.lib import copy_workspace_mel
|
||||
from openpype.lib.path_tools import HostDirmap
|
||||
from . import menu, lib
|
||||
|
||||
log = logging.getLogger("openpype.hosts.maya")
|
||||
|
|
@ -30,7 +31,8 @@ def install():
|
|||
|
||||
project_settings = get_project_settings(os.getenv("AVALON_PROJECT"))
|
||||
# process path mapping
|
||||
process_dirmap(project_settings)
|
||||
dirmap_processor = MayaDirmap("maya", project_settings)
|
||||
dirmap_processor.process_dirmap()
|
||||
|
||||
pyblish.register_plugin_path(PUBLISH_PATH)
|
||||
avalon.register_plugin_path(avalon.Loader, LOAD_PATH)
|
||||
|
|
@ -60,110 +62,6 @@ def install():
|
|||
avalon.data["familiesStateToggled"] = ["imagesequence"]
|
||||
|
||||
|
||||
def process_dirmap(project_settings):
|
||||
# type: (dict) -> None
|
||||
"""Go through all paths in Settings and set them using `dirmap`.
|
||||
|
||||
If artists has Site Sync enabled, take dirmap mapping directly from
|
||||
Local Settings when artist is syncing workfile locally.
|
||||
|
||||
Args:
|
||||
project_settings (dict): Settings for current project.
|
||||
|
||||
"""
|
||||
local_mapping = _get_local_sync_dirmap(project_settings)
|
||||
if not project_settings["maya"].get("maya-dirmap") and not local_mapping:
|
||||
return
|
||||
|
||||
mapping = local_mapping or \
|
||||
project_settings["maya"]["maya-dirmap"]["paths"] \
|
||||
or {}
|
||||
mapping_enabled = project_settings["maya"]["maya-dirmap"]["enabled"] \
|
||||
or bool(local_mapping)
|
||||
|
||||
if not mapping or not mapping_enabled:
|
||||
return
|
||||
if mapping.get("source-path") and mapping_enabled is True:
|
||||
log.info("Processing directory mapping ...")
|
||||
cmds.dirmap(en=True)
|
||||
for k, sp in enumerate(mapping["source-path"]):
|
||||
try:
|
||||
print("{} -> {}".format(sp, mapping["destination-path"][k]))
|
||||
cmds.dirmap(m=(sp, mapping["destination-path"][k]))
|
||||
cmds.dirmap(m=(mapping["destination-path"][k], sp))
|
||||
except IndexError:
|
||||
# missing corresponding destination path
|
||||
log.error(("invalid dirmap mapping, missing corresponding"
|
||||
" destination directory."))
|
||||
break
|
||||
except RuntimeError:
|
||||
log.error("invalid path {} -> {}, mapping not registered".format(
|
||||
sp, mapping["destination-path"][k]
|
||||
))
|
||||
continue
|
||||
|
||||
|
||||
def _get_local_sync_dirmap(project_settings):
|
||||
"""
|
||||
Returns dirmap if synch to local project is enabled.
|
||||
|
||||
Only valid mapping is from roots of remote site to local site set in
|
||||
Local Settings.
|
||||
|
||||
Args:
|
||||
project_settings (dict)
|
||||
Returns:
|
||||
dict : { "source-path": [XXX], "destination-path": [YYYY]}
|
||||
"""
|
||||
import json
|
||||
mapping = {}
|
||||
|
||||
if not project_settings["global"]["sync_server"]["enabled"]:
|
||||
log.debug("Site Sync not enabled")
|
||||
return mapping
|
||||
|
||||
from openpype.settings.lib import get_site_local_overrides
|
||||
from openpype.modules import ModulesManager
|
||||
|
||||
manager = ModulesManager()
|
||||
sync_module = manager.modules_by_name["sync_server"]
|
||||
|
||||
project_name = os.getenv("AVALON_PROJECT")
|
||||
sync_settings = sync_module.get_sync_project_setting(
|
||||
os.getenv("AVALON_PROJECT"), exclude_locals=False, cached=False)
|
||||
log.debug(json.dumps(sync_settings, indent=4))
|
||||
|
||||
active_site = sync_module.get_local_normalized_site(
|
||||
sync_module.get_active_site(project_name))
|
||||
remote_site = sync_module.get_local_normalized_site(
|
||||
sync_module.get_remote_site(project_name))
|
||||
log.debug("active {} - remote {}".format(active_site, remote_site))
|
||||
|
||||
if active_site == "local" \
|
||||
and project_name in sync_module.get_enabled_projects()\
|
||||
and active_site != remote_site:
|
||||
overrides = get_site_local_overrides(os.getenv("AVALON_PROJECT"),
|
||||
active_site)
|
||||
for root_name, value in overrides.items():
|
||||
if os.path.isdir(value):
|
||||
try:
|
||||
mapping["destination-path"] = [value]
|
||||
mapping["source-path"] = [sync_settings["sites"]\
|
||||
[remote_site]\
|
||||
["root"]\
|
||||
[root_name]]
|
||||
except IndexError:
|
||||
# missing corresponding destination path
|
||||
log.debug("overrides".format(overrides))
|
||||
log.error(
|
||||
("invalid dirmap mapping, missing corresponding"
|
||||
" destination directory."))
|
||||
break
|
||||
|
||||
log.debug("local sync mapping:: {}".format(mapping))
|
||||
return mapping
|
||||
|
||||
|
||||
def uninstall():
|
||||
pyblish.deregister_plugin_path(PUBLISH_PATH)
|
||||
avalon.deregister_plugin_path(avalon.Loader, LOAD_PATH)
|
||||
|
|
@ -326,3 +224,12 @@ def before_workfile_save(workfile_path):
|
|||
|
||||
workdir = os.path.dirname(workfile_path)
|
||||
copy_workspace_mel(workdir)
|
||||
|
||||
|
||||
class MayaDirmap(HostDirmap):
|
||||
def on_enable_dirmap(self):
|
||||
cmds.dirmap(en=True)
|
||||
|
||||
def dirmap_routine(self, source_path, destination_path):
|
||||
cmds.dirmap(m=(source_path, destination_path))
|
||||
cmds.dirmap(m=(destination_path, source_path))
|
||||
|
|
|
|||
|
|
@ -45,9 +45,12 @@ class ExtractPlayblast(openpype.api.Extractor):
|
|||
# get cameras
|
||||
camera = instance.data['review_camera']
|
||||
|
||||
override_viewport_options = (
|
||||
self.capture_preset['Viewport Options']
|
||||
['override_viewport_options']
|
||||
)
|
||||
preset = lib.load_capture_preset(data=self.capture_preset)
|
||||
|
||||
|
||||
preset['camera'] = camera
|
||||
preset['start_frame'] = start
|
||||
preset['end_frame'] = end
|
||||
|
|
@ -92,6 +95,12 @@ class ExtractPlayblast(openpype.api.Extractor):
|
|||
|
||||
self.log.info('using viewport preset: {}'.format(preset))
|
||||
|
||||
# Update preset with current panel setting
|
||||
# if override_viewport_options is turned off
|
||||
if not override_viewport_options:
|
||||
panel_preset = capture.parse_active_view()
|
||||
preset.update(panel_preset)
|
||||
|
||||
path = capture.capture(**preset)
|
||||
|
||||
self.log.debug("playblast path {}".format(path))
|
||||
|
|
|
|||
|
|
@ -32,6 +32,9 @@ class ExtractThumbnail(openpype.api.Extractor):
|
|||
capture_preset = (
|
||||
instance.context.data["project_settings"]['maya']['publish']['ExtractPlayblast']['capture_preset']
|
||||
)
|
||||
override_viewport_options = (
|
||||
capture_preset['Viewport Options']['override_viewport_options']
|
||||
)
|
||||
|
||||
try:
|
||||
preset = lib.load_capture_preset(data=capture_preset)
|
||||
|
|
@ -86,6 +89,12 @@ class ExtractThumbnail(openpype.api.Extractor):
|
|||
# playblast and viewer
|
||||
preset['viewer'] = False
|
||||
|
||||
# Update preset with current panel setting
|
||||
# if override_viewport_options is turned off
|
||||
if not override_viewport_options:
|
||||
panel_preset = capture.parse_active_view()
|
||||
preset.update(panel_preset)
|
||||
|
||||
path = capture.capture(**preset)
|
||||
playblast = self._fix_playblast_output_path(path)
|
||||
|
||||
|
|
|
|||
|
|
@ -24,6 +24,10 @@ from openpype.api import (
|
|||
ApplicationManager
|
||||
)
|
||||
from openpype.tools.utils import host_tools
|
||||
from openpype.lib.path_tools import HostDirmap
|
||||
from openpype.settings import get_project_settings
|
||||
from openpype.modules import ModulesManager
|
||||
|
||||
import nuke
|
||||
|
||||
from .utils import set_context_favorites
|
||||
|
|
@ -1795,3 +1799,69 @@ def recreate_instance(origin_node, avalon_data=None):
|
|||
dn.setInput(0, new_node)
|
||||
|
||||
return new_node
|
||||
|
||||
|
||||
class NukeDirmap(HostDirmap):
|
||||
def __init__(self, host_name, project_settings, sync_module, file_name):
|
||||
"""
|
||||
Args:
|
||||
host_name (str): Nuke
|
||||
project_settings (dict): settings of current project
|
||||
sync_module (SyncServerModule): to limit reinitialization
|
||||
file_name (str): full path of referenced file from workfiles
|
||||
"""
|
||||
self.host_name = host_name
|
||||
self.project_settings = project_settings
|
||||
self.file_name = file_name
|
||||
self.sync_module = sync_module
|
||||
|
||||
self._mapping = None # cache mapping
|
||||
|
||||
def on_enable_dirmap(self):
|
||||
pass
|
||||
|
||||
def dirmap_routine(self, source_path, destination_path):
|
||||
log.debug("{}: {}->{}".format(self.file_name,
|
||||
source_path, destination_path))
|
||||
source_path = source_path.lower().replace(os.sep, '/')
|
||||
destination_path = destination_path.lower().replace(os.sep, '/')
|
||||
if platform.system().lower() == "windows":
|
||||
self.file_name = self.file_name.lower().replace(
|
||||
source_path, destination_path)
|
||||
else:
|
||||
self.file_name = self.file_name.replace(
|
||||
source_path, destination_path)
|
||||
|
||||
|
||||
class DirmapCache:
|
||||
"""Caching class to get settings and sync_module easily and only once."""
|
||||
_project_settings = None
|
||||
_sync_module = None
|
||||
|
||||
@classmethod
|
||||
def project_settings(cls):
|
||||
if cls._project_settings is None:
|
||||
cls._project_settings = get_project_settings(
|
||||
os.getenv("AVALON_PROJECT"))
|
||||
return cls._project_settings
|
||||
|
||||
@classmethod
|
||||
def sync_module(cls):
|
||||
if cls._sync_module is None:
|
||||
cls._sync_module = ModulesManager().modules_by_name["sync_server"]
|
||||
return cls._sync_module
|
||||
|
||||
|
||||
def dirmap_file_name_filter(file_name):
|
||||
"""Nuke callback function with single full path argument.
|
||||
|
||||
Checks project settings for potential mapping from source to dest.
|
||||
"""
|
||||
dirmap_processor = NukeDirmap("nuke",
|
||||
DirmapCache.project_settings(),
|
||||
DirmapCache.sync_module(),
|
||||
file_name)
|
||||
dirmap_processor.process_dirmap()
|
||||
if os.path.exists(dirmap_processor.file_name):
|
||||
return dirmap_processor.file_name
|
||||
return file_name
|
||||
|
|
|
|||
|
|
@ -6,10 +6,10 @@ from openpype.hosts.nuke.api.lib import (
|
|||
|
||||
import nuke
|
||||
from openpype.api import Logger
|
||||
from openpype.hosts.nuke.api.lib import dirmap_file_name_filter
|
||||
|
||||
log = Logger().get_logger(__name__)
|
||||
|
||||
|
||||
# fix ffmpeg settings on script
|
||||
nuke.addOnScriptLoad(on_script_load)
|
||||
|
||||
|
|
@ -20,4 +20,6 @@ nuke.addOnScriptSave(check_inventory_versions)
|
|||
# # set apply all workfile settings on script load and save
|
||||
nuke.addOnScriptLoad(WorkfileSettings().set_context_settings)
|
||||
|
||||
nuke.addFilenameFilter(dirmap_file_name_filter)
|
||||
|
||||
log.info('Automatic syncing of write file knob to script version')
|
||||
|
|
|
|||
|
|
@ -17,11 +17,10 @@ class ClosePS(pyblish.api.ContextPlugin):
|
|||
active = True
|
||||
|
||||
hosts = ["photoshop"]
|
||||
targets = ["remotepublish"]
|
||||
|
||||
def process(self, context):
|
||||
self.log.info("ClosePS")
|
||||
if not os.environ.get("IS_HEADLESS"):
|
||||
return
|
||||
|
||||
stub = photoshop.stub()
|
||||
self.log.info("Shutting down PS")
|
||||
|
|
|
|||
|
|
@ -21,6 +21,7 @@ class CollectRemoteInstances(pyblish.api.ContextPlugin):
|
|||
label = "Instances"
|
||||
order = pyblish.api.CollectorOrder
|
||||
hosts = ["photoshop"]
|
||||
targets = ["remotepublish"]
|
||||
|
||||
# configurable by Settings
|
||||
color_code_mapping = []
|
||||
|
|
@ -28,9 +29,6 @@ class CollectRemoteInstances(pyblish.api.ContextPlugin):
|
|||
def process(self, context):
|
||||
self.log.info("CollectRemoteInstances")
|
||||
self.log.info("mapping:: {}".format(self.color_code_mapping))
|
||||
if not os.environ.get("IS_HEADLESS"):
|
||||
self.log.debug("Not headless publishing, skipping.")
|
||||
return
|
||||
|
||||
# parse variant if used in webpublishing, comes from webpublisher batch
|
||||
batch_dir = os.environ.get("OPENPYPE_PUBLISH_DATA")
|
||||
|
|
|
|||
|
|
@ -0,0 +1,84 @@
|
|||
"""Loads batch context from json and continues in publish process.
|
||||
|
||||
Provides:
|
||||
context -> Loaded batch file.
|
||||
"""
|
||||
|
||||
import os
|
||||
|
||||
import pyblish.api
|
||||
from avalon import io
|
||||
from openpype.lib.plugin_tools import (
|
||||
parse_json,
|
||||
get_batch_asset_task_info
|
||||
)
|
||||
from openpype.lib.remote_publish import get_webpublish_conn
|
||||
|
||||
|
||||
class CollectBatchData(pyblish.api.ContextPlugin):
|
||||
"""Collect batch data from json stored in 'OPENPYPE_PUBLISH_DATA' env dir.
|
||||
|
||||
The directory must contain 'manifest.json' file where batch data should be
|
||||
stored.
|
||||
"""
|
||||
# must be really early, context values are only in json file
|
||||
order = pyblish.api.CollectorOrder - 0.495
|
||||
label = "Collect batch data"
|
||||
host = ["webpublisher"]
|
||||
|
||||
def process(self, context):
|
||||
batch_dir = os.environ.get("OPENPYPE_PUBLISH_DATA")
|
||||
|
||||
assert batch_dir, (
|
||||
"Missing `OPENPYPE_PUBLISH_DATA`")
|
||||
|
||||
assert os.path.exists(batch_dir), \
|
||||
"Folder {} doesn't exist".format(batch_dir)
|
||||
|
||||
project_name = os.environ.get("AVALON_PROJECT")
|
||||
if project_name is None:
|
||||
raise AssertionError(
|
||||
"Environment `AVALON_PROJECT` was not found."
|
||||
"Could not set project `root` which may cause issues."
|
||||
)
|
||||
|
||||
batch_data = parse_json(os.path.join(batch_dir, "manifest.json"))
|
||||
|
||||
context.data["batchDir"] = batch_dir
|
||||
context.data["batchData"] = batch_data
|
||||
|
||||
asset_name, task_name, task_type = get_batch_asset_task_info(
|
||||
batch_data["context"]
|
||||
)
|
||||
|
||||
os.environ["AVALON_ASSET"] = asset_name
|
||||
io.Session["AVALON_ASSET"] = asset_name
|
||||
os.environ["AVALON_TASK"] = task_name
|
||||
io.Session["AVALON_TASK"] = task_name
|
||||
|
||||
context.data["asset"] = asset_name
|
||||
context.data["task"] = task_name
|
||||
context.data["taskType"] = task_type
|
||||
|
||||
self._set_ctx_path(batch_data)
|
||||
|
||||
def _set_ctx_path(self, batch_data):
|
||||
dbcon = get_webpublish_conn()
|
||||
|
||||
batch_id = batch_data["batch"]
|
||||
ctx_path = batch_data["context"]["path"]
|
||||
self.log.info("ctx_path: {}".format(ctx_path))
|
||||
self.log.info("batch_id: {}".format(batch_id))
|
||||
if ctx_path and batch_id:
|
||||
self.log.info("Updating log record")
|
||||
dbcon.update_one(
|
||||
{
|
||||
"batch_id": batch_id,
|
||||
"status": "in_progress"
|
||||
},
|
||||
{
|
||||
"$set": {
|
||||
"path": ctx_path
|
||||
}
|
||||
}
|
||||
)
|
||||
|
|
@ -20,9 +20,8 @@ class CollectFPS(pyblish.api.InstancePlugin):
|
|||
hosts = ["webpublisher"]
|
||||
|
||||
def process(self, instance):
|
||||
fps = instance.context.data["fps"]
|
||||
instance_fps = instance.data.get("fps")
|
||||
if instance_fps is None:
|
||||
instance.data["fps"] = instance.context.data["fps"]
|
||||
|
||||
instance.data.update({
|
||||
"fps": fps
|
||||
})
|
||||
self.log.debug(f"instance.data: {pformat(instance.data)}")
|
||||
|
|
|
|||
|
|
@ -1,21 +1,19 @@
|
|||
"""Loads publishing context from json and continues in publish process.
|
||||
"""Create instances from batch data and continues in publish process.
|
||||
|
||||
Requires:
|
||||
anatomy -> context["anatomy"] *(pyblish.api.CollectorOrder - 0.11)
|
||||
CollectBatchData
|
||||
|
||||
Provides:
|
||||
context, instances -> All data from previous publishing process.
|
||||
"""
|
||||
|
||||
import os
|
||||
import json
|
||||
import clique
|
||||
import tempfile
|
||||
|
||||
import pyblish.api
|
||||
from avalon import io
|
||||
import pyblish.api
|
||||
from openpype.lib import prepare_template_data
|
||||
from openpype.lib.plugin_tools import parse_json, get_batch_asset_task_info
|
||||
from openpype.lib.plugin_tools import parse_json
|
||||
|
||||
|
||||
class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
||||
|
|
@ -28,28 +26,28 @@ class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
|||
order = pyblish.api.CollectorOrder - 0.490
|
||||
label = "Collect rendered frames"
|
||||
host = ["webpublisher"]
|
||||
|
||||
_context = None
|
||||
targets = ["filespublish"]
|
||||
|
||||
# from Settings
|
||||
task_type_to_family = {}
|
||||
|
||||
def _process_batch(self, dir_url):
|
||||
task_subfolders = [
|
||||
os.path.join(dir_url, o)
|
||||
for o in os.listdir(dir_url)
|
||||
if os.path.isdir(os.path.join(dir_url, o))]
|
||||
def process(self, context):
|
||||
batch_dir = context.data["batchDir"]
|
||||
task_subfolders = []
|
||||
for folder_name in os.listdir(batch_dir):
|
||||
full_path = os.path.join(batch_dir, folder_name)
|
||||
if os.path.isdir(full_path):
|
||||
task_subfolders.append(full_path)
|
||||
|
||||
self.log.info("task_sub:: {}".format(task_subfolders))
|
||||
|
||||
asset_name = context.data["asset"]
|
||||
task_name = context.data["task"]
|
||||
task_type = context.data["taskType"]
|
||||
for task_dir in task_subfolders:
|
||||
task_data = parse_json(os.path.join(task_dir,
|
||||
"manifest.json"))
|
||||
self.log.info("task_data:: {}".format(task_data))
|
||||
ctx = task_data["context"]
|
||||
|
||||
asset, task_name, task_type = get_batch_asset_task_info(ctx)
|
||||
|
||||
if task_name:
|
||||
os.environ["AVALON_TASK"] = task_name
|
||||
|
||||
is_sequence = len(task_data["files"]) > 1
|
||||
|
||||
|
|
@ -60,26 +58,20 @@ class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
|||
is_sequence,
|
||||
extension.replace(".", ''))
|
||||
|
||||
subset = self._get_subset_name(family, subset_template, task_name,
|
||||
task_data["variant"])
|
||||
subset = self._get_subset_name(
|
||||
family, subset_template, task_name, task_data["variant"]
|
||||
)
|
||||
version = self._get_last_version(asset_name, subset) + 1
|
||||
|
||||
os.environ["AVALON_ASSET"] = asset
|
||||
io.Session["AVALON_ASSET"] = asset
|
||||
|
||||
instance = self._context.create_instance(subset)
|
||||
instance.data["asset"] = asset
|
||||
instance = context.create_instance(subset)
|
||||
instance.data["asset"] = asset_name
|
||||
instance.data["subset"] = subset
|
||||
instance.data["family"] = family
|
||||
instance.data["families"] = families
|
||||
instance.data["version"] = \
|
||||
self._get_last_version(asset, subset) + 1
|
||||
instance.data["version"] = version
|
||||
instance.data["stagingDir"] = tempfile.mkdtemp()
|
||||
instance.data["source"] = "webpublisher"
|
||||
|
||||
# to store logging info into DB openpype.webpublishes
|
||||
instance.data["ctx_path"] = ctx["path"]
|
||||
instance.data["batch_id"] = task_data["batch"]
|
||||
|
||||
# to convert from email provided into Ftrack username
|
||||
instance.data["user_email"] = task_data["user"]
|
||||
|
||||
|
|
@ -230,23 +222,3 @@ class CollectPublishedFiles(pyblish.api.ContextPlugin):
|
|||
return version[0].get("version") or 0
|
||||
else:
|
||||
return 0
|
||||
|
||||
def process(self, context):
|
||||
self._context = context
|
||||
|
||||
batch_dir = os.environ.get("OPENPYPE_PUBLISH_DATA")
|
||||
|
||||
assert batch_dir, (
|
||||
"Missing `OPENPYPE_PUBLISH_DATA`")
|
||||
|
||||
assert os.path.exists(batch_dir), \
|
||||
"Folder {} doesn't exist".format(batch_dir)
|
||||
|
||||
project_name = os.environ.get("AVALON_PROJECT")
|
||||
if project_name is None:
|
||||
raise AssertionError(
|
||||
"Environment `AVALON_PROJECT` was not found."
|
||||
"Could not set project `root` which may cause issues."
|
||||
)
|
||||
|
||||
self._process_batch(batch_dir)
|
||||
|
|
|
|||
|
|
@ -1,38 +0,0 @@
|
|||
import os
|
||||
|
||||
import pyblish.api
|
||||
from openpype.lib import OpenPypeMongoConnection
|
||||
|
||||
|
||||
class IntegrateContextToLog(pyblish.api.ContextPlugin):
|
||||
""" Adds context information to log document for displaying in front end"""
|
||||
|
||||
label = "Integrate Context to Log"
|
||||
order = pyblish.api.IntegratorOrder - 0.1
|
||||
hosts = ["webpublisher"]
|
||||
|
||||
def process(self, context):
|
||||
self.log.info("Integrate Context to Log")
|
||||
|
||||
mongo_client = OpenPypeMongoConnection.get_mongo_client()
|
||||
database_name = os.environ["OPENPYPE_DATABASE_NAME"]
|
||||
dbcon = mongo_client[database_name]["webpublishes"]
|
||||
|
||||
for instance in context:
|
||||
self.log.info("ctx_path: {}".format(instance.data.get("ctx_path")))
|
||||
self.log.info("batch_id: {}".format(instance.data.get("batch_id")))
|
||||
if instance.data.get("ctx_path") and instance.data.get("batch_id"):
|
||||
self.log.info("Updating log record")
|
||||
dbcon.update_one(
|
||||
{
|
||||
"batch_id": instance.data.get("batch_id"),
|
||||
"status": "in_progress"
|
||||
},
|
||||
{"$set":
|
||||
{
|
||||
"path": instance.data.get("ctx_path")
|
||||
|
||||
}}
|
||||
)
|
||||
|
||||
return
|
||||
|
|
@ -11,7 +11,8 @@ from avalon.api import AvalonMongoDB
|
|||
|
||||
from openpype.lib import OpenPypeMongoConnection
|
||||
from openpype_modules.avalon_apps.rest_api import _RestApiEndpoint
|
||||
from openpype.lib.plugin_tools import parse_json
|
||||
from openpype.lib.remote_publish import get_task_data
|
||||
from openpype.settings import get_project_settings
|
||||
|
||||
from openpype.lib import PypeLogger
|
||||
|
||||
|
|
@ -20,11 +21,16 @@ log = PypeLogger.get_logger("WebServer")
|
|||
|
||||
class RestApiResource:
|
||||
"""Resource carrying needed info and Avalon DB connection for publish."""
|
||||
def __init__(self, server_manager, executable, upload_dir):
|
||||
def __init__(self, server_manager, executable, upload_dir,
|
||||
studio_task_queue=None):
|
||||
self.server_manager = server_manager
|
||||
self.upload_dir = upload_dir
|
||||
self.executable = executable
|
||||
|
||||
if studio_task_queue is None:
|
||||
studio_task_queue = collections.deque().dequeu
|
||||
self.studio_task_queue = studio_task_queue
|
||||
|
||||
self.dbcon = AvalonMongoDB()
|
||||
self.dbcon.install()
|
||||
|
||||
|
|
@ -34,6 +40,8 @@ class RestApiResource:
|
|||
return value.isoformat()
|
||||
if isinstance(value, ObjectId):
|
||||
return str(value)
|
||||
if isinstance(value, set):
|
||||
return list(value)
|
||||
raise TypeError(value)
|
||||
|
||||
@classmethod
|
||||
|
|
@ -176,71 +184,95 @@ class TaskNode(Node):
|
|||
class WebpublisherBatchPublishEndpoint(_RestApiEndpoint):
|
||||
"""Triggers headless publishing of batch."""
|
||||
async def post(self, request) -> Response:
|
||||
# for postprocessing in host, currently only PS
|
||||
host_map = {"photoshop": [".psd", ".psb"]}
|
||||
|
||||
output = {}
|
||||
log.info("WebpublisherBatchPublishEndpoint called")
|
||||
content = await request.json()
|
||||
|
||||
batch_path = os.path.join(self.resource.upload_dir,
|
||||
content["batch"])
|
||||
|
||||
add_args = {
|
||||
"host": "webpublisher",
|
||||
"project": content["project_name"],
|
||||
"user": content["user"]
|
||||
}
|
||||
|
||||
command = "remotepublish"
|
||||
|
||||
if content.get("studio_processing"):
|
||||
log.info("Post processing called")
|
||||
|
||||
batch_data = parse_json(os.path.join(batch_path, "manifest.json"))
|
||||
if not batch_data:
|
||||
raise ValueError(
|
||||
"Cannot parse batch meta in {} folder".format(batch_path))
|
||||
task_dir_name = batch_data["tasks"][0]
|
||||
task_data = parse_json(os.path.join(batch_path, task_dir_name,
|
||||
"manifest.json"))
|
||||
if not task_data:
|
||||
raise ValueError(
|
||||
"Cannot parse batch meta in {} folder".format(task_data))
|
||||
|
||||
command = "remotepublishfromapp"
|
||||
for host, extensions in host_map.items():
|
||||
for ext in extensions:
|
||||
for file_name in task_data["files"]:
|
||||
if ext in file_name:
|
||||
add_args["host"] = host
|
||||
break
|
||||
|
||||
if not add_args.get("host"):
|
||||
raise ValueError(
|
||||
"Couldn't discern host from {}".format(task_data["files"]))
|
||||
|
||||
# Validate existence of openpype executable
|
||||
openpype_app = self.resource.executable
|
||||
args = [
|
||||
openpype_app,
|
||||
command,
|
||||
batch_path
|
||||
]
|
||||
|
||||
if not openpype_app or not os.path.exists(openpype_app):
|
||||
msg = "Non existent OpenPype executable {}".format(openpype_app)
|
||||
raise RuntimeError(msg)
|
||||
|
||||
log.info("WebpublisherBatchPublishEndpoint called")
|
||||
content = await request.json()
|
||||
|
||||
# Each filter have extensions which are checked on first task item
|
||||
# - first filter with extensions that are on first task is used
|
||||
# - filter defines command and can extend arguments dictionary
|
||||
# This is used only if 'studio_processing' is enabled on batch
|
||||
studio_processing_filters = [
|
||||
# Photoshop filter
|
||||
{
|
||||
"extensions": [".psd", ".psb"],
|
||||
"command": "remotepublishfromapp",
|
||||
"arguments": {
|
||||
# Command 'remotepublishfromapp' requires --host argument
|
||||
"host": "photoshop",
|
||||
# Make sure targets are set to None for cases that default
|
||||
# would change
|
||||
# - targets argument is not used in 'remotepublishfromapp'
|
||||
"targets": ["remotepublish"]
|
||||
},
|
||||
# does publish need to be handled by a queue, eg. only
|
||||
# single process running concurrently?
|
||||
"add_to_queue": True
|
||||
}
|
||||
]
|
||||
|
||||
batch_dir = os.path.join(self.resource.upload_dir, content["batch"])
|
||||
|
||||
# Default command and arguments
|
||||
command = "remotepublish"
|
||||
add_args = {
|
||||
# All commands need 'project' and 'user'
|
||||
"project": content["project_name"],
|
||||
"user": content["user"],
|
||||
|
||||
"targets": ["filespublish"]
|
||||
}
|
||||
|
||||
add_to_queue = False
|
||||
if content.get("studio_processing"):
|
||||
log.info("Post processing called for {}".format(batch_dir))
|
||||
|
||||
task_data = get_task_data(batch_dir)
|
||||
|
||||
for process_filter in studio_processing_filters:
|
||||
filter_extensions = process_filter.get("extensions") or []
|
||||
for file_name in task_data["files"]:
|
||||
file_ext = os.path.splitext(file_name)[-1].lower()
|
||||
if file_ext in filter_extensions:
|
||||
# Change command
|
||||
command = process_filter["command"]
|
||||
# Update arguments
|
||||
add_args.update(
|
||||
process_filter.get("arguments") or {}
|
||||
)
|
||||
add_to_queue = process_filter["add_to_queue"]
|
||||
break
|
||||
|
||||
args = [
|
||||
openpype_app,
|
||||
command,
|
||||
batch_dir
|
||||
]
|
||||
|
||||
for key, value in add_args.items():
|
||||
args.append("--{}".format(key))
|
||||
args.append(value)
|
||||
# Skip key values where value is None
|
||||
if value is not None:
|
||||
args.append("--{}".format(key))
|
||||
# Extend list into arguments (targets can be a list)
|
||||
if isinstance(value, (tuple, list)):
|
||||
args.extend(value)
|
||||
else:
|
||||
args.append(value)
|
||||
|
||||
log.info("args:: {}".format(args))
|
||||
if add_to_queue:
|
||||
log.debug("Adding to queue")
|
||||
self.resource.studio_task_queue.append(args)
|
||||
else:
|
||||
subprocess.call(args)
|
||||
|
||||
subprocess.call(args)
|
||||
return Response(
|
||||
status=200,
|
||||
body=self.resource.encode(output),
|
||||
content_type="application/json"
|
||||
)
|
||||
|
||||
|
|
@ -277,3 +309,36 @@ class PublishesStatusEndpoint(_RestApiEndpoint):
|
|||
body=self.resource.encode(output),
|
||||
content_type="application/json"
|
||||
)
|
||||
|
||||
|
||||
class ConfiguredExtensionsEndpoint(_RestApiEndpoint):
|
||||
"""Returns dict of extensions which have mapping to family.
|
||||
|
||||
Returns:
|
||||
{
|
||||
"file_exts": [],
|
||||
"sequence_exts": []
|
||||
}
|
||||
"""
|
||||
async def get(self, project_name=None) -> Response:
|
||||
sett = get_project_settings(project_name)
|
||||
|
||||
configured = {
|
||||
"file_exts": set(),
|
||||
"sequence_exts": set(),
|
||||
# workfiles that could have "Studio Procesing" hardcoded for now
|
||||
"studio_exts": set(["psd", "psb", "tvpp", "tvp"])
|
||||
}
|
||||
collect_conf = sett["webpublisher"]["publish"]["CollectPublishedFiles"]
|
||||
for _, mapping in collect_conf.get("task_type_to_family", {}).items():
|
||||
for _family, config in mapping.items():
|
||||
if config["is_sequence"]:
|
||||
configured["sequence_exts"].update(config["extensions"])
|
||||
else:
|
||||
configured["file_exts"].update(config["extensions"])
|
||||
|
||||
return Response(
|
||||
status=200,
|
||||
body=self.resource.encode(dict(configured)),
|
||||
content_type="application/json"
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,8 +1,10 @@
|
|||
import collections
|
||||
import time
|
||||
import os
|
||||
from datetime import datetime
|
||||
import requests
|
||||
import json
|
||||
import subprocess
|
||||
|
||||
from openpype.lib import PypeLogger
|
||||
|
||||
|
|
@ -14,7 +16,8 @@ from .webpublish_routes import (
|
|||
WebpublisherHiearchyEndpoint,
|
||||
WebpublisherProjectsEndpoint,
|
||||
BatchStatusEndpoint,
|
||||
PublishesStatusEndpoint
|
||||
PublishesStatusEndpoint,
|
||||
ConfiguredExtensionsEndpoint
|
||||
)
|
||||
|
||||
|
||||
|
|
@ -31,10 +34,13 @@ def run_webserver(*args, **kwargs):
|
|||
port = kwargs.get("port") or 8079
|
||||
server_manager = webserver_module.create_new_server_manager(port, host)
|
||||
webserver_url = server_manager.url
|
||||
# queue for remotepublishfromapp tasks
|
||||
studio_task_queue = collections.deque()
|
||||
|
||||
resource = RestApiResource(server_manager,
|
||||
upload_dir=kwargs["upload_dir"],
|
||||
executable=kwargs["executable"])
|
||||
executable=kwargs["executable"],
|
||||
studio_task_queue=studio_task_queue)
|
||||
projects_endpoint = WebpublisherProjectsEndpoint(resource)
|
||||
server_manager.add_route(
|
||||
"GET",
|
||||
|
|
@ -49,6 +55,13 @@ def run_webserver(*args, **kwargs):
|
|||
hiearchy_endpoint.dispatch
|
||||
)
|
||||
|
||||
configured_ext_endpoint = ConfiguredExtensionsEndpoint(resource)
|
||||
server_manager.add_route(
|
||||
"GET",
|
||||
"/api/webpublish/configured_ext/{project_name}",
|
||||
configured_ext_endpoint.dispatch
|
||||
)
|
||||
|
||||
# triggers publish
|
||||
webpublisher_task_publish_endpoint = \
|
||||
WebpublisherBatchPublishEndpoint(resource)
|
||||
|
|
@ -88,6 +101,10 @@ def run_webserver(*args, **kwargs):
|
|||
if time.time() - last_reprocessed > 20:
|
||||
reprocess_failed(kwargs["upload_dir"], webserver_url)
|
||||
last_reprocessed = time.time()
|
||||
if studio_task_queue:
|
||||
args = studio_task_queue.popleft()
|
||||
subprocess.call(args) # blocking call
|
||||
|
||||
time.sleep(1.0)
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -522,6 +522,11 @@ def get_local_site_id():
|
|||
|
||||
Identifier is created if does not exists yet.
|
||||
"""
|
||||
# override local id from environment
|
||||
# used for background syncing
|
||||
if os.environ.get("OPENPYPE_LOCAL_ID"):
|
||||
return os.environ["OPENPYPE_LOCAL_ID"]
|
||||
|
||||
registry = OpenPypeSettingsRegistry()
|
||||
try:
|
||||
return registry.get_item("localId")
|
||||
|
|
|
|||
|
|
@ -2,6 +2,8 @@ import json
|
|||
import logging
|
||||
import os
|
||||
import re
|
||||
import abc
|
||||
import six
|
||||
|
||||
|
||||
from .anatomy import Anatomy
|
||||
|
|
@ -196,3 +198,159 @@ def get_project_basic_paths(project_name):
|
|||
if isinstance(folder_structure, str):
|
||||
folder_structure = json.loads(folder_structure)
|
||||
return _list_path_items(folder_structure)
|
||||
|
||||
|
||||
@six.add_metaclass(abc.ABCMeta)
|
||||
class HostDirmap:
|
||||
"""
|
||||
Abstract class for running dirmap on a workfile in a host.
|
||||
|
||||
Dirmap is used to translate paths inside of host workfile from one
|
||||
OS to another. (Eg. arstist created workfile on Win, different artists
|
||||
opens same file on Linux.)
|
||||
|
||||
Expects methods to be implemented inside of host:
|
||||
on_dirmap_enabled: run host code for enabling dirmap
|
||||
do_dirmap: run host code to do actual remapping
|
||||
"""
|
||||
def __init__(self, host_name, project_settings, sync_module=None):
|
||||
self.host_name = host_name
|
||||
self.project_settings = project_settings
|
||||
self.sync_module = sync_module # to limit reinit of Modules
|
||||
|
||||
self._mapping = None # cache mapping
|
||||
|
||||
@abc.abstractmethod
|
||||
def on_enable_dirmap(self):
|
||||
"""
|
||||
Run host dependent operation for enabling dirmap if necessary.
|
||||
"""
|
||||
|
||||
@abc.abstractmethod
|
||||
def dirmap_routine(self, source_path, destination_path):
|
||||
"""
|
||||
Run host dependent remapping from source_path to destination_path
|
||||
"""
|
||||
|
||||
def process_dirmap(self):
|
||||
# type: (dict) -> None
|
||||
"""Go through all paths in Settings and set them using `dirmap`.
|
||||
|
||||
If artists has Site Sync enabled, take dirmap mapping directly from
|
||||
Local Settings when artist is syncing workfile locally.
|
||||
|
||||
Args:
|
||||
project_settings (dict): Settings for current project.
|
||||
|
||||
"""
|
||||
if not self._mapping:
|
||||
self._mapping = self.get_mappings(self.project_settings)
|
||||
if not self._mapping:
|
||||
return
|
||||
|
||||
log.info("Processing directory mapping ...")
|
||||
self.on_enable_dirmap()
|
||||
log.info("mapping:: {}".format(self._mapping))
|
||||
|
||||
for k, sp in enumerate(self._mapping["source-path"]):
|
||||
try:
|
||||
print("{} -> {}".format(sp,
|
||||
self._mapping["destination-path"][k]))
|
||||
self.dirmap_routine(sp,
|
||||
self._mapping["destination-path"][k])
|
||||
except IndexError:
|
||||
# missing corresponding destination path
|
||||
log.error(("invalid dirmap mapping, missing corresponding"
|
||||
" destination directory."))
|
||||
break
|
||||
except RuntimeError:
|
||||
log.error("invalid path {} -> {}, mapping not registered".format( # noqa: E501
|
||||
sp, self._mapping["destination-path"][k]
|
||||
))
|
||||
continue
|
||||
|
||||
def get_mappings(self, project_settings):
|
||||
"""Get translation from source-path to destination-path.
|
||||
|
||||
It checks if Site Sync is enabled and user chose to use local
|
||||
site, in that case configuration in Local Settings takes precedence
|
||||
"""
|
||||
local_mapping = self._get_local_sync_dirmap(project_settings)
|
||||
dirmap_label = "{}-dirmap".format(self.host_name)
|
||||
if not self.project_settings[self.host_name].get(dirmap_label) and \
|
||||
not local_mapping:
|
||||
return []
|
||||
mapping = local_mapping or \
|
||||
self.project_settings[self.host_name][dirmap_label]["paths"] or {}
|
||||
enbled = self.project_settings[self.host_name][dirmap_label]["enabled"]
|
||||
mapping_enabled = enbled or bool(local_mapping)
|
||||
|
||||
if not mapping or not mapping_enabled or \
|
||||
not mapping.get("destination-path") or \
|
||||
not mapping.get("source-path"):
|
||||
return []
|
||||
return mapping
|
||||
|
||||
def _get_local_sync_dirmap(self, project_settings):
|
||||
"""
|
||||
Returns dirmap if synch to local project is enabled.
|
||||
|
||||
Only valid mapping is from roots of remote site to local site set
|
||||
in Local Settings.
|
||||
|
||||
Args:
|
||||
project_settings (dict)
|
||||
Returns:
|
||||
dict : { "source-path": [XXX], "destination-path": [YYYY]}
|
||||
"""
|
||||
import json
|
||||
mapping = {}
|
||||
|
||||
if not project_settings["global"]["sync_server"]["enabled"]:
|
||||
log.debug("Site Sync not enabled")
|
||||
return mapping
|
||||
|
||||
from openpype.settings.lib import get_site_local_overrides
|
||||
|
||||
if not self.sync_module:
|
||||
from openpype.modules import ModulesManager
|
||||
manager = ModulesManager()
|
||||
self.sync_module = manager.modules_by_name["sync_server"]
|
||||
|
||||
project_name = os.getenv("AVALON_PROJECT")
|
||||
|
||||
active_site = self.sync_module.get_local_normalized_site(
|
||||
self.sync_module.get_active_site(project_name))
|
||||
remote_site = self.sync_module.get_local_normalized_site(
|
||||
self.sync_module.get_remote_site(project_name))
|
||||
log.debug("active {} - remote {}".format(active_site, remote_site))
|
||||
|
||||
if active_site == "local" \
|
||||
and project_name in self.sync_module.get_enabled_projects()\
|
||||
and active_site != remote_site:
|
||||
|
||||
sync_settings = self.sync_module.get_sync_project_setting(
|
||||
os.getenv("AVALON_PROJECT"), exclude_locals=False,
|
||||
cached=False)
|
||||
|
||||
active_overrides = get_site_local_overrides(
|
||||
os.getenv("AVALON_PROJECT"), active_site)
|
||||
remote_overrides = get_site_local_overrides(
|
||||
os.getenv("AVALON_PROJECT"), remote_site)
|
||||
|
||||
log.debug("local overrides".format(active_overrides))
|
||||
log.debug("remote overrides".format(remote_overrides))
|
||||
for root_name, active_site_dir in active_overrides.items():
|
||||
remote_site_dir = remote_overrides.get(root_name) or\
|
||||
sync_settings["sites"][remote_site]["root"][root_name]
|
||||
if os.path.isdir(active_site_dir):
|
||||
if not mapping.get("destination-path"):
|
||||
mapping["destination-path"] = []
|
||||
mapping["destination-path"].append(active_site_dir)
|
||||
|
||||
if not mapping.get("source-path"):
|
||||
mapping["source-path"] = []
|
||||
mapping["source-path"].append(remote_site_dir)
|
||||
|
||||
log.debug("local sync mapping:: {}".format(mapping))
|
||||
return mapping
|
||||
|
|
|
|||
|
|
@ -22,6 +22,9 @@ def import_filepath(filepath, module_name=None):
|
|||
if module_name is None:
|
||||
module_name = os.path.splitext(os.path.basename(filepath))[0]
|
||||
|
||||
# Make sure it is not 'unicode' in Python 2
|
||||
module_name = str(module_name)
|
||||
|
||||
# Prepare module object where content of file will be parsed
|
||||
module = types.ModuleType(module_name)
|
||||
|
||||
|
|
|
|||
|
|
@ -8,6 +8,7 @@ import pyblish.api
|
|||
|
||||
from openpype import uninstall
|
||||
from openpype.lib.mongo import OpenPypeMongoConnection
|
||||
from openpype.lib.plugin_tools import parse_json
|
||||
|
||||
|
||||
def get_webpublish_conn():
|
||||
|
|
@ -31,7 +32,8 @@ def start_webpublish_log(dbcon, batch_id, user):
|
|||
"batch_id": batch_id,
|
||||
"start_date": datetime.now(),
|
||||
"user": user,
|
||||
"status": "in_progress"
|
||||
"status": "in_progress",
|
||||
"progress": 0.0
|
||||
}).inserted_id
|
||||
|
||||
|
||||
|
|
@ -157,3 +159,28 @@ def _get_close_plugin(close_plugin_name, log):
|
|||
return plugin
|
||||
|
||||
log.warning("Close plugin not found, app might not close.")
|
||||
|
||||
|
||||
def get_task_data(batch_dir):
|
||||
"""Return parsed data from first task manifest.json
|
||||
|
||||
Used for `remotepublishfromapp` command where batch contains only
|
||||
single task with publishable workfile.
|
||||
|
||||
Returns:
|
||||
(dict)
|
||||
Throws:
|
||||
(ValueError) if batch or task manifest not found or broken
|
||||
"""
|
||||
batch_data = parse_json(os.path.join(batch_dir, "manifest.json"))
|
||||
if not batch_data:
|
||||
raise ValueError(
|
||||
"Cannot parse batch meta in {} folder".format(batch_dir))
|
||||
task_dir_name = batch_data["tasks"][0]
|
||||
task_data = parse_json(os.path.join(batch_dir, task_dir_name,
|
||||
"manifest.json"))
|
||||
if not task_data:
|
||||
raise ValueError(
|
||||
"Cannot parse batch meta in {} folder".format(task_data))
|
||||
|
||||
return task_data
|
||||
|
|
|
|||
|
|
@ -22,6 +22,10 @@ OpenPype modules should contain separated logic of specific kind of implementati
|
|||
- `__init__` should not be overridden and `initialize` should not do time consuming part but only prepare base data about module
|
||||
- also keep in mind that they may be initialized in headless mode
|
||||
- connection with other modules is made with help of interfaces
|
||||
- `cli` method - add cli commands specific for the module
|
||||
- command line arguments are handled using `click` python module
|
||||
- `cli` method should expect single argument which is click group on which can be called any group specific methods (e.g. `add_command` to add another click group as children see `ExampleAddon`)
|
||||
- it is possible to add trigger cli commands using `./openpype_console module <module_name> <command> *args`
|
||||
|
||||
## Addon class `OpenPypeAddOn`
|
||||
- inherits from `OpenPypeModule` but is enabled by default and doesn't have to implement `initialize` and `connect_with_modules` methods
|
||||
|
|
@ -140,4 +144,4 @@ class ClockifyModule(
|
|||
|
||||
### TrayModulesManager
|
||||
- inherits from `ModulesManager`
|
||||
- has specific implementation for Pype Tray tool and handle `ITrayModule` methods
|
||||
- has specific implementation for Pype Tray tool and handle `ITrayModule` methods
|
||||
|
|
|
|||
|
|
@ -107,12 +107,9 @@ class _InterfacesClass(_ModuleClass):
|
|||
if attr_name in ("__path__", "__file__"):
|
||||
return None
|
||||
|
||||
# Fake Interface if is not missing
|
||||
self.__attributes__[attr_name] = type(
|
||||
attr_name,
|
||||
(MissingInteface, ),
|
||||
{}
|
||||
)
|
||||
raise ImportError((
|
||||
"cannot import name '{}' from 'openpype_interfaces'"
|
||||
).format(attr_name))
|
||||
|
||||
return self.__attributes__[attr_name]
|
||||
|
||||
|
|
@ -212,54 +209,17 @@ def _load_interfaces():
|
|||
_InterfacesClass(modules_key)
|
||||
)
|
||||
|
||||
log = PypeLogger.get_logger("InterfacesLoader")
|
||||
from . import interfaces
|
||||
|
||||
dirpaths = get_module_dirs()
|
||||
|
||||
interface_paths = []
|
||||
interface_paths.append(
|
||||
os.path.join(get_default_modules_dir(), "interfaces.py")
|
||||
)
|
||||
for dirpath in dirpaths:
|
||||
if not os.path.exists(dirpath):
|
||||
for attr_name in dir(interfaces):
|
||||
attr = getattr(interfaces, attr_name)
|
||||
if (
|
||||
not inspect.isclass(attr)
|
||||
or attr is OpenPypeInterface
|
||||
or not issubclass(attr, OpenPypeInterface)
|
||||
):
|
||||
continue
|
||||
|
||||
for filename in os.listdir(dirpath):
|
||||
if filename in ("__pycache__", ):
|
||||
continue
|
||||
|
||||
full_path = os.path.join(dirpath, filename)
|
||||
if not os.path.isdir(full_path):
|
||||
continue
|
||||
|
||||
interfaces_path = os.path.join(full_path, "interfaces.py")
|
||||
if os.path.exists(interfaces_path):
|
||||
interface_paths.append(interfaces_path)
|
||||
|
||||
for full_path in interface_paths:
|
||||
if not os.path.exists(full_path):
|
||||
continue
|
||||
|
||||
try:
|
||||
# Prepare module object where content of file will be parsed
|
||||
module = import_filepath(full_path)
|
||||
|
||||
except Exception:
|
||||
log.warning(
|
||||
"Failed to load path: \"{0}\"".format(full_path),
|
||||
exc_info=True
|
||||
)
|
||||
continue
|
||||
|
||||
for attr_name in dir(module):
|
||||
attr = getattr(module, attr_name)
|
||||
if (
|
||||
not inspect.isclass(attr)
|
||||
or attr is OpenPypeInterface
|
||||
or not issubclass(attr, OpenPypeInterface)
|
||||
):
|
||||
continue
|
||||
setattr(openpype_interfaces, attr_name, attr)
|
||||
setattr(openpype_interfaces, attr_name, attr)
|
||||
|
||||
|
||||
def load_modules(force=False):
|
||||
|
|
@ -333,6 +293,15 @@ def _load_modules():
|
|||
# - check manifest and content of manifest
|
||||
try:
|
||||
if os.path.isdir(fullpath):
|
||||
# Module without init file can't be used as OpenPype module
|
||||
# because the module class could not be imported
|
||||
init_file = os.path.join(fullpath, "__init__.py")
|
||||
if not os.path.exists(init_file):
|
||||
log.info((
|
||||
"Skipping module directory because of"
|
||||
" missing \"__init__.py\" file. \"{}\""
|
||||
).format(fullpath))
|
||||
continue
|
||||
import_module_from_dirpath(dirpath, filename, modules_key)
|
||||
|
||||
elif ext in (".py", ):
|
||||
|
|
@ -369,14 +338,6 @@ class OpenPypeInterface:
|
|||
pass
|
||||
|
||||
|
||||
class MissingInteface(OpenPypeInterface):
|
||||
"""Class representing missing interface class.
|
||||
|
||||
Used when interface is not available from currently registered paths.
|
||||
"""
|
||||
pass
|
||||
|
||||
|
||||
@six.add_metaclass(ABCMeta)
|
||||
class OpenPypeModule:
|
||||
"""Base class of pype module.
|
||||
|
|
@ -431,6 +392,28 @@ class OpenPypeModule:
|
|||
"""
|
||||
return {}
|
||||
|
||||
def cli(self, module_click_group):
|
||||
"""Add commands to click group.
|
||||
|
||||
The best practise is to create click group for whole module which is
|
||||
used to separate commands.
|
||||
|
||||
class MyPlugin(OpenPypeModule):
|
||||
...
|
||||
def cli(self, module_click_group):
|
||||
module_click_group.add_command(cli_main)
|
||||
|
||||
|
||||
@click.group(<module name>, help="<Any help shown in cmd>")
|
||||
def cli_main():
|
||||
pass
|
||||
|
||||
@cli_main.command()
|
||||
def mycommand():
|
||||
print("my_command")
|
||||
"""
|
||||
pass
|
||||
|
||||
|
||||
class OpenPypeAddOn(OpenPypeModule):
|
||||
# Enable Addon by default
|
||||
|
|
|
|||
|
|
@ -194,6 +194,7 @@ class SyncToAvalonEvent(BaseEvent):
|
|||
ftrack_id = proj["data"].get("ftrackId")
|
||||
if ftrack_id is None:
|
||||
ftrack_id = self._update_project_ftrack_id()
|
||||
proj["data"]["ftrackId"] = ftrack_id
|
||||
self._avalon_ents_by_ftrack_id[ftrack_id] = proj
|
||||
for ent in ents:
|
||||
ftrack_id = ent["data"].get("ftrackId")
|
||||
|
|
@ -584,6 +585,10 @@ class SyncToAvalonEvent(BaseEvent):
|
|||
continue
|
||||
ftrack_id = ftrack_id[0]
|
||||
|
||||
# Skip deleted projects
|
||||
if action == "remove" and entityType == "show":
|
||||
return True
|
||||
|
||||
# task modified, collect parent id of task, handle separately
|
||||
if entity_type.lower() == "task":
|
||||
changes = ent_info.get("changes") or {}
|
||||
|
|
|
|||
|
|
@ -1,8 +1,10 @@
|
|||
import os
|
||||
import json
|
||||
import collections
|
||||
from openpype.modules import OpenPypeModule
|
||||
|
||||
import click
|
||||
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype_interfaces import (
|
||||
ITrayModule,
|
||||
IPluginPaths,
|
||||
|
|
@ -224,8 +226,8 @@ class FtrackModule(
|
|||
if not project_name:
|
||||
return
|
||||
|
||||
attributes_changes = changes.get("attributes")
|
||||
if not attributes_changes:
|
||||
new_attr_values = new_value.get("attributes")
|
||||
if not new_attr_values:
|
||||
return
|
||||
|
||||
import ftrack_api
|
||||
|
|
@ -275,7 +277,7 @@ class FtrackModule(
|
|||
|
||||
failed = {}
|
||||
missing = {}
|
||||
for key, value in attributes_changes.items():
|
||||
for key, value in new_attr_values.items():
|
||||
if key not in ca_keys:
|
||||
continue
|
||||
|
||||
|
|
@ -349,12 +351,24 @@ class FtrackModule(
|
|||
if "server_url" not in session_kwargs:
|
||||
session_kwargs["server_url"] = self.ftrack_url
|
||||
|
||||
if "api_key" not in session_kwargs or "api_user" not in session_kwargs:
|
||||
api_key = session_kwargs.get("api_key")
|
||||
api_user = session_kwargs.get("api_user")
|
||||
# First look into environments
|
||||
# - both OpenPype tray and ftrack event server should have set them
|
||||
# - ftrack event server may crash when credentials are tried to load
|
||||
# from keyring
|
||||
if not api_key or not api_user:
|
||||
api_key = os.environ.get("FTRACK_API_KEY")
|
||||
api_user = os.environ.get("FTRACK_API_USER")
|
||||
|
||||
if not api_key or not api_user:
|
||||
from .lib import credentials
|
||||
cred = credentials.get_credentials()
|
||||
session_kwargs["api_user"] = cred.get("username")
|
||||
session_kwargs["api_key"] = cred.get("api_key")
|
||||
api_user = cred.get("username")
|
||||
api_key = cred.get("api_key")
|
||||
|
||||
session_kwargs["api_user"] = api_user
|
||||
session_kwargs["api_key"] = api_key
|
||||
return ftrack_api.Session(**session_kwargs)
|
||||
|
||||
def tray_init(self):
|
||||
|
|
@ -409,3 +423,62 @@ class FtrackModule(
|
|||
return 0
|
||||
hours_logged = (task_entity["time_logged"] / 60) / 60
|
||||
return hours_logged
|
||||
|
||||
def get_credentials(self):
|
||||
# type: () -> tuple
|
||||
"""Get local Ftrack credentials."""
|
||||
from .lib import credentials
|
||||
|
||||
cred = credentials.get_credentials(self.ftrack_url)
|
||||
return cred.get("username"), cred.get("api_key")
|
||||
|
||||
def cli(self, click_group):
|
||||
click_group.add_command(cli_main)
|
||||
|
||||
|
||||
@click.group(FtrackModule.name, help="Ftrack module related commands.")
|
||||
def cli_main():
|
||||
pass
|
||||
|
||||
|
||||
@cli_main.command()
|
||||
@click.option("-d", "--debug", is_flag=True, help="Print debug messages")
|
||||
@click.option("--ftrack-url", envvar="FTRACK_SERVER",
|
||||
help="Ftrack server url")
|
||||
@click.option("--ftrack-user", envvar="FTRACK_API_USER",
|
||||
help="Ftrack api user")
|
||||
@click.option("--ftrack-api-key", envvar="FTRACK_API_KEY",
|
||||
help="Ftrack api key")
|
||||
@click.option("--legacy", is_flag=True,
|
||||
help="run event server without mongo storing")
|
||||
@click.option("--clockify-api-key", envvar="CLOCKIFY_API_KEY",
|
||||
help="Clockify API key.")
|
||||
@click.option("--clockify-workspace", envvar="CLOCKIFY_WORKSPACE",
|
||||
help="Clockify workspace")
|
||||
def eventserver(
|
||||
debug,
|
||||
ftrack_url,
|
||||
ftrack_user,
|
||||
ftrack_api_key,
|
||||
legacy,
|
||||
clockify_api_key,
|
||||
clockify_workspace
|
||||
):
|
||||
"""Launch ftrack event server.
|
||||
|
||||
This should be ideally used by system service (such us systemd or upstart
|
||||
on linux and window service).
|
||||
"""
|
||||
if debug:
|
||||
os.environ["OPENPYPE_DEBUG"] = "3"
|
||||
|
||||
from .ftrack_server.event_server_cli import run_event_server
|
||||
|
||||
return run_event_server(
|
||||
ftrack_url,
|
||||
ftrack_user,
|
||||
ftrack_api_key,
|
||||
legacy,
|
||||
clockify_api_key,
|
||||
clockify_workspace
|
||||
)
|
||||
|
|
|
|||
|
|
@ -570,9 +570,15 @@ class BaseHandler(object):
|
|||
|
||||
if low_entity_type == "assetversion":
|
||||
asset = entity["asset"]
|
||||
parent = None
|
||||
if asset:
|
||||
parent = asset["parent"]
|
||||
if parent:
|
||||
|
||||
if parent:
|
||||
if parent.entity_type.lower() == "project":
|
||||
return parent
|
||||
|
||||
if "project" in parent:
|
||||
return parent["project"]
|
||||
|
||||
project_data = entity["link"][0]
|
||||
|
|
|
|||
|
|
@ -0,0 +1,23 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Collect default Deadline server."""
|
||||
import pyblish.api
|
||||
import os
|
||||
|
||||
|
||||
class CollectLocalFtrackCreds(pyblish.api.ContextPlugin):
|
||||
"""Collect default Royal Render path."""
|
||||
|
||||
order = pyblish.api.CollectorOrder + 0.01
|
||||
label = "Collect local ftrack credentials"
|
||||
targets = ["rr_control"]
|
||||
|
||||
def process(self, context):
|
||||
if os.getenv("FTRACK_API_USER") and os.getenv("FTRACK_API_KEY") and \
|
||||
os.getenv("FTRACK_SERVER"):
|
||||
return
|
||||
ftrack_module = context.data["openPypeModules"]["ftrack"]
|
||||
if ftrack_module.enabled:
|
||||
creds = ftrack_module.get_credentials()
|
||||
os.environ["FTRACK_API_USER"] = creds[0]
|
||||
os.environ["FTRACK_API_KEY"] = creds[1]
|
||||
os.environ["FTRACK_SERVER"] = ftrack_module.ftrack_url
|
||||
|
|
@ -27,16 +27,12 @@ class CollectUsername(pyblish.api.ContextPlugin):
|
|||
order = pyblish.api.CollectorOrder - 0.488
|
||||
label = "Collect ftrack username"
|
||||
hosts = ["webpublisher", "photoshop"]
|
||||
targets = ["remotepublish", "filespublish"]
|
||||
|
||||
_context = None
|
||||
|
||||
def process(self, context):
|
||||
self.log.info("CollectUsername")
|
||||
# photoshop could be triggered remotely in webpublisher fashion
|
||||
if os.environ["AVALON_APP"] == "photoshop":
|
||||
if not os.environ.get("IS_HEADLESS"):
|
||||
self.log.debug("Regular process, skipping")
|
||||
return
|
||||
|
||||
os.environ["FTRACK_API_USER"] = os.environ["FTRACK_BOT_API_USER"]
|
||||
os.environ["FTRACK_API_KEY"] = os.environ["FTRACK_BOT_API_KEY"]
|
||||
|
|
|
|||
|
|
@ -0,0 +1,6 @@
|
|||
from .royal_render_module import RoyalRenderModule
|
||||
|
||||
|
||||
__all__ = (
|
||||
"RoyalRenderModule",
|
||||
)
|
||||
199
openpype/modules/default_modules/royal_render/api.py
Normal file
199
openpype/modules/default_modules/royal_render/api.py
Normal file
|
|
@ -0,0 +1,199 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Wrapper around Royal Render API."""
|
||||
import sys
|
||||
import os
|
||||
|
||||
from openpype.settings import get_project_settings
|
||||
from openpype.lib.local_settings import OpenPypeSettingsRegistry
|
||||
from openpype.lib import PypeLogger, run_subprocess
|
||||
from .rr_job import RRJob, SubmitFile, SubmitterParameter
|
||||
|
||||
|
||||
log = PypeLogger.get_logger("RoyalRender")
|
||||
|
||||
|
||||
class Api:
|
||||
|
||||
_settings = None
|
||||
RR_SUBMIT_CONSOLE = 1
|
||||
RR_SUBMIT_API = 2
|
||||
|
||||
def __init__(self, settings, project=None):
|
||||
self._settings = settings
|
||||
self._initialize_rr(project)
|
||||
|
||||
def _initialize_rr(self, project=None):
|
||||
# type: (str) -> None
|
||||
"""Initialize RR Path.
|
||||
|
||||
Args:
|
||||
project (str, Optional): Project name to set RR api in
|
||||
context.
|
||||
|
||||
"""
|
||||
if project:
|
||||
project_settings = get_project_settings(project)
|
||||
rr_path = (
|
||||
project_settings
|
||||
["royalrender"]
|
||||
["rr_paths"]
|
||||
)
|
||||
else:
|
||||
rr_path = (
|
||||
self._settings
|
||||
["modules"]
|
||||
["royalrender"]
|
||||
["rr_path"]
|
||||
["default"]
|
||||
)
|
||||
os.environ["RR_ROOT"] = rr_path
|
||||
self._rr_path = rr_path
|
||||
|
||||
def _get_rr_bin_path(self, rr_root=None):
|
||||
# type: (str) -> str
|
||||
"""Get path to RR bin folder."""
|
||||
rr_root = rr_root or self._rr_path
|
||||
is_64bit_python = sys.maxsize > 2 ** 32
|
||||
|
||||
rr_bin_path = ""
|
||||
if sys.platform.lower() == "win32":
|
||||
rr_bin_path = "/bin/win64"
|
||||
if not is_64bit_python:
|
||||
# we are using 64bit python
|
||||
rr_bin_path = "/bin/win"
|
||||
rr_bin_path = rr_bin_path.replace(
|
||||
"/", os.path.sep
|
||||
)
|
||||
|
||||
if sys.platform.lower() == "darwin":
|
||||
rr_bin_path = "/bin/mac64"
|
||||
if not is_64bit_python:
|
||||
rr_bin_path = "/bin/mac"
|
||||
|
||||
if sys.platform.lower() == "linux":
|
||||
rr_bin_path = "/bin/lx64"
|
||||
|
||||
return os.path.join(rr_root, rr_bin_path)
|
||||
|
||||
def _initialize_module_path(self):
|
||||
# type: () -> None
|
||||
"""Set RR modules for Python."""
|
||||
# default for linux
|
||||
rr_bin = self._get_rr_bin_path()
|
||||
rr_module_path = os.path.join(rr_bin, "lx64/lib")
|
||||
|
||||
if sys.platform.lower() == "win32":
|
||||
rr_module_path = rr_bin
|
||||
rr_module_path = rr_module_path.replace(
|
||||
"/", os.path.sep
|
||||
)
|
||||
|
||||
if sys.platform.lower() == "darwin":
|
||||
rr_module_path = os.path.join(rr_bin, "lib/python/27")
|
||||
|
||||
sys.path.append(os.path.join(self._rr_path, rr_module_path))
|
||||
|
||||
def create_submission(self, jobs, submitter_attributes, file_name=None):
|
||||
# type: (list[RRJob], list[SubmitterParameter], str) -> SubmitFile
|
||||
"""Create jobs submission file.
|
||||
|
||||
Args:
|
||||
jobs (list): List of :class:`RRJob`
|
||||
submitter_attributes (list): List of submitter attributes
|
||||
:class:`SubmitterParameter` for whole submission batch.
|
||||
file_name (str), optional): File path to write data to.
|
||||
|
||||
Returns:
|
||||
str: XML data of job submission files.
|
||||
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def submit_file(self, file, mode=RR_SUBMIT_CONSOLE):
|
||||
# type: (SubmitFile, int) -> None
|
||||
if mode == self.RR_SUBMIT_CONSOLE:
|
||||
self._submit_using_console(file)
|
||||
|
||||
# RR v7 supports only Python 2.7 so we bail out in fear
|
||||
# until there is support for Python 3 😰
|
||||
raise NotImplementedError(
|
||||
"Submission via RoyalRender API is not supported yet")
|
||||
# self._submit_using_api(file)
|
||||
|
||||
def _submit_using_console(self, file):
|
||||
# type: (SubmitFile) -> bool
|
||||
rr_console = os.path.join(
|
||||
self._get_rr_bin_path(),
|
||||
"rrSubmitterconsole"
|
||||
)
|
||||
|
||||
if sys.platform.lower() == "darwin":
|
||||
if "/bin/mac64" in rr_console:
|
||||
rr_console = rr_console.replace("/bin/mac64", "/bin/mac")
|
||||
|
||||
if sys.platform.lower() == "win32":
|
||||
if "/bin/win64" in rr_console:
|
||||
rr_console = rr_console.replace("/bin/win64", "/bin/win")
|
||||
rr_console += ".exe"
|
||||
|
||||
args = [rr_console, file]
|
||||
run_subprocess(" ".join(args), logger=log)
|
||||
|
||||
def _submit_using_api(self, file):
|
||||
# type: (SubmitFile) -> None
|
||||
"""Use RR API to submit jobs.
|
||||
|
||||
Args:
|
||||
file (SubmitFile): Submit jobs definition.
|
||||
|
||||
Throws:
|
||||
RoyalRenderException: When something fails.
|
||||
|
||||
"""
|
||||
self._initialize_module_path()
|
||||
import libpyRR2 as rrLib # noqa
|
||||
from rrJob import getClass_JobBasics # noqa
|
||||
import libpyRR2 as _RenderAppBasic # noqa
|
||||
|
||||
tcp = rrLib._rrTCP("") # noqa
|
||||
rr_server = tcp.getRRServer()
|
||||
|
||||
if len(rr_server) == 0:
|
||||
log.info("Got RR IP address {}".format(rr_server))
|
||||
|
||||
# TODO: Port is hardcoded in RR? If not, move it to Settings
|
||||
if not tcp.setServer(rr_server, 7773):
|
||||
log.error(
|
||||
"Can not set RR server: {}".format(tcp.errorMessage()))
|
||||
raise RoyalRenderException(tcp.errorMessage())
|
||||
|
||||
# TODO: This need UI and better handling of username/password.
|
||||
# We can't store password in keychain as it is pulled multiple
|
||||
# times and users on linux must enter keychain password every time.
|
||||
# Probably best way until we setup our own user management would be
|
||||
# to encrypt password and save it to json locally. Not bulletproof
|
||||
# but at least it is not stored in plaintext.
|
||||
reg = OpenPypeSettingsRegistry()
|
||||
try:
|
||||
rr_user = reg.get_item("rr_username")
|
||||
rr_password = reg.get_item("rr_password")
|
||||
except ValueError:
|
||||
# user has no rr credentials set
|
||||
pass
|
||||
else:
|
||||
# login to RR
|
||||
tcp.setLogin(rr_user, rr_password)
|
||||
|
||||
job = getClass_JobBasics()
|
||||
renderer = _RenderAppBasic()
|
||||
|
||||
# iterate over SubmitFile, set _JobBasic (job) and renderer
|
||||
# and feed it to jobSubmitNew()
|
||||
# not implemented yet
|
||||
job.renderer = renderer
|
||||
tcp.jobSubmitNew(job)
|
||||
|
||||
|
||||
class RoyalRenderException(Exception):
|
||||
"""Exception used in various error states coming from RR."""
|
||||
pass
|
||||
|
|
@ -0,0 +1,23 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Collect default Deadline server."""
|
||||
import pyblish.api
|
||||
|
||||
|
||||
class CollectDefaultRRPath(pyblish.api.ContextPlugin):
|
||||
"""Collect default Royal Render path."""
|
||||
|
||||
order = pyblish.api.CollectorOrder + 0.01
|
||||
label = "Default Royal Render Path"
|
||||
|
||||
def process(self, context):
|
||||
try:
|
||||
rr_module = context.data.get(
|
||||
"openPypeModules")["royalrender"]
|
||||
except AttributeError:
|
||||
msg = "Cannot get OpenPype Royal Render module."
|
||||
self.log.error(msg)
|
||||
raise AssertionError(msg)
|
||||
|
||||
# get default deadline webservice url from deadline module
|
||||
self.log.debug(rr_module.rr_paths)
|
||||
context.data["defaultRRPath"] = rr_module.rr_paths["default"] # noqa: E501
|
||||
|
|
@ -0,0 +1,49 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
import pyblish.api
|
||||
|
||||
|
||||
class CollectRRPathFromInstance(pyblish.api.InstancePlugin):
|
||||
"""Collect RR Path from instance."""
|
||||
|
||||
order = pyblish.api.CollectorOrder
|
||||
label = "Royal Render Path from the Instance"
|
||||
families = ["rendering"]
|
||||
|
||||
def process(self, instance):
|
||||
instance.data["rrPath"] = self._collect_rr_path(instance)
|
||||
self.log.info(
|
||||
"Using {} for submission.".format(instance.data["rrPath"]))
|
||||
|
||||
@staticmethod
|
||||
def _collect_rr_path(render_instance):
|
||||
# type: (pyblish.api.Instance) -> str
|
||||
"""Get Royal Render path from render instance."""
|
||||
rr_settings = (
|
||||
render_instance.context.data
|
||||
["system_settings"]
|
||||
["modules"]
|
||||
["royalrender"]
|
||||
)
|
||||
try:
|
||||
default_servers = rr_settings["rr_paths"]
|
||||
project_servers = (
|
||||
render_instance.context.data
|
||||
["project_settings"]
|
||||
["royalrender"]
|
||||
["rr_paths"]
|
||||
)
|
||||
rr_servers = {
|
||||
k: default_servers[k]
|
||||
for k in project_servers
|
||||
if k in default_servers
|
||||
}
|
||||
|
||||
except AttributeError:
|
||||
# Handle situation were we had only one url for deadline.
|
||||
return render_instance.context.data["defaultRRPath"]
|
||||
|
||||
return rr_servers[
|
||||
list(rr_servers.keys())[
|
||||
int(render_instance.data.get("rrPaths"))
|
||||
]
|
||||
]
|
||||
|
|
@ -0,0 +1,209 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Collect sequences from Royal Render Job."""
|
||||
import os
|
||||
import re
|
||||
import copy
|
||||
import json
|
||||
from pprint import pformat
|
||||
|
||||
import pyblish.api
|
||||
from avalon import api
|
||||
|
||||
|
||||
def collect(root,
|
||||
regex=None,
|
||||
exclude_regex=None,
|
||||
frame_start=None,
|
||||
frame_end=None):
|
||||
"""Collect sequence collections in root"""
|
||||
|
||||
from avalon.vendor import clique
|
||||
|
||||
files = []
|
||||
for filename in os.listdir(root):
|
||||
|
||||
# Must have extension
|
||||
ext = os.path.splitext(filename)[1]
|
||||
if not ext:
|
||||
continue
|
||||
|
||||
# Only files
|
||||
if not os.path.isfile(os.path.join(root, filename)):
|
||||
continue
|
||||
|
||||
# Include and exclude regex
|
||||
if regex and not re.search(regex, filename):
|
||||
continue
|
||||
if exclude_regex and re.search(exclude_regex, filename):
|
||||
continue
|
||||
|
||||
files.append(filename)
|
||||
|
||||
# Match collections
|
||||
# Support filenames like: projectX_shot01_0010.tiff with this regex
|
||||
pattern = r"(?P<index>(?P<padding>0*)\d+)\.\D+\d?$"
|
||||
collections, remainder = clique.assemble(files,
|
||||
patterns=[pattern],
|
||||
minimum_items=1)
|
||||
|
||||
# Ignore any remainders
|
||||
if remainder:
|
||||
print("Skipping remainder {}".format(remainder))
|
||||
|
||||
# Exclude any frames outside start and end frame.
|
||||
for collection in collections:
|
||||
for index in list(collection.indexes):
|
||||
if frame_start is not None and index < frame_start:
|
||||
collection.indexes.discard(index)
|
||||
continue
|
||||
if frame_end is not None and index > frame_end:
|
||||
collection.indexes.discard(index)
|
||||
continue
|
||||
|
||||
# Keep only collections that have at least a single frame
|
||||
collections = [c for c in collections if c.indexes]
|
||||
|
||||
return collections
|
||||
|
||||
|
||||
class CollectSequencesFromJob(pyblish.api.ContextPlugin):
|
||||
"""Gather file sequences from job directory.
|
||||
|
||||
When "OPENPYPE_PUBLISH_DATA" environment variable is set these paths
|
||||
(folders or .json files) are parsed for image sequences. Otherwise the
|
||||
current working directory is searched for file sequences.
|
||||
|
||||
"""
|
||||
order = pyblish.api.CollectorOrder
|
||||
targets = ["rr_control"]
|
||||
label = "Collect Rendered Frames"
|
||||
|
||||
def process(self, context):
|
||||
if os.environ.get("OPENPYPE_PUBLISH_DATA"):
|
||||
self.log.debug(os.environ.get("OPENPYPE_PUBLISH_DATA"))
|
||||
paths = os.environ["OPENPYPE_PUBLISH_DATA"].split(os.pathsep)
|
||||
self.log.info("Collecting paths: {}".format(paths))
|
||||
else:
|
||||
cwd = context.get("workspaceDir", os.getcwd())
|
||||
paths = [cwd]
|
||||
|
||||
for path in paths:
|
||||
|
||||
self.log.info("Loading: {}".format(path))
|
||||
|
||||
if path.endswith(".json"):
|
||||
# Search using .json configuration
|
||||
with open(path, "r") as f:
|
||||
try:
|
||||
data = json.load(f)
|
||||
except Exception as exc:
|
||||
self.log.error("Error loading json: "
|
||||
"{} - Exception: {}".format(path, exc))
|
||||
raise
|
||||
|
||||
cwd = os.path.dirname(path)
|
||||
root_override = data.get("root")
|
||||
if root_override:
|
||||
if os.path.isabs(root_override):
|
||||
root = root_override
|
||||
else:
|
||||
root = os.path.join(cwd, root_override)
|
||||
else:
|
||||
root = cwd
|
||||
|
||||
metadata = data.get("metadata")
|
||||
if metadata:
|
||||
session = metadata.get("session")
|
||||
if session:
|
||||
self.log.info("setting session using metadata")
|
||||
api.Session.update(session)
|
||||
os.environ.update(session)
|
||||
|
||||
else:
|
||||
# Search in directory
|
||||
data = {}
|
||||
root = path
|
||||
|
||||
self.log.info("Collecting: {}".format(root))
|
||||
regex = data.get("regex")
|
||||
if regex:
|
||||
self.log.info("Using regex: {}".format(regex))
|
||||
|
||||
collections = collect(root=root,
|
||||
regex=regex,
|
||||
exclude_regex=data.get("exclude_regex"),
|
||||
frame_start=data.get("frameStart"),
|
||||
frame_end=data.get("frameEnd"))
|
||||
|
||||
self.log.info("Found collections: {}".format(collections))
|
||||
|
||||
if data.get("subset") and len(collections) > 1:
|
||||
self.log.error("Forced subset can only work with a single "
|
||||
"found sequence")
|
||||
raise RuntimeError("Invalid sequence")
|
||||
|
||||
fps = data.get("fps", 25)
|
||||
|
||||
# Get family from the data
|
||||
families = data.get("families", ["render"])
|
||||
if "render" not in families:
|
||||
families.append("render")
|
||||
if "ftrack" not in families:
|
||||
families.append("ftrack")
|
||||
if "review" not in families:
|
||||
families.append("review")
|
||||
|
||||
for collection in collections:
|
||||
instance = context.create_instance(str(collection))
|
||||
self.log.info("Collection: %s" % list(collection))
|
||||
|
||||
# Ensure each instance gets a unique reference to the data
|
||||
data = copy.deepcopy(data)
|
||||
|
||||
# If no subset provided, get it from collection's head
|
||||
subset = data.get("subset", collection.head.rstrip("_. "))
|
||||
|
||||
# If no start or end frame provided, get it from collection
|
||||
indices = list(collection.indexes)
|
||||
start = data.get("frameStart", indices[0])
|
||||
end = data.get("frameEnd", indices[-1])
|
||||
|
||||
# root = os.path.normpath(root)
|
||||
# self.log.info("Source: {}}".format(data.get("source", "")))
|
||||
|
||||
ext = list(collection)[0].split('.')[-1]
|
||||
|
||||
instance.data.update({
|
||||
"name": str(collection),
|
||||
"family": families[0], # backwards compatibility / pyblish
|
||||
"families": list(families),
|
||||
"subset": subset,
|
||||
"asset": data.get("asset", api.Session["AVALON_ASSET"]),
|
||||
"stagingDir": root,
|
||||
"frameStart": start,
|
||||
"frameEnd": end,
|
||||
"fps": fps,
|
||||
"source": data.get('source', '')
|
||||
})
|
||||
instance.append(collection)
|
||||
instance.context.data['fps'] = fps
|
||||
|
||||
if "representations" not in instance.data:
|
||||
instance.data["representations"] = []
|
||||
|
||||
representation = {
|
||||
'name': ext,
|
||||
'ext': '{}'.format(ext),
|
||||
'files': list(collection),
|
||||
"stagingDir": root,
|
||||
"anatomy_template": "render",
|
||||
"fps": fps,
|
||||
"tags": ['review']
|
||||
}
|
||||
instance.data["representations"].append(representation)
|
||||
|
||||
if data.get('user'):
|
||||
context.data["user"] = data['user']
|
||||
|
||||
self.log.debug("Collected instance:\n"
|
||||
"{}".format(pformat(instance.data)))
|
||||
|
|
@ -0,0 +1,46 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Module providing support for Royal Render."""
|
||||
import os
|
||||
import openpype.modules
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype_interfaces import IPluginPaths
|
||||
|
||||
|
||||
class RoyalRenderModule(OpenPypeModule, IPluginPaths):
|
||||
"""Class providing basic Royal Render implementation logic."""
|
||||
name = "royalrender"
|
||||
|
||||
@property
|
||||
def api(self):
|
||||
if not self._api:
|
||||
# import royal render modules
|
||||
from . import api as rr_api
|
||||
self._api = rr_api.Api(self.settings)
|
||||
|
||||
return self._api
|
||||
|
||||
def __init__(self, manager, settings):
|
||||
# type: (openpype.modules.base.ModulesManager, dict) -> None
|
||||
self.rr_paths = {}
|
||||
self._api = None
|
||||
self.settings = settings
|
||||
super(RoyalRenderModule, self).__init__(manager, settings)
|
||||
|
||||
def initialize(self, module_settings):
|
||||
# type: (dict) -> None
|
||||
rr_settings = module_settings[self.name]
|
||||
self.enabled = rr_settings["enabled"]
|
||||
self.rr_paths = rr_settings.get("rr_paths")
|
||||
|
||||
@staticmethod
|
||||
def get_plugin_paths():
|
||||
# type: () -> dict
|
||||
"""Royal Render plugin paths.
|
||||
|
||||
Returns:
|
||||
dict: Dictionary of plugin paths for RR.
|
||||
"""
|
||||
current_dir = os.path.dirname(os.path.abspath(__file__))
|
||||
return {
|
||||
"publish": [os.path.join(current_dir, "plugins", "publish")]
|
||||
}
|
||||
256
openpype/modules/default_modules/royal_render/rr_job.py
Normal file
256
openpype/modules/default_modules/royal_render/rr_job.py
Normal file
|
|
@ -0,0 +1,256 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Python wrapper for RoyalRender XML job file."""
|
||||
from xml.dom import minidom as md
|
||||
import attr
|
||||
from collections import namedtuple, OrderedDict
|
||||
|
||||
|
||||
CustomAttribute = namedtuple("CustomAttribute", ["name", "value"])
|
||||
|
||||
|
||||
@attr.s
|
||||
class RRJob:
|
||||
"""Mapping of Royal Render job file to a data class."""
|
||||
|
||||
# Required
|
||||
# --------
|
||||
|
||||
# Name of your render application. Same as in the render config file.
|
||||
# (Maya, Softimage)
|
||||
Software = attr.ib() # type: str
|
||||
|
||||
# The OS the scene was created on, all texture paths are set on
|
||||
# that OS. Possible values are windows, linux, osx
|
||||
SceneOS = attr.ib() # type: str
|
||||
|
||||
# Renderer you use. Same as in the render config file
|
||||
# (VRay, Mental Ray, Arnold)
|
||||
Renderer = attr.ib() # type: str
|
||||
|
||||
# Version you want to render with. (5.11, 2010, 12)
|
||||
Version = attr.ib() # type: str
|
||||
|
||||
# Name of the scene file with full path.
|
||||
SceneName = attr.ib() # type: str
|
||||
|
||||
# Is the job enabled for submission?
|
||||
# enabled by default
|
||||
IsActive = attr.ib() # type: str
|
||||
|
||||
# Sequence settings of this job
|
||||
SeqStart = attr.ib() # type: int
|
||||
SeqEnd = attr.ib() # type: int
|
||||
SeqStep = attr.ib() # type: int
|
||||
SeqFileOffset = attr.ib() # type: int
|
||||
|
||||
# If you specify ImageDir, then ImageFilename has no path. If you do
|
||||
# NOT specify ImageDir, then ImageFilename has to include the path.
|
||||
# Same for ImageExtension.
|
||||
# Important: Do not forget any _ or . in front or after the frame
|
||||
# numbering. Usually ImageExtension always starts with a . (.tga, .exr)
|
||||
ImageDir = attr.ib() # type: str
|
||||
ImageFilename = attr.ib() # type: str
|
||||
ImageExtension = attr.ib() # type: str
|
||||
|
||||
# Some applications always add a . or _ in front of the frame number.
|
||||
# Set this variable to that character. The user can then change
|
||||
# the filename at the rrSubmitter and the submitter keeps
|
||||
# track of this character.
|
||||
ImagePreNumberLetter = attr.ib() # type: str
|
||||
|
||||
# If you render a single file, e.g. Quicktime or Avi, then you have to
|
||||
# set this value. Videos have to be rendered at once on one client.
|
||||
ImageSingleOutputFile = attr.ib(default="false") # type: str
|
||||
|
||||
# Semi-Required (required for some render applications)
|
||||
# -----------------------------------------------------
|
||||
|
||||
# The database of your scene file. In Maya and XSI called "project",
|
||||
# in Lightwave "content dir"
|
||||
SceneDatabaseDir = attr.ib(default=None) # type: str
|
||||
|
||||
# Required if you want to split frames on multiple clients
|
||||
ImageWidth = attr.ib(default=None) # type: int
|
||||
ImageHeight = attr.ib(default=None) # type: int
|
||||
Camera = attr.ib(default=None) # type: str
|
||||
Layer = attr.ib(default=None) # type: str
|
||||
Channel = attr.ib(default=None) # type: str
|
||||
|
||||
# Optional
|
||||
# --------
|
||||
|
||||
# Used for the RR render license function.
|
||||
# E.g. If you render with mentalRay, then add mentalRay. If you render
|
||||
# with Nuke and you use Furnace plugins in your comp, add Furnace.
|
||||
# TODO: determine how this work for multiple plugins
|
||||
RequiredPlugins = attr.ib(default=None) # type: str
|
||||
|
||||
# Frame Padding of the frame number in the rendered filename.
|
||||
# Some render config files are setting the padding at render time.
|
||||
ImageFramePadding = attr.ib(default=None) # type: str
|
||||
|
||||
# Some render applications support overriding the image format at
|
||||
# the render commandline.
|
||||
OverrideImageFormat = attr.ib(default=None) # type: str
|
||||
|
||||
# rrControl can display the name of additonal channels that are
|
||||
# rendered. Each channel requires these two values. ChannelFilename
|
||||
# contains the full path.
|
||||
ChannelFilename = attr.ib(default=None) # type: str
|
||||
ChannelExtension = attr.ib(default=None) # type: str
|
||||
|
||||
# A value between 0 and 255. Each job gets the Pre ID attached as small
|
||||
# letter to the main ID. A new main ID is generated for every machine
|
||||
# for every 5/1000s.
|
||||
PreID = attr.ib(default=None) # type: int
|
||||
|
||||
# When the job is received by the server, the server checks for other
|
||||
# jobs send from this machine. If a job with the PreID was found, then
|
||||
# this jobs waits for the other job. Note: This flag can be used multiple
|
||||
# times to wait for multiple jobs.
|
||||
WaitForPreID = attr.ib(default=None) # type: int
|
||||
|
||||
# List of submitter options per job
|
||||
# list item must be of `SubmitterParameter` type
|
||||
SubmitterParameters = attr.ib(factory=list) # type: list
|
||||
|
||||
# List of Custom job attributes
|
||||
# Royal Render support custom attributes in format <CustomFoo> or
|
||||
# <CustomSomeOtherAttr>
|
||||
# list item must be of `CustomAttribute` named tuple
|
||||
CustomAttributes = attr.ib(factory=list) # type: list
|
||||
|
||||
# Additional information for subsequent publish script and
|
||||
# for better display in rrControl
|
||||
UserName = attr.ib(default=None) # type: str
|
||||
CustomSeQName = attr.ib(default=None) # type: str
|
||||
CustomSHotName = attr.ib(default=None) # type: str
|
||||
CustomVersionName = attr.ib(default=None) # type: str
|
||||
CustomUserInfo = attr.ib(default=None) # type: str
|
||||
SubmitMachine = attr.ib(default=None) # type: str
|
||||
Color_ID = attr.ib(default=2) # type: int
|
||||
|
||||
RequiredLicenses = attr.ib(default=None) # type: str
|
||||
|
||||
# Additional frame info
|
||||
Priority = attr.ib(default=50) # type: int
|
||||
TotalFrames = attr.ib(default=None) # type: int
|
||||
Tiled = attr.ib(default=None) # type: str
|
||||
|
||||
|
||||
class SubmitterParameter:
|
||||
"""Wrapper for Submitter Parameters."""
|
||||
def __init__(self, parameter, *args):
|
||||
# type: (str, list) -> None
|
||||
self._parameter = parameter
|
||||
self._values = args
|
||||
|
||||
def serialize(self):
|
||||
# type: () -> str
|
||||
"""Serialize submitter parameter as a string value.
|
||||
|
||||
This can be later on used as text node in job xml file.
|
||||
|
||||
Returns:
|
||||
str: concatenated string of parameter values.
|
||||
|
||||
"""
|
||||
return '"{param}={val}"'.format(
|
||||
param=self._parameter, val="~".join(self._values))
|
||||
|
||||
|
||||
@attr.s
|
||||
class SubmitFile:
|
||||
"""Class wrapping Royal Render submission XML file."""
|
||||
|
||||
# Syntax version of the submission file.
|
||||
syntax_version = attr.ib(default="6.0") # type: str
|
||||
|
||||
# Delete submission file after processing
|
||||
DeleteXML = attr.ib(default=1) # type: int
|
||||
|
||||
# List of submitter options per job
|
||||
# list item must be of `SubmitterParameter` type
|
||||
SubmitterParameters = attr.ib(factory=list) # type: list
|
||||
|
||||
# List of job is submission batch.
|
||||
# list item must be of type `RRJob`
|
||||
Jobs = attr.ib(factory=list) # type: list
|
||||
|
||||
@staticmethod
|
||||
def _process_submitter_parameters(parameters, dom, append_to):
|
||||
# type: (list[SubmitterParameter], md.Document, md.Element) -> None
|
||||
"""Take list of :class:`SubmitterParameter` and process it as XML.
|
||||
|
||||
This will take :class:`SubmitterParameter`, create XML element
|
||||
for them and convert value to Royal Render compatible string
|
||||
(options and values separated by ~)
|
||||
|
||||
Args:
|
||||
parameters (list of SubmitterParameter): List of parameters.
|
||||
dom (xml.dom.minidom.Document): XML Document
|
||||
append_to (xml.dom.minidom.Element): Element to append to.
|
||||
|
||||
"""
|
||||
for param in parameters:
|
||||
if not isinstance(param, SubmitterParameter):
|
||||
raise AttributeError(
|
||||
"{} is not of type `SubmitterParameter`".format(param))
|
||||
xml_parameter = dom.createElement("SubmitterParameter")
|
||||
xml_parameter.appendChild(dom.createTextNode(param.serialize()))
|
||||
append_to.appendChild(xml_parameter)
|
||||
|
||||
def serialize(self):
|
||||
# type: () -> str
|
||||
"""Return all data serialized as XML.
|
||||
|
||||
Returns:
|
||||
str: XML data as string.
|
||||
|
||||
"""
|
||||
def filter_data(a, v):
|
||||
"""Skip private attributes."""
|
||||
if a.name.startswith("_"):
|
||||
return False
|
||||
if v is None:
|
||||
return False
|
||||
return True
|
||||
|
||||
root = md.Document()
|
||||
# root element: <RR_Job_File syntax_version="6.0">
|
||||
job_file = root.createElement('RR_Job_File')
|
||||
job_file.setAttribute("syntax_version", self.syntax_version)
|
||||
|
||||
# handle Submitter Parameters for batch
|
||||
# <SubmitterParameter>foo=bar~baz~goo</SubmitterParameter>
|
||||
self._process_submitter_parameters(
|
||||
self.SubmitterParameters, root, job_file)
|
||||
|
||||
for job in self.Jobs: # type: RRJob
|
||||
if not isinstance(job, RRJob):
|
||||
raise AttributeError(
|
||||
"{} is not of type `SubmitterParameter`".format(job))
|
||||
xml_job = root.createElement("Job")
|
||||
# handle Submitter Parameters for job
|
||||
self._process_submitter_parameters(
|
||||
job.SubmitterParameters, root, xml_job
|
||||
)
|
||||
job_custom_attributes = job.CustomAttributes
|
||||
|
||||
serialized_job = attr.asdict(
|
||||
job, dict_factory=OrderedDict, filter=filter_data)
|
||||
serialized_job.pop("CustomAttributes")
|
||||
serialized_job.pop("SubmitterParameters")
|
||||
|
||||
for custom_attr in job_custom_attributes: # type: CustomAttribute
|
||||
serialized_job["Custom{}".format(
|
||||
custom_attr.name)] = custom_attr.value
|
||||
|
||||
for item, value in serialized_job.items():
|
||||
xml_attr = root.create(item)
|
||||
xml_attr.appendChild(
|
||||
root.createTextNode(value)
|
||||
)
|
||||
xml_job.appendChild(xml_attr)
|
||||
|
||||
return root.toprettyxml(indent="\t")
|
||||
|
|
@ -0,0 +1,5 @@
|
|||
## OpenPype RoyalRender integration plugins
|
||||
|
||||
### Installation
|
||||
|
||||
Copy content of this folder to your `RR_ROOT` (place where RoyalRender studio wide installation is).
|
||||
|
|
@ -0,0 +1,170 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""This is RR control plugin that runs on the job by user interaction.
|
||||
|
||||
It asks user for context to publish, getting it from OpenPype. In order to
|
||||
run it needs `OPENPYPE_ROOT` to be set to know where to execute OpenPype.
|
||||
|
||||
"""
|
||||
import rr # noqa
|
||||
import rrGlobal # noqa
|
||||
import subprocess
|
||||
import os
|
||||
import glob
|
||||
import platform
|
||||
import tempfile
|
||||
import json
|
||||
|
||||
|
||||
class OpenPypeContextSelector:
|
||||
"""Class to handle publishing context determination in RR."""
|
||||
|
||||
def __init__(self):
|
||||
self.job = rr.getJob()
|
||||
self.context = None
|
||||
|
||||
self.openpype_executable = "openpype_gui"
|
||||
if platform.system().lower() == "windows":
|
||||
self.openpype_executable = "{}.exe".format(
|
||||
self.openpype_executable)
|
||||
|
||||
op_path = os.environ.get("OPENPYPE_ROOT")
|
||||
print("initializing ... {}".format(op_path))
|
||||
if not op_path:
|
||||
print("Warning: OpenPype root is not found.")
|
||||
|
||||
if platform.system().lower() == "windows":
|
||||
print(" * trying to find OpenPype on local computer.")
|
||||
op_path = os.path.join(
|
||||
os.environ.get("PROGRAMFILES"),
|
||||
"OpenPype", "openpype_console.exe"
|
||||
)
|
||||
if os.path.exists(op_path):
|
||||
print(" - found OpenPype installation {}".format(op_path))
|
||||
else:
|
||||
# try to find in user local context
|
||||
op_path = os.path.join(
|
||||
os.environ.get("LOCALAPPDATA"),
|
||||
"Programs",
|
||||
"OpenPype", "openpype_console.exe"
|
||||
)
|
||||
if os.path.exists(op_path):
|
||||
print(
|
||||
" - found OpenPype installation {}".format(
|
||||
op_path))
|
||||
else:
|
||||
raise Exception("Error: OpenPype was not found.")
|
||||
|
||||
self.openpype_root = op_path
|
||||
|
||||
# TODO: this should try to find metadata file. Either using
|
||||
# jobs custom attributes or using environment variable
|
||||
# or just using plain existence of file.
|
||||
# self.context = self._process_metadata_file()
|
||||
|
||||
def _process_metadata_file(self):
|
||||
"""Find and process metadata file.
|
||||
|
||||
Try to find metadata json file in job folder to get context from.
|
||||
|
||||
Returns:
|
||||
dict: Context from metadata json file.
|
||||
|
||||
"""
|
||||
image_dir = self.job.imageDir
|
||||
metadata_files = glob.glob(
|
||||
"{}{}*_metadata.json".format(image_dir, os.path.sep))
|
||||
if not metadata_files:
|
||||
return {}
|
||||
|
||||
raise NotImplementedError(
|
||||
"Processing existing metadata not implemented yet.")
|
||||
|
||||
def process_job(self):
|
||||
"""Process selected job.
|
||||
|
||||
This should process selected job. If context can be determined
|
||||
automatically, no UI will be show and publishing will directly
|
||||
proceed.
|
||||
"""
|
||||
if not self.context:
|
||||
self.show()
|
||||
|
||||
self.context["user"] = self.job.userName
|
||||
self.run_publish()
|
||||
|
||||
def show(self):
|
||||
"""Show UI for context selection.
|
||||
|
||||
Because of RR UI limitations, this must be done using OpenPype
|
||||
itself.
|
||||
|
||||
"""
|
||||
tf = tempfile.TemporaryFile(delete=False)
|
||||
context_file = tf.name
|
||||
op_args = [os.path.join(self.openpype_root, self.openpype_executable),
|
||||
"contextselection", tf.name]
|
||||
|
||||
tf.close()
|
||||
print(">>> running {}".format(" ".join(op_args)))
|
||||
|
||||
subprocess.call(op_args)
|
||||
|
||||
with open(context_file, "r") as cf:
|
||||
self.context = json.load(cf)
|
||||
|
||||
os.unlink(context_file)
|
||||
print(">>> context: {}".format(self.context))
|
||||
|
||||
if not self.context or \
|
||||
not self.context.get("project") or \
|
||||
not self.context.get("asset") or \
|
||||
not self.context.get("task"):
|
||||
self._show_rr_warning("Context selection failed.")
|
||||
return
|
||||
|
||||
# self.context["app_name"] = self.job.renderer.name
|
||||
self.context["app_name"] = "maya/2020"
|
||||
|
||||
@staticmethod
|
||||
def _show_rr_warning(text):
|
||||
warning_dialog = rrGlobal.getGenericUI()
|
||||
warning_dialog.addItem(rrGlobal.genUIType.label, "infoLabel", "")
|
||||
warning_dialog.setText("infoLabel", text)
|
||||
warning_dialog.addItem(
|
||||
rrGlobal.genUIType.layoutH, "btnLayout", "")
|
||||
warning_dialog.addItem(
|
||||
rrGlobal.genUIType.closeButton, "Ok", "btnLayout")
|
||||
warning_dialog.execute()
|
||||
del warning_dialog
|
||||
|
||||
def run_publish(self):
|
||||
"""Run publish process."""
|
||||
env = {'AVALON_PROJECT': str(self.context.get("project")),
|
||||
"AVALON_ASSET": str(self.context.get("asset")),
|
||||
"AVALON_TASK": str(self.context.get("task")),
|
||||
"AVALON_APP_NAME": str(self.context.get("app_name"))}
|
||||
|
||||
print(">>> setting environment:")
|
||||
for k, v in env.items():
|
||||
print(" {}: {}".format(k, v))
|
||||
|
||||
args = [os.path.join(self.openpype_root, self.openpype_executable),
|
||||
'publish', '-t', "rr_control", "--gui",
|
||||
os.path.join(self.job.imageDir,
|
||||
os.path.dirname(self.job.imageFileName))
|
||||
]
|
||||
|
||||
print(">>> running {}".format(" ".join(args)))
|
||||
orig = os.environ.copy()
|
||||
orig.update(env)
|
||||
try:
|
||||
subprocess.call(args, env=orig)
|
||||
except subprocess.CalledProcessError as e:
|
||||
self._show_rr_warning(" Publish failed [ {} ]".format(
|
||||
e.returncode
|
||||
))
|
||||
|
||||
|
||||
print("running selector")
|
||||
selector = OpenPypeContextSelector()
|
||||
selector.process_job()
|
||||
|
|
@ -1,30 +0,0 @@
|
|||
from abc import abstractmethod
|
||||
from openpype.modules import OpenPypeInterface
|
||||
|
||||
|
||||
class ISettingsChangeListener(OpenPypeInterface):
|
||||
"""Module has plugin paths to return.
|
||||
|
||||
Expected result is dictionary with keys "publish", "create", "load" or
|
||||
"actions" and values as list or string.
|
||||
{
|
||||
"publish": ["path/to/publish_plugins"]
|
||||
}
|
||||
"""
|
||||
@abstractmethod
|
||||
def on_system_settings_save(
|
||||
self, old_value, new_value, changes, new_value_metadata
|
||||
):
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def on_project_settings_save(
|
||||
self, old_value, new_value, changes, project_name, new_value_metadata
|
||||
):
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def on_project_anatomy_save(
|
||||
self, old_value, new_value, changes, project_name, new_value_metadata
|
||||
):
|
||||
pass
|
||||
|
|
@ -24,25 +24,19 @@ class DropboxHandler(AbstractProvider):
|
|||
)
|
||||
return
|
||||
|
||||
provider_presets = self.presets.get(self.CODE)
|
||||
if not provider_presets:
|
||||
msg = "Sync Server: No provider presets for {}".format(self.CODE)
|
||||
log.info(msg)
|
||||
return
|
||||
|
||||
token = self.presets[self.CODE].get("token", "")
|
||||
token = self.presets.get("token", "")
|
||||
if not token:
|
||||
msg = "Sync Server: No access token for dropbox provider"
|
||||
log.info(msg)
|
||||
return
|
||||
|
||||
team_folder_name = self.presets[self.CODE].get("team_folder_name", "")
|
||||
team_folder_name = self.presets.get("team_folder_name", "")
|
||||
if not team_folder_name:
|
||||
msg = "Sync Server: No team folder name for dropbox provider"
|
||||
log.info(msg)
|
||||
return
|
||||
|
||||
acting_as_member = self.presets[self.CODE].get("acting_as_member", "")
|
||||
acting_as_member = self.presets.get("acting_as_member", "")
|
||||
if not acting_as_member:
|
||||
msg = (
|
||||
"Sync Server: No acting member for dropbox provider"
|
||||
|
|
@ -51,13 +45,15 @@ class DropboxHandler(AbstractProvider):
|
|||
return
|
||||
|
||||
self.dbx = None
|
||||
try:
|
||||
self.dbx = self._get_service(
|
||||
token, acting_as_member, team_folder_name
|
||||
)
|
||||
except Exception as e:
|
||||
log.info("Could not establish dropbox object: {}".format(e))
|
||||
return
|
||||
|
||||
if self.presets["enabled"]:
|
||||
try:
|
||||
self.dbx = self._get_service(
|
||||
token, acting_as_member, team_folder_name
|
||||
)
|
||||
except Exception as e:
|
||||
log.info("Could not establish dropbox object: {}".format(e))
|
||||
return
|
||||
|
||||
super(AbstractProvider, self).__init__()
|
||||
|
||||
|
|
@ -101,12 +97,12 @@ class DropboxHandler(AbstractProvider):
|
|||
},
|
||||
# roots could be overriden only on Project level, User cannot
|
||||
{
|
||||
"key": "roots",
|
||||
"key": "root",
|
||||
"label": "Roots",
|
||||
"type": "dict-roots",
|
||||
"object_type": {
|
||||
"type": "path",
|
||||
"multiplatform": True,
|
||||
"multiplatform": False,
|
||||
"multipath": False
|
||||
}
|
||||
}
|
||||
|
|
@ -169,7 +165,7 @@ class DropboxHandler(AbstractProvider):
|
|||
Returns:
|
||||
(boolean)
|
||||
"""
|
||||
return self.dbx is not None
|
||||
return self.presets["enabled"] and self.dbx is not None
|
||||
|
||||
@classmethod
|
||||
def get_configurable_items(cls):
|
||||
|
|
|
|||
|
|
@ -73,13 +73,7 @@ class GDriveHandler(AbstractProvider):
|
|||
format(site_name))
|
||||
return
|
||||
|
||||
provider_presets = self.presets.get(self.CODE)
|
||||
if not provider_presets:
|
||||
msg = "Sync Server: No provider presets for {}".format(self.CODE)
|
||||
log.info(msg)
|
||||
return
|
||||
|
||||
cred_path = self.presets[self.CODE].get("credentials_url", {}).\
|
||||
cred_path = self.presets.get("credentials_url", {}).\
|
||||
get(platform.system().lower()) or ''
|
||||
if not os.path.exists(cred_path):
|
||||
msg = "Sync Server: No credentials for gdrive provider " + \
|
||||
|
|
@ -87,10 +81,12 @@ class GDriveHandler(AbstractProvider):
|
|||
log.info(msg)
|
||||
return
|
||||
|
||||
self.service = self._get_gd_service(cred_path)
|
||||
self.service = None
|
||||
if self.presets["enabled"]:
|
||||
self.service = self._get_gd_service(cred_path)
|
||||
|
||||
self._tree = tree
|
||||
self.active = True
|
||||
self._tree = tree
|
||||
self.active = True
|
||||
|
||||
def is_active(self):
|
||||
"""
|
||||
|
|
@ -98,7 +94,7 @@ class GDriveHandler(AbstractProvider):
|
|||
Returns:
|
||||
(boolean)
|
||||
"""
|
||||
return self.service is not None
|
||||
return self.presets["enabled"] and self.service is not None
|
||||
|
||||
@classmethod
|
||||
def get_system_settings_schema(cls):
|
||||
|
|
@ -125,18 +121,20 @@ class GDriveHandler(AbstractProvider):
|
|||
editable = [
|
||||
# credentials could be overriden on Project or User level
|
||||
{
|
||||
'key': "credentials_url",
|
||||
'label': "Credentials url",
|
||||
'type': 'text'
|
||||
"type": "path",
|
||||
"key": "credentials_url",
|
||||
"label": "Credentials url",
|
||||
"multiplatform": True,
|
||||
"placeholder": "Credentials url"
|
||||
},
|
||||
# roots could be overriden only on Project leve, User cannot
|
||||
{
|
||||
"key": "roots",
|
||||
"key": "root",
|
||||
"label": "Roots",
|
||||
"type": "dict-roots",
|
||||
"object_type": {
|
||||
"type": "path",
|
||||
"multiplatform": True,
|
||||
"multiplatform": False,
|
||||
"multipath": False
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -50,7 +50,7 @@ class LocalDriveHandler(AbstractProvider):
|
|||
# for non 'studio' sites, 'studio' is configured in Anatomy
|
||||
editable = [
|
||||
{
|
||||
"key": "roots",
|
||||
"key": "root",
|
||||
"label": "Roots",
|
||||
"type": "dict-roots",
|
||||
"object_type": {
|
||||
|
|
@ -73,7 +73,7 @@ class LocalDriveHandler(AbstractProvider):
|
|||
"""
|
||||
editable = [
|
||||
{
|
||||
'key': "roots",
|
||||
'key': "root",
|
||||
'label': "Roots",
|
||||
'type': 'dict'
|
||||
}
|
||||
|
|
@ -89,6 +89,7 @@ class LocalDriveHandler(AbstractProvider):
|
|||
if not os.path.isfile(source_path):
|
||||
raise FileNotFoundError("Source file {} doesn't exist."
|
||||
.format(source_path))
|
||||
|
||||
if overwrite:
|
||||
thread = threading.Thread(target=self._copy,
|
||||
args=(source_path, target_path))
|
||||
|
|
@ -181,7 +182,10 @@ class LocalDriveHandler(AbstractProvider):
|
|||
|
||||
def _copy(self, source_path, target_path):
|
||||
print("copying {}->{}".format(source_path, target_path))
|
||||
shutil.copy(source_path, target_path)
|
||||
try:
|
||||
shutil.copy(source_path, target_path)
|
||||
except shutil.SameFileError:
|
||||
print("same files, skipping")
|
||||
|
||||
def _mark_progress(self, collection, file, representation, server, site,
|
||||
source_path, target_path, direction):
|
||||
|
|
|
|||
|
|
@ -1,8 +1,6 @@
|
|||
import os
|
||||
import os.path
|
||||
import time
|
||||
import sys
|
||||
import six
|
||||
import threading
|
||||
import platform
|
||||
|
||||
|
|
@ -14,6 +12,7 @@ log = Logger().get_logger("SyncServer")
|
|||
pysftp = None
|
||||
try:
|
||||
import pysftp
|
||||
import paramiko
|
||||
except (ImportError, SyntaxError):
|
||||
pass
|
||||
|
||||
|
|
@ -37,7 +36,6 @@ class SFTPHandler(AbstractProvider):
|
|||
|
||||
def __init__(self, project_name, site_name, tree=None, presets=None):
|
||||
self.presets = None
|
||||
self.active = False
|
||||
self.project_name = project_name
|
||||
self.site_name = site_name
|
||||
self.root = None
|
||||
|
|
@ -49,22 +47,15 @@ class SFTPHandler(AbstractProvider):
|
|||
format(site_name))
|
||||
return
|
||||
|
||||
provider_presets = self.presets.get(self.CODE)
|
||||
if not provider_presets:
|
||||
msg = "Sync Server: No provider presets for {}".format(self.CODE)
|
||||
log.warning(msg)
|
||||
return
|
||||
|
||||
# store to instance for reconnect
|
||||
self.sftp_host = provider_presets["sftp_host"]
|
||||
self.sftp_port = provider_presets["sftp_port"]
|
||||
self.sftp_user = provider_presets["sftp_user"]
|
||||
self.sftp_pass = provider_presets["sftp_pass"]
|
||||
self.sftp_key = provider_presets["sftp_key"]
|
||||
self.sftp_key_pass = provider_presets["sftp_key_pass"]
|
||||
self.sftp_host = presets["sftp_host"]
|
||||
self.sftp_port = presets["sftp_port"]
|
||||
self.sftp_user = presets["sftp_user"]
|
||||
self.sftp_pass = presets["sftp_pass"]
|
||||
self.sftp_key = presets["sftp_key"]
|
||||
self.sftp_key_pass = presets["sftp_key_pass"]
|
||||
|
||||
self._tree = None
|
||||
self.active = True
|
||||
|
||||
@property
|
||||
def conn(self):
|
||||
|
|
@ -80,7 +71,7 @@ class SFTPHandler(AbstractProvider):
|
|||
Returns:
|
||||
(boolean)
|
||||
"""
|
||||
return self.conn is not None
|
||||
return self.presets["enabled"] and self.conn is not None
|
||||
|
||||
@classmethod
|
||||
def get_system_settings_schema(cls):
|
||||
|
|
@ -108,7 +99,7 @@ class SFTPHandler(AbstractProvider):
|
|||
editable = [
|
||||
# credentials could be overriden on Project or User level
|
||||
{
|
||||
'key': "sftp_server",
|
||||
'key': "sftp_host",
|
||||
'label': "SFTP host name",
|
||||
'type': 'text'
|
||||
},
|
||||
|
|
@ -130,7 +121,8 @@ class SFTPHandler(AbstractProvider):
|
|||
{
|
||||
'key': "sftp_key",
|
||||
'label': "SFTP user ssh key",
|
||||
'type': 'path'
|
||||
'type': 'path',
|
||||
"multiplatform": True
|
||||
},
|
||||
{
|
||||
'key': "sftp_key_pass",
|
||||
|
|
@ -139,12 +131,12 @@ class SFTPHandler(AbstractProvider):
|
|||
},
|
||||
# roots could be overriden only on Project leve, User cannot
|
||||
{
|
||||
"key": "roots",
|
||||
"key": "root",
|
||||
"label": "Roots",
|
||||
"type": "dict-roots",
|
||||
"object_type": {
|
||||
"type": "path",
|
||||
"multiplatform": True,
|
||||
"multiplatform": False,
|
||||
"multipath": False
|
||||
}
|
||||
}
|
||||
|
|
@ -176,7 +168,8 @@ class SFTPHandler(AbstractProvider):
|
|||
{
|
||||
'key': "sftp_key",
|
||||
'label': "SFTP user ssh key",
|
||||
'type': 'path'
|
||||
'type': 'path',
|
||||
"multiplatform": True
|
||||
},
|
||||
{
|
||||
'key': "sftp_key_pass",
|
||||
|
|
@ -199,7 +192,7 @@ class SFTPHandler(AbstractProvider):
|
|||
Format is importing for usage of python's format ** approach
|
||||
"""
|
||||
# roots cannot be locally overridden
|
||||
return self.presets['root']
|
||||
return self.presets['roots']
|
||||
|
||||
def get_tree(self):
|
||||
"""
|
||||
|
|
@ -426,7 +419,10 @@ class SFTPHandler(AbstractProvider):
|
|||
if self.sftp_key_pass:
|
||||
conn_params['private_key_pass'] = self.sftp_key_pass
|
||||
|
||||
return pysftp.Connection(**conn_params)
|
||||
try:
|
||||
return pysftp.Connection(**conn_params)
|
||||
except paramiko.ssh_exception.SSHException:
|
||||
log.warning("Couldn't connect", exc_info=True)
|
||||
|
||||
def _mark_progress(self, collection, file, representation, server, site,
|
||||
source_path, target_path, direction):
|
||||
|
|
|
|||
|
|
@ -80,6 +80,10 @@ async def upload(module, collection, file, representation, provider_name,
|
|||
remote_site_name,
|
||||
True
|
||||
)
|
||||
|
||||
module.handle_alternate_site(collection, representation, remote_site_name,
|
||||
file["_id"], file_id)
|
||||
|
||||
return file_id
|
||||
|
||||
|
||||
|
|
@ -131,6 +135,10 @@ async def download(module, collection, file, representation, provider_name,
|
|||
local_site,
|
||||
True
|
||||
)
|
||||
|
||||
module.handle_alternate_site(collection, representation, local_site,
|
||||
file["_id"], file_id)
|
||||
|
||||
return file_id
|
||||
|
||||
|
||||
|
|
@ -246,6 +254,7 @@ class SyncServerThread(threading.Thread):
|
|||
|
||||
asyncio.ensure_future(self.check_shutdown(), loop=self.loop)
|
||||
asyncio.ensure_future(self.sync_loop(), loop=self.loop)
|
||||
log.info("Sync Server Started")
|
||||
self.loop.run_forever()
|
||||
except Exception:
|
||||
log.warning(
|
||||
|
|
|
|||
|
|
@ -109,6 +109,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
|
||||
# some parts of code need to run sequentially, not in async
|
||||
self.lock = None
|
||||
self._sync_system_settings = None
|
||||
# settings for all enabled projects for sync
|
||||
self._sync_project_settings = None
|
||||
self.sync_server_thread = None # asyncio requires new thread
|
||||
|
|
@ -152,9 +153,9 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
if not site_name:
|
||||
site_name = self.DEFAULT_SITE
|
||||
|
||||
self.reset_provider_for_file(collection,
|
||||
representation_id,
|
||||
site_name=site_name, force=force)
|
||||
self.reset_site_on_representation(collection,
|
||||
representation_id,
|
||||
site_name=site_name, force=force)
|
||||
|
||||
# public facing API
|
||||
def remove_site(self, collection, representation_id, site_name,
|
||||
|
|
@ -176,10 +177,10 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
if not self.get_sync_project_setting(collection):
|
||||
raise ValueError("Project not configured")
|
||||
|
||||
self.reset_provider_for_file(collection,
|
||||
representation_id,
|
||||
site_name=site_name,
|
||||
remove=True)
|
||||
self.reset_site_on_representation(collection,
|
||||
representation_id,
|
||||
site_name=site_name,
|
||||
remove=True)
|
||||
if remove_local_files:
|
||||
self._remove_local_file(collection, representation_id, site_name)
|
||||
|
||||
|
|
@ -314,8 +315,8 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
"""
|
||||
log.info("Pausing SyncServer for {}".format(representation_id))
|
||||
self._paused_representations.add(representation_id)
|
||||
self.reset_provider_for_file(collection, representation_id,
|
||||
site_name=site_name, pause=True)
|
||||
self.reset_site_on_representation(collection, representation_id,
|
||||
site_name=site_name, pause=True)
|
||||
|
||||
def unpause_representation(self, collection, representation_id, site_name):
|
||||
"""
|
||||
|
|
@ -334,8 +335,8 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
except KeyError:
|
||||
pass
|
||||
# self.paused_representations is not persistent
|
||||
self.reset_provider_for_file(collection, representation_id,
|
||||
site_name=site_name, pause=False)
|
||||
self.reset_site_on_representation(collection, representation_id,
|
||||
site_name=site_name, pause=False)
|
||||
|
||||
def is_representation_paused(self, representation_id,
|
||||
check_parents=False, project_name=None):
|
||||
|
|
@ -769,6 +770,58 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
enabled_projects.append(project_name)
|
||||
|
||||
return enabled_projects
|
||||
|
||||
def handle_alternate_site(self, collection, representation, processed_site,
|
||||
file_id, synced_file_id):
|
||||
"""
|
||||
For special use cases where one site vendors another.
|
||||
|
||||
Current use case is sftp site vendoring (exposing) same data as
|
||||
regular site (studio). Each site is accessible for different
|
||||
audience. 'studio' for artists in a studio, 'sftp' for externals.
|
||||
|
||||
Change of file status on one site actually means same change on
|
||||
'alternate' site. (eg. artists publish to 'studio', 'sftp' is using
|
||||
same location >> file is accesible on 'sftp' site right away.
|
||||
|
||||
Args:
|
||||
collection (str): name of project
|
||||
representation (dict)
|
||||
processed_site (str): real site_name of published/uploaded file
|
||||
file_id (ObjectId): DB id of file handled
|
||||
synced_file_id (str): id of the created file returned
|
||||
by provider
|
||||
"""
|
||||
sites = self.sync_system_settings.get("sites", {})
|
||||
sites[self.DEFAULT_SITE] = {"provider": "local_drive",
|
||||
"alternative_sites": []}
|
||||
|
||||
alternate_sites = []
|
||||
for site_name, site_info in sites.items():
|
||||
conf_alternative_sites = site_info.get("alternative_sites", [])
|
||||
if processed_site in conf_alternative_sites:
|
||||
alternate_sites.append(site_name)
|
||||
continue
|
||||
if processed_site == site_name and conf_alternative_sites:
|
||||
alternate_sites.extend(conf_alternative_sites)
|
||||
continue
|
||||
|
||||
alternate_sites = set(alternate_sites)
|
||||
|
||||
for alt_site in alternate_sites:
|
||||
query = {
|
||||
"_id": representation["_id"]
|
||||
}
|
||||
elem = {"name": alt_site,
|
||||
"created_dt": datetime.now(),
|
||||
"id": synced_file_id}
|
||||
|
||||
self.log.debug("Adding alternate {} to {}".format(
|
||||
alt_site, representation["_id"]))
|
||||
self._add_site(collection, query,
|
||||
[representation], elem,
|
||||
alt_site, file_id=file_id, force=True)
|
||||
|
||||
""" End of Public API """
|
||||
|
||||
def get_local_file_path(self, collection, site_name, file_path):
|
||||
|
|
@ -799,12 +852,19 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
|
||||
def tray_init(self):
|
||||
"""
|
||||
Actual initialization of Sync Server.
|
||||
Actual initialization of Sync Server for Tray.
|
||||
|
||||
Called when tray is initialized, it checks if module should be
|
||||
enabled. If not, no initialization necessary.
|
||||
"""
|
||||
# import only in tray, because of Python2 hosts
|
||||
self.server_init()
|
||||
|
||||
from .tray.app import SyncServerWindow
|
||||
self.widget = SyncServerWindow(self)
|
||||
|
||||
def server_init(self):
|
||||
"""Actual initialization of Sync Server."""
|
||||
# import only in tray or Python3, because of Python2 hosts
|
||||
from .sync_server import SyncServerThread
|
||||
|
||||
if not self.enabled:
|
||||
|
|
@ -816,6 +876,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
return
|
||||
|
||||
self.lock = threading.Lock()
|
||||
|
||||
self.sync_server_thread = SyncServerThread(self)
|
||||
|
||||
def tray_start(self):
|
||||
|
|
@ -829,6 +890,9 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
Returns:
|
||||
None
|
||||
"""
|
||||
self.server_start()
|
||||
|
||||
def server_start(self):
|
||||
if self.sync_project_settings and self.enabled:
|
||||
self.sync_server_thread.start()
|
||||
else:
|
||||
|
|
@ -841,6 +905,9 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
|
||||
Called from Module Manager
|
||||
"""
|
||||
self.server_exit()
|
||||
|
||||
def server_exit(self):
|
||||
if not self.sync_server_thread:
|
||||
return
|
||||
|
||||
|
|
@ -850,6 +917,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
log.info("Stopping sync server server")
|
||||
self.sync_server_thread.is_running = False
|
||||
self.sync_server_thread.stop()
|
||||
log.info("Sync server stopped")
|
||||
except Exception:
|
||||
log.warning(
|
||||
"Error has happened during Killing sync server",
|
||||
|
|
@ -892,6 +960,14 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
|
||||
return self._connection
|
||||
|
||||
@property
|
||||
def sync_system_settings(self):
|
||||
if self._sync_system_settings is None:
|
||||
self._sync_system_settings = get_system_settings()["modules"].\
|
||||
get("sync_server")
|
||||
|
||||
return self._sync_system_settings
|
||||
|
||||
@property
|
||||
def sync_project_settings(self):
|
||||
if self._sync_project_settings is None:
|
||||
|
|
@ -977,9 +1053,7 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
(dict): {'studio': {'provider':'local_drive'...},
|
||||
'MY_LOCAL': {'provider':....}}
|
||||
"""
|
||||
sys_sett = get_system_settings()
|
||||
sync_sett = sys_sett["modules"].get("sync_server")
|
||||
|
||||
sync_sett = self.sync_system_settings
|
||||
project_enabled = True
|
||||
if project_name:
|
||||
project_enabled = project_name in self.get_enabled_projects()
|
||||
|
|
@ -1037,10 +1111,9 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
if provider:
|
||||
return provider
|
||||
|
||||
sys_sett = get_system_settings()
|
||||
sync_sett = sys_sett["modules"].get("sync_server")
|
||||
for site, detail in sync_sett.get("sites", {}).items():
|
||||
sites[site] = detail.get("provider")
|
||||
sync_sett = self.sync_system_settings
|
||||
for conf_site, detail in sync_sett.get("sites", {}).items():
|
||||
sites[conf_site] = detail.get("provider")
|
||||
|
||||
return sites.get(site, 'N/A')
|
||||
|
||||
|
|
@ -1319,9 +1392,9 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
|
||||
return -1, None
|
||||
|
||||
def reset_provider_for_file(self, collection, representation_id,
|
||||
side=None, file_id=None, site_name=None,
|
||||
remove=False, pause=None, force=False):
|
||||
def reset_site_on_representation(self, collection, representation_id,
|
||||
side=None, file_id=None, site_name=None,
|
||||
remove=False, pause=None, force=False):
|
||||
"""
|
||||
Reset information about synchronization for particular 'file_id'
|
||||
and provider.
|
||||
|
|
@ -1407,9 +1480,12 @@ class SyncServerModule(OpenPypeModule, ITrayModule):
|
|||
update = {
|
||||
"$set": {"files.$[f].sites.$[s]": elem}
|
||||
}
|
||||
if not isinstance(file_id, ObjectId):
|
||||
file_id = ObjectId(file_id)
|
||||
|
||||
arr_filter = [
|
||||
{'s.name': site_name},
|
||||
{'f._id': ObjectId(file_id)}
|
||||
{'f._id': file_id}
|
||||
]
|
||||
|
||||
self._update_site(collection, query, update, arr_filter)
|
||||
|
|
|
|||
|
|
@ -4,6 +4,18 @@ from Qt import QtCore, QtWidgets, QtGui
|
|||
from openpype.lib import PypeLogger
|
||||
from . import lib
|
||||
|
||||
from openpype.tools.utils.constants import (
|
||||
LOCAL_PROVIDER_ROLE,
|
||||
REMOTE_PROVIDER_ROLE,
|
||||
LOCAL_PROGRESS_ROLE,
|
||||
REMOTE_PROGRESS_ROLE,
|
||||
LOCAL_DATE_ROLE,
|
||||
REMOTE_DATE_ROLE,
|
||||
LOCAL_FAILED_ROLE,
|
||||
REMOTE_FAILED_ROLE,
|
||||
EDIT_ICON_ROLE
|
||||
)
|
||||
|
||||
log = PypeLogger().get_logger("SyncServer")
|
||||
|
||||
|
||||
|
|
@ -14,7 +26,7 @@ class PriorityDelegate(QtWidgets.QStyledItemDelegate):
|
|||
|
||||
if option.widget.selectionModel().isSelected(index) or \
|
||||
option.state & QtWidgets.QStyle.State_MouseOver:
|
||||
edit_icon = index.data(lib.EditIconRole)
|
||||
edit_icon = index.data(EDIT_ICON_ROLE)
|
||||
if not edit_icon:
|
||||
return
|
||||
|
||||
|
|
@ -38,7 +50,7 @@ class PriorityDelegate(QtWidgets.QStyledItemDelegate):
|
|||
editor = PriorityLineEdit(
|
||||
parent,
|
||||
option.widget.selectionModel().selectedRows())
|
||||
editor.setFocus(True)
|
||||
editor.setFocus()
|
||||
return editor
|
||||
|
||||
def setModelData(self, editor, model, index):
|
||||
|
|
@ -71,19 +83,30 @@ class ImageDelegate(QtWidgets.QStyledItemDelegate):
|
|||
Prints icon of site and progress of synchronization
|
||||
"""
|
||||
|
||||
def __init__(self, parent=None):
|
||||
def __init__(self, parent=None, side=None):
|
||||
super(ImageDelegate, self).__init__(parent)
|
||||
self.icons = {}
|
||||
self.side = side
|
||||
|
||||
def paint(self, painter, option, index):
|
||||
super(ImageDelegate, self).paint(painter, option, index)
|
||||
option = QtWidgets.QStyleOptionViewItem(option)
|
||||
option.showDecorationSelected = True
|
||||
|
||||
provider = index.data(lib.ProviderRole)
|
||||
value = index.data(lib.ProgressRole)
|
||||
date_value = index.data(lib.DateRole)
|
||||
is_failed = index.data(lib.FailedRole)
|
||||
if not self.side:
|
||||
log.warning("No side provided, delegate won't work")
|
||||
return
|
||||
|
||||
if self.side == 'local':
|
||||
provider = index.data(LOCAL_PROVIDER_ROLE)
|
||||
value = index.data(LOCAL_PROGRESS_ROLE)
|
||||
date_value = index.data(LOCAL_DATE_ROLE)
|
||||
is_failed = index.data(LOCAL_FAILED_ROLE)
|
||||
else:
|
||||
provider = index.data(REMOTE_PROVIDER_ROLE)
|
||||
value = index.data(REMOTE_PROGRESS_ROLE)
|
||||
date_value = index.data(REMOTE_DATE_ROLE)
|
||||
is_failed = index.data(REMOTE_FAILED_ROLE)
|
||||
|
||||
if not self.icons.get(provider):
|
||||
resource_path = os.path.dirname(__file__)
|
||||
|
|
|
|||
|
|
@ -1,4 +1,3 @@
|
|||
from Qt import QtCore
|
||||
import attr
|
||||
import abc
|
||||
import six
|
||||
|
|
@ -19,14 +18,6 @@ STATUS = {
|
|||
|
||||
DUMMY_PROJECT = "No project configured"
|
||||
|
||||
ProviderRole = QtCore.Qt.UserRole + 2
|
||||
ProgressRole = QtCore.Qt.UserRole + 4
|
||||
DateRole = QtCore.Qt.UserRole + 6
|
||||
FailedRole = QtCore.Qt.UserRole + 8
|
||||
HeaderNameRole = QtCore.Qt.UserRole + 10
|
||||
FullItemRole = QtCore.Qt.UserRole + 12
|
||||
EditIconRole = QtCore.Qt.UserRole + 14
|
||||
|
||||
|
||||
@six.add_metaclass(abc.ABCMeta)
|
||||
class AbstractColumnFilter:
|
||||
|
|
@ -161,7 +152,7 @@ def translate_provider_for_icon(sync_server, project, site):
|
|||
return sync_server.get_provider_for_site(site=site)
|
||||
|
||||
|
||||
def get_item_by_id(model, object_id):
|
||||
def get_value_from_id_by_role(model, object_id, role):
|
||||
"""Return value from item with 'object_id' with 'role'."""
|
||||
index = model.get_index(object_id)
|
||||
item = model.data(index, FullItemRole)
|
||||
return item
|
||||
return model.data(index, role)
|
||||
|
|
|
|||
|
|
@ -13,6 +13,23 @@ from openpype.api import get_local_site_id
|
|||
|
||||
from . import lib
|
||||
|
||||
from openpype.tools.utils.constants import (
|
||||
LOCAL_PROVIDER_ROLE,
|
||||
REMOTE_PROVIDER_ROLE,
|
||||
LOCAL_PROGRESS_ROLE,
|
||||
REMOTE_PROGRESS_ROLE,
|
||||
HEADER_NAME_ROLE,
|
||||
EDIT_ICON_ROLE,
|
||||
LOCAL_DATE_ROLE,
|
||||
REMOTE_DATE_ROLE,
|
||||
LOCAL_FAILED_ROLE,
|
||||
REMOTE_FAILED_ROLE,
|
||||
STATUS_ROLE,
|
||||
PATH_ROLE,
|
||||
ERROR_ROLE,
|
||||
TRIES_ROLE
|
||||
)
|
||||
|
||||
|
||||
log = PypeLogger().get_logger("SyncServer")
|
||||
|
||||
|
|
@ -68,10 +85,68 @@ class _SyncRepresentationModel(QtCore.QAbstractTableModel):
|
|||
if orientation == Qt.Horizontal:
|
||||
return self.COLUMN_LABELS[section][1]
|
||||
|
||||
if role == lib.HeaderNameRole:
|
||||
if role == HEADER_NAME_ROLE:
|
||||
if orientation == Qt.Horizontal:
|
||||
return self.COLUMN_LABELS[section][0] # return name
|
||||
|
||||
def data(self, index, role):
|
||||
item = self._data[index.row()]
|
||||
|
||||
header_value = self._header[index.column()]
|
||||
if role == LOCAL_PROVIDER_ROLE:
|
||||
return item.local_provider
|
||||
|
||||
if role == REMOTE_PROVIDER_ROLE:
|
||||
return item.remote_provider
|
||||
|
||||
if role == LOCAL_PROGRESS_ROLE:
|
||||
return item.local_progress
|
||||
|
||||
if role == REMOTE_PROGRESS_ROLE:
|
||||
return item.remote_progress
|
||||
|
||||
if role == LOCAL_DATE_ROLE:
|
||||
if item.created_dt:
|
||||
return pretty_timestamp(item.created_dt)
|
||||
|
||||
if role == REMOTE_DATE_ROLE:
|
||||
if item.sync_dt:
|
||||
return pretty_timestamp(item.sync_dt)
|
||||
|
||||
if role == LOCAL_FAILED_ROLE:
|
||||
return item.status == lib.STATUS[2] and \
|
||||
item.local_progress < 1
|
||||
|
||||
if role == REMOTE_FAILED_ROLE:
|
||||
return item.status == lib.STATUS[2] and \
|
||||
item.remote_progress < 1
|
||||
|
||||
if role in (Qt.DisplayRole, Qt.EditRole):
|
||||
# because of ImageDelegate
|
||||
if header_value in ['remote_site', 'local_site']:
|
||||
return ""
|
||||
|
||||
return attr.asdict(item)[self._header[index.column()]]
|
||||
|
||||
if role == EDIT_ICON_ROLE:
|
||||
if self.can_edit and header_value in self.EDITABLE_COLUMNS:
|
||||
return self.edit_icon
|
||||
|
||||
if role == PATH_ROLE:
|
||||
return item.path
|
||||
|
||||
if role == ERROR_ROLE:
|
||||
return item.error
|
||||
|
||||
if role == TRIES_ROLE:
|
||||
return item.tries
|
||||
|
||||
if role == STATUS_ROLE:
|
||||
return item.status
|
||||
|
||||
if role == Qt.UserRole:
|
||||
return item._id
|
||||
|
||||
@property
|
||||
def can_edit(self):
|
||||
"""Returns true if some site is user local site, eg. could edit"""
|
||||
|
|
@ -456,55 +531,6 @@ class SyncRepresentationSummaryModel(_SyncRepresentationModel):
|
|||
self.timer.timeout.connect(self.tick)
|
||||
self.timer.start(self.REFRESH_SEC)
|
||||
|
||||
def data(self, index, role):
|
||||
item = self._data[index.row()]
|
||||
|
||||
if role == lib.FullItemRole:
|
||||
return item
|
||||
|
||||
header_value = self._header[index.column()]
|
||||
if role == lib.ProviderRole:
|
||||
if header_value == 'local_site':
|
||||
return item.local_provider
|
||||
if header_value == 'remote_site':
|
||||
return item.remote_provider
|
||||
|
||||
if role == lib.ProgressRole:
|
||||
if header_value == 'local_site':
|
||||
return item.local_progress
|
||||
if header_value == 'remote_site':
|
||||
return item.remote_progress
|
||||
|
||||
if role == lib.DateRole:
|
||||
if header_value == 'local_site':
|
||||
if item.created_dt:
|
||||
return pretty_timestamp(item.created_dt)
|
||||
if header_value == 'remote_site':
|
||||
if item.sync_dt:
|
||||
return pretty_timestamp(item.sync_dt)
|
||||
|
||||
if role == lib.FailedRole:
|
||||
if header_value == 'local_site':
|
||||
return item.status == lib.STATUS[2] and \
|
||||
item.local_progress < 1
|
||||
if header_value == 'remote_site':
|
||||
return item.status == lib.STATUS[2] and \
|
||||
item.remote_progress < 1
|
||||
|
||||
if role in (Qt.DisplayRole, Qt.EditRole):
|
||||
# because of ImageDelegate
|
||||
if header_value in ['remote_site', 'local_site']:
|
||||
return ""
|
||||
|
||||
return attr.asdict(item)[self._header[index.column()]]
|
||||
|
||||
if role == lib.EditIconRole:
|
||||
if self.can_edit and header_value in self.EDITABLE_COLUMNS:
|
||||
return self.edit_icon
|
||||
|
||||
if role == Qt.UserRole:
|
||||
return item._id
|
||||
|
||||
def add_page_records(self, local_site, remote_site, representations):
|
||||
"""
|
||||
Process all records from 'representation' and add them to storage.
|
||||
|
|
@ -985,55 +1011,6 @@ class SyncRepresentationDetailModel(_SyncRepresentationModel):
|
|||
self.timer.timeout.connect(self.tick)
|
||||
self.timer.start(SyncRepresentationSummaryModel.REFRESH_SEC)
|
||||
|
||||
def data(self, index, role):
|
||||
item = self._data[index.row()]
|
||||
|
||||
if role == lib.FullItemRole:
|
||||
return item
|
||||
|
||||
header_value = self._header[index.column()]
|
||||
if role == lib.ProviderRole:
|
||||
if header_value == 'local_site':
|
||||
return item.local_provider
|
||||
if header_value == 'remote_site':
|
||||
return item.remote_provider
|
||||
|
||||
if role == lib.ProgressRole:
|
||||
if header_value == 'local_site':
|
||||
return item.local_progress
|
||||
if header_value == 'remote_site':
|
||||
return item.remote_progress
|
||||
|
||||
if role == lib.DateRole:
|
||||
if header_value == 'local_site':
|
||||
if item.created_dt:
|
||||
return pretty_timestamp(item.created_dt)
|
||||
if header_value == 'remote_site':
|
||||
if item.sync_dt:
|
||||
return pretty_timestamp(item.sync_dt)
|
||||
|
||||
if role == lib.FailedRole:
|
||||
if header_value == 'local_site':
|
||||
return item.status == lib.STATUS[2] and \
|
||||
item.local_progress < 1
|
||||
if header_value == 'remote_site':
|
||||
return item.status == lib.STATUS[2] and \
|
||||
item.remote_progress < 1
|
||||
|
||||
if role in (Qt.DisplayRole, Qt.EditRole):
|
||||
# because of ImageDelegate
|
||||
if header_value in ['remote_site', 'local_site']:
|
||||
return ""
|
||||
|
||||
return attr.asdict(item)[self._header[index.column()]]
|
||||
|
||||
if role == lib.EditIconRole:
|
||||
if self.can_edit and header_value in self.EDITABLE_COLUMNS:
|
||||
return self.edit_icon
|
||||
|
||||
if role == Qt.UserRole:
|
||||
return item._id
|
||||
|
||||
def add_page_records(self, local_site, remote_site, representations):
|
||||
"""
|
||||
Process all records from 'representation' and add them to storage.
|
||||
|
|
|
|||
|
|
@ -22,6 +22,20 @@ from .models import (
|
|||
from . import lib
|
||||
from . import delegates
|
||||
|
||||
from openpype.tools.utils.constants import (
|
||||
LOCAL_PROGRESS_ROLE,
|
||||
REMOTE_PROGRESS_ROLE,
|
||||
HEADER_NAME_ROLE,
|
||||
STATUS_ROLE,
|
||||
PATH_ROLE,
|
||||
LOCAL_SITE_NAME_ROLE,
|
||||
REMOTE_SITE_NAME_ROLE,
|
||||
LOCAL_DATE_ROLE,
|
||||
REMOTE_DATE_ROLE,
|
||||
ERROR_ROLE,
|
||||
TRIES_ROLE
|
||||
)
|
||||
|
||||
log = PypeLogger().get_logger("SyncServer")
|
||||
|
||||
|
||||
|
|
@ -140,7 +154,7 @@ class SyncProjectListWidget(QtWidgets.QWidget):
|
|||
selected_index.isValid() and \
|
||||
not self._selection_changed:
|
||||
mode = QtCore.QItemSelectionModel.Select | \
|
||||
QtCore.QItemSelectionModel.Rows
|
||||
QtCore.QItemSelectionModel.Rows
|
||||
self.project_list.selectionModel().select(selected_index, mode)
|
||||
|
||||
if self.current_project:
|
||||
|
|
@ -289,14 +303,19 @@ class _SyncRepresentationWidget(QtWidgets.QWidget):
|
|||
|
||||
if is_multi:
|
||||
index = self.model.get_index(list(self._selected_ids)[0])
|
||||
item = self.model.data(index, lib.FullItemRole)
|
||||
local_progress = self.model.data(index, LOCAL_PROGRESS_ROLE)
|
||||
remote_progress = self.model.data(index, REMOTE_PROGRESS_ROLE)
|
||||
status = self.model.data(index, STATUS_ROLE)
|
||||
else:
|
||||
item = self.model.data(point_index, lib.FullItemRole)
|
||||
local_progress = self.model.data(point_index, LOCAL_PROGRESS_ROLE)
|
||||
remote_progress = self.model.data(point_index,
|
||||
REMOTE_PROGRESS_ROLE)
|
||||
status = self.model.data(point_index, STATUS_ROLE)
|
||||
|
||||
|
||||
can_edit = self.model.can_edit
|
||||
action_kwarg_map, actions_mapping, menu = self._prepare_menu(item,
|
||||
is_multi,
|
||||
can_edit)
|
||||
action_kwarg_map, actions_mapping, menu = self._prepare_menu(
|
||||
local_progress, remote_progress, is_multi, can_edit, status)
|
||||
|
||||
result = menu.exec_(QtGui.QCursor.pos())
|
||||
if result:
|
||||
|
|
@ -307,7 +326,8 @@ class _SyncRepresentationWidget(QtWidgets.QWidget):
|
|||
|
||||
self.model.refresh()
|
||||
|
||||
def _prepare_menu(self, item, is_multi, can_edit):
|
||||
def _prepare_menu(self, local_progress, remote_progress,
|
||||
is_multi, can_edit, status=None):
|
||||
menu = QtWidgets.QMenu(self)
|
||||
|
||||
actions_mapping = {}
|
||||
|
|
@ -316,11 +336,6 @@ class _SyncRepresentationWidget(QtWidgets.QWidget):
|
|||
active_site = self.model.active_site
|
||||
remote_site = self.model.remote_site
|
||||
|
||||
local_progress = item.local_progress
|
||||
remote_progress = item.remote_progress
|
||||
|
||||
project = self.model.project
|
||||
|
||||
for site, progress in {active_site: local_progress,
|
||||
remote_site: remote_progress}.items():
|
||||
provider = self.sync_server.get_provider_for_site(site=site)
|
||||
|
|
@ -360,12 +375,6 @@ class _SyncRepresentationWidget(QtWidgets.QWidget):
|
|||
actions_mapping[action] = self._change_priority
|
||||
menu.addAction(action)
|
||||
|
||||
# # temp for testing only !!!
|
||||
# action = QtWidgets.QAction("Download")
|
||||
# action_kwarg_map[action] = self._get_action_kwargs(active_site)
|
||||
# actions_mapping[action] = self._add_site
|
||||
# menu.addAction(action)
|
||||
|
||||
if not actions_mapping:
|
||||
action = QtWidgets.QAction("< No action >")
|
||||
actions_mapping[action] = None
|
||||
|
|
@ -376,11 +385,15 @@ class _SyncRepresentationWidget(QtWidgets.QWidget):
|
|||
def _pause(self, selected_ids=None):
|
||||
log.debug("Pause {}".format(selected_ids))
|
||||
for representation_id in selected_ids:
|
||||
item = lib.get_item_by_id(self.model, representation_id)
|
||||
if item.status not in [lib.STATUS[0], lib.STATUS[1]]:
|
||||
status = lib.get_value_from_id_by_role(self.model,
|
||||
representation_id,
|
||||
STATUS_ROLE)
|
||||
if status not in [lib.STATUS[0], lib.STATUS[1]]:
|
||||
continue
|
||||
for site_name in [self.model.active_site, self.model.remote_site]:
|
||||
check_progress = self._get_progress(item, site_name)
|
||||
check_progress = self._get_progress(self.model,
|
||||
representation_id,
|
||||
site_name)
|
||||
if check_progress < 1:
|
||||
self.sync_server.pause_representation(self.model.project,
|
||||
representation_id,
|
||||
|
|
@ -391,11 +404,15 @@ class _SyncRepresentationWidget(QtWidgets.QWidget):
|
|||
def _unpause(self, selected_ids=None):
|
||||
log.debug("UnPause {}".format(selected_ids))
|
||||
for representation_id in selected_ids:
|
||||
item = lib.get_item_by_id(self.model, representation_id)
|
||||
if item.status not in lib.STATUS[3]:
|
||||
status = lib.get_value_from_id_by_role(self.model,
|
||||
representation_id,
|
||||
STATUS_ROLE)
|
||||
if status not in lib.STATUS[3]:
|
||||
continue
|
||||
for site_name in [self.model.active_site, self.model.remote_site]:
|
||||
check_progress = self._get_progress(item, site_name)
|
||||
check_progress = self._get_progress(self.model,
|
||||
representation_id,
|
||||
site_name)
|
||||
if check_progress < 1:
|
||||
self.sync_server.unpause_representation(
|
||||
self.model.project,
|
||||
|
|
@ -408,8 +425,11 @@ class _SyncRepresentationWidget(QtWidgets.QWidget):
|
|||
def _add_site(self, selected_ids=None, site_name=None):
|
||||
log.debug("Add site {}:{}".format(selected_ids, site_name))
|
||||
for representation_id in selected_ids:
|
||||
item = lib.get_item_by_id(self.model, representation_id)
|
||||
if item.local_site == site_name or item.remote_site == site_name:
|
||||
item_local_site = lib.get_value_from_id_by_role(
|
||||
self.model, representation_id, LOCAL_SITE_NAME_ROLE)
|
||||
item_remote_site = lib.get_value_from_id_by_role(
|
||||
self.model, representation_id, REMOTE_SITE_NAME_ROLE)
|
||||
if site_name in [item_local_site, item_remote_site]:
|
||||
# site already exists skip
|
||||
continue
|
||||
|
||||
|
|
@ -460,8 +480,8 @@ class _SyncRepresentationWidget(QtWidgets.QWidget):
|
|||
"""
|
||||
log.debug("Reset site {}:{}".format(selected_ids, site_name))
|
||||
for representation_id in selected_ids:
|
||||
item = lib.get_item_by_id(self.model, representation_id)
|
||||
check_progress = self._get_progress(item, site_name, True)
|
||||
check_progress = self._get_progress(self.model, representation_id,
|
||||
site_name, True)
|
||||
|
||||
# do not reset if opposite side is not fully there
|
||||
if check_progress != 1:
|
||||
|
|
@ -469,7 +489,7 @@ class _SyncRepresentationWidget(QtWidgets.QWidget):
|
|||
format(check_progress))
|
||||
continue
|
||||
|
||||
self.sync_server.reset_provider_for_file(
|
||||
self.sync_server.reset_site_on_representation(
|
||||
self.model.project,
|
||||
representation_id,
|
||||
site_name=site_name,
|
||||
|
|
@ -482,11 +502,8 @@ class _SyncRepresentationWidget(QtWidgets.QWidget):
|
|||
def _open_in_explorer(self, selected_ids=None, site_name=None):
|
||||
log.debug("Open in Explorer {}:{}".format(selected_ids, site_name))
|
||||
for selected_id in selected_ids:
|
||||
item = lib.get_item_by_id(self.model, selected_id)
|
||||
if not item:
|
||||
return
|
||||
|
||||
fpath = item.path
|
||||
fpath = lib.get_value_from_id_by_role(self.model, selected_id,
|
||||
PATH_ROLE)
|
||||
project = self.model.project
|
||||
fpath = self.sync_server.get_local_file_path(project,
|
||||
site_name,
|
||||
|
|
@ -514,10 +531,17 @@ class _SyncRepresentationWidget(QtWidgets.QWidget):
|
|||
self.model.is_editing = True
|
||||
self.table_view.openPersistentEditor(real_index)
|
||||
|
||||
def _get_progress(self, item, site_name, opposite=False):
|
||||
def _get_progress(self, model, representation_id,
|
||||
site_name, opposite=False):
|
||||
"""Returns progress value according to site (side)"""
|
||||
progress = {'local': item.local_progress,
|
||||
'remote': item.remote_progress}
|
||||
local_progress = lib.get_value_from_id_by_role(model,
|
||||
representation_id,
|
||||
LOCAL_PROGRESS_ROLE)
|
||||
remote_progress = lib.get_value_from_id_by_role(model,
|
||||
representation_id,
|
||||
REMOTE_PROGRESS_ROLE)
|
||||
progress = {'local': local_progress,
|
||||
'remote': remote_progress}
|
||||
side = 'remote'
|
||||
if site_name == self.model.active_site:
|
||||
side = 'local'
|
||||
|
|
@ -591,11 +615,11 @@ class SyncRepresentationSummaryWidget(_SyncRepresentationWidget):
|
|||
table_view.viewport().setAttribute(QtCore.Qt.WA_Hover, True)
|
||||
|
||||
column = table_view.model().get_header_index("local_site")
|
||||
delegate = delegates.ImageDelegate(self)
|
||||
delegate = delegates.ImageDelegate(self, side="local")
|
||||
table_view.setItemDelegateForColumn(column, delegate)
|
||||
|
||||
column = table_view.model().get_header_index("remote_site")
|
||||
delegate = delegates.ImageDelegate(self)
|
||||
delegate = delegates.ImageDelegate(self, side="remote")
|
||||
table_view.setItemDelegateForColumn(column, delegate)
|
||||
|
||||
column = table_view.model().get_header_index("priority")
|
||||
|
|
@ -631,19 +655,21 @@ class SyncRepresentationSummaryWidget(_SyncRepresentationWidget):
|
|||
self.selection_model = self.table_view.selectionModel()
|
||||
self.selection_model.selectionChanged.connect(self._selection_changed)
|
||||
|
||||
def _prepare_menu(self, item, is_multi, can_edit):
|
||||
def _prepare_menu(self, local_progress, remote_progress,
|
||||
is_multi, can_edit, status=None):
|
||||
action_kwarg_map, actions_mapping, menu = \
|
||||
super()._prepare_menu(item, is_multi, can_edit)
|
||||
super()._prepare_menu(local_progress, remote_progress,
|
||||
is_multi, can_edit)
|
||||
|
||||
if can_edit and (
|
||||
item.status in [lib.STATUS[0], lib.STATUS[1]] or is_multi):
|
||||
status in [lib.STATUS[0], lib.STATUS[1]] or is_multi):
|
||||
action = QtWidgets.QAction("Pause in queue")
|
||||
actions_mapping[action] = self._pause
|
||||
# pause handles which site_name it will pause itself
|
||||
action_kwarg_map[action] = {"selected_ids": self._selected_ids}
|
||||
menu.addAction(action)
|
||||
|
||||
if can_edit and (item.status == lib.STATUS[3] or is_multi):
|
||||
if can_edit and (status == lib.STATUS[3] or is_multi):
|
||||
action = QtWidgets.QAction("Unpause in queue")
|
||||
actions_mapping[action] = self._unpause
|
||||
action_kwarg_map[action] = {"selected_ids": self._selected_ids}
|
||||
|
|
@ -753,11 +779,11 @@ class SyncRepresentationDetailWidget(_SyncRepresentationWidget):
|
|||
table_view.verticalHeader().hide()
|
||||
|
||||
column = model.get_header_index("local_site")
|
||||
delegate = delegates.ImageDelegate(self)
|
||||
delegate = delegates.ImageDelegate(self, side="local")
|
||||
table_view.setItemDelegateForColumn(column, delegate)
|
||||
|
||||
column = model.get_header_index("remote_site")
|
||||
delegate = delegates.ImageDelegate(self)
|
||||
delegate = delegates.ImageDelegate(self, side="remote")
|
||||
table_view.setItemDelegateForColumn(column, delegate)
|
||||
|
||||
if model.can_edit:
|
||||
|
|
@ -815,12 +841,14 @@ class SyncRepresentationDetailWidget(_SyncRepresentationWidget):
|
|||
|
||||
detail_window.exec()
|
||||
|
||||
def _prepare_menu(self, item, is_multi, can_edit):
|
||||
def _prepare_menu(self, local_progress, remote_progress,
|
||||
is_multi, can_edit, status=None):
|
||||
"""Adds view (and model) dependent actions to default ones"""
|
||||
action_kwarg_map, actions_mapping, menu = \
|
||||
super()._prepare_menu(item, is_multi, can_edit)
|
||||
super()._prepare_menu(local_progress, remote_progress,
|
||||
is_multi, can_edit, status)
|
||||
|
||||
if item.status == lib.STATUS[2] or is_multi:
|
||||
if status == lib.STATUS[2] or is_multi:
|
||||
action = QtWidgets.QAction("Open error detail")
|
||||
actions_mapping[action] = self._show_detail
|
||||
action_kwarg_map[action] = {"selected_ids": self._selected_ids}
|
||||
|
|
@ -835,8 +863,8 @@ class SyncRepresentationDetailWidget(_SyncRepresentationWidget):
|
|||
redo of upload/download
|
||||
"""
|
||||
for file_id in selected_ids:
|
||||
item = lib.get_item_by_id(self.model, file_id)
|
||||
check_progress = self._get_progress(item, site_name, True)
|
||||
check_progress = self._get_progress(self.model, file_id,
|
||||
site_name, True)
|
||||
|
||||
# do not reset if opposite side is not fully there
|
||||
if check_progress != 1:
|
||||
|
|
@ -844,7 +872,7 @@ class SyncRepresentationDetailWidget(_SyncRepresentationWidget):
|
|||
format(check_progress))
|
||||
continue
|
||||
|
||||
self.sync_server.reset_provider_for_file(
|
||||
self.sync_server.reset_site_on_representation(
|
||||
self.model.project,
|
||||
self.representation_id,
|
||||
site_name=site_name,
|
||||
|
|
@ -895,20 +923,28 @@ class SyncRepresentationErrorWidget(QtWidgets.QWidget):
|
|||
|
||||
no_errors = True
|
||||
for file_id in selected_ids:
|
||||
item = lib.get_item_by_id(model, file_id)
|
||||
if not item.created_dt or not item.sync_dt or not item.error:
|
||||
created_dt = lib.get_value_from_id_by_role(model, file_id,
|
||||
LOCAL_DATE_ROLE)
|
||||
sync_dt = lib.get_value_from_id_by_role(model, file_id,
|
||||
REMOTE_DATE_ROLE)
|
||||
errors = lib.get_value_from_id_by_role(model, file_id,
|
||||
ERROR_ROLE)
|
||||
if not created_dt or not sync_dt or not errors:
|
||||
continue
|
||||
|
||||
tries = lib.get_value_from_id_by_role(model, file_id,
|
||||
TRIES_ROLE)
|
||||
|
||||
no_errors = False
|
||||
dt = max(item.created_dt, item.sync_dt)
|
||||
dt = max(created_dt, sync_dt)
|
||||
|
||||
txts = []
|
||||
txts.append("{}: {}<br>".format("<b>Last update date</b>",
|
||||
pretty_timestamp(dt)))
|
||||
txts.append("{}: {}<br>".format("<b>Retries</b>",
|
||||
str(item.tries)))
|
||||
str(tries)))
|
||||
txts.append("{}: {}<br>".format("<b>Error message</b>",
|
||||
item.error))
|
||||
errors))
|
||||
|
||||
text_area = QtWidgets.QTextEdit("\n\n".join(txts))
|
||||
text_area.setReadOnly(True)
|
||||
|
|
@ -1162,7 +1198,7 @@ class HorizontalHeader(QtWidgets.QHeaderView):
|
|||
|
||||
column_name = self.model.headerData(column_idx,
|
||||
QtCore.Qt.Horizontal,
|
||||
lib.HeaderNameRole)
|
||||
HEADER_NAME_ROLE)
|
||||
button = self.filter_buttons.get(column_name)
|
||||
if not button:
|
||||
continue
|
||||
|
|
|
|||
|
|
@ -1,10 +1,7 @@
|
|||
import os
|
||||
import platform
|
||||
from openpype.modules import OpenPypeModule
|
||||
from openpype_interfaces import (
|
||||
ITimersManager,
|
||||
ITrayService
|
||||
)
|
||||
from openpype_interfaces import ITrayService
|
||||
from avalon.api import AvalonMongoDB
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -8,14 +8,15 @@ in global space here until are required or used.
|
|||
"""
|
||||
|
||||
import os
|
||||
import click
|
||||
|
||||
from openpype.modules import (
|
||||
JsonFilesSettingsDef,
|
||||
OpenPypeAddOn
|
||||
OpenPypeAddOn,
|
||||
ModulesManager
|
||||
)
|
||||
# Import interface defined by this addon to be able find other addons using it
|
||||
from openpype_interfaces import (
|
||||
IExampleInterface,
|
||||
IPluginPaths,
|
||||
ITrayAction
|
||||
)
|
||||
|
|
@ -75,19 +76,6 @@ class ExampleAddon(OpenPypeAddOn, IPluginPaths, ITrayAction):
|
|||
|
||||
self._create_dialog()
|
||||
|
||||
def connect_with_modules(self, enabled_modules):
|
||||
"""Method where you should find connected modules.
|
||||
|
||||
It is triggered by OpenPype modules manager at the best possible time.
|
||||
Some addons and modules may required to connect with other modules
|
||||
before their main logic is executed so changes would require to restart
|
||||
whole process.
|
||||
"""
|
||||
self._connected_modules = []
|
||||
for module in enabled_modules:
|
||||
if isinstance(module, IExampleInterface):
|
||||
self._connected_modules.append(module)
|
||||
|
||||
def _create_dialog(self):
|
||||
# Don't recreate dialog if already exists
|
||||
if self._dialog is not None:
|
||||
|
|
@ -106,8 +94,6 @@ class ExampleAddon(OpenPypeAddOn, IPluginPaths, ITrayAction):
|
|||
"""
|
||||
# Make sure dialog is created
|
||||
self._create_dialog()
|
||||
# Change value of dialog by current state
|
||||
self._dialog.set_connected_modules(self.get_connected_modules())
|
||||
# Show dialog
|
||||
self._dialog.open()
|
||||
|
||||
|
|
@ -130,3 +116,32 @@ class ExampleAddon(OpenPypeAddOn, IPluginPaths, ITrayAction):
|
|||
return {
|
||||
"publish": [os.path.join(current_dir, "plugins", "publish")]
|
||||
}
|
||||
|
||||
def cli(self, click_group):
|
||||
click_group.add_command(cli_main)
|
||||
|
||||
|
||||
@click.group(ExampleAddon.name, help="Example addon dynamic cli commands.")
|
||||
def cli_main():
|
||||
pass
|
||||
|
||||
|
||||
@cli_main.command()
|
||||
def nothing():
|
||||
"""Does nothing but print a message."""
|
||||
print("You've triggered \"nothing\" command.")
|
||||
|
||||
|
||||
@cli_main.command()
|
||||
def show_dialog():
|
||||
"""Show ExampleAddon dialog.
|
||||
|
||||
We don't have access to addon directly through cli so we have to create
|
||||
it again.
|
||||
"""
|
||||
from openpype.tools.utils.lib import qt_app_context
|
||||
|
||||
manager = ModulesManager()
|
||||
example_addon = manager.modules_by_name[ExampleAddon.name]
|
||||
with qt_app_context():
|
||||
example_addon.show_dialog()
|
||||
|
|
|
|||
|
|
@ -1,28 +0,0 @@
|
|||
""" Using interfaces is one way of connecting multiple OpenPype Addons/Modules.
|
||||
|
||||
Interfaces must be in `interfaces.py` file (or folder). Interfaces should not
|
||||
import module logic or other module in global namespace. That is because
|
||||
all of them must be imported before all OpenPype AddOns and Modules.
|
||||
|
||||
Ideally they should just define abstract and helper methods. If interface
|
||||
require any logic or connection it should be defined in module.
|
||||
|
||||
Keep in mind that attributes and methods will be added to other addon
|
||||
attributes and methods so they should be unique and ideally contain
|
||||
addon name in it's name.
|
||||
"""
|
||||
|
||||
from abc import abstractmethod
|
||||
from openpype.modules import OpenPypeInterface
|
||||
|
||||
|
||||
class IExampleInterface(OpenPypeInterface):
|
||||
"""Example interface of addon."""
|
||||
_example_module = None
|
||||
|
||||
def get_example_module(self):
|
||||
return self._example_module
|
||||
|
||||
@abstractmethod
|
||||
def example_method_of_example_interface(self):
|
||||
pass
|
||||
|
|
@ -9,7 +9,8 @@ class MyExampleDialog(QtWidgets.QDialog):
|
|||
|
||||
self.setWindowTitle("Connected modules")
|
||||
|
||||
label_widget = QtWidgets.QLabel(self)
|
||||
msg = "This is example dialog of example addon."
|
||||
label_widget = QtWidgets.QLabel(msg, self)
|
||||
|
||||
ok_btn = QtWidgets.QPushButton("OK", self)
|
||||
btns_layout = QtWidgets.QHBoxLayout()
|
||||
|
|
@ -28,12 +29,3 @@ class MyExampleDialog(QtWidgets.QDialog):
|
|||
|
||||
def _on_ok_clicked(self):
|
||||
self.done(1)
|
||||
|
||||
def set_connected_modules(self, connected_modules):
|
||||
if connected_modules:
|
||||
message = "\n".join(connected_modules)
|
||||
else:
|
||||
message = (
|
||||
"Other enabled modules/addons are not using my interface."
|
||||
)
|
||||
self._label_widget.setText(message)
|
||||
|
|
|
|||
|
|
@ -263,3 +263,31 @@ class ITrayService(ITrayModule):
|
|||
"""Change icon of an QAction to orange circle."""
|
||||
if self.menu_action:
|
||||
self.menu_action.setIcon(self.get_icon_idle())
|
||||
|
||||
|
||||
class ISettingsChangeListener(OpenPypeInterface):
|
||||
"""Module has plugin paths to return.
|
||||
|
||||
Expected result is dictionary with keys "publish", "create", "load" or
|
||||
"actions" and values as list or string.
|
||||
{
|
||||
"publish": ["path/to/publish_plugins"]
|
||||
}
|
||||
"""
|
||||
@abstractmethod
|
||||
def on_system_settings_save(
|
||||
self, old_value, new_value, changes, new_value_metadata
|
||||
):
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def on_project_settings_save(
|
||||
self, old_value, new_value, changes, project_name, new_value_metadata
|
||||
):
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def on_project_anatomy_save(
|
||||
self, old_value, new_value, changes, project_name, new_value_metadata
|
||||
):
|
||||
pass
|
||||
|
|
@ -38,6 +38,8 @@ class CollectAnatomyInstanceData(pyblish.api.ContextPlugin):
|
|||
order = pyblish.api.CollectorOrder + 0.49
|
||||
label = "Collect Anatomy Instance data"
|
||||
|
||||
follow_workfile_version = False
|
||||
|
||||
def process(self, context):
|
||||
self.log.info("Collecting anatomy data for all instances.")
|
||||
|
||||
|
|
@ -213,7 +215,10 @@ class CollectAnatomyInstanceData(pyblish.api.ContextPlugin):
|
|||
context_asset_doc = context.data["assetEntity"]
|
||||
|
||||
for instance in context:
|
||||
version_number = instance.data.get("version")
|
||||
if self.follow_workfile_version:
|
||||
version_number = context.data('version')
|
||||
else:
|
||||
version_number = instance.data.get("version")
|
||||
# If version is not specified for instance or context
|
||||
if version_number is None:
|
||||
# TODO we should be able to change default version by studio
|
||||
|
|
|
|||
|
|
@ -649,6 +649,8 @@ class ExtractReview(pyblish.api.InstancePlugin):
|
|||
AssertionError: if more then one collection is obtained.
|
||||
|
||||
"""
|
||||
start_frame = int(start_frame)
|
||||
end_frame = int(end_frame)
|
||||
collections = clique.assemble(files)[0]
|
||||
assert len(collections) == 1, "Multiple collections found."
|
||||
col = collections[0]
|
||||
|
|
|
|||
|
|
@ -1029,29 +1029,8 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
|||
"""
|
||||
local_site = 'studio' # default
|
||||
remote_site = None
|
||||
sync_server_presets = None
|
||||
|
||||
if (instance.context.data["system_settings"]
|
||||
["modules"]
|
||||
["sync_server"]
|
||||
["enabled"]):
|
||||
sync_server_presets = (instance.context.data["project_settings"]
|
||||
["global"]
|
||||
["sync_server"])
|
||||
|
||||
local_site_id = openpype.api.get_local_site_id()
|
||||
if sync_server_presets["enabled"]:
|
||||
local_site = sync_server_presets["config"].\
|
||||
get("active_site", "studio").strip()
|
||||
if local_site == 'local':
|
||||
local_site = local_site_id
|
||||
|
||||
remote_site = sync_server_presets["config"].get("remote_site")
|
||||
if remote_site == local_site:
|
||||
remote_site = None
|
||||
|
||||
if remote_site == 'local':
|
||||
remote_site = local_site_id
|
||||
always_accesible = []
|
||||
sync_project_presets = None
|
||||
|
||||
rec = {
|
||||
"_id": io.ObjectId(),
|
||||
|
|
@ -1066,12 +1045,93 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
|
|||
if sites:
|
||||
rec["sites"] = sites
|
||||
else:
|
||||
system_sync_server_presets = (
|
||||
instance.context.data["system_settings"]
|
||||
["modules"]
|
||||
["sync_server"])
|
||||
log.debug("system_sett:: {}".format(system_sync_server_presets))
|
||||
|
||||
if system_sync_server_presets["enabled"]:
|
||||
sync_project_presets = (
|
||||
instance.context.data["project_settings"]
|
||||
["global"]
|
||||
["sync_server"])
|
||||
|
||||
if sync_project_presets and sync_project_presets["enabled"]:
|
||||
local_site, remote_site = self._get_sites(sync_project_presets)
|
||||
|
||||
always_accesible = sync_project_presets["config"]. \
|
||||
get("always_accessible_on", [])
|
||||
|
||||
already_attached_sites = {}
|
||||
meta = {"name": local_site, "created_dt": datetime.now()}
|
||||
rec["sites"] = [meta]
|
||||
already_attached_sites[meta["name"]] = meta["created_dt"]
|
||||
|
||||
if remote_site:
|
||||
if sync_project_presets and sync_project_presets["enabled"]:
|
||||
# add remote
|
||||
meta = {"name": remote_site.strip()}
|
||||
rec["sites"].append(meta)
|
||||
already_attached_sites[meta["name"]] = None
|
||||
|
||||
# add skeleton for site where it should be always synced to
|
||||
for always_on_site in always_accesible:
|
||||
if always_on_site not in already_attached_sites.keys():
|
||||
meta = {"name": always_on_site.strip()}
|
||||
rec["sites"].append(meta)
|
||||
already_attached_sites[meta["name"]] = None
|
||||
|
||||
# add alternative sites
|
||||
rec = self._add_alternative_sites(system_sync_server_presets,
|
||||
already_attached_sites,
|
||||
rec)
|
||||
|
||||
log.debug("final sites:: {}".format(rec["sites"]))
|
||||
|
||||
return rec
|
||||
|
||||
def _get_sites(self, sync_project_presets):
|
||||
"""Returns tuple (local_site, remote_site)"""
|
||||
local_site_id = openpype.api.get_local_site_id()
|
||||
local_site = sync_project_presets["config"]. \
|
||||
get("active_site", "studio").strip()
|
||||
|
||||
if local_site == 'local':
|
||||
local_site = local_site_id
|
||||
|
||||
remote_site = sync_project_presets["config"].get("remote_site")
|
||||
if remote_site == local_site:
|
||||
remote_site = None
|
||||
|
||||
if remote_site == 'local':
|
||||
remote_site = local_site_id
|
||||
|
||||
return local_site, remote_site
|
||||
|
||||
def _add_alternative_sites(self,
|
||||
system_sync_server_presets,
|
||||
already_attached_sites,
|
||||
rec):
|
||||
"""Loop through all configured sites and add alternatives.
|
||||
|
||||
See SyncServerModule.handle_alternate_site
|
||||
"""
|
||||
conf_sites = system_sync_server_presets.get("sites", {})
|
||||
|
||||
for site_name, site_info in conf_sites.items():
|
||||
alt_sites = set(site_info.get("alternative_sites", []))
|
||||
already_attached_keys = list(already_attached_sites.keys())
|
||||
for added_site in already_attached_keys:
|
||||
if added_site in alt_sites:
|
||||
if site_name in already_attached_keys:
|
||||
continue
|
||||
meta = {"name": site_name}
|
||||
real_created = already_attached_sites[added_site]
|
||||
# alt site inherits state of 'created_dt'
|
||||
if real_created:
|
||||
meta["created_dt"] = real_created
|
||||
rec["sites"].append(meta)
|
||||
already_attached_sites[meta["name"]] = real_created
|
||||
|
||||
return rec
|
||||
|
||||
|
|
|
|||
|
|
@ -21,8 +21,9 @@ class ValidateVersion(pyblish.api.InstancePlugin):
|
|||
|
||||
if latest_version is not None:
|
||||
msg = (
|
||||
"Version `{0}` that you are trying to publish, already exists"
|
||||
" in the database. Version in database: `{1}`. Please version "
|
||||
"up your workfile to a higher version number than: `{1}`."
|
||||
).format(version, latest_version)
|
||||
"Version `{0}` from instance `{1}` that you are trying to"
|
||||
" publish, already exists in the database. Version in"
|
||||
" database: `{2}`. Please version up your workfile to a higher"
|
||||
" version number than: `{2}`."
|
||||
).format(version, instance.data["name"], latest_version)
|
||||
assert (int(version) > int(latest_version)), msg
|
||||
|
|
|
|||
|
|
@ -13,7 +13,8 @@ from openpype.lib.remote_publish import (
|
|||
start_webpublish_log,
|
||||
publish_and_log,
|
||||
fail_batch,
|
||||
find_variant_key
|
||||
find_variant_key,
|
||||
get_task_data
|
||||
)
|
||||
|
||||
|
||||
|
|
@ -41,6 +42,25 @@ class PypeCommands:
|
|||
user_role = "manager"
|
||||
settings.main(user_role)
|
||||
|
||||
@staticmethod
|
||||
def add_modules(click_func):
|
||||
"""Modules/Addons can add their cli commands dynamically."""
|
||||
from openpype.modules import ModulesManager
|
||||
|
||||
manager = ModulesManager()
|
||||
log = PypeLogger.get_logger("AddModulesCLI")
|
||||
for module in manager.modules:
|
||||
try:
|
||||
module.cli(click_func)
|
||||
|
||||
except Exception:
|
||||
log.warning(
|
||||
"Failed to add cli command for module \"{}\"".format(
|
||||
module.name
|
||||
)
|
||||
)
|
||||
return click_func
|
||||
|
||||
@staticmethod
|
||||
def launch_eventservercli(*args):
|
||||
from openpype_modules.ftrack.ftrack_server.event_server_cli import (
|
||||
|
|
@ -60,7 +80,7 @@ class PypeCommands:
|
|||
standalonepublish.main()
|
||||
|
||||
@staticmethod
|
||||
def publish(paths, targets=None):
|
||||
def publish(paths, targets=None, gui=False):
|
||||
"""Start headless publishing.
|
||||
|
||||
Publish use json from passed paths argument.
|
||||
|
|
@ -69,20 +89,35 @@ class PypeCommands:
|
|||
paths (list): Paths to jsons.
|
||||
targets (string): What module should be targeted
|
||||
(to choose validator for example)
|
||||
gui (bool): Show publish UI.
|
||||
|
||||
Raises:
|
||||
RuntimeError: When there is no path to process.
|
||||
"""
|
||||
if not any(paths):
|
||||
raise RuntimeError("No publish paths specified")
|
||||
|
||||
from openpype.modules import ModulesManager
|
||||
from openpype import install, uninstall
|
||||
from openpype.api import Logger
|
||||
from openpype.tools.utils.host_tools import show_publish
|
||||
from openpype.tools.utils.lib import qt_app_context
|
||||
|
||||
# Register target and host
|
||||
import pyblish.api
|
||||
import pyblish.util
|
||||
|
||||
log = Logger.get_logger()
|
||||
|
||||
install()
|
||||
|
||||
manager = ModulesManager()
|
||||
|
||||
publish_paths = manager.collect_plugin_paths()["publish"]
|
||||
|
||||
for path in publish_paths:
|
||||
pyblish.api.register_plugin_path(path)
|
||||
|
||||
if not any(paths):
|
||||
raise RuntimeError("No publish paths specified")
|
||||
|
||||
env = get_app_environments_for_context(
|
||||
os.environ["AVALON_PROJECT"],
|
||||
os.environ["AVALON_ASSET"],
|
||||
|
|
@ -91,35 +126,43 @@ class PypeCommands:
|
|||
)
|
||||
os.environ.update(env)
|
||||
|
||||
log = Logger.get_logger()
|
||||
|
||||
install()
|
||||
|
||||
pyblish.api.register_target("filesequence")
|
||||
pyblish.api.register_host("shell")
|
||||
|
||||
if targets:
|
||||
for target in targets:
|
||||
print(f"setting target: {target}")
|
||||
pyblish.api.register_target(target)
|
||||
else:
|
||||
pyblish.api.register_target("filesequence")
|
||||
|
||||
os.environ["OPENPYPE_PUBLISH_DATA"] = os.pathsep.join(paths)
|
||||
|
||||
log.info("Running publish ...")
|
||||
|
||||
# Error exit as soon as any error occurs.
|
||||
error_format = "Failed {plugin.__name__}: {error} -- {error.traceback}"
|
||||
plugins = pyblish.api.discover()
|
||||
print("Using plugins:")
|
||||
for plugin in plugins:
|
||||
print(plugin)
|
||||
|
||||
for result in pyblish.util.publish_iter():
|
||||
if result["error"]:
|
||||
log.error(error_format.format(**result))
|
||||
uninstall()
|
||||
sys.exit(1)
|
||||
if gui:
|
||||
with qt_app_context():
|
||||
show_publish()
|
||||
else:
|
||||
# Error exit as soon as any error occurs.
|
||||
error_format = ("Failed {plugin.__name__}: "
|
||||
"{error} -- {error.traceback}")
|
||||
|
||||
for result in pyblish.util.publish_iter():
|
||||
if result["error"]:
|
||||
log.error(error_format.format(**result))
|
||||
# uninstall()
|
||||
sys.exit(1)
|
||||
|
||||
log.info("Publish finished.")
|
||||
uninstall()
|
||||
|
||||
@staticmethod
|
||||
def remotepublishfromapp(project, batch_dir, host, user, targets=None):
|
||||
def remotepublishfromapp(project, batch_dir, host_name,
|
||||
user, targets=None):
|
||||
"""Opens installed variant of 'host' and run remote publish there.
|
||||
|
||||
Currently implemented and tested for Photoshop where customer
|
||||
|
|
@ -134,67 +177,37 @@ class PypeCommands:
|
|||
|
||||
Runs publish process as user would, in automatic fashion.
|
||||
"""
|
||||
SLEEP = 5 # seconds for another loop check for concurrently runs
|
||||
WAIT_FOR = 300 # seconds to wait for conc. runs
|
||||
|
||||
from openpype import install, uninstall
|
||||
import pyblish.api
|
||||
from openpype.api import Logger
|
||||
from openpype.lib import ApplicationManager
|
||||
|
||||
log = Logger.get_logger()
|
||||
|
||||
log.info("remotepublishphotoshop command")
|
||||
|
||||
install()
|
||||
|
||||
from openpype.lib import ApplicationManager
|
||||
application_manager = ApplicationManager()
|
||||
|
||||
found_variant_key = find_variant_key(application_manager, host)
|
||||
|
||||
app_name = "{}/{}".format(host, found_variant_key)
|
||||
|
||||
batch_data = None
|
||||
if batch_dir and os.path.exists(batch_dir):
|
||||
batch_data = parse_json(os.path.join(batch_dir, "manifest.json"))
|
||||
|
||||
if not batch_data:
|
||||
raise ValueError(
|
||||
"Cannot parse batch meta in {} folder".format(batch_dir))
|
||||
|
||||
asset, task_name, _task_type = get_batch_asset_task_info(
|
||||
batch_data["context"])
|
||||
|
||||
# processing from app expects JUST ONE task in batch and 1 workfile
|
||||
task_dir_name = batch_data["tasks"][0]
|
||||
task_data = parse_json(os.path.join(batch_dir, task_dir_name,
|
||||
"manifest.json"))
|
||||
task_data = get_task_data(batch_dir)
|
||||
|
||||
workfile_path = os.path.join(batch_dir,
|
||||
task_dir_name,
|
||||
task_data["task"],
|
||||
task_data["files"][0])
|
||||
|
||||
print("workfile_path {}".format(workfile_path))
|
||||
|
||||
_, batch_id = os.path.split(batch_dir)
|
||||
batch_id = task_data["batch"]
|
||||
dbcon = get_webpublish_conn()
|
||||
# safer to start logging here, launch might be broken altogether
|
||||
_id = start_webpublish_log(dbcon, batch_id, user)
|
||||
|
||||
in_progress = True
|
||||
slept_times = 0
|
||||
while in_progress:
|
||||
batches_in_progress = list(dbcon.find({
|
||||
"status": "in_progress"
|
||||
}))
|
||||
if len(batches_in_progress) > 1:
|
||||
if slept_times * SLEEP >= WAIT_FOR:
|
||||
fail_batch(_id, batches_in_progress, dbcon)
|
||||
batches_in_progress = list(dbcon.find({"status": "in_progress"}))
|
||||
if len(batches_in_progress) > 1:
|
||||
fail_batch(_id, batches_in_progress, dbcon)
|
||||
print("Another batch running, probably stuck, ask admin for help")
|
||||
|
||||
print("Another batch running, sleeping for a bit")
|
||||
time.sleep(SLEEP)
|
||||
slept_times += 1
|
||||
else:
|
||||
in_progress = False
|
||||
asset, task_name, _ = get_batch_asset_task_info(task_data["context"])
|
||||
|
||||
application_manager = ApplicationManager()
|
||||
found_variant_key = find_variant_key(application_manager, host_name)
|
||||
app_name = "{}/{}".format(host_name, found_variant_key)
|
||||
|
||||
# must have for proper launch of app
|
||||
env = get_app_environments_for_context(
|
||||
|
|
@ -206,9 +219,21 @@ class PypeCommands:
|
|||
os.environ.update(env)
|
||||
|
||||
os.environ["OPENPYPE_PUBLISH_DATA"] = batch_dir
|
||||
os.environ["IS_HEADLESS"] = "true"
|
||||
# must pass identifier to update log lines for a batch
|
||||
os.environ["BATCH_LOG_ID"] = str(_id)
|
||||
os.environ["HEADLESS_PUBLISH"] = 'true' # to use in app lib
|
||||
|
||||
pyblish.api.register_host(host_name)
|
||||
if targets:
|
||||
if isinstance(targets, str):
|
||||
targets = [targets]
|
||||
current_targets = os.environ.get("PYBLISH_TARGETS", "").split(
|
||||
os.pathsep)
|
||||
for target in targets:
|
||||
current_targets.append(target)
|
||||
|
||||
os.environ["PYBLISH_TARGETS"] = os.pathsep.join(
|
||||
set(current_targets))
|
||||
|
||||
data = {
|
||||
"last_workfile_path": workfile_path,
|
||||
|
|
@ -220,10 +245,8 @@ class PypeCommands:
|
|||
while launched_app.poll() is None:
|
||||
time.sleep(0.5)
|
||||
|
||||
uninstall()
|
||||
|
||||
@staticmethod
|
||||
def remotepublish(project, batch_path, host, user, targets=None):
|
||||
def remotepublish(project, batch_path, user, targets=None):
|
||||
"""Start headless publishing.
|
||||
|
||||
Used to publish rendered assets, workfiles etc.
|
||||
|
|
@ -235,10 +258,9 @@ class PypeCommands:
|
|||
per call of remotepublish
|
||||
batch_path (str): Path batch folder. Contains subfolders with
|
||||
resources (workfile, another subfolder 'renders' etc.)
|
||||
targets (string): What module should be targeted
|
||||
(to choose validator for example)
|
||||
host (string)
|
||||
user (string): email address for webpublisher
|
||||
targets (list): Pyblish targets
|
||||
(to choose validator for example)
|
||||
|
||||
Raises:
|
||||
RuntimeError: When there is no path to process.
|
||||
|
|
@ -246,21 +268,22 @@ class PypeCommands:
|
|||
if not batch_path:
|
||||
raise RuntimeError("No publish paths specified")
|
||||
|
||||
from openpype import install, uninstall
|
||||
from openpype.api import Logger
|
||||
|
||||
# Register target and host
|
||||
import pyblish.api
|
||||
import pyblish.util
|
||||
import avalon.api
|
||||
from openpype.hosts.webpublisher import api as webpublisher
|
||||
|
||||
log = Logger.get_logger()
|
||||
log = PypeLogger.get_logger()
|
||||
|
||||
log.info("remotepublish command")
|
||||
|
||||
install()
|
||||
host_name = "webpublisher"
|
||||
os.environ["OPENPYPE_PUBLISH_DATA"] = batch_path
|
||||
os.environ["AVALON_PROJECT"] = project
|
||||
os.environ["AVALON_APP"] = host_name
|
||||
|
||||
if host:
|
||||
pyblish.api.register_host(host)
|
||||
pyblish.api.register_host(host_name)
|
||||
|
||||
if targets:
|
||||
if isinstance(targets, str):
|
||||
|
|
@ -268,13 +291,6 @@ class PypeCommands:
|
|||
for target in targets:
|
||||
pyblish.api.register_target(target)
|
||||
|
||||
os.environ["OPENPYPE_PUBLISH_DATA"] = batch_path
|
||||
os.environ["AVALON_PROJECT"] = project
|
||||
os.environ["AVALON_APP"] = host
|
||||
|
||||
import avalon.api
|
||||
from openpype.hosts.webpublisher import api as webpublisher
|
||||
|
||||
avalon.api.install(webpublisher)
|
||||
|
||||
log.info("Running publish ...")
|
||||
|
|
@ -286,7 +302,6 @@ class PypeCommands:
|
|||
publish_and_log(dbcon, _id, log)
|
||||
|
||||
log.info("Publish finished.")
|
||||
uninstall()
|
||||
|
||||
@staticmethod
|
||||
def extractenvironments(output_json_path, project, asset, task, app):
|
||||
|
|
@ -352,3 +367,28 @@ class PypeCommands:
|
|||
cmd = "pytest {} {} {}".format(folder, mark_str, pyargs_str)
|
||||
print("Running {}".format(cmd))
|
||||
subprocess.run(cmd)
|
||||
|
||||
def syncserver(self, active_site):
|
||||
"""Start running sync_server in background."""
|
||||
import signal
|
||||
os.environ["OPENPYPE_LOCAL_ID"] = active_site
|
||||
|
||||
def signal_handler(sig, frame):
|
||||
print("You pressed Ctrl+C. Process ended.")
|
||||
sync_server_module.server_exit()
|
||||
sys.exit(0)
|
||||
|
||||
signal.signal(signal.SIGINT, signal_handler)
|
||||
signal.signal(signal.SIGTERM, signal_handler)
|
||||
|
||||
from openpype.modules import ModulesManager
|
||||
|
||||
manager = ModulesManager()
|
||||
sync_server_module = manager.modules_by_name["sync_server"]
|
||||
|
||||
sync_server_module.server_init()
|
||||
sync_server_module.server_start()
|
||||
|
||||
import time
|
||||
while True:
|
||||
time.sleep(1.0)
|
||||
|
|
|
|||
|
|
@ -37,7 +37,7 @@ TIMECODE_KEY = "{timecode}"
|
|||
SOURCE_TIMECODE_KEY = "{source_timecode}"
|
||||
|
||||
|
||||
def _streams(source):
|
||||
def _get_ffprobe_data(source):
|
||||
"""Reimplemented from otio burnins to be able use full path to ffprobe
|
||||
:param str source: source media file
|
||||
:rtype: [{}, ...]
|
||||
|
|
@ -47,7 +47,7 @@ def _streams(source):
|
|||
out = proc.communicate()[0]
|
||||
if proc.returncode != 0:
|
||||
raise RuntimeError("Failed to run: %s" % command)
|
||||
return json.loads(out)['streams']
|
||||
return json.loads(out)
|
||||
|
||||
|
||||
def get_fps(str_value):
|
||||
|
|
@ -69,10 +69,10 @@ def get_fps(str_value):
|
|||
return str(fps)
|
||||
|
||||
|
||||
def _prores_codec_args(ffprobe_data, source_ffmpeg_cmd):
|
||||
def _prores_codec_args(stream_data, source_ffmpeg_cmd):
|
||||
output = []
|
||||
|
||||
tags = ffprobe_data.get("tags") or {}
|
||||
tags = stream_data.get("tags") or {}
|
||||
encoder = tags.get("encoder") or ""
|
||||
if encoder.endswith("prores_ks"):
|
||||
codec_name = "prores_ks"
|
||||
|
|
@ -85,7 +85,7 @@ def _prores_codec_args(ffprobe_data, source_ffmpeg_cmd):
|
|||
|
||||
output.extend(["-codec:v", codec_name])
|
||||
|
||||
pix_fmt = ffprobe_data.get("pix_fmt")
|
||||
pix_fmt = stream_data.get("pix_fmt")
|
||||
if pix_fmt:
|
||||
output.extend(["-pix_fmt", pix_fmt])
|
||||
|
||||
|
|
@ -99,7 +99,7 @@ def _prores_codec_args(ffprobe_data, source_ffmpeg_cmd):
|
|||
"ap4h": "4444",
|
||||
"ap4x": "4444xq"
|
||||
}
|
||||
codec_tag_str = ffprobe_data.get("codec_tag_string")
|
||||
codec_tag_str = stream_data.get("codec_tag_string")
|
||||
if codec_tag_str:
|
||||
profile = codec_tag_to_profile_map.get(codec_tag_str)
|
||||
if profile:
|
||||
|
|
@ -108,7 +108,7 @@ def _prores_codec_args(ffprobe_data, source_ffmpeg_cmd):
|
|||
return output
|
||||
|
||||
|
||||
def _h264_codec_args(ffprobe_data, source_ffmpeg_cmd):
|
||||
def _h264_codec_args(stream_data, source_ffmpeg_cmd):
|
||||
output = ["-codec:v", "h264"]
|
||||
|
||||
# Use arguments from source if are available source arguments
|
||||
|
|
@ -125,7 +125,7 @@ def _h264_codec_args(ffprobe_data, source_ffmpeg_cmd):
|
|||
if arg in copy_args:
|
||||
output.extend([arg, args[idx + 1]])
|
||||
|
||||
pix_fmt = ffprobe_data.get("pix_fmt")
|
||||
pix_fmt = stream_data.get("pix_fmt")
|
||||
if pix_fmt:
|
||||
output.extend(["-pix_fmt", pix_fmt])
|
||||
|
||||
|
|
@ -135,11 +135,11 @@ def _h264_codec_args(ffprobe_data, source_ffmpeg_cmd):
|
|||
return output
|
||||
|
||||
|
||||
def _dnxhd_codec_args(ffprobe_data, source_ffmpeg_cmd):
|
||||
def _dnxhd_codec_args(stream_data, source_ffmpeg_cmd):
|
||||
output = ["-codec:v", "dnxhd"]
|
||||
|
||||
# Use source profile (profiles in metadata are not usable in args directly)
|
||||
profile = ffprobe_data.get("profile") or ""
|
||||
profile = stream_data.get("profile") or ""
|
||||
# Lower profile and replace space with underscore
|
||||
cleaned_profile = profile.lower().replace(" ", "_")
|
||||
dnx_profiles = {
|
||||
|
|
@ -153,7 +153,7 @@ def _dnxhd_codec_args(ffprobe_data, source_ffmpeg_cmd):
|
|||
if cleaned_profile in dnx_profiles:
|
||||
output.extend(["-profile:v", cleaned_profile])
|
||||
|
||||
pix_fmt = ffprobe_data.get("pix_fmt")
|
||||
pix_fmt = stream_data.get("pix_fmt")
|
||||
if pix_fmt:
|
||||
output.extend(["-pix_fmt", pix_fmt])
|
||||
|
||||
|
|
@ -162,28 +162,29 @@ def _dnxhd_codec_args(ffprobe_data, source_ffmpeg_cmd):
|
|||
|
||||
|
||||
def get_codec_args(ffprobe_data, source_ffmpeg_cmd):
|
||||
codec_name = ffprobe_data.get("codec_name")
|
||||
stream_data = ffprobe_data["streams"][0]
|
||||
codec_name = stream_data.get("codec_name")
|
||||
# Codec "prores"
|
||||
if codec_name == "prores":
|
||||
return _prores_codec_args(ffprobe_data, source_ffmpeg_cmd)
|
||||
return _prores_codec_args(stream_data, source_ffmpeg_cmd)
|
||||
|
||||
# Codec "h264"
|
||||
if codec_name == "h264":
|
||||
return _h264_codec_args(ffprobe_data, source_ffmpeg_cmd)
|
||||
return _h264_codec_args(stream_data, source_ffmpeg_cmd)
|
||||
|
||||
# Coded DNxHD
|
||||
if codec_name == "dnxhd":
|
||||
return _dnxhd_codec_args(ffprobe_data, source_ffmpeg_cmd)
|
||||
return _dnxhd_codec_args(stream_data, source_ffmpeg_cmd)
|
||||
|
||||
output = []
|
||||
if codec_name:
|
||||
output.extend(["-codec:v", codec_name])
|
||||
|
||||
bit_rate = ffprobe_data.get("bit_rate")
|
||||
bit_rate = stream_data.get("bit_rate")
|
||||
if bit_rate:
|
||||
output.extend(["-b:v", bit_rate])
|
||||
|
||||
pix_fmt = ffprobe_data.get("pix_fmt")
|
||||
pix_fmt = stream_data.get("pix_fmt")
|
||||
if pix_fmt:
|
||||
output.extend(["-pix_fmt", pix_fmt])
|
||||
|
||||
|
|
@ -244,15 +245,16 @@ class ModifiedBurnins(ffmpeg_burnins.Burnins):
|
|||
}
|
||||
|
||||
def __init__(
|
||||
self, source, streams=None, options_init=None, first_frame=None
|
||||
self, source, ffprobe_data=None, options_init=None, first_frame=None
|
||||
):
|
||||
if not streams:
|
||||
streams = _streams(source)
|
||||
if not ffprobe_data:
|
||||
ffprobe_data = _get_ffprobe_data(source)
|
||||
|
||||
self.ffprobe_data = ffprobe_data
|
||||
self.first_frame = first_frame
|
||||
self.input_args = []
|
||||
|
||||
super().__init__(source, streams)
|
||||
super().__init__(source, ffprobe_data["streams"])
|
||||
|
||||
if options_init:
|
||||
self.options_init.update(options_init)
|
||||
|
|
@ -492,8 +494,6 @@ def example(input_path, output_path):
|
|||
'bg_opacity': 0.5,
|
||||
'font_size': 52
|
||||
}
|
||||
# First frame in burnin
|
||||
start_frame = 2000
|
||||
# Options init sets burnin look
|
||||
burnin = ModifiedBurnins(input_path, options_init=options_init)
|
||||
# Static text
|
||||
|
|
@ -564,11 +564,11 @@ def burnins_from_data(
|
|||
"shot": "sh0010"
|
||||
}
|
||||
"""
|
||||
streams = None
|
||||
ffprobe_data = None
|
||||
if full_input_path:
|
||||
streams = _streams(full_input_path)
|
||||
ffprobe_data = _get_ffprobe_data(full_input_path)
|
||||
|
||||
burnin = ModifiedBurnins(input_path, streams, options, first_frame)
|
||||
burnin = ModifiedBurnins(input_path, ffprobe_data, options, first_frame)
|
||||
|
||||
frame_start = data.get("frame_start")
|
||||
frame_end = data.get("frame_end")
|
||||
|
|
@ -595,6 +595,14 @@ def burnins_from_data(
|
|||
if source_timecode is None:
|
||||
source_timecode = stream.get("tags", {}).get("timecode")
|
||||
|
||||
if source_timecode is None:
|
||||
# Use "format" key from ffprobe data
|
||||
# - this is used e.g. in mxf extension
|
||||
input_format = burnin.ffprobe_data.get("format") or {}
|
||||
source_timecode = input_format.get("timecode")
|
||||
if source_timecode is None:
|
||||
source_timecode = input_format.get("tags", {}).get("timecode")
|
||||
|
||||
if source_timecode is not None:
|
||||
data[SOURCE_TIMECODE_KEY[1:-1]] = SOURCE_TIMECODE_KEY
|
||||
|
||||
|
|
@ -684,8 +692,9 @@ def burnins_from_data(
|
|||
ffmpeg_args.append("-g 1")
|
||||
|
||||
else:
|
||||
ffprobe_data = burnin._streams[0]
|
||||
ffmpeg_args.extend(get_codec_args(ffprobe_data, source_ffmpeg_cmd))
|
||||
ffmpeg_args.extend(
|
||||
get_codec_args(burnin.ffprobe_data, source_ffmpeg_cmd)
|
||||
)
|
||||
|
||||
# Use group one (same as `-intra` argument, which is deprecated)
|
||||
ffmpeg_args_str = " ".join(ffmpeg_args)
|
||||
|
|
|
|||
|
|
@ -1,5 +1,8 @@
|
|||
{
|
||||
"publish": {
|
||||
"CollectAnatomyInstanceData": {
|
||||
"follow_workfile_version": false
|
||||
},
|
||||
"ValidateEditorialAssetName": {
|
||||
"enabled": true,
|
||||
"optional": false
|
||||
|
|
@ -315,10 +318,11 @@
|
|||
},
|
||||
"project_folder_structure": "{\"__project_root__\": {\"prod\": {}, \"resources\": {\"footage\": {\"plates\": {}, \"offline\": {}}, \"audio\": {}, \"art_dept\": {}}, \"editorial\": {}, \"assets[ftrack.Library]\": {\"characters[ftrack]\": {}, \"locations[ftrack]\": {}}, \"shots[ftrack.Sequence]\": {\"scripts\": {}, \"editorial[ftrack.Folder]\": {}}}}",
|
||||
"sync_server": {
|
||||
"enabled": true,
|
||||
"enabled": false,
|
||||
"config": {
|
||||
"retry_cnt": "3",
|
||||
"loop_delay": "60",
|
||||
"always_accessible_on": [],
|
||||
"active_site": "studio",
|
||||
"remote_site": "studio"
|
||||
},
|
||||
|
|
|
|||
|
|
@ -8,16 +8,10 @@
|
|||
"yetiRig": "ma"
|
||||
},
|
||||
"maya-dirmap": {
|
||||
"enabled": true,
|
||||
"enabled": false,
|
||||
"paths": {
|
||||
"source-path": [
|
||||
"foo1",
|
||||
"foo2"
|
||||
],
|
||||
"destination-path": [
|
||||
"bar1",
|
||||
"bar2"
|
||||
]
|
||||
"source-path": [],
|
||||
"destination-path": []
|
||||
}
|
||||
},
|
||||
"scriptsmenu": {
|
||||
|
|
|
|||
|
|
@ -8,6 +8,13 @@
|
|||
"build_workfile": "ctrl+alt+b"
|
||||
}
|
||||
},
|
||||
"nuke-dirmap": {
|
||||
"enabled": false,
|
||||
"paths": {
|
||||
"source-path": [],
|
||||
"destination-path": []
|
||||
}
|
||||
},
|
||||
"create": {
|
||||
"CreateWriteRender": {
|
||||
"fpath_template": "{work}/renders/nuke/{subset}/{subset}.{frame}.{ext}",
|
||||
|
|
@ -130,8 +137,7 @@
|
|||
},
|
||||
"LoadClip": {
|
||||
"enabled": true,
|
||||
"_representations": [
|
||||
],
|
||||
"_representations": [],
|
||||
"node_name_template": "{class_name}_{ext}"
|
||||
}
|
||||
},
|
||||
|
|
|
|||
|
|
@ -30,8 +30,7 @@
|
|||
},
|
||||
"ExtractReview": {
|
||||
"jpg_options": {
|
||||
"tags": [
|
||||
]
|
||||
"tags": []
|
||||
},
|
||||
"mov_options": {
|
||||
"tags": [
|
||||
|
|
|
|||
|
|
@ -167,6 +167,16 @@
|
|||
"ffmpeg": 48
|
||||
}
|
||||
},
|
||||
"royalrender": {
|
||||
"enabled": false,
|
||||
"rr_paths": {
|
||||
"default": {
|
||||
"windows": "",
|
||||
"darwin": "",
|
||||
"linux": ""
|
||||
}
|
||||
}
|
||||
},
|
||||
"log_viewer": {
|
||||
"enabled": true
|
||||
},
|
||||
|
|
|
|||
|
|
@ -762,6 +762,17 @@ class SyncServerProviders(DictConditionalEntity):
|
|||
|
||||
enum_children = []
|
||||
for provider_code, configurables in system_settings_schema.items():
|
||||
# any site could be exposed or vendorized by different site
|
||||
# eg studio site content could be mapped on sftp site, single file
|
||||
# accessible via 2 different protocols (sites)
|
||||
configurables.append(
|
||||
{
|
||||
"type": "list",
|
||||
"key": "alternative_sites",
|
||||
"label": "Alternative sites",
|
||||
"object_type": "text"
|
||||
}
|
||||
)
|
||||
label = provider_code_to_label.get(provider_code) or provider_code
|
||||
|
||||
enum_children.append({
|
||||
|
|
|
|||
|
|
@ -46,6 +46,39 @@
|
|||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"checkbox_key": "enabled",
|
||||
"key": "nuke-dirmap",
|
||||
"label": "Nuke Directory Mapping",
|
||||
"is_group": true,
|
||||
"children": [
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "enabled",
|
||||
"label": "Enabled"
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"key": "paths",
|
||||
"children": [
|
||||
{
|
||||
"type": "list",
|
||||
"object_type": "text",
|
||||
"key": "source-path",
|
||||
"label": "Source Path"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"object_type": "text",
|
||||
"key": "destination-path",
|
||||
"label": "Destination Path"
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
|
|
|
|||
|
|
@ -67,7 +67,17 @@
|
|||
"type": "list",
|
||||
"key": "color_code",
|
||||
"label": "Color codes for layers",
|
||||
"object_type": "text"
|
||||
"type": "enum",
|
||||
"multiselection": true,
|
||||
"enum_items": [
|
||||
{ "red": "red" },
|
||||
{ "orange": "orange" },
|
||||
{ "yellowColor": "yellow" },
|
||||
{ "grain": "green" },
|
||||
{ "blue": "blue" },
|
||||
{ "violet": "violet" },
|
||||
{ "gray": "gray" }
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
|
|
|
|||
|
|
@ -26,113 +26,33 @@
|
|||
"key": "loop_delay",
|
||||
"label": "Loop Delay"
|
||||
},
|
||||
{
|
||||
"type": "list",
|
||||
"key": "always_accessible_on",
|
||||
"label": "Always accessible on sites",
|
||||
"object_type": "text"
|
||||
},
|
||||
{
|
||||
"type": "splitter"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "active_site",
|
||||
"label": "Active Site"
|
||||
"label": "User Default Active Site"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "remote_site",
|
||||
"label": "Remote Site"
|
||||
"label": "User Default Remote Site"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict-modifiable",
|
||||
"type": "sync-server-sites",
|
||||
"collapsible": true,
|
||||
"key": "sites",
|
||||
"label": "Sites",
|
||||
"collapsible_key": false,
|
||||
"object_type": {
|
||||
"type": "dict",
|
||||
"children": [
|
||||
{
|
||||
"type": "dict",
|
||||
"key": "gdrive",
|
||||
"label": "Google Drive",
|
||||
"collapsible": true,
|
||||
"children": [
|
||||
{
|
||||
"type": "path",
|
||||
"key": "credentials_url",
|
||||
"label": "Credentials url",
|
||||
"multiplatform": true
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"key": "dropbox",
|
||||
"label": "Dropbox",
|
||||
"collapsible": true,
|
||||
"children": [
|
||||
{
|
||||
"type": "text",
|
||||
"key": "token",
|
||||
"label": "Access Token"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "team_folder_name",
|
||||
"label": "Team Folder Name"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "acting_as_member",
|
||||
"label": "Acting As Member"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"key": "sftp",
|
||||
"label": "SFTP",
|
||||
"collapsible": true,
|
||||
"children": [
|
||||
{
|
||||
"type": "text",
|
||||
"key": "sftp_host",
|
||||
"label": "SFTP host"
|
||||
},
|
||||
{
|
||||
"type": "number",
|
||||
"key": "sftp_port",
|
||||
"label": "SFTP port"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "sftp_user",
|
||||
"label": "SFTP user"
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "sftp_pass",
|
||||
"label": "SFTP pass"
|
||||
},
|
||||
{
|
||||
"type": "path",
|
||||
"key": "sftp_key",
|
||||
"label": "SFTP user ssh key",
|
||||
"multiplatform": true
|
||||
},
|
||||
{
|
||||
"type": "text",
|
||||
"key": "sftp_key_pass",
|
||||
"label": "SFTP user ssh key password"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict-modifiable",
|
||||
"key": "root",
|
||||
"label": "Roots",
|
||||
"collapsable": false,
|
||||
"collapsable_key": false,
|
||||
"object_type": "text"
|
||||
}
|
||||
]
|
||||
}
|
||||
"collapsible_key": false
|
||||
}
|
||||
]
|
||||
}
|
||||
|
|
|
|||
|
|
@ -4,6 +4,20 @@
|
|||
"key": "publish",
|
||||
"label": "Publish plugins",
|
||||
"children": [
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
"key": "CollectAnatomyInstanceData",
|
||||
"label": "Collect Anatomy Instance Data",
|
||||
"is_group": true,
|
||||
"children": [
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "follow_workfile_version",
|
||||
"label": "Follow workfile version"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"collapsible": true,
|
||||
|
|
|
|||
|
|
@ -180,6 +180,31 @@
|
|||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"key": "royalrender",
|
||||
"label": "Royal Render",
|
||||
"require_restart": true,
|
||||
"collapsible": true,
|
||||
"checkbox_key": "enabled",
|
||||
"children": [
|
||||
{
|
||||
"type": "boolean",
|
||||
"key": "enabled",
|
||||
"label": "Enabled"
|
||||
},
|
||||
{
|
||||
"type": "dict-modifiable",
|
||||
"object_type": {
|
||||
"type": "path",
|
||||
"multiplatform": true
|
||||
},
|
||||
"key": "rr_paths",
|
||||
"required_keys": ["default"],
|
||||
"label": "Royal Render Root Paths"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "dict",
|
||||
"key": "log_viewer",
|
||||
|
|
|
|||
Binary file not shown.
|
Before Width: | Height: | Size: 69 B After Width: | Height: | Size: 1.4 KiB |
File diff suppressed because it is too large
Load diff
File diff suppressed because it is too large
Load diff
|
|
@ -765,6 +765,11 @@ QScrollBar::add-page:vertical, QScrollBar::sub-page:vertical {
|
|||
border: 1px solid {color:border};
|
||||
border-radius: 0.1em;
|
||||
}
|
||||
/* Subset Manager */
|
||||
#SubsetManagerDetailsText {}
|
||||
#SubsetManagerDetailsText[state="invalid"] {
|
||||
border: 1px solid #ff0000;
|
||||
}
|
||||
|
||||
/* Python console interpreter */
|
||||
#PythonInterpreterOutput, #PythonCodeEditor {
|
||||
|
|
|
|||
|
|
@ -104,8 +104,8 @@ class TestPerformance():
|
|||
"name": "mb",
|
||||
"parent": {"oid": '{}'.format(id)},
|
||||
"data": {
|
||||
"path": "C:\\projects\\test_performance\\Assets\\Cylinder\\publish\\workfile\\workfileLookdev\\{}\\{}".format(version_str, file_name), # noqa
|
||||
"template": "{root[work]}\\{project[name]}\\{hierarchy}\\{asset}\\publish\\{family}\\{subset}\\v{version:0>3}\\{project[code]}_{asset}_{subset}_v{version:0>3}<_{output}><.{frame:0>4}>.{representation}" # noqa
|
||||
"path": "C:\\projects\\test_performance\\Assets\\Cylinder\\publish\\workfile\\workfileLookdev\\{}\\{}".format(version_str, file_name), # noqa: E501
|
||||
"template": "{root[work]}\\{project[name]}\\{hierarchy}\\{asset}\\publish\\{family}\\{subset}\\v{version:0>3}\\{project[code]}_{asset}_{subset}_v{version:0>3}<_{output}><.{frame:0>4}>.{representation}" # noqa: E501
|
||||
},
|
||||
"type": "representation",
|
||||
"schema": "openpype:representation-2.0"
|
||||
|
|
@ -188,21 +188,21 @@ class TestPerformance():
|
|||
create_files=False):
|
||||
ret = [
|
||||
{
|
||||
"path": "{root[work]}" + "{root[work]}/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/v{:03d}/test_Cylinder_A_workfileLookdev_v{:03d}.dat".format(i, i), #noqa
|
||||
"path": "{root[work]}" + "{root[work]}/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/v{:03d}/test_Cylinder_A_workfileLookdev_v{:03d}.dat".format(i, i), # noqa: E501
|
||||
"_id": '{}'.format(file_id),
|
||||
"hash": "temphash",
|
||||
"sites": self.get_sites(self.MAX_NUMBER_OF_SITES),
|
||||
"size": random.randint(0, self.MAX_FILE_SIZE_B)
|
||||
},
|
||||
{
|
||||
"path": "{root[work]}" + "/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/v{:03d}/test_Cylinder_B_workfileLookdev_v{:03d}.dat".format(i, i), #noqa
|
||||
"path": "{root[work]}" + "/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/v{:03d}/test_Cylinder_B_workfileLookdev_v{:03d}.dat".format(i, i), # noqa: E501
|
||||
"_id": '{}'.format(file_id2),
|
||||
"hash": "temphash",
|
||||
"sites": self.get_sites(self.MAX_NUMBER_OF_SITES),
|
||||
"size": random.randint(0, self.MAX_FILE_SIZE_B)
|
||||
},
|
||||
{
|
||||
"path": "{root[work]}" + "/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/v{:03d}/test_Cylinder_C_workfileLookdev_v{:03d}.dat".format(i, i), #noqa
|
||||
"path": "{root[work]}" + "/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/v{:03d}/test_Cylinder_C_workfileLookdev_v{:03d}.dat".format(i, i), # noqa: E501
|
||||
"_id": '{}'.format(file_id3),
|
||||
"hash": "temphash",
|
||||
"sites": self.get_sites(self.MAX_NUMBER_OF_SITES),
|
||||
|
|
@ -223,8 +223,8 @@ class TestPerformance():
|
|||
ret = {}
|
||||
ret['{}'.format(file_id)] = {
|
||||
"path": "{root[work]}" +
|
||||
"/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/" #noqa
|
||||
"v{:03d}/test_CylinderA_workfileLookdev_v{:03d}.mb".format(i, i), # noqa
|
||||
"/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/" # noqa: E501
|
||||
"v{:03d}/test_CylinderA_workfileLookdev_v{:03d}.mb".format(i, i), # noqa: E501
|
||||
"hash": "temphash",
|
||||
"sites": ["studio"],
|
||||
"size": 87236
|
||||
|
|
@ -232,16 +232,16 @@ class TestPerformance():
|
|||
|
||||
ret['{}'.format(file_id2)] = {
|
||||
"path": "{root[work]}" +
|
||||
"/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/" #noqa
|
||||
"v{:03d}/test_CylinderB_workfileLookdev_v{:03d}.mb".format(i, i), # noqa
|
||||
"/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/" # noqa: E501
|
||||
"v{:03d}/test_CylinderB_workfileLookdev_v{:03d}.mb".format(i, i), # noqa: E501
|
||||
"hash": "temphash",
|
||||
"sites": ["studio"],
|
||||
"size": 87236
|
||||
}
|
||||
ret['{}'.format(file_id3)] = {
|
||||
"path": "{root[work]}" +
|
||||
"/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/" #noqa
|
||||
"v{:03d}/test_CylinderC_workfileLookdev_v{:03d}.mb".format(i, i), # noqa
|
||||
"/test_performance/Assets/Cylinder/publish/workfile/workfileLookdev/" # noqa: E501
|
||||
"v{:03d}/test_CylinderC_workfileLookdev_v{:03d}.mb".format(i, i), # noqa: E501
|
||||
"hash": "temphash",
|
||||
"sites": ["studio"],
|
||||
"size": 87236
|
||||
|
|
|
|||
|
|
@ -8,14 +8,12 @@ from openpype import style
|
|||
from openpype.tools.utils.lib import center_window
|
||||
from openpype.tools.utils.widgets import AssetWidget
|
||||
from openpype.tools.utils.constants import (
|
||||
TASK_NAME_ROLE,
|
||||
PROJECT_NAME_ROLE
|
||||
)
|
||||
from openpype.tools.utils.tasks_widget import TasksWidget
|
||||
from openpype.tools.utils.models import (
|
||||
ProjectModel,
|
||||
ProjectSortFilterProxy,
|
||||
TasksModel,
|
||||
TasksProxyModel
|
||||
ProjectSortFilterProxy
|
||||
)
|
||||
|
||||
|
||||
|
|
@ -77,15 +75,11 @@ class ContextDialog(QtWidgets.QDialog):
|
|||
left_side_layout.addWidget(assets_widget)
|
||||
|
||||
# Right side of window contains only tasks
|
||||
task_view = QtWidgets.QListView(main_splitter)
|
||||
task_model = TasksModel(dbcon)
|
||||
task_proxy = TasksProxyModel()
|
||||
task_proxy.setSourceModel(task_model)
|
||||
task_view.setModel(task_proxy)
|
||||
tasks_widget = TasksWidget(dbcon, main_splitter)
|
||||
|
||||
# Add widgets to main splitter
|
||||
main_splitter.addWidget(left_side_widget)
|
||||
main_splitter.addWidget(task_view)
|
||||
main_splitter.addWidget(tasks_widget)
|
||||
|
||||
# Set stretch of both sides
|
||||
main_splitter.setStretchFactor(0, 7)
|
||||
|
|
@ -119,7 +113,7 @@ class ContextDialog(QtWidgets.QDialog):
|
|||
assets_widget.selection_changed.connect(self._on_asset_change)
|
||||
assets_widget.refresh_triggered.connect(self._on_asset_refresh_trigger)
|
||||
assets_widget.refreshed.connect(self._on_asset_widget_refresh_finished)
|
||||
task_view.selectionModel().selectionChanged.connect(
|
||||
tasks_widget.task_changed.selectionChanged.connect(
|
||||
self._on_task_change
|
||||
)
|
||||
ok_btn.clicked.connect(self._on_ok_click)
|
||||
|
|
@ -133,9 +127,7 @@ class ContextDialog(QtWidgets.QDialog):
|
|||
|
||||
self._assets_widget = assets_widget
|
||||
|
||||
self._task_view = task_view
|
||||
self._task_model = task_model
|
||||
self._task_proxy = task_proxy
|
||||
self._tasks_widget = tasks_widget
|
||||
|
||||
self._ok_btn = ok_btn
|
||||
|
||||
|
|
@ -279,15 +271,13 @@ class ContextDialog(QtWidgets.QDialog):
|
|||
self._dbcon.Session["AVALON_ASSET"] = self._set_context_asset
|
||||
self._assets_widget.setEnabled(False)
|
||||
self._assets_widget.select_assets(self._set_context_asset)
|
||||
self._set_asset_to_task_model()
|
||||
self._set_asset_to_tasks_widget()
|
||||
else:
|
||||
self._assets_widget.setEnabled(True)
|
||||
self._assets_widget.set_current_asset_btn_visibility(False)
|
||||
|
||||
# Refresh tasks
|
||||
self._task_model.refresh()
|
||||
# Sort tasks
|
||||
self._task_proxy.sort(0, QtCore.Qt.AscendingOrder)
|
||||
self._tasks_widget.refresh()
|
||||
|
||||
self._ignore_value_changes = False
|
||||
|
||||
|
|
@ -314,20 +304,19 @@ class ContextDialog(QtWidgets.QDialog):
|
|||
"""Selected assets have changed"""
|
||||
if self._ignore_value_changes:
|
||||
return
|
||||
self._set_asset_to_task_model()
|
||||
self._set_asset_to_tasks_widget()
|
||||
|
||||
def _on_task_change(self):
|
||||
self._validate_strict()
|
||||
|
||||
def _set_asset_to_task_model(self):
|
||||
def _set_asset_to_tasks_widget(self):
|
||||
# filter None docs they are silo
|
||||
asset_docs = self._assets_widget.get_selected_assets()
|
||||
asset_ids = [asset_doc["_id"] for asset_doc in asset_docs]
|
||||
asset_id = None
|
||||
if asset_ids:
|
||||
asset_id = asset_ids[0]
|
||||
self._task_model.set_asset_id(asset_id)
|
||||
self._task_proxy.sort(0, QtCore.Qt.AscendingOrder)
|
||||
self._tasks_widget.set_asset_id(asset_id)
|
||||
|
||||
def _confirm_values(self):
|
||||
"""Store values to output."""
|
||||
|
|
@ -355,11 +344,7 @@ class ContextDialog(QtWidgets.QDialog):
|
|||
|
||||
def get_selected_task(self):
|
||||
"""Currently selected task."""
|
||||
task_name = None
|
||||
index = self._task_view.selectionModel().currentIndex()
|
||||
if index.isValid():
|
||||
task_name = index.data(TASK_NAME_ROLE)
|
||||
return task_name
|
||||
return self._tasks_widget.get_selected_task_name()
|
||||
|
||||
def _validate_strict(self):
|
||||
if not self._strict:
|
||||
|
|
|
|||
|
|
@ -19,102 +19,6 @@ from openpype.lib import ApplicationManager
|
|||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class TaskModel(QtGui.QStandardItemModel):
|
||||
"""A model listing the tasks combined for a list of assets"""
|
||||
|
||||
def __init__(self, dbcon, parent=None):
|
||||
super(TaskModel, self).__init__(parent=parent)
|
||||
self.dbcon = dbcon
|
||||
|
||||
self._num_assets = 0
|
||||
|
||||
self.default_icon = qtawesome.icon(
|
||||
"fa.male", color=style.colors.default
|
||||
)
|
||||
self.no_task_icon = qtawesome.icon(
|
||||
"fa.exclamation-circle", color=style.colors.mid
|
||||
)
|
||||
|
||||
self._icons = {}
|
||||
|
||||
self._get_task_icons()
|
||||
|
||||
def _get_task_icons(self):
|
||||
if not self.dbcon.Session.get("AVALON_PROJECT"):
|
||||
return
|
||||
|
||||
# Get the project configured icons from database
|
||||
project = self.dbcon.find_one({"type": "project"})
|
||||
for task in project["config"].get("tasks") or []:
|
||||
icon_name = task.get("icon")
|
||||
if icon_name:
|
||||
self._icons[task["name"]] = qtawesome.icon(
|
||||
"fa.{}".format(icon_name), color=style.colors.default
|
||||
)
|
||||
|
||||
def set_assets(self, asset_ids=None, asset_docs=None):
|
||||
"""Set assets to track by their database id
|
||||
|
||||
Arguments:
|
||||
asset_ids (list): List of asset ids.
|
||||
asset_docs (list): List of asset entities from MongoDB.
|
||||
|
||||
"""
|
||||
|
||||
if asset_docs is None and asset_ids is not None:
|
||||
# find assets in db by query
|
||||
asset_docs = list(self.dbcon.find({
|
||||
"type": "asset",
|
||||
"_id": {"$in": asset_ids}
|
||||
}))
|
||||
db_assets_ids = tuple(asset_doc["_id"] for asset_doc in asset_docs)
|
||||
|
||||
# check if all assets were found
|
||||
not_found = tuple(
|
||||
str(asset_id)
|
||||
for asset_id in asset_ids
|
||||
if asset_id not in db_assets_ids
|
||||
)
|
||||
|
||||
assert not not_found, "Assets not found by id: {0}".format(
|
||||
", ".join(not_found)
|
||||
)
|
||||
|
||||
self.clear()
|
||||
|
||||
if not asset_docs:
|
||||
return
|
||||
|
||||
task_names = set()
|
||||
for asset_doc in asset_docs:
|
||||
asset_tasks = asset_doc.get("data", {}).get("tasks") or set()
|
||||
task_names.update(asset_tasks)
|
||||
|
||||
self.beginResetModel()
|
||||
|
||||
if not task_names:
|
||||
item = QtGui.QStandardItem(self.no_task_icon, "No task")
|
||||
item.setEnabled(False)
|
||||
self.appendRow(item)
|
||||
|
||||
else:
|
||||
for task_name in sorted(task_names):
|
||||
icon = self._icons.get(task_name, self.default_icon)
|
||||
item = QtGui.QStandardItem(icon, task_name)
|
||||
self.appendRow(item)
|
||||
|
||||
self.endResetModel()
|
||||
|
||||
def headerData(self, section, orientation, role):
|
||||
if (
|
||||
role == QtCore.Qt.DisplayRole
|
||||
and orientation == QtCore.Qt.Horizontal
|
||||
and section == 0
|
||||
):
|
||||
return "Tasks"
|
||||
return super(TaskModel, self).headerData(section, orientation, role)
|
||||
|
||||
|
||||
class ActionModel(QtGui.QStandardItemModel):
|
||||
def __init__(self, dbcon, parent=None):
|
||||
super(ActionModel, self).__init__(parent=parent)
|
||||
|
|
|
|||
|
|
@ -6,7 +6,7 @@ from avalon.vendor import qtawesome
|
|||
|
||||
from .delegates import ActionDelegate
|
||||
from . import lib
|
||||
from .models import TaskModel, ActionModel
|
||||
from .models import ActionModel
|
||||
from openpype.tools.flickcharm import FlickCharm
|
||||
from .constants import (
|
||||
ACTION_ROLE,
|
||||
|
|
@ -90,9 +90,6 @@ class ActionBar(QtWidgets.QWidget):
|
|||
self.project_handler = project_handler
|
||||
self.dbcon = dbcon
|
||||
|
||||
layout = QtWidgets.QHBoxLayout(self)
|
||||
layout.setContentsMargins(8, 0, 8, 0)
|
||||
|
||||
view = QtWidgets.QListView(self)
|
||||
view.setProperty("mode", "icon")
|
||||
view.setObjectName("IconView")
|
||||
|
|
@ -116,6 +113,8 @@ class ActionBar(QtWidgets.QWidget):
|
|||
)
|
||||
view.setItemDelegate(delegate)
|
||||
|
||||
layout = QtWidgets.QHBoxLayout(self)
|
||||
layout.setContentsMargins(0, 0, 0, 0)
|
||||
layout.addWidget(view)
|
||||
|
||||
self.model = model
|
||||
|
|
@ -261,92 +260,6 @@ class ActionBar(QtWidgets.QWidget):
|
|||
self.action_clicked.emit(action)
|
||||
|
||||
|
||||
class TasksWidget(QtWidgets.QWidget):
|
||||
"""Widget showing active Tasks"""
|
||||
|
||||
task_changed = QtCore.Signal()
|
||||
selection_mode = (
|
||||
QtCore.QItemSelectionModel.Select | QtCore.QItemSelectionModel.Rows
|
||||
)
|
||||
|
||||
def __init__(self, dbcon, parent=None):
|
||||
super(TasksWidget, self).__init__(parent)
|
||||
|
||||
self.dbcon = dbcon
|
||||
|
||||
view = QtWidgets.QTreeView(self)
|
||||
view.setIndentation(0)
|
||||
view.setEditTriggers(QtWidgets.QTreeView.NoEditTriggers)
|
||||
model = TaskModel(self.dbcon)
|
||||
view.setModel(model)
|
||||
|
||||
layout = QtWidgets.QVBoxLayout(self)
|
||||
layout.addWidget(view)
|
||||
|
||||
view.selectionModel().selectionChanged.connect(self.task_changed)
|
||||
|
||||
self.model = model
|
||||
self.view = view
|
||||
|
||||
self._last_selected_task = None
|
||||
|
||||
def set_asset(self, asset_id):
|
||||
if asset_id is None:
|
||||
# Asset deselected
|
||||
self.model.set_assets()
|
||||
return
|
||||
|
||||
# Try and preserve the last selected task and reselect it
|
||||
# after switching assets. If there's no currently selected
|
||||
# asset keep whatever the "last selected" was prior to it.
|
||||
current = self.get_current_task()
|
||||
if current:
|
||||
self._last_selected_task = current
|
||||
|
||||
self.model.set_assets([asset_id])
|
||||
|
||||
if self._last_selected_task:
|
||||
self.select_task(self._last_selected_task)
|
||||
|
||||
# Force a task changed emit.
|
||||
self.task_changed.emit()
|
||||
|
||||
def select_task(self, task_name):
|
||||
"""Select a task by name.
|
||||
|
||||
If the task does not exist in the current model then selection is only
|
||||
cleared.
|
||||
|
||||
Args:
|
||||
task (str): Name of the task to select.
|
||||
|
||||
"""
|
||||
|
||||
# Clear selection
|
||||
self.view.selectionModel().clearSelection()
|
||||
|
||||
# Select the task
|
||||
for row in range(self.model.rowCount()):
|
||||
index = self.model.index(row, 0)
|
||||
_task_name = index.data(QtCore.Qt.DisplayRole)
|
||||
if _task_name == task_name:
|
||||
self.view.selectionModel().select(index, self.selection_mode)
|
||||
# Set the currently active index
|
||||
self.view.setCurrentIndex(index)
|
||||
break
|
||||
|
||||
def get_current_task(self):
|
||||
"""Return name of task at current index (selected)
|
||||
|
||||
Returns:
|
||||
str: Name of the current task.
|
||||
|
||||
"""
|
||||
index = self.view.currentIndex()
|
||||
if self.view.selectionModel().isSelected(index):
|
||||
return index.data(QtCore.Qt.DisplayRole)
|
||||
|
||||
|
||||
class ActionHistory(QtWidgets.QPushButton):
|
||||
trigger_history = QtCore.Signal(tuple)
|
||||
|
||||
|
|
|
|||
|
|
@ -9,13 +9,14 @@ from openpype import style
|
|||
from openpype.api import resources
|
||||
|
||||
from openpype.tools.utils.widgets import AssetWidget
|
||||
from openpype.tools.utils.tasks_widget import TasksWidget
|
||||
|
||||
from avalon.vendor import qtawesome
|
||||
from .models import ProjectModel
|
||||
from .lib import get_action_label, ProjectHandler
|
||||
from .widgets import (
|
||||
ProjectBar,
|
||||
ActionBar,
|
||||
TasksWidget,
|
||||
ActionHistory,
|
||||
SlidePageWidget
|
||||
)
|
||||
|
|
@ -91,8 +92,6 @@ class ProjectsPanel(QtWidgets.QWidget):
|
|||
def __init__(self, project_handler, parent=None):
|
||||
super(ProjectsPanel, self).__init__(parent=parent)
|
||||
|
||||
layout = QtWidgets.QVBoxLayout(self)
|
||||
|
||||
view = ProjectIconView(parent=self)
|
||||
view.setSelectionMode(QtWidgets.QListView.NoSelection)
|
||||
flick = FlickCharm(parent=self)
|
||||
|
|
@ -100,6 +99,8 @@ class ProjectsPanel(QtWidgets.QWidget):
|
|||
|
||||
view.setModel(project_handler.model)
|
||||
|
||||
layout = QtWidgets.QVBoxLayout(self)
|
||||
layout.setContentsMargins(0, 0, 0, 0)
|
||||
layout.addWidget(view)
|
||||
|
||||
view.clicked.connect(self.on_clicked)
|
||||
|
|
@ -123,28 +124,21 @@ class AssetsPanel(QtWidgets.QWidget):
|
|||
|
||||
self.dbcon = dbcon
|
||||
|
||||
# project bar
|
||||
project_bar_widget = QtWidgets.QWidget(self)
|
||||
|
||||
layout = QtWidgets.QHBoxLayout(project_bar_widget)
|
||||
layout.setSpacing(4)
|
||||
|
||||
# Project bar
|
||||
btn_back_icon = qtawesome.icon("fa.angle-left", color="white")
|
||||
btn_back = QtWidgets.QPushButton(project_bar_widget)
|
||||
btn_back = QtWidgets.QPushButton(self)
|
||||
btn_back.setIcon(btn_back_icon)
|
||||
|
||||
project_bar = ProjectBar(project_handler, project_bar_widget)
|
||||
project_bar = ProjectBar(project_handler, self)
|
||||
|
||||
layout.addWidget(btn_back)
|
||||
layout.addWidget(project_bar)
|
||||
project_bar_layout = QtWidgets.QHBoxLayout()
|
||||
project_bar_layout.setContentsMargins(0, 0, 0, 0)
|
||||
project_bar_layout.setSpacing(4)
|
||||
project_bar_layout.addWidget(btn_back)
|
||||
project_bar_layout.addWidget(project_bar)
|
||||
|
||||
# assets
|
||||
assets_proxy_widgets = QtWidgets.QWidget(self)
|
||||
assets_proxy_widgets.setContentsMargins(0, 0, 0, 0)
|
||||
assets_layout = QtWidgets.QVBoxLayout(assets_proxy_widgets)
|
||||
assets_widget = AssetWidget(
|
||||
dbcon=self.dbcon, parent=assets_proxy_widgets
|
||||
)
|
||||
# Assets widget
|
||||
assets_widget = AssetWidget(dbcon=self.dbcon, parent=self)
|
||||
|
||||
# Make assets view flickable
|
||||
flick = FlickCharm(parent=self)
|
||||
|
|
@ -152,18 +146,19 @@ class AssetsPanel(QtWidgets.QWidget):
|
|||
assets_widget.view.setVerticalScrollMode(
|
||||
assets_widget.view.ScrollPerPixel
|
||||
)
|
||||
assets_layout.addWidget(assets_widget)
|
||||
|
||||
# tasks
|
||||
# Tasks widget
|
||||
tasks_widget = TasksWidget(self.dbcon, self)
|
||||
body = QtWidgets.QSplitter()
|
||||
|
||||
# Body
|
||||
body = QtWidgets.QSplitter(self)
|
||||
body.setContentsMargins(0, 0, 0, 0)
|
||||
body.setSizePolicy(
|
||||
QtWidgets.QSizePolicy.Expanding,
|
||||
QtWidgets.QSizePolicy.Expanding
|
||||
)
|
||||
body.setOrientation(QtCore.Qt.Horizontal)
|
||||
body.addWidget(assets_proxy_widgets)
|
||||
body.addWidget(assets_widget)
|
||||
body.addWidget(tasks_widget)
|
||||
body.setStretchFactor(0, 100)
|
||||
body.setStretchFactor(1, 65)
|
||||
|
|
@ -171,22 +166,21 @@ class AssetsPanel(QtWidgets.QWidget):
|
|||
# main layout
|
||||
layout = QtWidgets.QVBoxLayout(self)
|
||||
layout.setContentsMargins(0, 0, 0, 0)
|
||||
layout.setSpacing(0)
|
||||
layout.addWidget(project_bar_widget)
|
||||
layout.addLayout(project_bar_layout)
|
||||
layout.addWidget(body)
|
||||
|
||||
# signals
|
||||
project_handler.project_changed.connect(self.on_project_changed)
|
||||
assets_widget.selection_changed.connect(self.on_asset_changed)
|
||||
assets_widget.refreshed.connect(self.on_asset_changed)
|
||||
tasks_widget.task_changed.connect(self.on_task_change)
|
||||
project_handler.project_changed.connect(self._on_project_changed)
|
||||
assets_widget.selection_changed.connect(self._on_asset_changed)
|
||||
assets_widget.refreshed.connect(self._on_asset_changed)
|
||||
tasks_widget.task_changed.connect(self._on_task_change)
|
||||
|
||||
btn_back.clicked.connect(self.back_clicked)
|
||||
|
||||
self.project_handler = project_handler
|
||||
self.project_bar = project_bar
|
||||
self.assets_widget = assets_widget
|
||||
self.tasks_widget = tasks_widget
|
||||
self._tasks_widget = tasks_widget
|
||||
self._btn_back = btn_back
|
||||
|
||||
def showEvent(self, event):
|
||||
|
|
@ -197,12 +191,16 @@ class AssetsPanel(QtWidgets.QWidget):
|
|||
btn_size = self.project_bar.height()
|
||||
self._btn_back.setFixedSize(QtCore.QSize(btn_size, btn_size))
|
||||
|
||||
def on_project_changed(self):
|
||||
def select_task_name(self, task_name):
|
||||
self._on_asset_changed()
|
||||
self._tasks_widget.select_task_name(task_name)
|
||||
|
||||
def _on_project_changed(self):
|
||||
self.session_changed.emit()
|
||||
|
||||
self.assets_widget.refresh()
|
||||
|
||||
def on_asset_changed(self):
|
||||
def _on_asset_changed(self):
|
||||
"""Callback on asset selection changed
|
||||
|
||||
This updates the task view.
|
||||
|
|
@ -237,16 +235,17 @@ class AssetsPanel(QtWidgets.QWidget):
|
|||
asset_id = None
|
||||
if asset_doc:
|
||||
asset_id = asset_doc["_id"]
|
||||
self.tasks_widget.set_asset(asset_id)
|
||||
self._tasks_widget.set_asset_id(asset_id)
|
||||
|
||||
def on_task_change(self):
|
||||
task_name = self.tasks_widget.get_current_task()
|
||||
def _on_task_change(self):
|
||||
task_name = self._tasks_widget.get_selected_task_name()
|
||||
self.dbcon.Session["AVALON_TASK"] = task_name
|
||||
self.session_changed.emit()
|
||||
|
||||
|
||||
class LauncherWindow(QtWidgets.QDialog):
|
||||
"""Launcher interface"""
|
||||
message_timeout = 5000
|
||||
|
||||
def __init__(self, parent=None):
|
||||
super(LauncherWindow, self).__init__(parent)
|
||||
|
|
@ -283,20 +282,17 @@ class LauncherWindow(QtWidgets.QDialog):
|
|||
actions_bar = ActionBar(project_handler, self.dbcon, self)
|
||||
|
||||
# statusbar
|
||||
statusbar = QtWidgets.QWidget()
|
||||
layout = QtWidgets.QHBoxLayout(statusbar)
|
||||
message_label = QtWidgets.QLabel(self)
|
||||
|
||||
message_label = QtWidgets.QLabel()
|
||||
message_label.setFixedHeight(15)
|
||||
|
||||
action_history = ActionHistory()
|
||||
action_history = ActionHistory(self)
|
||||
action_history.setStatusTip("Show Action History")
|
||||
|
||||
layout.addWidget(message_label)
|
||||
layout.addWidget(action_history)
|
||||
status_layout = QtWidgets.QHBoxLayout()
|
||||
status_layout.addWidget(message_label, 1)
|
||||
status_layout.addWidget(action_history, 0)
|
||||
|
||||
# Vertically split Pages and Actions
|
||||
body = QtWidgets.QSplitter()
|
||||
body = QtWidgets.QSplitter(self)
|
||||
body.setContentsMargins(0, 0, 0, 0)
|
||||
body.setSizePolicy(
|
||||
QtWidgets.QSizePolicy.Expanding,
|
||||
|
|
@ -314,19 +310,13 @@ class LauncherWindow(QtWidgets.QDialog):
|
|||
|
||||
layout = QtWidgets.QVBoxLayout(self)
|
||||
layout.addWidget(body)
|
||||
layout.addWidget(statusbar)
|
||||
layout.setSpacing(0)
|
||||
layout.setContentsMargins(0, 0, 0, 0)
|
||||
layout.addLayout(status_layout)
|
||||
|
||||
self.project_handler = project_handler
|
||||
message_timer = QtCore.QTimer()
|
||||
message_timer.setInterval(self.message_timeout)
|
||||
message_timer.setSingleShot(True)
|
||||
|
||||
self.message_label = message_label
|
||||
self.project_panel = project_panel
|
||||
self.asset_panel = asset_panel
|
||||
self.actions_bar = actions_bar
|
||||
self.action_history = action_history
|
||||
self.page_slider = page_slider
|
||||
self._page = 0
|
||||
message_timer.timeout.connect(self._on_message_timeout)
|
||||
|
||||
# signals
|
||||
actions_bar.action_clicked.connect(self.on_action_clicked)
|
||||
|
|
@ -338,6 +328,19 @@ class LauncherWindow(QtWidgets.QDialog):
|
|||
|
||||
self.resize(520, 740)
|
||||
|
||||
self._page = 0
|
||||
|
||||
self._message_timer = message_timer
|
||||
|
||||
self.project_handler = project_handler
|
||||
|
||||
self._message_label = message_label
|
||||
self.project_panel = project_panel
|
||||
self.asset_panel = asset_panel
|
||||
self.actions_bar = actions_bar
|
||||
self.action_history = action_history
|
||||
self.page_slider = page_slider
|
||||
|
||||
def showEvent(self, event):
|
||||
self.project_handler.set_active(True)
|
||||
self.project_handler.start_timer(True)
|
||||
|
|
@ -363,9 +366,12 @@ class LauncherWindow(QtWidgets.QDialog):
|
|||
self._page = page
|
||||
self.page_slider.slide_view(page, direction=direction)
|
||||
|
||||
def _on_message_timeout(self):
|
||||
self._message_label.setText("")
|
||||
|
||||
def echo(self, message):
|
||||
self.message_label.setText(str(message))
|
||||
QtCore.QTimer.singleShot(5000, lambda: self.message_label.setText(""))
|
||||
self._message_label.setText(str(message))
|
||||
self._message_timer.start()
|
||||
self.log.debug(message)
|
||||
|
||||
def on_session_changed(self):
|
||||
|
|
@ -448,5 +454,4 @@ class LauncherWindow(QtWidgets.QDialog):
|
|||
|
||||
if task_name:
|
||||
# requires a forced refresh first
|
||||
self.asset_panel.on_asset_changed()
|
||||
self.asset_panel.tasks_widget.select_task(task_name)
|
||||
self.asset_panel.select_task_name(task_name)
|
||||
|
|
|
|||
|
|
@ -407,7 +407,8 @@ class LibraryLoaderWindow(QtWidgets.QDialog):
|
|||
self.data["state"]["assetIds"] = asset_ids
|
||||
|
||||
# reset repre list
|
||||
self._repres_widget.set_version_ids([])
|
||||
if self._repres_widget:
|
||||
self._repres_widget.set_version_ids([])
|
||||
|
||||
def _subsetschanged(self):
|
||||
asset_ids = self.data["state"]["assetIds"]
|
||||
|
|
@ -497,7 +498,8 @@ class LibraryLoaderWindow(QtWidgets.QDialog):
|
|||
self._thumbnail_widget.set_thumbnail(thumbnail_docs)
|
||||
|
||||
version_ids = [doc["_id"] for doc in version_docs or []]
|
||||
self._repres_widget.set_version_ids(version_ids)
|
||||
if self._repres_widget:
|
||||
self._repres_widget.set_version_ids(version_ids)
|
||||
|
||||
def _set_context(self, context, refresh=True):
|
||||
"""Set the selection in the interface using a context.
|
||||
|
|
|
|||
|
|
@ -15,6 +15,12 @@ from openpype.tools.utils.models import TreeModel, Item
|
|||
from openpype.tools.utils import lib
|
||||
|
||||
from openpype.modules import ModulesManager
|
||||
from openpype.tools.utils.constants import (
|
||||
LOCAL_PROVIDER_ROLE,
|
||||
REMOTE_PROVIDER_ROLE,
|
||||
LOCAL_AVAILABILITY_ROLE,
|
||||
REMOTE_AVAILABILITY_ROLE
|
||||
)
|
||||
|
||||
|
||||
def is_filtering_recursible():
|
||||
|
|
@ -237,9 +243,9 @@ class SubsetsModel(TreeModel, BaseRepresentationModel):
|
|||
|
||||
# update availability on active site when version changes
|
||||
if self.sync_server.enabled and version:
|
||||
site = self.active_site
|
||||
query = self._repre_per_version_pipeline([version["_id"]],
|
||||
site)
|
||||
self.active_site,
|
||||
self.remote_site)
|
||||
docs = list(self.dbcon.aggregate(query))
|
||||
if docs:
|
||||
repre = docs.pop()
|
||||
|
|
@ -333,7 +339,6 @@ class SubsetsModel(TreeModel, BaseRepresentationModel):
|
|||
repre_info = version_data.get("repre_info")
|
||||
if repre_info:
|
||||
item["repre_info"] = repre_info
|
||||
item["repre_icon"] = version_data.get("repre_icon")
|
||||
|
||||
def _fetch(self):
|
||||
asset_docs = self.dbcon.find(
|
||||
|
|
@ -445,14 +450,16 @@ class SubsetsModel(TreeModel, BaseRepresentationModel):
|
|||
for _subset_id, doc in last_versions_by_subset_id.items():
|
||||
version_ids.add(doc["_id"])
|
||||
|
||||
site = self.active_site
|
||||
query = self._repre_per_version_pipeline(list(version_ids), site)
|
||||
query = self._repre_per_version_pipeline(list(version_ids),
|
||||
self.active_site,
|
||||
self.remote_site)
|
||||
|
||||
repre_info = {}
|
||||
for doc in self.dbcon.aggregate(query):
|
||||
if self._doc_fetching_stop:
|
||||
return
|
||||
doc["provider"] = self.active_provider
|
||||
doc["active_provider"] = self.active_provider
|
||||
doc["remote_provider"] = self.remote_provider
|
||||
repre_info[doc["_id"]] = doc
|
||||
|
||||
self._doc_payload["repre_info_by_version_id"] = repre_info
|
||||
|
|
@ -666,8 +673,8 @@ class SubsetsModel(TreeModel, BaseRepresentationModel):
|
|||
if not index.isValid():
|
||||
return
|
||||
|
||||
item = index.internalPointer()
|
||||
if role == self.SortDescendingRole:
|
||||
item = index.internalPointer()
|
||||
if item.get("isGroup"):
|
||||
# Ensure groups be on top when sorting by descending order
|
||||
prefix = "2"
|
||||
|
|
@ -683,7 +690,6 @@ class SubsetsModel(TreeModel, BaseRepresentationModel):
|
|||
return prefix + order
|
||||
|
||||
if role == self.SortAscendingRole:
|
||||
item = index.internalPointer()
|
||||
if item.get("isGroup"):
|
||||
# Ensure groups be on top when sorting by ascending order
|
||||
prefix = "0"
|
||||
|
|
@ -701,14 +707,12 @@ class SubsetsModel(TreeModel, BaseRepresentationModel):
|
|||
if role == QtCore.Qt.DisplayRole:
|
||||
if index.column() == self.columns_index["family"]:
|
||||
# Show familyLabel instead of family
|
||||
item = index.internalPointer()
|
||||
return item.get("familyLabel", None)
|
||||
|
||||
elif role == QtCore.Qt.DecorationRole:
|
||||
|
||||
# Add icon to subset column
|
||||
if index.column() == self.columns_index["subset"]:
|
||||
item = index.internalPointer()
|
||||
if item.get("isGroup") or item.get("isMerged"):
|
||||
return item["icon"]
|
||||
else:
|
||||
|
|
@ -716,20 +720,32 @@ class SubsetsModel(TreeModel, BaseRepresentationModel):
|
|||
|
||||
# Add icon to family column
|
||||
if index.column() == self.columns_index["family"]:
|
||||
item = index.internalPointer()
|
||||
return item.get("familyIcon", None)
|
||||
|
||||
if index.column() == self.columns_index.get("repre_info"):
|
||||
item = index.internalPointer()
|
||||
return item.get("repre_icon", None)
|
||||
|
||||
elif role == QtCore.Qt.ForegroundRole:
|
||||
item = index.internalPointer()
|
||||
version_doc = item.get("version_document")
|
||||
if version_doc and version_doc.get("type") == "hero_version":
|
||||
if not version_doc["is_from_latest"]:
|
||||
return self.not_last_hero_brush
|
||||
|
||||
elif role == LOCAL_AVAILABILITY_ROLE:
|
||||
if not item.get("isGroup"):
|
||||
return item.get("repre_info_local")
|
||||
else:
|
||||
return None
|
||||
|
||||
elif role == REMOTE_AVAILABILITY_ROLE:
|
||||
if not item.get("isGroup"):
|
||||
return item.get("repre_info_remote")
|
||||
else:
|
||||
return None
|
||||
|
||||
elif role == LOCAL_PROVIDER_ROLE:
|
||||
return self.active_provider
|
||||
|
||||
elif role == REMOTE_PROVIDER_ROLE:
|
||||
return self.remote_provider
|
||||
|
||||
return super(SubsetsModel, self).data(index, role)
|
||||
|
||||
def flags(self, index):
|
||||
|
|
@ -759,19 +775,25 @@ class SubsetsModel(TreeModel, BaseRepresentationModel):
|
|||
return data
|
||||
|
||||
def _get_repre_dict(self, repre_info):
|
||||
"""Returns icon and str representation of availability"""
|
||||
"""Returns str representation of availability"""
|
||||
data = {}
|
||||
if repre_info:
|
||||
repres_str = "{}/{}".format(
|
||||
int(math.floor(float(repre_info['avail_repre']))),
|
||||
int(math.floor(float(repre_info['avail_repre_local']))),
|
||||
int(math.floor(float(repre_info['repre_count']))))
|
||||
|
||||
data["repre_info"] = repres_str
|
||||
data["repre_icon"] = self.repre_icons.get(self.active_provider)
|
||||
data["repre_info_local"] = repres_str
|
||||
|
||||
repres_str = "{}/{}".format(
|
||||
int(math.floor(float(repre_info['avail_repre_remote']))),
|
||||
int(math.floor(float(repre_info['repre_count']))))
|
||||
|
||||
data["repre_info_remote"] = repres_str
|
||||
|
||||
return data
|
||||
|
||||
def _repre_per_version_pipeline(self, version_ids, site):
|
||||
def _repre_per_version_pipeline(self, version_ids,
|
||||
active_site, remote_site):
|
||||
query = [
|
||||
{"$match": {"parent": {"$in": version_ids},
|
||||
"type": "representation",
|
||||
|
|
@ -779,35 +801,70 @@ class SubsetsModel(TreeModel, BaseRepresentationModel):
|
|||
{"$unwind": "$files"},
|
||||
{'$addFields': {
|
||||
'order_local': {
|
||||
'$filter': {'input': '$files.sites', 'as': 'p',
|
||||
'cond': {'$eq': ['$$p.name', site]}
|
||||
}}
|
||||
'$filter': {
|
||||
'input': '$files.sites', 'as': 'p',
|
||||
'cond': {'$eq': ['$$p.name', active_site]}
|
||||
}
|
||||
}
|
||||
}},
|
||||
{'$addFields': {
|
||||
'order_remote': {
|
||||
'$filter': {
|
||||
'input': '$files.sites', 'as': 'p',
|
||||
'cond': {'$eq': ['$$p.name', remote_site]}
|
||||
}
|
||||
}
|
||||
}},
|
||||
{'$addFields': {
|
||||
'progress_local': {"$arrayElemAt": [{
|
||||
'$cond': [{'$size': "$order_local.progress"},
|
||||
"$order_local.progress",
|
||||
# if exists created_dt count is as available
|
||||
{'$cond': [
|
||||
{'$size': "$order_local.created_dt"},
|
||||
[1],
|
||||
[0]
|
||||
]}
|
||||
]}, 0]}
|
||||
'$cond': [
|
||||
{'$size': "$order_local.progress"},
|
||||
"$order_local.progress",
|
||||
# if exists created_dt count is as available
|
||||
{'$cond': [
|
||||
{'$size': "$order_local.created_dt"},
|
||||
[1],
|
||||
[0]
|
||||
]}
|
||||
]},
|
||||
0
|
||||
]}
|
||||
}},
|
||||
{'$addFields': {
|
||||
'progress_remote': {"$arrayElemAt": [{
|
||||
'$cond': [
|
||||
{'$size': "$order_remote.progress"},
|
||||
"$order_remote.progress",
|
||||
# if exists created_dt count is as available
|
||||
{'$cond': [
|
||||
{'$size': "$order_remote.created_dt"},
|
||||
[1],
|
||||
[0]
|
||||
]}
|
||||
]},
|
||||
0
|
||||
]}
|
||||
}},
|
||||
{'$group': { # first group by repre
|
||||
'_id': '$_id',
|
||||
'parent': {'$first': '$parent'},
|
||||
'files_count': {'$sum': 1},
|
||||
'files_avail': {'$sum': "$progress_local"},
|
||||
'avail_ratio': {'$first': {
|
||||
'$divide': [{'$sum': "$progress_local"}, {'$sum': 1}]}}
|
||||
'avail_ratio_local': {
|
||||
'$first': {
|
||||
'$divide': [{'$sum': "$progress_local"}, {'$sum': 1}]
|
||||
}
|
||||
},
|
||||
'avail_ratio_remote': {
|
||||
'$first': {
|
||||
'$divide': [{'$sum': "$progress_remote"}, {'$sum': 1}]
|
||||
}
|
||||
}
|
||||
}},
|
||||
{'$group': { # second group by parent, eg version_id
|
||||
'_id': '$parent',
|
||||
'repre_count': {'$sum': 1}, # total representations
|
||||
# fully available representation for site
|
||||
'avail_repre': {'$sum': "$avail_ratio"}
|
||||
'avail_repre_local': {'$sum': "$avail_ratio_local"},
|
||||
'avail_repre_remote': {'$sum': "$avail_ratio_remote"},
|
||||
}},
|
||||
]
|
||||
return query
|
||||
|
|
|
|||
|
|
@ -31,6 +31,13 @@ from .model import (
|
|||
)
|
||||
from . import lib
|
||||
|
||||
from openpype.tools.utils.constants import (
|
||||
LOCAL_PROVIDER_ROLE,
|
||||
REMOTE_PROVIDER_ROLE,
|
||||
LOCAL_AVAILABILITY_ROLE,
|
||||
REMOTE_AVAILABILITY_ROLE
|
||||
)
|
||||
|
||||
|
||||
class OverlayFrame(QtWidgets.QFrame):
|
||||
def __init__(self, label, parent):
|
||||
|
|
@ -197,6 +204,10 @@ class SubsetWidget(QtWidgets.QWidget):
|
|||
column = model.Columns.index("time")
|
||||
view.setItemDelegateForColumn(column, time_delegate)
|
||||
|
||||
avail_delegate = AvailabilityDelegate(self.dbcon, view)
|
||||
column = model.Columns.index("repre_info")
|
||||
view.setItemDelegateForColumn(column, avail_delegate)
|
||||
|
||||
layout = QtWidgets.QVBoxLayout(self)
|
||||
layout.setContentsMargins(0, 0, 0, 0)
|
||||
layout.addLayout(top_bar_layout)
|
||||
|
|
@ -1578,3 +1589,54 @@ def _load_subsets_by_loader(loader, subset_contexts, options,
|
|||
))
|
||||
|
||||
return error_info
|
||||
|
||||
|
||||
class AvailabilityDelegate(QtWidgets.QStyledItemDelegate):
|
||||
"""
|
||||
Prints icons and downloaded representation ration for both sides.
|
||||
"""
|
||||
|
||||
def __init__(self, dbcon, parent=None):
|
||||
super(AvailabilityDelegate, self).__init__(parent)
|
||||
self.icons = tools_lib.get_repre_icons()
|
||||
|
||||
def paint(self, painter, option, index):
|
||||
super(AvailabilityDelegate, self).paint(painter, option, index)
|
||||
option = QtWidgets.QStyleOptionViewItem(option)
|
||||
option.showDecorationSelected = True
|
||||
|
||||
provider_active = index.data(LOCAL_PROVIDER_ROLE)
|
||||
provider_remote = index.data(REMOTE_PROVIDER_ROLE)
|
||||
|
||||
availability_active = index.data(LOCAL_AVAILABILITY_ROLE)
|
||||
availability_remote = index.data(REMOTE_AVAILABILITY_ROLE)
|
||||
|
||||
if not availability_active or not availability_remote: # group lines
|
||||
return
|
||||
|
||||
idx = 0
|
||||
height = width = 24
|
||||
for value, provider in [(availability_active, provider_active),
|
||||
(availability_remote, provider_remote)]:
|
||||
icon = self.icons.get(provider)
|
||||
if not icon:
|
||||
continue
|
||||
|
||||
pixmap = icon.pixmap(icon.actualSize(QtCore.QSize(height, width)))
|
||||
padding = 10 + (70 * idx)
|
||||
point = QtCore.QPoint(option.rect.x() + padding,
|
||||
option.rect.y() +
|
||||
(option.rect.height() - pixmap.height()) / 2)
|
||||
painter.drawPixmap(point, pixmap)
|
||||
|
||||
text_rect = option.rect.translated(padding + width + 10, 0)
|
||||
painter.drawText(
|
||||
text_rect,
|
||||
option.displayAlignment,
|
||||
value
|
||||
)
|
||||
|
||||
idx += 1
|
||||
|
||||
def displayText(self, value, locale):
|
||||
pass
|
||||
|
|
|
|||
|
|
@ -190,7 +190,9 @@ class Controller(QtCore.QObject):
|
|||
|
||||
plugins = pyblish.api.discover()
|
||||
|
||||
targets = pyblish.logic.registered_targets() or ["default"]
|
||||
targets = set(pyblish.logic.registered_targets())
|
||||
targets.add("default")
|
||||
targets = list(targets)
|
||||
plugins_by_targets = pyblish.logic.plugins_by_targets(plugins, targets)
|
||||
|
||||
_plugins = []
|
||||
|
|
|
|||
9
openpype/tools/sceneinventory/__init__.py
Normal file
9
openpype/tools/sceneinventory/__init__.py
Normal file
|
|
@ -0,0 +1,9 @@
|
|||
from .window import (
|
||||
show,
|
||||
SceneInventoryWindow
|
||||
)
|
||||
|
||||
__all__ = (
|
||||
"show",
|
||||
"SceneInventoryWindow"
|
||||
)
|
||||
82
openpype/tools/sceneinventory/lib.py
Normal file
82
openpype/tools/sceneinventory/lib.py
Normal file
|
|
@ -0,0 +1,82 @@
|
|||
import os
|
||||
from openpype_modules import sync_server
|
||||
|
||||
from Qt import QtGui
|
||||
|
||||
|
||||
def walk_hierarchy(node):
|
||||
"""Recursively yield group node."""
|
||||
for child in node.children():
|
||||
if child.get("isGroupNode"):
|
||||
yield child
|
||||
|
||||
for _child in walk_hierarchy(child):
|
||||
yield _child
|
||||
|
||||
|
||||
def get_site_icons():
|
||||
resource_path = os.path.join(
|
||||
os.path.dirname(sync_server.sync_server_module.__file__),
|
||||
"providers",
|
||||
"resources"
|
||||
)
|
||||
icons = {}
|
||||
# TODO get from sync module
|
||||
for provider in ["studio", "local_drive", "gdrive"]:
|
||||
pix_url = "{}/{}.png".format(resource_path, provider)
|
||||
icons[provider] = QtGui.QIcon(pix_url)
|
||||
|
||||
return icons
|
||||
|
||||
|
||||
def get_progress_for_repre(repre_doc, active_site, remote_site):
|
||||
"""
|
||||
Calculates average progress for representation.
|
||||
|
||||
If site has created_dt >> fully available >> progress == 1
|
||||
|
||||
Could be calculated in aggregate if it would be too slow
|
||||
Args:
|
||||
repre_doc(dict): representation dict
|
||||
Returns:
|
||||
(dict) with active and remote sites progress
|
||||
{'studio': 1.0, 'gdrive': -1} - gdrive site is not present
|
||||
-1 is used to highlight the site should be added
|
||||
{'studio': 1.0, 'gdrive': 0.0} - gdrive site is present, not
|
||||
uploaded yet
|
||||
"""
|
||||
progress = {active_site: -1, remote_site: -1}
|
||||
if not repre_doc:
|
||||
return progress
|
||||
|
||||
files = {active_site: 0, remote_site: 0}
|
||||
doc_files = repre_doc.get("files") or []
|
||||
for doc_file in doc_files:
|
||||
if not isinstance(doc_file, dict):
|
||||
continue
|
||||
|
||||
sites = doc_file.get("sites") or []
|
||||
for site in sites:
|
||||
if (
|
||||
# Pype 2 compatibility
|
||||
not isinstance(site, dict)
|
||||
# Check if site name is one of progress sites
|
||||
or site["name"] not in progress
|
||||
):
|
||||
continue
|
||||
|
||||
files[site["name"]] += 1
|
||||
norm_progress = max(progress[site["name"]], 0)
|
||||
if site.get("created_dt"):
|
||||
progress[site["name"]] = norm_progress + 1
|
||||
elif site.get("progress"):
|
||||
progress[site["name"]] = norm_progress + site["progress"]
|
||||
else: # site exists, might be failed, do not add again
|
||||
progress[site["name"]] = 0
|
||||
|
||||
# for example 13 fully avail. files out of 26 >> 13/26 = 0.5
|
||||
avg_progress = {
|
||||
active_site: progress[active_site] / max(files[active_site], 1),
|
||||
remote_site: progress[remote_site] / max(files[remote_site], 1)
|
||||
}
|
||||
return avg_progress
|
||||
576
openpype/tools/sceneinventory/model.py
Normal file
576
openpype/tools/sceneinventory/model.py
Normal file
|
|
@ -0,0 +1,576 @@
|
|||
import re
|
||||
import logging
|
||||
|
||||
from collections import defaultdict
|
||||
|
||||
from Qt import QtCore, QtGui
|
||||
from avalon import api, io, style, schema
|
||||
from avalon.vendor import qtawesome
|
||||
|
||||
from avalon.lib import HeroVersionType
|
||||
from avalon.tools.models import TreeModel, Item
|
||||
|
||||
from .lib import (
|
||||
get_site_icons,
|
||||
walk_hierarchy,
|
||||
get_progress_for_repre
|
||||
)
|
||||
|
||||
from openpype.modules import ModulesManager
|
||||
|
||||
|
||||
class InventoryModel(TreeModel):
|
||||
"""The model for the inventory"""
|
||||
|
||||
Columns = ["Name", "version", "count", "family", "loader", "objectName"]
|
||||
|
||||
OUTDATED_COLOR = QtGui.QColor(235, 30, 30)
|
||||
CHILD_OUTDATED_COLOR = QtGui.QColor(200, 160, 30)
|
||||
GRAYOUT_COLOR = QtGui.QColor(160, 160, 160)
|
||||
|
||||
UniqueRole = QtCore.Qt.UserRole + 2 # unique label role
|
||||
|
||||
def __init__(self, family_config_cache, parent=None):
|
||||
super(InventoryModel, self).__init__(parent)
|
||||
self.log = logging.getLogger(self.__class__.__name__)
|
||||
|
||||
self.family_config_cache = family_config_cache
|
||||
|
||||
self._hierarchy_view = False
|
||||
|
||||
manager = ModulesManager()
|
||||
sync_server = manager.modules_by_name["sync_server"]
|
||||
self.sync_enabled = sync_server.enabled
|
||||
self._site_icons = {}
|
||||
self.active_site = self.remote_site = None
|
||||
self.active_provider = self.remote_provider = None
|
||||
|
||||
if not self.sync_enabled:
|
||||
return
|
||||
|
||||
project_name = io.Session["AVALON_PROJECT"]
|
||||
active_site = sync_server.get_active_site(project_name)
|
||||
remote_site = sync_server.get_remote_site(project_name)
|
||||
|
||||
active_provider = "studio"
|
||||
remote_provider = "studio"
|
||||
if active_site != "studio":
|
||||
# sanitized for icon
|
||||
active_provider = sync_server.get_provider_for_site(
|
||||
project_name, active_site
|
||||
)
|
||||
|
||||
if remote_site != "studio":
|
||||
remote_provider = sync_server.get_provider_for_site(
|
||||
project_name, remote_site
|
||||
)
|
||||
|
||||
# self.sync_server = sync_server
|
||||
self.active_site = active_site
|
||||
self.active_provider = active_provider
|
||||
self.remote_site = remote_site
|
||||
self.remote_provider = remote_provider
|
||||
self._site_icons = get_site_icons()
|
||||
if "active_site" not in self.Columns:
|
||||
self.Columns.append("active_site")
|
||||
if "remote_site" not in self.Columns:
|
||||
self.Columns.append("remote_site")
|
||||
|
||||
def outdated(self, item):
|
||||
value = item.get("version")
|
||||
if isinstance(value, HeroVersionType):
|
||||
return False
|
||||
|
||||
if item.get("version") == item.get("highest_version"):
|
||||
return False
|
||||
return True
|
||||
|
||||
def data(self, index, role):
|
||||
if not index.isValid():
|
||||
return
|
||||
|
||||
item = index.internalPointer()
|
||||
|
||||
if role == QtCore.Qt.FontRole:
|
||||
# Make top-level entries bold
|
||||
if item.get("isGroupNode") or item.get("isNotSet"): # group-item
|
||||
font = QtGui.QFont()
|
||||
font.setBold(True)
|
||||
return font
|
||||
|
||||
if role == QtCore.Qt.ForegroundRole:
|
||||
# Set the text color to the OUTDATED_COLOR when the
|
||||
# collected version is not the same as the highest version
|
||||
key = self.Columns[index.column()]
|
||||
if key == "version": # version
|
||||
if item.get("isGroupNode"): # group-item
|
||||
if self.outdated(item):
|
||||
return self.OUTDATED_COLOR
|
||||
|
||||
if self._hierarchy_view:
|
||||
# If current group is not outdated, check if any
|
||||
# outdated children.
|
||||
for _node in walk_hierarchy(item):
|
||||
if self.outdated(_node):
|
||||
return self.CHILD_OUTDATED_COLOR
|
||||
else:
|
||||
|
||||
if self._hierarchy_view:
|
||||
# Although this is not a group item, we still need
|
||||
# to distinguish which one contain outdated child.
|
||||
for _node in walk_hierarchy(item):
|
||||
if self.outdated(_node):
|
||||
return self.CHILD_OUTDATED_COLOR.darker(150)
|
||||
|
||||
return self.GRAYOUT_COLOR
|
||||
|
||||
if key == "Name" and not item.get("isGroupNode"):
|
||||
return self.GRAYOUT_COLOR
|
||||
|
||||
# Add icons
|
||||
if role == QtCore.Qt.DecorationRole:
|
||||
if index.column() == 0:
|
||||
# Override color
|
||||
color = item.get("color", style.colors.default)
|
||||
if item.get("isGroupNode"): # group-item
|
||||
return qtawesome.icon("fa.folder", color=color)
|
||||
if item.get("isNotSet"):
|
||||
return qtawesome.icon("fa.exclamation-circle", color=color)
|
||||
|
||||
return qtawesome.icon("fa.file-o", color=color)
|
||||
|
||||
if index.column() == 3:
|
||||
# Family icon
|
||||
return item.get("familyIcon", None)
|
||||
|
||||
if item.get("isGroupNode"):
|
||||
column_name = self.Columns[index.column()]
|
||||
if column_name == "active_site":
|
||||
provider = item.get("active_site_provider")
|
||||
return self._site_icons.get(provider)
|
||||
|
||||
if column_name == "remote_site":
|
||||
provider = item.get("remote_site_provider")
|
||||
return self._site_icons.get(provider)
|
||||
|
||||
if role == QtCore.Qt.DisplayRole and item.get("isGroupNode"):
|
||||
column_name = self.Columns[index.column()]
|
||||
progress = None
|
||||
if column_name == 'active_site':
|
||||
progress = item.get("active_site_progress", 0)
|
||||
elif column_name == 'remote_site':
|
||||
progress = item.get("remote_site_progress", 0)
|
||||
if progress is not None:
|
||||
return "{}%".format(max(progress, 0) * 100)
|
||||
|
||||
if role == self.UniqueRole:
|
||||
return item["representation"] + item.get("objectName", "<none>")
|
||||
|
||||
return super(InventoryModel, self).data(index, role)
|
||||
|
||||
def set_hierarchy_view(self, state):
|
||||
"""Set whether to display subsets in hierarchy view."""
|
||||
state = bool(state)
|
||||
|
||||
if state != self._hierarchy_view:
|
||||
self._hierarchy_view = state
|
||||
|
||||
def refresh(self, selected=None, items=None):
|
||||
"""Refresh the model"""
|
||||
|
||||
host = api.registered_host()
|
||||
if not items: # for debugging or testing, injecting items from outside
|
||||
items = host.ls()
|
||||
|
||||
self.clear()
|
||||
|
||||
if self._hierarchy_view and selected:
|
||||
|
||||
if not hasattr(host.pipeline, "update_hierarchy"):
|
||||
# If host doesn't support hierarchical containers, then
|
||||
# cherry-pick only.
|
||||
self.add_items((item for item in items
|
||||
if item["objectName"] in selected))
|
||||
|
||||
# Update hierarchy info for all containers
|
||||
items_by_name = {item["objectName"]: item
|
||||
for item in host.pipeline.update_hierarchy(items)}
|
||||
|
||||
selected_items = set()
|
||||
|
||||
def walk_children(names):
|
||||
"""Select containers and extend to chlid containers"""
|
||||
for name in [n for n in names if n not in selected_items]:
|
||||
selected_items.add(name)
|
||||
item = items_by_name[name]
|
||||
yield item
|
||||
|
||||
for child in walk_children(item["children"]):
|
||||
yield child
|
||||
|
||||
items = list(walk_children(selected)) # Cherry-picked and extended
|
||||
|
||||
# Cut unselected upstream containers
|
||||
for item in items:
|
||||
if not item.get("parent") in selected_items:
|
||||
# Parent not in selection, this is root item.
|
||||
item["parent"] = None
|
||||
|
||||
parents = [self._root_item]
|
||||
|
||||
# The length of `items` array is the maximum depth that a
|
||||
# hierarchy could be.
|
||||
# Take this as an easiest way to prevent looping forever.
|
||||
maximum_loop = len(items)
|
||||
count = 0
|
||||
while items:
|
||||
if count > maximum_loop:
|
||||
self.log.warning("Maximum loop count reached, possible "
|
||||
"missing parent node.")
|
||||
break
|
||||
|
||||
_parents = list()
|
||||
for parent in parents:
|
||||
_unparented = list()
|
||||
|
||||
def _children():
|
||||
"""Child item provider"""
|
||||
for item in items:
|
||||
if item.get("parent") == parent.get("objectName"):
|
||||
# (NOTE)
|
||||
# Since `self._root_node` has no "objectName"
|
||||
# entry, it will be paired with root item if
|
||||
# the value of key "parent" is None, or not
|
||||
# having the key.
|
||||
yield item
|
||||
else:
|
||||
# Not current parent's child, try next
|
||||
_unparented.append(item)
|
||||
|
||||
self.add_items(_children(), parent)
|
||||
|
||||
items[:] = _unparented
|
||||
|
||||
# Parents of next level
|
||||
for group_node in parent.children():
|
||||
_parents += group_node.children()
|
||||
|
||||
parents[:] = _parents
|
||||
count += 1
|
||||
|
||||
else:
|
||||
self.add_items(items)
|
||||
|
||||
def add_items(self, items, parent=None):
|
||||
"""Add the items to the model.
|
||||
|
||||
The items should be formatted similar to `api.ls()` returns, an item
|
||||
is then represented as:
|
||||
{"filename_v001.ma": [full/filename/of/loaded/filename_v001.ma,
|
||||
full/filename/of/loaded/filename_v001.ma],
|
||||
"nodetype" : "reference",
|
||||
"node": "referenceNode1"}
|
||||
|
||||
Note: When performing an additional call to `add_items` it will *not*
|
||||
group the new items with previously existing item groups of the
|
||||
same type.
|
||||
|
||||
Args:
|
||||
items (generator): the items to be processed as returned by `ls()`
|
||||
parent (Item, optional): Set this item as parent for the added
|
||||
items when provided. Defaults to the root of the model.
|
||||
|
||||
Returns:
|
||||
node.Item: root node which has children added based on the data
|
||||
"""
|
||||
|
||||
self.beginResetModel()
|
||||
|
||||
# Group by representation
|
||||
grouped = defaultdict(lambda: {"items": list()})
|
||||
for item in items:
|
||||
grouped[item["representation"]]["items"].append(item)
|
||||
|
||||
# Add to model
|
||||
not_found = defaultdict(list)
|
||||
not_found_ids = []
|
||||
for repre_id, group_dict in sorted(grouped.items()):
|
||||
group_items = group_dict["items"]
|
||||
# Get parenthood per group
|
||||
representation = io.find_one({"_id": io.ObjectId(repre_id)})
|
||||
if not representation:
|
||||
not_found["representation"].append(group_items)
|
||||
not_found_ids.append(repre_id)
|
||||
continue
|
||||
|
||||
version = io.find_one({"_id": representation["parent"]})
|
||||
if not version:
|
||||
not_found["version"].append(group_items)
|
||||
not_found_ids.append(repre_id)
|
||||
continue
|
||||
|
||||
elif version["type"] == "hero_version":
|
||||
_version = io.find_one({
|
||||
"_id": version["version_id"]
|
||||
})
|
||||
version["name"] = HeroVersionType(_version["name"])
|
||||
version["data"] = _version["data"]
|
||||
|
||||
subset = io.find_one({"_id": version["parent"]})
|
||||
if not subset:
|
||||
not_found["subset"].append(group_items)
|
||||
not_found_ids.append(repre_id)
|
||||
continue
|
||||
|
||||
asset = io.find_one({"_id": subset["parent"]})
|
||||
if not asset:
|
||||
not_found["asset"].append(group_items)
|
||||
not_found_ids.append(repre_id)
|
||||
continue
|
||||
|
||||
grouped[repre_id].update({
|
||||
"representation": representation,
|
||||
"version": version,
|
||||
"subset": subset,
|
||||
"asset": asset
|
||||
})
|
||||
|
||||
for id in not_found_ids:
|
||||
grouped.pop(id)
|
||||
|
||||
for where, group_items in not_found.items():
|
||||
# create the group header
|
||||
group_node = Item()
|
||||
name = "< NOT FOUND - {} >".format(where)
|
||||
group_node["Name"] = name
|
||||
group_node["representation"] = name
|
||||
group_node["count"] = len(group_items)
|
||||
group_node["isGroupNode"] = False
|
||||
group_node["isNotSet"] = True
|
||||
|
||||
self.add_child(group_node, parent=parent)
|
||||
|
||||
for _group_items in group_items:
|
||||
item_node = Item()
|
||||
item_node["Name"] = ", ".join(
|
||||
[item["objectName"] for item in _group_items]
|
||||
)
|
||||
self.add_child(item_node, parent=group_node)
|
||||
|
||||
for repre_id, group_dict in sorted(grouped.items()):
|
||||
group_items = group_dict["items"]
|
||||
representation = grouped[repre_id]["representation"]
|
||||
version = grouped[repre_id]["version"]
|
||||
subset = grouped[repre_id]["subset"]
|
||||
asset = grouped[repre_id]["asset"]
|
||||
|
||||
# Get the primary family
|
||||
no_family = ""
|
||||
maj_version, _ = schema.get_schema_version(subset["schema"])
|
||||
if maj_version < 3:
|
||||
prim_family = version["data"].get("family")
|
||||
if not prim_family:
|
||||
families = version["data"].get("families")
|
||||
prim_family = families[0] if families else no_family
|
||||
else:
|
||||
families = subset["data"].get("families") or []
|
||||
prim_family = families[0] if families else no_family
|
||||
|
||||
# Get the label and icon for the family if in configuration
|
||||
family_config = self.family_config_cache.family_config(prim_family)
|
||||
family = family_config.get("label", prim_family)
|
||||
family_icon = family_config.get("icon", None)
|
||||
|
||||
# Store the highest available version so the model can know
|
||||
# whether current version is currently up-to-date.
|
||||
highest_version = io.find_one({
|
||||
"type": "version",
|
||||
"parent": version["parent"]
|
||||
}, sort=[("name", -1)])
|
||||
|
||||
# create the group header
|
||||
group_node = Item()
|
||||
group_node["Name"] = "%s_%s: (%s)" % (asset["name"],
|
||||
subset["name"],
|
||||
representation["name"])
|
||||
group_node["representation"] = repre_id
|
||||
group_node["version"] = version["name"]
|
||||
group_node["highest_version"] = highest_version["name"]
|
||||
group_node["family"] = family
|
||||
group_node["familyIcon"] = family_icon
|
||||
group_node["count"] = len(group_items)
|
||||
group_node["isGroupNode"] = True
|
||||
|
||||
if self.sync_enabled:
|
||||
progress = get_progress_for_repre(
|
||||
representation, self.active_site, self.remote_site
|
||||
)
|
||||
group_node["active_site"] = self.active_site
|
||||
group_node["active_site_provider"] = self.active_provider
|
||||
group_node["remote_site"] = self.remote_site
|
||||
group_node["remote_site_provider"] = self.remote_provider
|
||||
group_node["active_site_progress"] = progress[self.active_site]
|
||||
group_node["remote_site_progress"] = progress[self.remote_site]
|
||||
|
||||
self.add_child(group_node, parent=parent)
|
||||
|
||||
for item in group_items:
|
||||
item_node = Item()
|
||||
item_node.update(item)
|
||||
|
||||
# store the current version on the item
|
||||
item_node["version"] = version["name"]
|
||||
|
||||
# Remapping namespace to item name.
|
||||
# Noted that the name key is capital "N", by doing this, we
|
||||
# can view namespace in GUI without changing container data.
|
||||
item_node["Name"] = item["namespace"]
|
||||
|
||||
self.add_child(item_node, parent=group_node)
|
||||
|
||||
self.endResetModel()
|
||||
|
||||
return self._root_item
|
||||
|
||||
|
||||
class FilterProxyModel(QtCore.QSortFilterProxyModel):
|
||||
"""Filter model to where key column's value is in the filtered tags"""
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super(FilterProxyModel, self).__init__(*args, **kwargs)
|
||||
self._filter_outdated = False
|
||||
self._hierarchy_view = False
|
||||
|
||||
def filterAcceptsRow(self, row, parent):
|
||||
model = self.sourceModel()
|
||||
source_index = model.index(row, self.filterKeyColumn(), parent)
|
||||
|
||||
# Always allow bottom entries (individual containers), since their
|
||||
# parent group hidden if it wouldn't have been validated.
|
||||
rows = model.rowCount(source_index)
|
||||
if not rows:
|
||||
return True
|
||||
|
||||
# Filter by regex
|
||||
if not self.filterRegExp().isEmpty():
|
||||
pattern = re.escape(self.filterRegExp().pattern())
|
||||
|
||||
if not self._matches(row, parent, pattern):
|
||||
return False
|
||||
|
||||
if self._filter_outdated:
|
||||
# When filtering to outdated we filter the up to date entries
|
||||
# thus we "allow" them when they are outdated
|
||||
if not self._is_outdated(row, parent):
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def set_filter_outdated(self, state):
|
||||
"""Set whether to show the outdated entries only."""
|
||||
state = bool(state)
|
||||
|
||||
if state != self._filter_outdated:
|
||||
self._filter_outdated = bool(state)
|
||||
self.invalidateFilter()
|
||||
|
||||
def set_hierarchy_view(self, state):
|
||||
state = bool(state)
|
||||
|
||||
if state != self._hierarchy_view:
|
||||
self._hierarchy_view = state
|
||||
|
||||
def _is_outdated(self, row, parent):
|
||||
"""Return whether row is outdated.
|
||||
|
||||
A row is considered outdated if it has "version" and "highest_version"
|
||||
data and in the internal data structure, and they are not of an
|
||||
equal value.
|
||||
|
||||
"""
|
||||
def outdated(node):
|
||||
version = node.get("version", None)
|
||||
highest = node.get("highest_version", None)
|
||||
|
||||
# Always allow indices that have no version data at all
|
||||
if version is None and highest is None:
|
||||
return True
|
||||
|
||||
# If either a version or highest is present but not the other
|
||||
# consider the item invalid.
|
||||
if not self._hierarchy_view:
|
||||
# Skip this check if in hierarchy view, or the child item
|
||||
# node will be hidden even it's actually outdated.
|
||||
if version is None or highest is None:
|
||||
return False
|
||||
return version != highest
|
||||
|
||||
index = self.sourceModel().index(row, self.filterKeyColumn(), parent)
|
||||
|
||||
# The scene contents are grouped by "representation", e.g. the same
|
||||
# "representation" loaded twice is grouped under the same header.
|
||||
# Since the version check filters these parent groups we skip that
|
||||
# check for the individual children.
|
||||
has_parent = index.parent().isValid()
|
||||
if has_parent and not self._hierarchy_view:
|
||||
return True
|
||||
|
||||
# Filter to those that have the different version numbers
|
||||
node = index.internalPointer()
|
||||
if outdated(node):
|
||||
return True
|
||||
|
||||
if self._hierarchy_view:
|
||||
for _node in walk_hierarchy(node):
|
||||
if outdated(_node):
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
def _matches(self, row, parent, pattern):
|
||||
"""Return whether row matches regex pattern.
|
||||
|
||||
Args:
|
||||
row (int): row number in model
|
||||
parent (QtCore.QModelIndex): parent index
|
||||
pattern (regex.pattern): pattern to check for in key
|
||||
|
||||
Returns:
|
||||
bool
|
||||
|
||||
"""
|
||||
model = self.sourceModel()
|
||||
column = self.filterKeyColumn()
|
||||
role = self.filterRole()
|
||||
|
||||
def matches(row, parent, pattern):
|
||||
index = model.index(row, column, parent)
|
||||
key = model.data(index, role)
|
||||
if re.search(pattern, key, re.IGNORECASE):
|
||||
return True
|
||||
|
||||
if matches(row, parent, pattern):
|
||||
return True
|
||||
|
||||
# Also allow if any of the children matches
|
||||
source_index = model.index(row, column, parent)
|
||||
rows = model.rowCount(source_index)
|
||||
|
||||
if any(
|
||||
matches(idx, source_index, pattern)
|
||||
for idx in range(rows)
|
||||
):
|
||||
return True
|
||||
|
||||
if not self._hierarchy_view:
|
||||
return False
|
||||
|
||||
for idx in range(rows):
|
||||
child_index = model.index(idx, column, source_index)
|
||||
child_rows = model.rowCount(child_index)
|
||||
return any(
|
||||
self._matches(child_idx, child_index, pattern)
|
||||
for child_idx in range(child_rows)
|
||||
)
|
||||
|
||||
return True
|
||||
993
openpype/tools/sceneinventory/switch_dialog.py
Normal file
993
openpype/tools/sceneinventory/switch_dialog.py
Normal file
|
|
@ -0,0 +1,993 @@
|
|||
import collections
|
||||
import logging
|
||||
from Qt import QtWidgets, QtCore
|
||||
|
||||
from avalon import io, api
|
||||
from avalon.vendor import qtawesome
|
||||
|
||||
from .widgets import SearchComboBox
|
||||
|
||||
log = logging.getLogger("SwitchAssetDialog")
|
||||
|
||||
|
||||
class ValidationState:
|
||||
def __init__(self):
|
||||
self.asset_ok = True
|
||||
self.subset_ok = True
|
||||
self.repre_ok = True
|
||||
|
||||
@property
|
||||
def all_ok(self):
|
||||
return (
|
||||
self.asset_ok
|
||||
and self.subset_ok
|
||||
and self.repre_ok
|
||||
)
|
||||
|
||||
|
||||
class SwitchAssetDialog(QtWidgets.QDialog):
|
||||
"""Widget to support asset switching"""
|
||||
|
||||
MIN_WIDTH = 550
|
||||
|
||||
switched = QtCore.Signal()
|
||||
|
||||
def __init__(self, parent=None, items=None):
|
||||
super(SwitchAssetDialog, self).__init__(parent)
|
||||
|
||||
self.setWindowTitle("Switch selected items ...")
|
||||
|
||||
# Force and keep focus dialog
|
||||
self.setModal(True)
|
||||
|
||||
assets_combox = SearchComboBox(self)
|
||||
subsets_combox = SearchComboBox(self)
|
||||
repres_combobox = SearchComboBox(self)
|
||||
|
||||
assets_combox.set_placeholder("<asset>")
|
||||
subsets_combox.set_placeholder("<subset>")
|
||||
repres_combobox.set_placeholder("<representation>")
|
||||
|
||||
asset_label = QtWidgets.QLabel(self)
|
||||
subset_label = QtWidgets.QLabel(self)
|
||||
repre_label = QtWidgets.QLabel(self)
|
||||
|
||||
current_asset_btn = QtWidgets.QPushButton("Use current asset")
|
||||
|
||||
accept_icon = qtawesome.icon("fa.check", color="white")
|
||||
accept_btn = QtWidgets.QPushButton(self)
|
||||
accept_btn.setIcon(accept_icon)
|
||||
|
||||
main_layout = QtWidgets.QGridLayout(self)
|
||||
# Asset column
|
||||
main_layout.addWidget(current_asset_btn, 0, 0)
|
||||
main_layout.addWidget(assets_combox, 1, 0)
|
||||
main_layout.addWidget(asset_label, 2, 0)
|
||||
# Subset column
|
||||
main_layout.addWidget(subsets_combox, 1, 1)
|
||||
main_layout.addWidget(subset_label, 2, 1)
|
||||
# Representation column
|
||||
main_layout.addWidget(repres_combobox, 1, 2)
|
||||
main_layout.addWidget(repre_label, 2, 2)
|
||||
# Btn column
|
||||
main_layout.addWidget(accept_btn, 1, 3)
|
||||
main_layout.setColumnStretch(0, 1)
|
||||
main_layout.setColumnStretch(1, 1)
|
||||
main_layout.setColumnStretch(2, 1)
|
||||
main_layout.setColumnStretch(3, 0)
|
||||
|
||||
assets_combox.currentIndexChanged.connect(
|
||||
self._combobox_value_changed
|
||||
)
|
||||
subsets_combox.currentIndexChanged.connect(
|
||||
self._combobox_value_changed
|
||||
)
|
||||
repres_combobox.currentIndexChanged.connect(
|
||||
self._combobox_value_changed
|
||||
)
|
||||
accept_btn.clicked.connect(self._on_accept)
|
||||
current_asset_btn.clicked.connect(self._on_current_asset)
|
||||
|
||||
self._current_asset_btn = current_asset_btn
|
||||
|
||||
self._assets_box = assets_combox
|
||||
self._subsets_box = subsets_combox
|
||||
self._representations_box = repres_combobox
|
||||
|
||||
self._asset_label = asset_label
|
||||
self._subset_label = subset_label
|
||||
self._repre_label = repre_label
|
||||
|
||||
self._accept_btn = accept_btn
|
||||
|
||||
self._init_asset_name = None
|
||||
self._init_subset_name = None
|
||||
self._init_repre_name = None
|
||||
|
||||
self._fill_check = False
|
||||
|
||||
self._items = items
|
||||
self._prepare_content_data()
|
||||
self.refresh(True)
|
||||
|
||||
self.setMinimumWidth(self.MIN_WIDTH)
|
||||
|
||||
# Set default focus to accept button so you don't directly type in
|
||||
# first asset field, this also allows to see the placeholder value.
|
||||
accept_btn.setFocus()
|
||||
|
||||
def _prepare_content_data(self):
|
||||
repre_ids = [
|
||||
io.ObjectId(item["representation"])
|
||||
for item in self._items
|
||||
]
|
||||
repres = list(io.find({
|
||||
"type": {"$in": ["representation", "archived_representation"]},
|
||||
"_id": {"$in": repre_ids}
|
||||
}))
|
||||
repres_by_id = {repre["_id"]: repre for repre in repres}
|
||||
|
||||
# stash context values, works only for single representation
|
||||
if len(repres) == 1:
|
||||
self._init_asset_name = repres[0]["context"]["asset"]
|
||||
self._init_subset_name = repres[0]["context"]["subset"]
|
||||
self._init_repre_name = repres[0]["context"]["representation"]
|
||||
|
||||
content_repres = {}
|
||||
archived_repres = []
|
||||
missing_repres = []
|
||||
version_ids = []
|
||||
for repre_id in repre_ids:
|
||||
if repre_id not in repres_by_id:
|
||||
missing_repres.append(repre_id)
|
||||
elif repres_by_id[repre_id]["type"] == "archived_representation":
|
||||
repre = repres_by_id[repre_id]
|
||||
archived_repres.append(repre)
|
||||
version_ids.append(repre["parent"])
|
||||
else:
|
||||
repre = repres_by_id[repre_id]
|
||||
content_repres[repre_id] = repres_by_id[repre_id]
|
||||
version_ids.append(repre["parent"])
|
||||
|
||||
versions = io.find({
|
||||
"type": {"$in": ["version", "hero_version"]},
|
||||
"_id": {"$in": list(set(version_ids))}
|
||||
})
|
||||
content_versions = {}
|
||||
hero_version_ids = set()
|
||||
for version in versions:
|
||||
content_versions[version["_id"]] = version
|
||||
if version["type"] == "hero_version":
|
||||
hero_version_ids.add(version["_id"])
|
||||
|
||||
missing_versions = []
|
||||
subset_ids = []
|
||||
for version_id in version_ids:
|
||||
if version_id not in content_versions:
|
||||
missing_versions.append(version_id)
|
||||
else:
|
||||
subset_ids.append(content_versions[version_id]["parent"])
|
||||
|
||||
subsets = io.find({
|
||||
"type": {"$in": ["subset", "archived_subset"]},
|
||||
"_id": {"$in": subset_ids}
|
||||
})
|
||||
subsets_by_id = {sub["_id"]: sub for sub in subsets}
|
||||
|
||||
asset_ids = []
|
||||
archived_subsets = []
|
||||
missing_subsets = []
|
||||
content_subsets = {}
|
||||
for subset_id in subset_ids:
|
||||
if subset_id not in subsets_by_id:
|
||||
missing_subsets.append(subset_id)
|
||||
elif subsets_by_id[subset_id]["type"] == "archived_subset":
|
||||
subset = subsets_by_id[subset_id]
|
||||
asset_ids.append(subset["parent"])
|
||||
archived_subsets.append(subset)
|
||||
else:
|
||||
subset = subsets_by_id[subset_id]
|
||||
asset_ids.append(subset["parent"])
|
||||
content_subsets[subset_id] = subset
|
||||
|
||||
assets = io.find({
|
||||
"type": {"$in": ["asset", "archived_asset"]},
|
||||
"_id": {"$in": list(asset_ids)}
|
||||
})
|
||||
assets_by_id = {asset["_id"]: asset for asset in assets}
|
||||
|
||||
missing_assets = []
|
||||
archived_assets = []
|
||||
content_assets = {}
|
||||
for asset_id in asset_ids:
|
||||
if asset_id not in assets_by_id:
|
||||
missing_assets.append(asset_id)
|
||||
elif assets_by_id[asset_id]["type"] == "archived_asset":
|
||||
archived_assets.append(assets_by_id[asset_id])
|
||||
else:
|
||||
content_assets[asset_id] = assets_by_id[asset_id]
|
||||
|
||||
self.content_assets = content_assets
|
||||
self.content_subsets = content_subsets
|
||||
self.content_versions = content_versions
|
||||
self.content_repres = content_repres
|
||||
|
||||
self.hero_version_ids = hero_version_ids
|
||||
|
||||
self.missing_assets = missing_assets
|
||||
self.missing_versions = missing_versions
|
||||
self.missing_subsets = missing_subsets
|
||||
self.missing_repres = missing_repres
|
||||
self.missing_docs = (
|
||||
bool(missing_assets)
|
||||
or bool(missing_versions)
|
||||
or bool(missing_subsets)
|
||||
or bool(missing_repres)
|
||||
)
|
||||
|
||||
self.archived_assets = archived_assets
|
||||
self.archived_subsets = archived_subsets
|
||||
self.archived_repres = archived_repres
|
||||
|
||||
def _combobox_value_changed(self, *args, **kwargs):
|
||||
self.refresh()
|
||||
|
||||
def refresh(self, init_refresh=False):
|
||||
"""Build the need comboboxes with content"""
|
||||
if not self._fill_check and not init_refresh:
|
||||
return
|
||||
|
||||
self._fill_check = False
|
||||
|
||||
if init_refresh:
|
||||
asset_values = self._get_asset_box_values()
|
||||
self._fill_combobox(asset_values, "asset")
|
||||
|
||||
validation_state = ValidationState()
|
||||
|
||||
# Set other comboboxes to empty if any document is missing or any asset
|
||||
# of loaded representations is archived.
|
||||
self._is_asset_ok(validation_state)
|
||||
if validation_state.asset_ok:
|
||||
subset_values = self._get_subset_box_values()
|
||||
self._fill_combobox(subset_values, "subset")
|
||||
self._is_subset_ok(validation_state)
|
||||
|
||||
if validation_state.asset_ok and validation_state.subset_ok:
|
||||
repre_values = sorted(self._representations_box_values())
|
||||
self._fill_combobox(repre_values, "repre")
|
||||
self._is_repre_ok(validation_state)
|
||||
|
||||
# Fill comboboxes with values
|
||||
self.set_labels()
|
||||
self.apply_validations(validation_state)
|
||||
|
||||
if init_refresh: # pre select context if possible
|
||||
self._assets_box.set_valid_value(self._init_asset_name)
|
||||
self._subsets_box.set_valid_value(self._init_subset_name)
|
||||
self._representations_box.set_valid_value(self._init_repre_name)
|
||||
|
||||
self._fill_check = True
|
||||
|
||||
def _get_loaders(self, representations):
|
||||
if not representations:
|
||||
return list()
|
||||
|
||||
available_loaders = filter(
|
||||
lambda l: not (hasattr(l, "is_utility") and l.is_utility),
|
||||
api.discover(api.Loader)
|
||||
)
|
||||
|
||||
loaders = set()
|
||||
|
||||
for representation in representations:
|
||||
for loader in api.loaders_from_representation(
|
||||
available_loaders,
|
||||
representation
|
||||
):
|
||||
loaders.add(loader)
|
||||
|
||||
return loaders
|
||||
|
||||
def _fill_combobox(self, values, combobox_type):
|
||||
if combobox_type == "asset":
|
||||
combobox_widget = self._assets_box
|
||||
elif combobox_type == "subset":
|
||||
combobox_widget = self._subsets_box
|
||||
elif combobox_type == "repre":
|
||||
combobox_widget = self._representations_box
|
||||
else:
|
||||
return
|
||||
selected_value = combobox_widget.get_valid_value()
|
||||
|
||||
# Fill combobox
|
||||
if values is not None:
|
||||
combobox_widget.populate(list(sorted(values)))
|
||||
if selected_value and selected_value in values:
|
||||
index = None
|
||||
for idx in range(combobox_widget.count()):
|
||||
if selected_value == str(combobox_widget.itemText(idx)):
|
||||
index = idx
|
||||
break
|
||||
if index is not None:
|
||||
combobox_widget.setCurrentIndex(index)
|
||||
|
||||
def set_labels(self):
|
||||
asset_label = self._assets_box.get_valid_value()
|
||||
subset_label = self._subsets_box.get_valid_value()
|
||||
repre_label = self._representations_box.get_valid_value()
|
||||
|
||||
default = "*No changes"
|
||||
self._asset_label.setText(asset_label or default)
|
||||
self._subset_label.setText(subset_label or default)
|
||||
self._repre_label.setText(repre_label or default)
|
||||
|
||||
def apply_validations(self, validation_state):
|
||||
error_msg = "*Please select"
|
||||
error_sheet = "border: 1px solid red;"
|
||||
success_sheet = "border: 1px solid green;"
|
||||
|
||||
asset_sheet = None
|
||||
subset_sheet = None
|
||||
repre_sheet = None
|
||||
accept_sheet = None
|
||||
if validation_state.asset_ok is False:
|
||||
asset_sheet = error_sheet
|
||||
self._asset_label.setText(error_msg)
|
||||
elif validation_state.subset_ok is False:
|
||||
subset_sheet = error_sheet
|
||||
self._subset_label.setText(error_msg)
|
||||
elif validation_state.repre_ok is False:
|
||||
repre_sheet = error_sheet
|
||||
self._repre_label.setText(error_msg)
|
||||
|
||||
if validation_state.all_ok:
|
||||
accept_sheet = success_sheet
|
||||
|
||||
self._assets_box.setStyleSheet(asset_sheet or "")
|
||||
self._subsets_box.setStyleSheet(subset_sheet or "")
|
||||
self._representations_box.setStyleSheet(repre_sheet or "")
|
||||
|
||||
self._accept_btn.setEnabled(validation_state.all_ok)
|
||||
self._accept_btn.setStyleSheet(accept_sheet or "")
|
||||
|
||||
def _get_asset_box_values(self):
|
||||
asset_docs = io.find(
|
||||
{"type": "asset"},
|
||||
{"_id": 1, "name": 1}
|
||||
)
|
||||
asset_names_by_id = {
|
||||
asset_doc["_id"]: asset_doc["name"]
|
||||
for asset_doc in asset_docs
|
||||
}
|
||||
subsets = io.find(
|
||||
{
|
||||
"type": "subset",
|
||||
"parent": {"$in": list(asset_names_by_id.keys())}
|
||||
},
|
||||
{
|
||||
"parent": 1
|
||||
}
|
||||
)
|
||||
|
||||
filtered_assets = []
|
||||
for subset in subsets:
|
||||
asset_name = asset_names_by_id[subset["parent"]]
|
||||
if asset_name not in filtered_assets:
|
||||
filtered_assets.append(asset_name)
|
||||
return sorted(filtered_assets)
|
||||
|
||||
def _get_subset_box_values(self):
|
||||
selected_asset = self._assets_box.get_valid_value()
|
||||
if selected_asset:
|
||||
asset_doc = io.find_one({"type": "asset", "name": selected_asset})
|
||||
asset_ids = [asset_doc["_id"]]
|
||||
else:
|
||||
asset_ids = list(self.content_assets.keys())
|
||||
|
||||
subsets = io.find(
|
||||
{
|
||||
"type": "subset",
|
||||
"parent": {"$in": asset_ids}
|
||||
},
|
||||
{
|
||||
"parent": 1,
|
||||
"name": 1
|
||||
}
|
||||
)
|
||||
|
||||
subset_names_by_parent_id = collections.defaultdict(set)
|
||||
for subset in subsets:
|
||||
subset_names_by_parent_id[subset["parent"]].add(subset["name"])
|
||||
|
||||
possible_subsets = None
|
||||
for subset_names in subset_names_by_parent_id.values():
|
||||
if possible_subsets is None:
|
||||
possible_subsets = subset_names
|
||||
else:
|
||||
possible_subsets = (possible_subsets & subset_names)
|
||||
|
||||
if not possible_subsets:
|
||||
break
|
||||
|
||||
return list(possible_subsets or list())
|
||||
|
||||
def _representations_box_values(self):
|
||||
# NOTE hero versions are not used because it is expected that
|
||||
# hero version has same representations as latests
|
||||
selected_asset = self._assets_box.currentText()
|
||||
selected_subset = self._subsets_box.currentText()
|
||||
|
||||
# If nothing is selected
|
||||
# [ ] [ ] [?]
|
||||
if not selected_asset and not selected_subset:
|
||||
# Find all representations of selection's subsets
|
||||
possible_repres = list(io.find(
|
||||
{
|
||||
"type": "representation",
|
||||
"parent": {"$in": list(self.content_versions.keys())}
|
||||
},
|
||||
{
|
||||
"parent": 1,
|
||||
"name": 1
|
||||
}
|
||||
))
|
||||
|
||||
possible_repres_by_parent = collections.defaultdict(set)
|
||||
for repre in possible_repres:
|
||||
possible_repres_by_parent[repre["parent"]].add(repre["name"])
|
||||
|
||||
output_repres = None
|
||||
for repre_names in possible_repres_by_parent.values():
|
||||
if output_repres is None:
|
||||
output_repres = repre_names
|
||||
else:
|
||||
output_repres = (output_repres & repre_names)
|
||||
|
||||
if not output_repres:
|
||||
break
|
||||
|
||||
return list(output_repres or list())
|
||||
|
||||
# [x] [x] [?]
|
||||
if selected_asset and selected_subset:
|
||||
asset_doc = io.find_one(
|
||||
{"type": "asset", "name": selected_asset},
|
||||
{"_id": 1}
|
||||
)
|
||||
subset_doc = io.find_one(
|
||||
{
|
||||
"type": "subset",
|
||||
"name": selected_subset,
|
||||
"parent": asset_doc["_id"]
|
||||
},
|
||||
{"_id": 1}
|
||||
)
|
||||
subset_id = subset_doc["_id"]
|
||||
last_versions_by_subset_id = self.find_last_versions([subset_id])
|
||||
version_doc = last_versions_by_subset_id.get(subset_id)
|
||||
repre_docs = io.find(
|
||||
{
|
||||
"type": "representation",
|
||||
"parent": version_doc["_id"]
|
||||
},
|
||||
{
|
||||
"name": 1
|
||||
}
|
||||
)
|
||||
return [
|
||||
repre_doc["name"]
|
||||
for repre_doc in repre_docs
|
||||
]
|
||||
|
||||
# [x] [ ] [?]
|
||||
# If asset only is selected
|
||||
if selected_asset:
|
||||
asset_doc = io.find_one(
|
||||
{"type": "asset", "name": selected_asset},
|
||||
{"_id": 1}
|
||||
)
|
||||
if not asset_doc:
|
||||
return list()
|
||||
|
||||
# Filter subsets by subset names from content
|
||||
subset_names = set()
|
||||
for subset_doc in self.content_subsets.values():
|
||||
subset_names.add(subset_doc["name"])
|
||||
subset_docs = io.find(
|
||||
{
|
||||
"type": "subset",
|
||||
"parent": asset_doc["_id"],
|
||||
"name": {"$in": list(subset_names)}
|
||||
},
|
||||
{"_id": 1}
|
||||
)
|
||||
subset_ids = [
|
||||
subset_doc["_id"]
|
||||
for subset_doc in subset_docs
|
||||
]
|
||||
if not subset_ids:
|
||||
return list()
|
||||
|
||||
last_versions_by_subset_id = self.find_last_versions(subset_ids)
|
||||
subset_id_by_version_id = {}
|
||||
for subset_id, last_version in last_versions_by_subset_id.items():
|
||||
version_id = last_version["_id"]
|
||||
subset_id_by_version_id[version_id] = subset_id
|
||||
|
||||
if not subset_id_by_version_id:
|
||||
return list()
|
||||
|
||||
repre_docs = list(io.find(
|
||||
{
|
||||
"type": "representation",
|
||||
"parent": {"$in": list(subset_id_by_version_id.keys())}
|
||||
},
|
||||
{
|
||||
"name": 1,
|
||||
"parent": 1
|
||||
}
|
||||
))
|
||||
if not repre_docs:
|
||||
return list()
|
||||
|
||||
repre_names_by_parent = collections.defaultdict(set)
|
||||
for repre_doc in repre_docs:
|
||||
repre_names_by_parent[repre_doc["parent"]].add(
|
||||
repre_doc["name"]
|
||||
)
|
||||
|
||||
available_repres = None
|
||||
for repre_names in repre_names_by_parent.values():
|
||||
if available_repres is None:
|
||||
available_repres = repre_names
|
||||
continue
|
||||
|
||||
available_repres = available_repres.intersection(repre_names)
|
||||
|
||||
return list(available_repres)
|
||||
|
||||
# [ ] [x] [?]
|
||||
subset_docs = list(io.find(
|
||||
{
|
||||
"type": "subset",
|
||||
"parent": {"$in": list(self.content_assets.keys())},
|
||||
"name": selected_subset
|
||||
},
|
||||
{"_id": 1, "parent": 1}
|
||||
))
|
||||
if not subset_docs:
|
||||
return list()
|
||||
|
||||
subset_docs_by_id = {
|
||||
subset_doc["_id"]: subset_doc
|
||||
for subset_doc in subset_docs
|
||||
}
|
||||
last_versions_by_subset_id = self.find_last_versions(
|
||||
subset_docs_by_id.keys()
|
||||
)
|
||||
|
||||
subset_id_by_version_id = {}
|
||||
for subset_id, last_version in last_versions_by_subset_id.items():
|
||||
version_id = last_version["_id"]
|
||||
subset_id_by_version_id[version_id] = subset_id
|
||||
|
||||
if not subset_id_by_version_id:
|
||||
return list()
|
||||
|
||||
repre_docs = list(io.find(
|
||||
{
|
||||
"type": "representation",
|
||||
"parent": {"$in": list(subset_id_by_version_id.keys())}
|
||||
},
|
||||
{
|
||||
"name": 1,
|
||||
"parent": 1
|
||||
}
|
||||
))
|
||||
if not repre_docs:
|
||||
return list()
|
||||
|
||||
repre_names_by_asset_id = {}
|
||||
for repre_doc in repre_docs:
|
||||
subset_id = subset_id_by_version_id[repre_doc["parent"]]
|
||||
asset_id = subset_docs_by_id[subset_id]["parent"]
|
||||
if asset_id not in repre_names_by_asset_id:
|
||||
repre_names_by_asset_id[asset_id] = set()
|
||||
repre_names_by_asset_id[asset_id].add(repre_doc["name"])
|
||||
|
||||
available_repres = None
|
||||
for repre_names in repre_names_by_asset_id.values():
|
||||
if available_repres is None:
|
||||
available_repres = repre_names
|
||||
continue
|
||||
|
||||
available_repres = available_repres.intersection(repre_names)
|
||||
|
||||
return list(available_repres)
|
||||
|
||||
def _is_asset_ok(self, validation_state):
|
||||
selected_asset = self._assets_box.get_valid_value()
|
||||
if (
|
||||
selected_asset is None
|
||||
and (self.missing_docs or self.archived_assets)
|
||||
):
|
||||
validation_state.asset_ok = False
|
||||
|
||||
def _is_subset_ok(self, validation_state):
|
||||
selected_asset = self._assets_box.get_valid_value()
|
||||
selected_subset = self._subsets_box.get_valid_value()
|
||||
|
||||
# [?] [x] [?]
|
||||
# If subset is selected then must be ok
|
||||
if selected_subset is not None:
|
||||
return
|
||||
|
||||
# [ ] [ ] [?]
|
||||
if selected_asset is None:
|
||||
# If there were archived subsets and asset is not selected
|
||||
if self.archived_subsets:
|
||||
validation_state.subset_ok = False
|
||||
return
|
||||
|
||||
# [x] [ ] [?]
|
||||
asset_doc = io.find_one(
|
||||
{"type": "asset", "name": selected_asset},
|
||||
{"_id": 1}
|
||||
)
|
||||
subset_docs = io.find(
|
||||
{"type": "subset", "parent": asset_doc["_id"]},
|
||||
{"name": 1}
|
||||
)
|
||||
subset_names = set(
|
||||
subset_doc["name"]
|
||||
for subset_doc in subset_docs
|
||||
)
|
||||
|
||||
for subset_doc in self.content_subsets.values():
|
||||
if subset_doc["name"] not in subset_names:
|
||||
validation_state.subset_ok = False
|
||||
break
|
||||
|
||||
def find_last_versions(self, subset_ids):
|
||||
_pipeline = [
|
||||
# Find all versions of those subsets
|
||||
{"$match": {
|
||||
"type": "version",
|
||||
"parent": {"$in": list(subset_ids)}
|
||||
}},
|
||||
# Sorting versions all together
|
||||
{"$sort": {"name": 1}},
|
||||
# Group them by "parent", but only take the last
|
||||
{"$group": {
|
||||
"_id": "$parent",
|
||||
"_version_id": {"$last": "$_id"},
|
||||
"type": {"$last": "$type"}
|
||||
}}
|
||||
]
|
||||
last_versions_by_subset_id = dict()
|
||||
for doc in io.aggregate(_pipeline):
|
||||
doc["parent"] = doc["_id"]
|
||||
doc["_id"] = doc.pop("_version_id")
|
||||
last_versions_by_subset_id[doc["parent"]] = doc
|
||||
return last_versions_by_subset_id
|
||||
|
||||
def _is_repre_ok(self, validation_state):
|
||||
selected_asset = self._assets_box.get_valid_value()
|
||||
selected_subset = self._subsets_box.get_valid_value()
|
||||
selected_repre = self._representations_box.get_valid_value()
|
||||
|
||||
# [?] [?] [x]
|
||||
# If subset is selected then must be ok
|
||||
if selected_repre is not None:
|
||||
return
|
||||
|
||||
# [ ] [ ] [ ]
|
||||
if selected_asset is None and selected_subset is None:
|
||||
if (
|
||||
self.archived_repres
|
||||
or self.missing_versions
|
||||
or self.missing_repres
|
||||
):
|
||||
validation_state.repre_ok = False
|
||||
return
|
||||
|
||||
# [x] [x] [ ]
|
||||
if selected_asset is not None and selected_subset is not None:
|
||||
asset_doc = io.find_one(
|
||||
{"type": "asset", "name": selected_asset},
|
||||
{"_id": 1}
|
||||
)
|
||||
subset_doc = io.find_one(
|
||||
{
|
||||
"type": "subset",
|
||||
"parent": asset_doc["_id"],
|
||||
"name": selected_subset
|
||||
},
|
||||
{"_id": 1}
|
||||
)
|
||||
last_versions_by_subset_id = self.find_last_versions(
|
||||
[subset_doc["_id"]]
|
||||
)
|
||||
last_version = last_versions_by_subset_id.get(subset_doc["_id"])
|
||||
if not last_version:
|
||||
validation_state.repre_ok = False
|
||||
return
|
||||
|
||||
repre_docs = io.find(
|
||||
{
|
||||
"type": "representation",
|
||||
"parent": last_version["_id"]
|
||||
},
|
||||
{"name": 1}
|
||||
)
|
||||
|
||||
repre_names = set(
|
||||
repre_doc["name"]
|
||||
for repre_doc in repre_docs
|
||||
)
|
||||
for repre_doc in self.content_repres.values():
|
||||
if repre_doc["name"] not in repre_names:
|
||||
validation_state.repre_ok = False
|
||||
break
|
||||
return
|
||||
|
||||
# [x] [ ] [ ]
|
||||
if selected_asset is not None:
|
||||
asset_doc = io.find_one(
|
||||
{"type": "asset", "name": selected_asset},
|
||||
{"_id": 1}
|
||||
)
|
||||
subset_docs = list(io.find(
|
||||
{
|
||||
"type": "subset",
|
||||
"parent": asset_doc["_id"]
|
||||
},
|
||||
{"_id": 1, "name": 1}
|
||||
))
|
||||
|
||||
subset_name_by_id = {}
|
||||
subset_ids = set()
|
||||
for subset_doc in subset_docs:
|
||||
subset_id = subset_doc["_id"]
|
||||
subset_ids.add(subset_id)
|
||||
subset_name_by_id[subset_id] = subset_doc["name"]
|
||||
|
||||
last_versions_by_subset_id = self.find_last_versions(subset_ids)
|
||||
|
||||
subset_id_by_version_id = {}
|
||||
for subset_id, last_version in last_versions_by_subset_id.items():
|
||||
version_id = last_version["_id"]
|
||||
subset_id_by_version_id[version_id] = subset_id
|
||||
|
||||
repre_docs = io.find(
|
||||
{
|
||||
"type": "representation",
|
||||
"parent": {"$in": list(subset_id_by_version_id.keys())}
|
||||
},
|
||||
{
|
||||
"name": 1,
|
||||
"parent": 1
|
||||
}
|
||||
)
|
||||
repres_by_subset_name = {}
|
||||
for repre_doc in repre_docs:
|
||||
subset_id = subset_id_by_version_id[repre_doc["parent"]]
|
||||
subset_name = subset_name_by_id[subset_id]
|
||||
if subset_name not in repres_by_subset_name:
|
||||
repres_by_subset_name[subset_name] = set()
|
||||
repres_by_subset_name[subset_name].add(repre_doc["name"])
|
||||
|
||||
for repre_doc in self.content_repres.values():
|
||||
version_doc = self.content_versions[repre_doc["parent"]]
|
||||
subset_doc = self.content_subsets[version_doc["parent"]]
|
||||
repre_names = (
|
||||
repres_by_subset_name.get(subset_doc["name"]) or []
|
||||
)
|
||||
if repre_doc["name"] not in repre_names:
|
||||
validation_state.repre_ok = False
|
||||
break
|
||||
return
|
||||
|
||||
# [ ] [x] [ ]
|
||||
# Subset documents
|
||||
subset_docs = io.find(
|
||||
{
|
||||
"type": "subset",
|
||||
"parent": {"$in": list(self.content_assets.keys())},
|
||||
"name": selected_subset
|
||||
},
|
||||
{"_id": 1, "name": 1, "parent": 1}
|
||||
)
|
||||
|
||||
subset_docs_by_id = {}
|
||||
for subset_doc in subset_docs:
|
||||
subset_docs_by_id[subset_doc["_id"]] = subset_doc
|
||||
|
||||
last_versions_by_subset_id = self.find_last_versions(
|
||||
subset_docs_by_id.keys()
|
||||
)
|
||||
subset_id_by_version_id = {}
|
||||
for subset_id, last_version in last_versions_by_subset_id.items():
|
||||
version_id = last_version["_id"]
|
||||
subset_id_by_version_id[version_id] = subset_id
|
||||
|
||||
repre_docs = io.find(
|
||||
{
|
||||
"type": "representation",
|
||||
"parent": {"$in": list(subset_id_by_version_id.keys())}
|
||||
},
|
||||
{
|
||||
"name": 1,
|
||||
"parent": 1
|
||||
}
|
||||
)
|
||||
repres_by_asset_id = {}
|
||||
for repre_doc in repre_docs:
|
||||
subset_id = subset_id_by_version_id[repre_doc["parent"]]
|
||||
asset_id = subset_docs_by_id[subset_id]["parent"]
|
||||
if asset_id not in repres_by_asset_id:
|
||||
repres_by_asset_id[asset_id] = set()
|
||||
repres_by_asset_id[asset_id].add(repre_doc["name"])
|
||||
|
||||
for repre_doc in self.content_repres.values():
|
||||
version_doc = self.content_versions[repre_doc["parent"]]
|
||||
subset_doc = self.content_subsets[version_doc["parent"]]
|
||||
asset_id = subset_doc["parent"]
|
||||
repre_names = (
|
||||
repres_by_asset_id.get(asset_id) or []
|
||||
)
|
||||
if repre_doc["name"] not in repre_names:
|
||||
validation_state.repre_ok = False
|
||||
break
|
||||
|
||||
def _on_current_asset(self):
|
||||
# Set initial asset as current.
|
||||
asset_name = io.Session["AVALON_ASSET"]
|
||||
index = self._assets_box.findText(
|
||||
asset_name, QtCore.Qt.MatchFixedString
|
||||
)
|
||||
if index >= 0:
|
||||
print("Setting asset to {}".format(asset_name))
|
||||
self._assets_box.setCurrentIndex(index)
|
||||
|
||||
def _on_accept(self):
|
||||
# Use None when not a valid value or when placeholder value
|
||||
selected_asset = self._assets_box.get_valid_value()
|
||||
selected_subset = self._subsets_box.get_valid_value()
|
||||
selected_representation = self._representations_box.get_valid_value()
|
||||
|
||||
if selected_asset:
|
||||
asset_doc = io.find_one({"type": "asset", "name": selected_asset})
|
||||
asset_docs_by_id = {asset_doc["_id"]: asset_doc}
|
||||
else:
|
||||
asset_docs_by_id = self.content_assets
|
||||
|
||||
asset_docs_by_name = {
|
||||
asset_doc["name"]: asset_doc
|
||||
for asset_doc in asset_docs_by_id.values()
|
||||
}
|
||||
|
||||
asset_ids = list(asset_docs_by_id.keys())
|
||||
|
||||
subset_query = {
|
||||
"type": "subset",
|
||||
"parent": {"$in": asset_ids}
|
||||
}
|
||||
if selected_subset:
|
||||
subset_query["name"] = selected_subset
|
||||
|
||||
subset_docs = list(io.find(subset_query))
|
||||
subset_ids = []
|
||||
subset_docs_by_parent_and_name = collections.defaultdict(dict)
|
||||
for subset in subset_docs:
|
||||
subset_ids.append(subset["_id"])
|
||||
parent_id = subset["parent"]
|
||||
name = subset["name"]
|
||||
subset_docs_by_parent_and_name[parent_id][name] = subset
|
||||
|
||||
# versions
|
||||
version_docs = list(io.find({
|
||||
"type": "version",
|
||||
"parent": {"$in": subset_ids}
|
||||
}, sort=[("name", -1)]))
|
||||
|
||||
hero_version_docs = list(io.find({
|
||||
"type": "hero_version",
|
||||
"parent": {"$in": subset_ids}
|
||||
}))
|
||||
|
||||
version_ids = list()
|
||||
|
||||
version_docs_by_parent_id = {}
|
||||
for version_doc in version_docs:
|
||||
parent_id = version_doc["parent"]
|
||||
if parent_id not in version_docs_by_parent_id:
|
||||
version_ids.append(version_doc["_id"])
|
||||
version_docs_by_parent_id[parent_id] = version_doc
|
||||
|
||||
hero_version_docs_by_parent_id = {}
|
||||
for hero_version_doc in hero_version_docs:
|
||||
version_ids.append(hero_version_doc["_id"])
|
||||
parent_id = hero_version_doc["parent"]
|
||||
hero_version_docs_by_parent_id[parent_id] = hero_version_doc
|
||||
|
||||
repre_docs = io.find({
|
||||
"type": "representation",
|
||||
"parent": {"$in": version_ids}
|
||||
})
|
||||
repre_docs_by_parent_id_by_name = collections.defaultdict(dict)
|
||||
for repre_doc in repre_docs:
|
||||
parent_id = repre_doc["parent"]
|
||||
name = repre_doc["name"]
|
||||
repre_docs_by_parent_id_by_name[parent_id][name] = repre_doc
|
||||
|
||||
for container in self._items:
|
||||
container_repre_id = io.ObjectId(container["representation"])
|
||||
container_repre = self.content_repres[container_repre_id]
|
||||
container_repre_name = container_repre["name"]
|
||||
|
||||
container_version_id = container_repre["parent"]
|
||||
container_version = self.content_versions[container_version_id]
|
||||
|
||||
container_subset_id = container_version["parent"]
|
||||
container_subset = self.content_subsets[container_subset_id]
|
||||
container_subset_name = container_subset["name"]
|
||||
|
||||
container_asset_id = container_subset["parent"]
|
||||
container_asset = self.content_assets[container_asset_id]
|
||||
container_asset_name = container_asset["name"]
|
||||
|
||||
if selected_asset:
|
||||
asset_doc = asset_docs_by_name[selected_asset]
|
||||
else:
|
||||
asset_doc = asset_docs_by_name[container_asset_name]
|
||||
|
||||
subsets_by_name = subset_docs_by_parent_and_name[asset_doc["_id"]]
|
||||
if selected_subset:
|
||||
subset_doc = subsets_by_name[selected_subset]
|
||||
else:
|
||||
subset_doc = subsets_by_name[container_subset_name]
|
||||
|
||||
repre_doc = None
|
||||
subset_id = subset_doc["_id"]
|
||||
if container_version["type"] == "hero_version":
|
||||
hero_version = hero_version_docs_by_parent_id.get(
|
||||
subset_id
|
||||
)
|
||||
if hero_version:
|
||||
_repres = repre_docs_by_parent_id_by_name.get(
|
||||
hero_version["_id"]
|
||||
)
|
||||
if selected_representation:
|
||||
repre_doc = _repres.get(selected_representation)
|
||||
else:
|
||||
repre_doc = _repres.get(container_repre_name)
|
||||
|
||||
if not repre_doc:
|
||||
version_doc = version_docs_by_parent_id[subset_id]
|
||||
version_id = version_doc["_id"]
|
||||
repres_by_name = repre_docs_by_parent_id_by_name[version_id]
|
||||
if selected_representation:
|
||||
repre_doc = repres_by_name[selected_representation]
|
||||
else:
|
||||
repre_doc = repres_by_name[container_repre_name]
|
||||
|
||||
try:
|
||||
api.switch(container, repre_doc)
|
||||
except Exception:
|
||||
msg = (
|
||||
"Couldn't switch asset."
|
||||
"See traceback for more information."
|
||||
)
|
||||
log.warning(msg, exc_info=True)
|
||||
dialog = QtWidgets.QMessageBox(self)
|
||||
dialog.setWindowTitle("Switch asset failed")
|
||||
dialog.setText(
|
||||
"Switch asset failed. Search console log for more details"
|
||||
)
|
||||
dialog.exec_()
|
||||
|
||||
self.switched.emit()
|
||||
|
||||
self.close()
|
||||
794
openpype/tools/sceneinventory/view.py
Normal file
794
openpype/tools/sceneinventory/view.py
Normal file
|
|
@ -0,0 +1,794 @@
|
|||
import collections
|
||||
import logging
|
||||
from functools import partial
|
||||
|
||||
from Qt import QtWidgets, QtCore
|
||||
|
||||
from avalon import io, api, style
|
||||
from avalon.vendor import qtawesome
|
||||
from avalon.lib import HeroVersionType
|
||||
from avalon.tools import lib as tools_lib
|
||||
|
||||
from openpype.modules import ModulesManager
|
||||
|
||||
from .switch_dialog import SwitchAssetDialog
|
||||
from .model import InventoryModel
|
||||
|
||||
|
||||
DEFAULT_COLOR = "#fb9c15"
|
||||
|
||||
log = logging.getLogger("SceneInventory")
|
||||
|
||||
|
||||
class SceneInvetoryView(QtWidgets.QTreeView):
|
||||
data_changed = QtCore.Signal()
|
||||
hierarchy_view_changed = QtCore.Signal(bool)
|
||||
|
||||
def __init__(self, parent=None):
|
||||
super(SceneInvetoryView, self).__init__(parent=parent)
|
||||
|
||||
# view settings
|
||||
self.setIndentation(12)
|
||||
self.setAlternatingRowColors(True)
|
||||
self.setSortingEnabled(True)
|
||||
self.setSelectionMode(self.ExtendedSelection)
|
||||
self.setContextMenuPolicy(QtCore.Qt.CustomContextMenu)
|
||||
self.customContextMenuRequested.connect(self._show_right_mouse_menu)
|
||||
self._hierarchy_view = False
|
||||
self._selected = None
|
||||
|
||||
manager = ModulesManager()
|
||||
self.sync_server = manager.modules_by_name["sync_server"]
|
||||
self.sync_enabled = self.sync_server.enabled
|
||||
|
||||
def _set_hierarchy_view(self, enabled):
|
||||
if enabled == self._hierarchy_view:
|
||||
return
|
||||
self._hierarchy_view = enabled
|
||||
self.hierarchy_view_changed.emit(enabled)
|
||||
|
||||
def _enter_hierarchy(self, items):
|
||||
self._selected = set(i["objectName"] for i in items)
|
||||
self._set_hierarchy_view(True)
|
||||
self.data_changed.emit()
|
||||
self.expandToDepth(1)
|
||||
self.setStyleSheet("""
|
||||
QTreeView {
|
||||
border-color: #fb9c15;
|
||||
}
|
||||
""")
|
||||
|
||||
def _leave_hierarchy(self):
|
||||
self._set_hierarchy_view(False)
|
||||
self.data_changed.emit()
|
||||
self.setStyleSheet("QTreeView {}")
|
||||
|
||||
def _build_item_menu_for_selection(self, items, menu):
|
||||
if not items:
|
||||
return
|
||||
|
||||
repre_ids = []
|
||||
for item in items:
|
||||
item_id = io.ObjectId(item["representation"])
|
||||
if item_id not in repre_ids:
|
||||
repre_ids.append(item_id)
|
||||
|
||||
repre_docs = io.find(
|
||||
{
|
||||
"type": "representation",
|
||||
"_id": {"$in": repre_ids}
|
||||
},
|
||||
{"parent": 1}
|
||||
)
|
||||
|
||||
version_ids = []
|
||||
for repre_doc in repre_docs:
|
||||
version_id = repre_doc["parent"]
|
||||
if version_id not in version_ids:
|
||||
version_ids.append(version_id)
|
||||
|
||||
loaded_versions = io.find({
|
||||
"_id": {"$in": version_ids},
|
||||
"type": {"$in": ["version", "hero_version"]}
|
||||
})
|
||||
|
||||
loaded_hero_versions = []
|
||||
versions_by_parent_id = collections.defaultdict(list)
|
||||
version_parents = []
|
||||
for version in loaded_versions:
|
||||
if version["type"] == "hero_version":
|
||||
loaded_hero_versions.append(version)
|
||||
else:
|
||||
parent_id = version["parent"]
|
||||
versions_by_parent_id[parent_id].append(version)
|
||||
if parent_id not in version_parents:
|
||||
version_parents.append(parent_id)
|
||||
|
||||
all_versions = io.find({
|
||||
"type": {"$in": ["hero_version", "version"]},
|
||||
"parent": {"$in": version_parents}
|
||||
})
|
||||
hero_versions = []
|
||||
versions = []
|
||||
for version in all_versions:
|
||||
if version["type"] == "hero_version":
|
||||
hero_versions.append(version)
|
||||
else:
|
||||
versions.append(version)
|
||||
|
||||
has_loaded_hero_versions = len(loaded_hero_versions) > 0
|
||||
has_available_hero_version = len(hero_versions) > 0
|
||||
has_outdated = False
|
||||
|
||||
for version in versions:
|
||||
parent_id = version["parent"]
|
||||
current_versions = versions_by_parent_id[parent_id]
|
||||
for current_version in current_versions:
|
||||
if current_version["name"] < version["name"]:
|
||||
has_outdated = True
|
||||
break
|
||||
|
||||
if has_outdated:
|
||||
break
|
||||
|
||||
switch_to_versioned = None
|
||||
if has_loaded_hero_versions:
|
||||
def _on_switch_to_versioned(items):
|
||||
repre_ids = []
|
||||
for item in items:
|
||||
item_id = io.ObjectId(item["representation"])
|
||||
if item_id not in repre_ids:
|
||||
repre_ids.append(item_id)
|
||||
|
||||
repre_docs = io.find(
|
||||
{
|
||||
"type": "representation",
|
||||
"_id": {"$in": repre_ids}
|
||||
},
|
||||
{"parent": 1}
|
||||
)
|
||||
|
||||
version_ids = []
|
||||
version_id_by_repre_id = {}
|
||||
for repre_doc in repre_docs:
|
||||
version_id = repre_doc["parent"]
|
||||
version_id_by_repre_id[repre_doc["_id"]] = version_id
|
||||
if version_id not in version_ids:
|
||||
version_ids.append(version_id)
|
||||
hero_versions = io.find(
|
||||
{
|
||||
"_id": {"$in": version_ids},
|
||||
"type": "hero_version"
|
||||
},
|
||||
{"version_id": 1}
|
||||
)
|
||||
version_ids = set()
|
||||
for hero_version in hero_versions:
|
||||
version_id = hero_version["version_id"]
|
||||
version_ids.add(version_id)
|
||||
hero_version_id = hero_version["_id"]
|
||||
for _repre_id, current_version_id in (
|
||||
version_id_by_repre_id.items()
|
||||
):
|
||||
if current_version_id == hero_version_id:
|
||||
version_id_by_repre_id[_repre_id] = version_id
|
||||
|
||||
version_docs = io.find(
|
||||
{
|
||||
"_id": {"$in": list(version_ids)},
|
||||
"type": "version"
|
||||
},
|
||||
{"name": 1}
|
||||
)
|
||||
version_name_by_id = {}
|
||||
for version_doc in version_docs:
|
||||
version_name_by_id[version_doc["_id"]] = \
|
||||
version_doc["name"]
|
||||
|
||||
for item in items:
|
||||
repre_id = io.ObjectId(item["representation"])
|
||||
version_id = version_id_by_repre_id.get(repre_id)
|
||||
version_name = version_name_by_id.get(version_id)
|
||||
if version_name is not None:
|
||||
try:
|
||||
api.update(item, version_name)
|
||||
except AssertionError:
|
||||
self._show_version_error_dialog(
|
||||
version_name, [item]
|
||||
)
|
||||
log.warning("Update failed", exc_info=True)
|
||||
|
||||
self.data_changed.emit()
|
||||
|
||||
update_icon = qtawesome.icon(
|
||||
"fa.asterisk",
|
||||
color=DEFAULT_COLOR
|
||||
)
|
||||
switch_to_versioned = QtWidgets.QAction(
|
||||
update_icon,
|
||||
"Switch to versioned",
|
||||
menu
|
||||
)
|
||||
switch_to_versioned.triggered.connect(
|
||||
lambda: _on_switch_to_versioned(items)
|
||||
)
|
||||
|
||||
update_to_latest_action = None
|
||||
if has_outdated or has_loaded_hero_versions:
|
||||
# update to latest version
|
||||
def _on_update_to_latest(items):
|
||||
for item in items:
|
||||
try:
|
||||
api.update(item, -1)
|
||||
except AssertionError:
|
||||
self._show_version_error_dialog(None, [item])
|
||||
log.warning("Update failed", exc_info=True)
|
||||
self.data_changed.emit()
|
||||
|
||||
update_icon = qtawesome.icon(
|
||||
"fa.angle-double-up",
|
||||
color=DEFAULT_COLOR
|
||||
)
|
||||
update_to_latest_action = QtWidgets.QAction(
|
||||
update_icon,
|
||||
"Update to latest",
|
||||
menu
|
||||
)
|
||||
update_to_latest_action.triggered.connect(
|
||||
lambda: _on_update_to_latest(items)
|
||||
)
|
||||
|
||||
change_to_hero = None
|
||||
if has_available_hero_version:
|
||||
# change to hero version
|
||||
def _on_update_to_hero(items):
|
||||
for item in items:
|
||||
try:
|
||||
api.update(item, HeroVersionType(-1))
|
||||
except AssertionError:
|
||||
self._show_version_error_dialog('hero', [item])
|
||||
log.warning("Update failed", exc_info=True)
|
||||
self.data_changed.emit()
|
||||
|
||||
# TODO change icon
|
||||
change_icon = qtawesome.icon(
|
||||
"fa.asterisk",
|
||||
color="#00b359"
|
||||
)
|
||||
change_to_hero = QtWidgets.QAction(
|
||||
change_icon,
|
||||
"Change to hero",
|
||||
menu
|
||||
)
|
||||
change_to_hero.triggered.connect(
|
||||
lambda: _on_update_to_hero(items)
|
||||
)
|
||||
|
||||
# set version
|
||||
set_version_icon = qtawesome.icon("fa.hashtag", color=DEFAULT_COLOR)
|
||||
set_version_action = QtWidgets.QAction(
|
||||
set_version_icon,
|
||||
"Set version",
|
||||
menu
|
||||
)
|
||||
set_version_action.triggered.connect(
|
||||
lambda: self._show_version_dialog(items))
|
||||
|
||||
# switch asset
|
||||
switch_asset_icon = qtawesome.icon("fa.sitemap", color=DEFAULT_COLOR)
|
||||
switch_asset_action = QtWidgets.QAction(
|
||||
switch_asset_icon,
|
||||
"Switch Asset",
|
||||
menu
|
||||
)
|
||||
switch_asset_action.triggered.connect(
|
||||
lambda: self._show_switch_dialog(items))
|
||||
|
||||
# remove
|
||||
remove_icon = qtawesome.icon("fa.remove", color=DEFAULT_COLOR)
|
||||
remove_action = QtWidgets.QAction(remove_icon, "Remove items", menu)
|
||||
remove_action.triggered.connect(
|
||||
lambda: self._show_remove_warning_dialog(items))
|
||||
|
||||
# add the actions
|
||||
if switch_to_versioned:
|
||||
menu.addAction(switch_to_versioned)
|
||||
|
||||
if update_to_latest_action:
|
||||
menu.addAction(update_to_latest_action)
|
||||
|
||||
if change_to_hero:
|
||||
menu.addAction(change_to_hero)
|
||||
|
||||
menu.addAction(set_version_action)
|
||||
menu.addAction(switch_asset_action)
|
||||
|
||||
menu.addSeparator()
|
||||
|
||||
menu.addAction(remove_action)
|
||||
|
||||
self._handle_sync_server(menu, repre_ids)
|
||||
|
||||
def _handle_sync_server(self, menu, repre_ids):
|
||||
"""
|
||||
Adds actions for download/upload when SyncServer is enabled
|
||||
|
||||
Args:
|
||||
menu (OptionMenu)
|
||||
repre_ids (list) of object_ids
|
||||
Returns:
|
||||
(OptionMenu)
|
||||
"""
|
||||
if not self.sync_enabled:
|
||||
return
|
||||
|
||||
menu.addSeparator()
|
||||
|
||||
download_icon = qtawesome.icon("fa.download", color=DEFAULT_COLOR)
|
||||
download_active_action = QtWidgets.QAction(
|
||||
download_icon,
|
||||
"Download",
|
||||
menu
|
||||
)
|
||||
download_active_action.triggered.connect(
|
||||
lambda: self._add_sites(repre_ids, 'active_site'))
|
||||
|
||||
upload_icon = qtawesome.icon("fa.upload", color=DEFAULT_COLOR)
|
||||
upload_remote_action = QtWidgets.QAction(
|
||||
upload_icon,
|
||||
"Upload",
|
||||
menu
|
||||
)
|
||||
upload_remote_action.triggered.connect(
|
||||
lambda: self._add_sites(repre_ids, 'remote_site'))
|
||||
|
||||
menu.addAction(download_active_action)
|
||||
menu.addAction(upload_remote_action)
|
||||
|
||||
def _add_sites(self, repre_ids, side):
|
||||
"""
|
||||
(Re)sync all 'repre_ids' to specific site.
|
||||
|
||||
It checks if opposite site has fully available content to limit
|
||||
accidents. (ReSync active when no remote >> losing active content)
|
||||
|
||||
Args:
|
||||
repre_ids (list)
|
||||
side (str): 'active_site'|'remote_site'
|
||||
"""
|
||||
project_name = io.Session["AVALON_PROJECT"]
|
||||
active_site = self.sync_server.get_active_site(project_name)
|
||||
remote_site = self.sync_server.get_remote_site(project_name)
|
||||
|
||||
repre_docs = io.find({
|
||||
"type": "representation",
|
||||
"_id": {"$in": repre_ids}
|
||||
})
|
||||
repre_docs_by_id = {
|
||||
repre_doc["_id"]: repre_doc
|
||||
for repre_doc in repre_docs
|
||||
}
|
||||
for repre_id in repre_ids:
|
||||
repre_doc = repre_docs_by_id.get(repre_id)
|
||||
if not repre_doc:
|
||||
continue
|
||||
|
||||
progress = tools_lib.get_progress_for_repre(
|
||||
repre_doc,
|
||||
active_site,
|
||||
remote_site
|
||||
)
|
||||
if side == "active_site":
|
||||
# check opposite from added site, must be 1 or unable to sync
|
||||
check_progress = progress[remote_site]
|
||||
site = active_site
|
||||
else:
|
||||
check_progress = progress[active_site]
|
||||
site = remote_site
|
||||
|
||||
if check_progress == 1:
|
||||
self.sync_server.add_site(
|
||||
project_name, repre_id, site, force=True
|
||||
)
|
||||
|
||||
self.data_changed.emit()
|
||||
|
||||
def _build_item_menu(self, items=None):
|
||||
"""Create menu for the selected items"""
|
||||
|
||||
if not items:
|
||||
items = []
|
||||
|
||||
menu = QtWidgets.QMenu(self)
|
||||
|
||||
# add the actions
|
||||
self._build_item_menu_for_selection(items, menu)
|
||||
|
||||
# These two actions should be able to work without selection
|
||||
# expand all items
|
||||
expandall_action = QtWidgets.QAction(menu, text="Expand all items")
|
||||
expandall_action.triggered.connect(self.expandAll)
|
||||
|
||||
# collapse all items
|
||||
collapse_action = QtWidgets.QAction(menu, text="Collapse all items")
|
||||
collapse_action.triggered.connect(self.collapseAll)
|
||||
|
||||
menu.addAction(expandall_action)
|
||||
menu.addAction(collapse_action)
|
||||
|
||||
custom_actions = self._get_custom_actions(containers=items)
|
||||
if custom_actions:
|
||||
submenu = QtWidgets.QMenu("Actions", self)
|
||||
for action in custom_actions:
|
||||
color = action.color or DEFAULT_COLOR
|
||||
icon = qtawesome.icon("fa.%s" % action.icon, color=color)
|
||||
action_item = QtWidgets.QAction(icon, action.label, submenu)
|
||||
action_item.triggered.connect(
|
||||
partial(self._process_custom_action, action, items))
|
||||
|
||||
submenu.addAction(action_item)
|
||||
|
||||
menu.addMenu(submenu)
|
||||
|
||||
# go back to flat view
|
||||
if self._hierarchy_view:
|
||||
back_to_flat_icon = qtawesome.icon("fa.list", color=DEFAULT_COLOR)
|
||||
back_to_flat_action = QtWidgets.QAction(
|
||||
back_to_flat_icon,
|
||||
"Back to Full-View",
|
||||
menu
|
||||
)
|
||||
back_to_flat_action.triggered.connect(self._leave_hierarchy)
|
||||
|
||||
# send items to hierarchy view
|
||||
enter_hierarchy_icon = qtawesome.icon("fa.indent", color="#d8d8d8")
|
||||
enter_hierarchy_action = QtWidgets.QAction(
|
||||
enter_hierarchy_icon,
|
||||
"Cherry-Pick (Hierarchy)",
|
||||
menu
|
||||
)
|
||||
enter_hierarchy_action.triggered.connect(
|
||||
lambda: self._enter_hierarchy(items))
|
||||
|
||||
if items:
|
||||
menu.addAction(enter_hierarchy_action)
|
||||
|
||||
if self._hierarchy_view:
|
||||
menu.addAction(back_to_flat_action)
|
||||
|
||||
return menu
|
||||
|
||||
def _get_custom_actions(self, containers):
|
||||
"""Get the registered Inventory Actions
|
||||
|
||||
Args:
|
||||
containers(list): collection of containers
|
||||
|
||||
Returns:
|
||||
list: collection of filter and initialized actions
|
||||
"""
|
||||
|
||||
def sorter(Plugin):
|
||||
"""Sort based on order attribute of the plugin"""
|
||||
return Plugin.order
|
||||
|
||||
# Fedd an empty dict if no selection, this will ensure the compat
|
||||
# lookup always work, so plugin can interact with Scene Inventory
|
||||
# reversely.
|
||||
containers = containers or [dict()]
|
||||
|
||||
# Check which action will be available in the menu
|
||||
Plugins = api.discover(api.InventoryAction)
|
||||
compatible = [p() for p in Plugins if
|
||||
any(p.is_compatible(c) for c in containers)]
|
||||
|
||||
return sorted(compatible, key=sorter)
|
||||
|
||||
def _process_custom_action(self, action, containers):
|
||||
"""Run action and if results are returned positive update the view
|
||||
|
||||
If the result is list or dict, will select view items by the result.
|
||||
|
||||
Args:
|
||||
action (InventoryAction): Inventory Action instance
|
||||
containers (list): Data of currently selected items
|
||||
|
||||
Returns:
|
||||
None
|
||||
"""
|
||||
|
||||
result = action.process(containers)
|
||||
if result:
|
||||
self.data_changed.emit()
|
||||
|
||||
if isinstance(result, (list, set)):
|
||||
self._select_items_by_action(result)
|
||||
|
||||
if isinstance(result, dict):
|
||||
self._select_items_by_action(
|
||||
result["objectNames"], result["options"]
|
||||
)
|
||||
|
||||
def _select_items_by_action(self, object_names, options=None):
|
||||
"""Select view items by the result of action
|
||||
|
||||
Args:
|
||||
object_names (list or set): A list/set of container object name
|
||||
options (dict): GUI operation options.
|
||||
|
||||
Returns:
|
||||
None
|
||||
|
||||
"""
|
||||
options = options or dict()
|
||||
|
||||
if options.get("clear", True):
|
||||
self.clearSelection()
|
||||
|
||||
object_names = set(object_names)
|
||||
if (
|
||||
self._hierarchy_view
|
||||
and not self._selected.issuperset(object_names)
|
||||
):
|
||||
# If any container not in current cherry-picked view, update
|
||||
# view before selecting them.
|
||||
self._selected.update(object_names)
|
||||
self.data_changed.emit()
|
||||
|
||||
model = self.model()
|
||||
selection_model = self.selectionModel()
|
||||
|
||||
select_mode = {
|
||||
"select": selection_model.Select,
|
||||
"deselect": selection_model.Deselect,
|
||||
"toggle": selection_model.Toggle,
|
||||
}[options.get("mode", "select")]
|
||||
|
||||
for item in tools_lib.iter_model_rows(model, 0):
|
||||
item = item.data(InventoryModel.ItemRole)
|
||||
if item.get("isGroupNode"):
|
||||
continue
|
||||
|
||||
name = item.get("objectName")
|
||||
if name in object_names:
|
||||
self.scrollTo(item) # Ensure item is visible
|
||||
flags = select_mode | selection_model.Rows
|
||||
selection_model.select(item, flags)
|
||||
|
||||
object_names.remove(name)
|
||||
|
||||
if len(object_names) == 0:
|
||||
break
|
||||
|
||||
def _show_right_mouse_menu(self, pos):
|
||||
"""Display the menu when at the position of the item clicked"""
|
||||
|
||||
globalpos = self.viewport().mapToGlobal(pos)
|
||||
|
||||
if not self.selectionModel().hasSelection():
|
||||
print("No selection")
|
||||
# Build menu without selection, feed an empty list
|
||||
menu = self._build_item_menu()
|
||||
menu.exec_(globalpos)
|
||||
return
|
||||
|
||||
active = self.currentIndex() # index under mouse
|
||||
active = active.sibling(active.row(), 0) # get first column
|
||||
|
||||
# move index under mouse
|
||||
indices = self.get_indices()
|
||||
if active in indices:
|
||||
indices.remove(active)
|
||||
|
||||
indices.append(active)
|
||||
|
||||
# Extend to the sub-items
|
||||
all_indices = self._extend_to_children(indices)
|
||||
items = [dict(i.data(InventoryModel.ItemRole)) for i in all_indices
|
||||
if i.parent().isValid()]
|
||||
|
||||
if self._hierarchy_view:
|
||||
# Ensure no group item
|
||||
items = [n for n in items if not n.get("isGroupNode")]
|
||||
|
||||
menu = self._build_item_menu(items)
|
||||
menu.exec_(globalpos)
|
||||
|
||||
def get_indices(self):
|
||||
"""Get the selected rows"""
|
||||
selection_model = self.selectionModel()
|
||||
return selection_model.selectedRows()
|
||||
|
||||
def _extend_to_children(self, indices):
|
||||
"""Extend the indices to the children indices.
|
||||
|
||||
Top-level indices are extended to its children indices. Sub-items
|
||||
are kept as is.
|
||||
|
||||
Args:
|
||||
indices (list): The indices to extend.
|
||||
|
||||
Returns:
|
||||
list: The children indices
|
||||
|
||||
"""
|
||||
def get_children(i):
|
||||
model = i.model()
|
||||
rows = model.rowCount(parent=i)
|
||||
for row in range(rows):
|
||||
child = model.index(row, 0, parent=i)
|
||||
yield child
|
||||
|
||||
subitems = set()
|
||||
for i in indices:
|
||||
valid_parent = i.parent().isValid()
|
||||
if valid_parent and i not in subitems:
|
||||
subitems.add(i)
|
||||
|
||||
if self._hierarchy_view:
|
||||
# Assume this is a group item
|
||||
for child in get_children(i):
|
||||
subitems.add(child)
|
||||
else:
|
||||
# is top level item
|
||||
for child in get_children(i):
|
||||
subitems.add(child)
|
||||
|
||||
return list(subitems)
|
||||
|
||||
def _show_version_dialog(self, items):
|
||||
"""Create a dialog with the available versions for the selected file
|
||||
|
||||
Args:
|
||||
items (list): list of items to run the "set_version" for
|
||||
|
||||
Returns:
|
||||
None
|
||||
"""
|
||||
|
||||
active = items[-1]
|
||||
|
||||
# Get available versions for active representation
|
||||
representation_id = io.ObjectId(active["representation"])
|
||||
representation = io.find_one({"_id": representation_id})
|
||||
version = io.find_one({
|
||||
"_id": representation["parent"]
|
||||
})
|
||||
|
||||
versions = list(io.find(
|
||||
{
|
||||
"parent": version["parent"],
|
||||
"type": "version"
|
||||
},
|
||||
sort=[("name", 1)]
|
||||
))
|
||||
|
||||
hero_version = io.find_one({
|
||||
"parent": version["parent"],
|
||||
"type": "hero_version"
|
||||
})
|
||||
if hero_version:
|
||||
_version_id = hero_version["version_id"]
|
||||
for _version in versions:
|
||||
if _version["_id"] != _version_id:
|
||||
continue
|
||||
|
||||
hero_version["name"] = HeroVersionType(
|
||||
_version["name"]
|
||||
)
|
||||
hero_version["data"] = _version["data"]
|
||||
break
|
||||
|
||||
# Get index among the listed versions
|
||||
current_item = None
|
||||
current_version = active["version"]
|
||||
if isinstance(current_version, HeroVersionType):
|
||||
current_item = hero_version
|
||||
else:
|
||||
for version in versions:
|
||||
if version["name"] == current_version:
|
||||
current_item = version
|
||||
break
|
||||
|
||||
all_versions = []
|
||||
if hero_version:
|
||||
all_versions.append(hero_version)
|
||||
all_versions.extend(reversed(versions))
|
||||
|
||||
if current_item:
|
||||
index = all_versions.index(current_item)
|
||||
else:
|
||||
index = 0
|
||||
|
||||
versions_by_label = dict()
|
||||
labels = []
|
||||
for version in all_versions:
|
||||
is_hero = version["type"] == "hero_version"
|
||||
label = tools_lib.format_version(version["name"], is_hero)
|
||||
labels.append(label)
|
||||
versions_by_label[label] = version["name"]
|
||||
|
||||
label, state = QtWidgets.QInputDialog.getItem(
|
||||
self,
|
||||
"Set version..",
|
||||
"Set version number to",
|
||||
labels,
|
||||
current=index,
|
||||
editable=False
|
||||
)
|
||||
if not state:
|
||||
return
|
||||
|
||||
if label:
|
||||
version = versions_by_label[label]
|
||||
for item in items:
|
||||
try:
|
||||
api.update(item, version)
|
||||
except AssertionError:
|
||||
self._show_version_error_dialog(version, [item])
|
||||
log.warning("Update failed", exc_info=True)
|
||||
# refresh model when done
|
||||
self.data_changed.emit()
|
||||
|
||||
def _show_switch_dialog(self, items):
|
||||
"""Display Switch dialog"""
|
||||
dialog = SwitchAssetDialog(self, items)
|
||||
dialog.switched.connect(self.data_changed.emit)
|
||||
dialog.show()
|
||||
|
||||
def _show_remove_warning_dialog(self, items):
|
||||
"""Prompt a dialog to inform the user the action will remove items"""
|
||||
|
||||
accept = QtWidgets.QMessageBox.Ok
|
||||
buttons = accept | QtWidgets.QMessageBox.Cancel
|
||||
|
||||
state = QtWidgets.QMessageBox.question(
|
||||
self,
|
||||
"Are you sure?",
|
||||
"Are you sure you want to remove {} item(s)".format(len(items)),
|
||||
buttons=buttons,
|
||||
defaultButton=accept
|
||||
)
|
||||
|
||||
if state != accept:
|
||||
return
|
||||
|
||||
for item in items:
|
||||
api.remove(item)
|
||||
self.data_changed.emit()
|
||||
|
||||
def _show_version_error_dialog(self, version, items):
|
||||
"""Shows QMessageBox when version switch doesn't work
|
||||
|
||||
Args:
|
||||
version: str or int or None
|
||||
"""
|
||||
if not version:
|
||||
version_str = "latest"
|
||||
elif version == "hero":
|
||||
version_str = "hero"
|
||||
elif isinstance(version, int):
|
||||
version_str = "v{:03d}".format(version)
|
||||
else:
|
||||
version_str = version
|
||||
|
||||
dialog = QtWidgets.QMessageBox()
|
||||
dialog.setIcon(QtWidgets.QMessageBox.Warning)
|
||||
dialog.setStyleSheet(style.load_stylesheet())
|
||||
dialog.setWindowTitle("Update failed")
|
||||
|
||||
switch_btn = dialog.addButton(
|
||||
"Switch Asset",
|
||||
QtWidgets.QMessageBox.ActionRole
|
||||
)
|
||||
switch_btn.clicked.connect(lambda: self._show_switch_dialog(items))
|
||||
|
||||
dialog.addButton(QtWidgets.QMessageBox.Cancel)
|
||||
|
||||
msg = (
|
||||
"Version update to '{}' failed as representation doesn't exist."
|
||||
"\n\nPlease update to version with a valid representation"
|
||||
" OR \n use 'Switch Asset' button to change asset."
|
||||
).format(version_str)
|
||||
dialog.setText(msg)
|
||||
dialog.exec_()
|
||||
51
openpype/tools/sceneinventory/widgets.py
Normal file
51
openpype/tools/sceneinventory/widgets.py
Normal file
|
|
@ -0,0 +1,51 @@
|
|||
from Qt import QtWidgets, QtCore
|
||||
|
||||
|
||||
class SearchComboBox(QtWidgets.QComboBox):
|
||||
"""Searchable ComboBox with empty placeholder value as first value"""
|
||||
|
||||
def __init__(self, parent=None):
|
||||
super(SearchComboBox, self).__init__(parent)
|
||||
|
||||
self.setEditable(True)
|
||||
self.setInsertPolicy(self.NoInsert)
|
||||
|
||||
# Apply completer settings
|
||||
completer = self.completer()
|
||||
completer.setCompletionMode(completer.PopupCompletion)
|
||||
completer.setCaseSensitivity(QtCore.Qt.CaseInsensitive)
|
||||
|
||||
# Force style sheet on popup menu
|
||||
# It won't take the parent stylesheet for some reason
|
||||
# todo: better fix for completer popup stylesheet
|
||||
# if module.window:
|
||||
# popup = completer.popup()
|
||||
# popup.setStyleSheet(module.window.styleSheet())
|
||||
|
||||
def set_placeholder(self, placeholder):
|
||||
self.lineEdit().setPlaceholderText(placeholder)
|
||||
|
||||
def populate(self, items):
|
||||
self.clear()
|
||||
self.addItems([""]) # ensure first item is placeholder
|
||||
self.addItems(items)
|
||||
|
||||
def get_valid_value(self):
|
||||
"""Return the current text if it's a valid value else None
|
||||
|
||||
Note: The empty placeholder value is valid and returns as ""
|
||||
|
||||
"""
|
||||
|
||||
text = self.currentText()
|
||||
lookup = set(self.itemText(i) for i in range(self.count()))
|
||||
if text not in lookup:
|
||||
return None
|
||||
|
||||
return text or None
|
||||
|
||||
def set_valid_value(self, value):
|
||||
"""Try to locate 'value' and pre-select it in dropdown."""
|
||||
index = self.findText(value)
|
||||
if index > -1:
|
||||
self.setCurrentIndex(index)
|
||||
203
openpype/tools/sceneinventory/window.py
Normal file
203
openpype/tools/sceneinventory/window.py
Normal file
|
|
@ -0,0 +1,203 @@
|
|||
import os
|
||||
import sys
|
||||
|
||||
from Qt import QtWidgets, QtCore
|
||||
from avalon.vendor import qtawesome
|
||||
from avalon import io, api
|
||||
|
||||
from openpype import style
|
||||
from openpype.tools.utils.delegates import VersionDelegate
|
||||
from openpype.tools.utils.lib import (
|
||||
qt_app_context,
|
||||
preserve_expanded_rows,
|
||||
preserve_selection,
|
||||
FamilyConfigCache
|
||||
)
|
||||
|
||||
from .model import (
|
||||
InventoryModel,
|
||||
FilterProxyModel
|
||||
)
|
||||
from .view import SceneInvetoryView
|
||||
|
||||
|
||||
module = sys.modules[__name__]
|
||||
module.window = None
|
||||
|
||||
|
||||
class SceneInventoryWindow(QtWidgets.QDialog):
|
||||
"""Scene Inventory window"""
|
||||
|
||||
def __init__(self, parent=None):
|
||||
super(SceneInventoryWindow, self).__init__(parent)
|
||||
|
||||
if not parent:
|
||||
self.setWindowFlags(
|
||||
self.windowFlags() | QtCore.Qt.WindowStaysOnTopHint
|
||||
)
|
||||
|
||||
project_name = os.getenv("AVALON_PROJECT") or "<Project not set>"
|
||||
self.setWindowTitle("Scene Inventory 1.0 - {}".format(project_name))
|
||||
self.setObjectName("SceneInventory")
|
||||
# Maya only property
|
||||
self.setProperty("saveWindowPref", True)
|
||||
|
||||
self.resize(1100, 480)
|
||||
|
||||
# region control
|
||||
filter_label = QtWidgets.QLabel("Search", self)
|
||||
text_filter = QtWidgets.QLineEdit(self)
|
||||
|
||||
outdated_only_checkbox = QtWidgets.QCheckBox(
|
||||
"Filter to outdated", self
|
||||
)
|
||||
outdated_only_checkbox.setToolTip("Show outdated files only")
|
||||
outdated_only_checkbox.setChecked(False)
|
||||
|
||||
icon = qtawesome.icon("fa.refresh", color="white")
|
||||
refresh_button = QtWidgets.QPushButton(self)
|
||||
refresh_button.setIcon(icon)
|
||||
|
||||
control_layout = QtWidgets.QHBoxLayout()
|
||||
control_layout.addWidget(filter_label)
|
||||
control_layout.addWidget(text_filter)
|
||||
control_layout.addWidget(outdated_only_checkbox)
|
||||
control_layout.addWidget(refresh_button)
|
||||
|
||||
# endregion control
|
||||
family_config_cache = FamilyConfigCache(io)
|
||||
|
||||
model = InventoryModel(family_config_cache)
|
||||
proxy = FilterProxyModel()
|
||||
proxy.setSourceModel(model)
|
||||
proxy.setDynamicSortFilter(True)
|
||||
proxy.setFilterCaseSensitivity(QtCore.Qt.CaseInsensitive)
|
||||
|
||||
view = SceneInvetoryView(self)
|
||||
view.setModel(proxy)
|
||||
|
||||
# set some nice default widths for the view
|
||||
view.setColumnWidth(0, 250) # name
|
||||
view.setColumnWidth(1, 55) # version
|
||||
view.setColumnWidth(2, 55) # count
|
||||
view.setColumnWidth(3, 150) # family
|
||||
view.setColumnWidth(4, 100) # namespace
|
||||
|
||||
# apply delegates
|
||||
version_delegate = VersionDelegate(io, self)
|
||||
column = model.Columns.index("version")
|
||||
view.setItemDelegateForColumn(column, version_delegate)
|
||||
|
||||
layout = QtWidgets.QVBoxLayout(self)
|
||||
layout.addLayout(control_layout)
|
||||
layout.addWidget(view)
|
||||
|
||||
# signals
|
||||
text_filter.textChanged.connect(self._on_text_filter_change)
|
||||
outdated_only_checkbox.stateChanged.connect(
|
||||
self._on_outdated_state_change
|
||||
)
|
||||
view.hierarchy_view_changed.connect(
|
||||
self._on_hiearchy_view_change
|
||||
)
|
||||
view.data_changed.connect(self.refresh)
|
||||
refresh_button.clicked.connect(self.refresh)
|
||||
|
||||
self._outdated_only_checkbox = outdated_only_checkbox
|
||||
self._view = view
|
||||
self._model = model
|
||||
self._proxy = proxy
|
||||
self._version_delegate = version_delegate
|
||||
self._family_config_cache = family_config_cache
|
||||
|
||||
self._first_show = True
|
||||
|
||||
family_config_cache.refresh()
|
||||
|
||||
def showEvent(self, event):
|
||||
super(SceneInventoryWindow, self).showEvent(event)
|
||||
if self._first_show:
|
||||
self._first_show = False
|
||||
self.setStyleSheet(style.load_stylesheet())
|
||||
|
||||
def keyPressEvent(self, event):
|
||||
"""Custom keyPressEvent.
|
||||
|
||||
Override keyPressEvent to do nothing so that Maya's panels won't
|
||||
take focus when pressing "SHIFT" whilst mouse is over viewport or
|
||||
outliner. This way users don't accidently perform Maya commands
|
||||
whilst trying to name an instance.
|
||||
|
||||
"""
|
||||
|
||||
def refresh(self, items=None):
|
||||
with preserve_expanded_rows(
|
||||
tree_view=self._view,
|
||||
role=self._model.UniqueRole
|
||||
):
|
||||
with preserve_selection(
|
||||
tree_view=self._view,
|
||||
role=self._model.UniqueRole,
|
||||
current_index=False
|
||||
):
|
||||
kwargs = {"items": items}
|
||||
# TODO do not touch view's inner attribute
|
||||
if self._view._hierarchy_view:
|
||||
kwargs["selected"] = self._view._selected
|
||||
self._model.refresh(**kwargs)
|
||||
|
||||
def _on_hiearchy_view_change(self, enabled):
|
||||
self._proxy.set_hierarchy_view(enabled)
|
||||
self._model.set_hierarchy_view(enabled)
|
||||
|
||||
def _on_text_filter_change(self, text_filter):
|
||||
self._proxy.setFilterRegExp(text_filter)
|
||||
|
||||
def _on_outdated_state_change(self):
|
||||
self._proxy.set_filter_outdated(
|
||||
self._outdated_only_checkbox.isChecked()
|
||||
)
|
||||
|
||||
|
||||
def show(root=None, debug=False, parent=None, items=None):
|
||||
"""Display Scene Inventory GUI
|
||||
|
||||
Arguments:
|
||||
debug (bool, optional): Run in debug-mode,
|
||||
defaults to False
|
||||
parent (QtCore.QObject, optional): When provided parent the interface
|
||||
to this QObject.
|
||||
items (list) of dictionaries - for injection of items for standalone
|
||||
testing
|
||||
|
||||
"""
|
||||
|
||||
try:
|
||||
module.window.close()
|
||||
del module.window
|
||||
except (RuntimeError, AttributeError):
|
||||
pass
|
||||
|
||||
if debug is True:
|
||||
io.install()
|
||||
|
||||
if not os.environ.get("AVALON_PROJECT"):
|
||||
any_project = next(
|
||||
project for project in io.projects()
|
||||
if project.get("active", True) is not False
|
||||
)
|
||||
|
||||
api.Session["AVALON_PROJECT"] = any_project["name"]
|
||||
else:
|
||||
api.Session["AVALON_PROJECT"] = os.environ.get("AVALON_PROJECT")
|
||||
|
||||
with qt_app_context():
|
||||
window = SceneInventoryWindow(parent)
|
||||
window.show()
|
||||
window.refresh(items=items)
|
||||
|
||||
module.window = window
|
||||
|
||||
# Pull window to the front.
|
||||
module.window.raise_()
|
||||
module.window.activateWindow()
|
||||
19
openpype/tools/subsetmanager/README.md
Normal file
19
openpype/tools/subsetmanager/README.md
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
Subset manager
|
||||
--------------
|
||||
|
||||
Simple UI showing list of created subset that will be published via Pyblish.
|
||||
Useful for applications (Photoshop, AfterEffects, TVPaint, Harmony) which are
|
||||
storing metadata about instance hidden from user.
|
||||
|
||||
This UI allows listing all created subset and removal of them if needed (
|
||||
in case use doesn't want to publish anymore, its using workfile as a starting
|
||||
file for different task and instances should be completely different etc.
|
||||
)
|
||||
|
||||
Host is expected to implemented:
|
||||
- `list_instances` - returning list of dictionaries (instances), must contain
|
||||
unique uuid field
|
||||
example:
|
||||
```[{"uuid":"15","active":true,"subset":"imageBG","family":"image","id":"pyblish.avalon.instance","asset":"Town"}]```
|
||||
- `remove_instance(instance)` - removes instance from file's metadata
|
||||
instance is a dictionary, with uuid field
|
||||
9
openpype/tools/subsetmanager/__init__.py
Normal file
9
openpype/tools/subsetmanager/__init__.py
Normal file
|
|
@ -0,0 +1,9 @@
|
|||
from .window import (
|
||||
show,
|
||||
SubsetManagerWindow
|
||||
)
|
||||
|
||||
__all__ = (
|
||||
"show",
|
||||
"SubsetManagerWindow"
|
||||
)
|
||||
52
openpype/tools/subsetmanager/model.py
Normal file
52
openpype/tools/subsetmanager/model.py
Normal file
|
|
@ -0,0 +1,52 @@
|
|||
import uuid
|
||||
|
||||
from Qt import QtCore, QtGui
|
||||
|
||||
from avalon import api
|
||||
|
||||
ITEM_ID_ROLE = QtCore.Qt.UserRole + 1
|
||||
|
||||
|
||||
class InstanceModel(QtGui.QStandardItemModel):
|
||||
def __init__(self, *args, **kwargs):
|
||||
super(InstanceModel, self).__init__(*args, **kwargs)
|
||||
self._instances_by_item_id = {}
|
||||
|
||||
def get_instance_by_id(self, item_id):
|
||||
return self._instances_by_item_id.get(item_id)
|
||||
|
||||
def refresh(self):
|
||||
self.clear()
|
||||
|
||||
self._instances_by_item_id = {}
|
||||
|
||||
instances = None
|
||||
host = api.registered_host()
|
||||
list_instances = getattr(host, "list_instances", None)
|
||||
if list_instances:
|
||||
instances = list_instances()
|
||||
|
||||
if not instances:
|
||||
return
|
||||
|
||||
items = []
|
||||
for instance_data in instances:
|
||||
item_id = str(uuid.uuid4())
|
||||
label = instance_data.get("label") or instance_data["subset"]
|
||||
item = QtGui.QStandardItem(label)
|
||||
item.setEnabled(True)
|
||||
item.setEditable(False)
|
||||
item.setData(item_id, ITEM_ID_ROLE)
|
||||
items.append(item)
|
||||
self._instances_by_item_id[item_id] = instance_data
|
||||
|
||||
if items:
|
||||
self.invisibleRootItem().appendRows(items)
|
||||
|
||||
def headerData(self, section, orientation, role):
|
||||
if role == QtCore.Qt.DisplayRole and section == 0:
|
||||
return "Instance"
|
||||
|
||||
return super(InstanceModel, self).headerData(
|
||||
section, orientation, role
|
||||
)
|
||||
110
openpype/tools/subsetmanager/widgets.py
Normal file
110
openpype/tools/subsetmanager/widgets.py
Normal file
|
|
@ -0,0 +1,110 @@
|
|||
import json
|
||||
from Qt import QtWidgets, QtCore
|
||||
|
||||
|
||||
class InstanceDetail(QtWidgets.QWidget):
|
||||
save_triggered = QtCore.Signal()
|
||||
|
||||
def __init__(self, parent=None):
|
||||
super(InstanceDetail, self).__init__(parent)
|
||||
|
||||
details_widget = QtWidgets.QPlainTextEdit(self)
|
||||
details_widget.setObjectName("SubsetManagerDetailsText")
|
||||
|
||||
save_btn = QtWidgets.QPushButton("Save", self)
|
||||
|
||||
self._block_changes = False
|
||||
self._editable = False
|
||||
self._item_id = None
|
||||
|
||||
layout = QtWidgets.QVBoxLayout(self)
|
||||
layout.setContentsMargins(0, 0, 0, 0)
|
||||
layout.addWidget(details_widget, 1)
|
||||
layout.addWidget(save_btn, 0, QtCore.Qt.AlignRight)
|
||||
|
||||
save_btn.clicked.connect(self._on_save_clicked)
|
||||
details_widget.textChanged.connect(self._on_text_change)
|
||||
|
||||
self._details_widget = details_widget
|
||||
self._save_btn = save_btn
|
||||
|
||||
self.set_editable(False)
|
||||
|
||||
def _on_save_clicked(self):
|
||||
if self.is_valid():
|
||||
self.save_triggered.emit()
|
||||
|
||||
def set_editable(self, enabled=True):
|
||||
self._editable = enabled
|
||||
self.update_state()
|
||||
|
||||
def update_state(self, valid=None):
|
||||
editable = self._editable
|
||||
if not self._item_id:
|
||||
editable = False
|
||||
|
||||
self._save_btn.setVisible(editable)
|
||||
self._details_widget.setReadOnly(not editable)
|
||||
if valid is None:
|
||||
valid = self.is_valid()
|
||||
|
||||
self._save_btn.setEnabled(valid)
|
||||
self._set_invalid_detail(valid)
|
||||
|
||||
def _set_invalid_detail(self, valid):
|
||||
state = ""
|
||||
if not valid:
|
||||
state = "invalid"
|
||||
|
||||
current_state = self._details_widget.property("state")
|
||||
if current_state != state:
|
||||
self._details_widget.setProperty("state", state)
|
||||
self._details_widget.style().polish(self._details_widget)
|
||||
|
||||
def set_details(self, container, item_id):
|
||||
self._item_id = item_id
|
||||
|
||||
text = "Nothing selected"
|
||||
if item_id:
|
||||
try:
|
||||
text = json.dumps(container, indent=4)
|
||||
except Exception:
|
||||
text = str(container)
|
||||
|
||||
self._block_changes = True
|
||||
self._details_widget.setPlainText(text)
|
||||
self._block_changes = False
|
||||
|
||||
self.update_state()
|
||||
|
||||
def instance_data_from_text(self):
|
||||
try:
|
||||
jsoned = json.loads(self._details_widget.toPlainText())
|
||||
except Exception:
|
||||
jsoned = None
|
||||
return jsoned
|
||||
|
||||
def item_id(self):
|
||||
return self._item_id
|
||||
|
||||
def is_valid(self):
|
||||
if not self._item_id:
|
||||
return True
|
||||
|
||||
value = self._details_widget.toPlainText()
|
||||
valid = False
|
||||
try:
|
||||
jsoned = json.loads(value)
|
||||
if jsoned and isinstance(jsoned, dict):
|
||||
valid = True
|
||||
|
||||
except Exception:
|
||||
pass
|
||||
return valid
|
||||
|
||||
def _on_text_change(self):
|
||||
if self._block_changes or not self._item_id:
|
||||
return
|
||||
|
||||
valid = self.is_valid()
|
||||
self.update_state(valid)
|
||||
218
openpype/tools/subsetmanager/window.py
Normal file
218
openpype/tools/subsetmanager/window.py
Normal file
|
|
@ -0,0 +1,218 @@
|
|||
import os
|
||||
import sys
|
||||
|
||||
from Qt import QtWidgets, QtCore
|
||||
|
||||
from avalon import api
|
||||
from avalon.vendor import qtawesome
|
||||
|
||||
from openpype import style
|
||||
from openpype.tools.utils.lib import (
|
||||
iter_model_rows,
|
||||
qt_app_context
|
||||
)
|
||||
from openpype.tools.utils.models import RecursiveSortFilterProxyModel
|
||||
from .model import (
|
||||
InstanceModel,
|
||||
ITEM_ID_ROLE
|
||||
)
|
||||
from .widgets import InstanceDetail
|
||||
|
||||
|
||||
module = sys.modules[__name__]
|
||||
module.window = None
|
||||
|
||||
|
||||
class SubsetManagerWindow(QtWidgets.QDialog):
|
||||
def __init__(self, parent=None):
|
||||
super(SubsetManagerWindow, self).__init__(parent=parent)
|
||||
self.setWindowTitle("Subset Manager 0.1")
|
||||
self.setObjectName("SubsetManager")
|
||||
if not parent:
|
||||
self.setWindowFlags(
|
||||
self.windowFlags() | QtCore.Qt.WindowStaysOnTopHint
|
||||
)
|
||||
|
||||
self.resize(780, 430)
|
||||
|
||||
# Trigger refresh on first called show
|
||||
self._first_show = True
|
||||
|
||||
left_side_widget = QtWidgets.QWidget(self)
|
||||
|
||||
# Header part
|
||||
header_widget = QtWidgets.QWidget(left_side_widget)
|
||||
|
||||
# Filter input
|
||||
filter_input = QtWidgets.QLineEdit(header_widget)
|
||||
filter_input.setPlaceholderText("Filter subsets..")
|
||||
|
||||
# Refresh button
|
||||
icon = qtawesome.icon("fa.refresh", color="white")
|
||||
refresh_btn = QtWidgets.QPushButton(header_widget)
|
||||
refresh_btn.setIcon(icon)
|
||||
|
||||
header_layout = QtWidgets.QHBoxLayout(header_widget)
|
||||
header_layout.setContentsMargins(0, 0, 0, 0)
|
||||
header_layout.addWidget(filter_input)
|
||||
header_layout.addWidget(refresh_btn)
|
||||
|
||||
# Instances view
|
||||
view = QtWidgets.QTreeView(left_side_widget)
|
||||
view.setIndentation(0)
|
||||
view.setContextMenuPolicy(QtCore.Qt.CustomContextMenu)
|
||||
|
||||
model = InstanceModel(view)
|
||||
proxy = RecursiveSortFilterProxyModel()
|
||||
proxy.setSourceModel(model)
|
||||
proxy.setFilterCaseSensitivity(QtCore.Qt.CaseInsensitive)
|
||||
|
||||
view.setModel(proxy)
|
||||
|
||||
left_side_layout = QtWidgets.QVBoxLayout(left_side_widget)
|
||||
left_side_layout.setContentsMargins(0, 0, 0, 0)
|
||||
left_side_layout.addWidget(header_widget)
|
||||
left_side_layout.addWidget(view)
|
||||
|
||||
details_widget = InstanceDetail(self)
|
||||
|
||||
layout = QtWidgets.QHBoxLayout(self)
|
||||
layout.addWidget(left_side_widget, 0)
|
||||
layout.addWidget(details_widget, 1)
|
||||
|
||||
filter_input.textChanged.connect(proxy.setFilterFixedString)
|
||||
refresh_btn.clicked.connect(self._on_refresh_clicked)
|
||||
view.clicked.connect(self._on_activated)
|
||||
view.customContextMenuRequested.connect(self.on_context_menu)
|
||||
details_widget.save_triggered.connect(self._on_save)
|
||||
|
||||
self._model = model
|
||||
self._proxy = proxy
|
||||
self._view = view
|
||||
self._details_widget = details_widget
|
||||
self._refresh_btn = refresh_btn
|
||||
|
||||
def _on_refresh_clicked(self):
|
||||
self.refresh()
|
||||
|
||||
def _on_activated(self, index):
|
||||
container = None
|
||||
item_id = None
|
||||
if index.isValid():
|
||||
item_id = index.data(ITEM_ID_ROLE)
|
||||
container = self._model.get_instance_by_id(item_id)
|
||||
|
||||
self._details_widget.set_details(container, item_id)
|
||||
|
||||
def _on_save(self):
|
||||
host = api.registered_host()
|
||||
if not hasattr(host, "save_instances"):
|
||||
print("BUG: Host does not have \"save_instances\" method")
|
||||
return
|
||||
|
||||
current_index = self._view.selectionModel().currentIndex()
|
||||
if not current_index.isValid():
|
||||
return
|
||||
|
||||
item_id = current_index.data(ITEM_ID_ROLE)
|
||||
if item_id != self._details_widget.item_id():
|
||||
return
|
||||
|
||||
item_data = self._details_widget.instance_data_from_text()
|
||||
new_instances = []
|
||||
for index in iter_model_rows(self._model, 0):
|
||||
_item_id = index.data(ITEM_ID_ROLE)
|
||||
if _item_id == item_id:
|
||||
instance_data = item_data
|
||||
else:
|
||||
instance_data = self._model.get_instance_by_id(item_id)
|
||||
new_instances.append(instance_data)
|
||||
|
||||
host.save_instances(new_instances)
|
||||
|
||||
def on_context_menu(self, point):
|
||||
point_index = self._view.indexAt(point)
|
||||
item_id = point_index.data(ITEM_ID_ROLE)
|
||||
instance_data = self._model.get_instance_by_id(item_id)
|
||||
if instance_data is None:
|
||||
return
|
||||
|
||||
# Prepare menu
|
||||
menu = QtWidgets.QMenu(self)
|
||||
actions = []
|
||||
host = api.registered_host()
|
||||
if hasattr(host, "remove_instance"):
|
||||
action = QtWidgets.QAction("Remove instance", menu)
|
||||
action.setData(host.remove_instance)
|
||||
actions.append(action)
|
||||
|
||||
if hasattr(host, "select_instance"):
|
||||
action = QtWidgets.QAction("Select instance", menu)
|
||||
action.setData(host.select_instance)
|
||||
actions.append(action)
|
||||
|
||||
if not actions:
|
||||
actions.append(QtWidgets.QAction("* Nothing to do", menu))
|
||||
|
||||
for action in actions:
|
||||
menu.addAction(action)
|
||||
|
||||
# Show menu under mouse
|
||||
global_point = self._view.mapToGlobal(point)
|
||||
action = menu.exec_(global_point)
|
||||
if not action or not action.data():
|
||||
return
|
||||
|
||||
# Process action
|
||||
# TODO catch exceptions
|
||||
function = action.data()
|
||||
function(instance_data)
|
||||
|
||||
# Reset modified data
|
||||
self.refresh()
|
||||
|
||||
def refresh(self):
|
||||
self._details_widget.set_details(None, None)
|
||||
self._model.refresh()
|
||||
|
||||
host = api.registered_host()
|
||||
dev_mode = os.environ.get("AVALON_DEVELOP_MODE") or ""
|
||||
editable = False
|
||||
if dev_mode.lower() in ("1", "yes", "true", "on"):
|
||||
editable = hasattr(host, "save_instances")
|
||||
self._details_widget.set_editable(editable)
|
||||
|
||||
def showEvent(self, *args, **kwargs):
|
||||
super(SubsetManagerWindow, self).showEvent(*args, **kwargs)
|
||||
if self._first_show:
|
||||
self._first_show = False
|
||||
self.setStyleSheet(style.load_stylesheet())
|
||||
self.refresh()
|
||||
|
||||
|
||||
def show(root=None, debug=False, parent=None):
|
||||
"""Display Scene Inventory GUI
|
||||
|
||||
Arguments:
|
||||
debug (bool, optional): Run in debug-mode,
|
||||
defaults to False
|
||||
parent (QtCore.QObject, optional): When provided parent the interface
|
||||
to this QObject.
|
||||
|
||||
"""
|
||||
|
||||
try:
|
||||
module.window.close()
|
||||
del module.window
|
||||
except (RuntimeError, AttributeError):
|
||||
pass
|
||||
|
||||
with qt_app_context():
|
||||
window = SubsetManagerWindow(parent)
|
||||
window.show()
|
||||
|
||||
module.window = window
|
||||
|
||||
# Pull window to the front.
|
||||
module.window.raise_()
|
||||
module.window.activateWindow()
|
||||
|
|
@ -7,4 +7,23 @@ PROJECT_IS_ACTIVE_ROLE = QtCore.Qt.UserRole + 102
|
|||
|
||||
TASK_NAME_ROLE = QtCore.Qt.UserRole + 301
|
||||
TASK_TYPE_ROLE = QtCore.Qt.UserRole + 302
|
||||
TASK_ORDER_ROLE = QtCore.Qt.UserRole + 403
|
||||
TASK_ORDER_ROLE = QtCore.Qt.UserRole + 303
|
||||
|
||||
LOCAL_PROVIDER_ROLE = QtCore.Qt.UserRole + 500 # provider of active site
|
||||
REMOTE_PROVIDER_ROLE = QtCore.Qt.UserRole + 501 # provider of remote site
|
||||
LOCAL_PROGRESS_ROLE = QtCore.Qt.UserRole + 502 # percentage downld on active
|
||||
REMOTE_PROGRESS_ROLE = QtCore.Qt.UserRole + 503 # percentage upload on remote
|
||||
LOCAL_AVAILABILITY_ROLE = QtCore.Qt.UserRole + 504 # ratio of presence active
|
||||
REMOTE_AVAILABILITY_ROLE = QtCore.Qt.UserRole + 505
|
||||
LOCAL_DATE_ROLE = QtCore.Qt.UserRole + 506 # created_dt on active site
|
||||
REMOTE_DATE_ROLE = QtCore.Qt.UserRole + 507
|
||||
LOCAL_FAILED_ROLE = QtCore.Qt.UserRole + 508
|
||||
REMOTE_FAILED_ROLE = QtCore.Qt.UserRole + 509
|
||||
HEADER_NAME_ROLE = QtCore.Qt.UserRole + 510
|
||||
EDIT_ICON_ROLE = QtCore.Qt.UserRole + 511
|
||||
STATUS_ROLE = QtCore.Qt.UserRole + 512
|
||||
PATH_ROLE = QtCore.Qt.UserRole + 513
|
||||
LOCAL_SITE_NAME_ROLE = QtCore.Qt.UserRole + 514
|
||||
REMOTE_SITE_NAME_ROLE = QtCore.Qt.UserRole + 515
|
||||
ERROR_ROLE = QtCore.Qt.UserRole + 516
|
||||
TRIES_ROLE = QtCore.Qt.UserRole + 517
|
||||
|
|
|
|||
|
|
@ -133,22 +133,20 @@ class HostToolsHelper:
|
|||
def get_subset_manager_tool(self, parent):
|
||||
"""Create, cache and return subset manager tool window."""
|
||||
if self._subset_manager_tool is None:
|
||||
from avalon.tools.subsetmanager import Window
|
||||
from openpype.tools.subsetmanager import SubsetManagerWindow
|
||||
|
||||
subset_manager_window = Window(parent=parent or self._parent)
|
||||
subset_manager_window = SubsetManagerWindow(
|
||||
parent=parent or self._parent
|
||||
)
|
||||
self._subset_manager_tool = subset_manager_window
|
||||
|
||||
return self._subset_manager_tool
|
||||
|
||||
def show_subset_manager(self, parent=None):
|
||||
"""Show tool display/remove existing created instances."""
|
||||
from avalon import style
|
||||
|
||||
subset_manager_tool = self.get_subset_manager_tool(parent)
|
||||
subset_manager_tool.show()
|
||||
|
||||
subset_manager_tool.setStyleSheet(style.load_stylesheet())
|
||||
|
||||
# Pull window to the front.
|
||||
subset_manager_tool.raise_()
|
||||
subset_manager_tool.activateWindow()
|
||||
|
|
@ -156,21 +154,20 @@ class HostToolsHelper:
|
|||
def get_scene_inventory_tool(self, parent):
|
||||
"""Create, cache and return scene inventory tool window."""
|
||||
if self._scene_inventory_tool is None:
|
||||
from avalon.tools.sceneinventory.app import Window
|
||||
from openpype.tools.sceneinventory import SceneInventoryWindow
|
||||
|
||||
scene_inventory_window = Window(parent=parent or self._parent)
|
||||
scene_inventory_window = SceneInventoryWindow(
|
||||
parent=parent or self._parent
|
||||
)
|
||||
self._scene_inventory_tool = scene_inventory_window
|
||||
|
||||
return self._scene_inventory_tool
|
||||
|
||||
def show_scene_inventory(self, parent=None):
|
||||
"""Show tool maintain loaded containers."""
|
||||
from avalon import style
|
||||
|
||||
scene_inventory_tool = self.get_scene_inventory_tool(parent)
|
||||
scene_inventory_tool.show()
|
||||
scene_inventory_tool.refresh()
|
||||
scene_inventory_tool.setStyleSheet(style.load_stylesheet())
|
||||
|
||||
# Pull window to the front.
|
||||
scene_inventory_tool.raise_()
|
||||
|
|
|
|||
299
openpype/tools/utils/tasks_widget.py
Normal file
299
openpype/tools/utils/tasks_widget.py
Normal file
|
|
@ -0,0 +1,299 @@
|
|||
from Qt import QtWidgets, QtCore, QtGui
|
||||
|
||||
from avalon import style
|
||||
from avalon.vendor import qtawesome
|
||||
|
||||
from .views import DeselectableTreeView
|
||||
from .constants import (
|
||||
TASK_ORDER_ROLE,
|
||||
TASK_TYPE_ROLE,
|
||||
TASK_NAME_ROLE
|
||||
)
|
||||
|
||||
|
||||
class TasksModel(QtGui.QStandardItemModel):
|
||||
"""A model listing the tasks combined for a list of assets"""
|
||||
def __init__(self, dbcon, parent=None):
|
||||
super(TasksModel, self).__init__(parent=parent)
|
||||
self.dbcon = dbcon
|
||||
self.setHeaderData(
|
||||
0, QtCore.Qt.Horizontal, "Tasks", QtCore.Qt.DisplayRole
|
||||
)
|
||||
self._default_icon = qtawesome.icon(
|
||||
"fa.male",
|
||||
color=style.colors.default
|
||||
)
|
||||
self._no_tasks_icon = qtawesome.icon(
|
||||
"fa.exclamation-circle",
|
||||
color=style.colors.mid
|
||||
)
|
||||
self._cached_icons = {}
|
||||
self._project_task_types = {}
|
||||
|
||||
self._empty_tasks_item = None
|
||||
self._last_asset_id = None
|
||||
self._loaded_project_name = None
|
||||
|
||||
def _context_is_valid(self):
|
||||
if self.dbcon.Session.get("AVALON_PROJECT"):
|
||||
return True
|
||||
return False
|
||||
|
||||
def refresh(self):
|
||||
self._refresh_task_types()
|
||||
self.set_asset_id(self._last_asset_id)
|
||||
|
||||
def _refresh_task_types(self):
|
||||
# Get the project configured icons from database
|
||||
task_types = {}
|
||||
if self._context_is_valid():
|
||||
project = self.dbcon.find_one(
|
||||
{"type": "project"},
|
||||
{"config.tasks"}
|
||||
)
|
||||
task_types = project["config"].get("tasks") or task_types
|
||||
self._project_task_types = task_types
|
||||
|
||||
def _try_get_awesome_icon(self, icon_name):
|
||||
icon = None
|
||||
if icon_name:
|
||||
try:
|
||||
icon = qtawesome.icon(
|
||||
"fa.{}".format(icon_name),
|
||||
color=style.colors.default
|
||||
)
|
||||
|
||||
except Exception:
|
||||
pass
|
||||
return icon
|
||||
|
||||
def headerData(self, section, orientation, role=None):
|
||||
if role is None:
|
||||
role = QtCore.Qt.EditRole
|
||||
# Show nice labels in the header
|
||||
if section == 0:
|
||||
if (
|
||||
role in (QtCore.Qt.DisplayRole, QtCore.Qt.EditRole)
|
||||
and orientation == QtCore.Qt.Horizontal
|
||||
):
|
||||
return "Tasks"
|
||||
|
||||
return super(TasksModel, self).headerData(section, orientation, role)
|
||||
|
||||
def _get_icon(self, task_icon, task_type_icon):
|
||||
if task_icon in self._cached_icons:
|
||||
return self._cached_icons[task_icon]
|
||||
|
||||
icon = self._try_get_awesome_icon(task_icon)
|
||||
if icon is not None:
|
||||
self._cached_icons[task_icon] = icon
|
||||
return icon
|
||||
|
||||
if task_type_icon in self._cached_icons:
|
||||
icon = self._cached_icons[task_type_icon]
|
||||
self._cached_icons[task_icon] = icon
|
||||
return icon
|
||||
|
||||
icon = self._try_get_awesome_icon(task_type_icon)
|
||||
if icon is None:
|
||||
icon = self._default_icon
|
||||
|
||||
self._cached_icons[task_icon] = icon
|
||||
self._cached_icons[task_type_icon] = icon
|
||||
|
||||
return icon
|
||||
|
||||
def set_asset_id(self, asset_id):
|
||||
asset_doc = None
|
||||
if self._context_is_valid():
|
||||
asset_doc = self.dbcon.find_one(
|
||||
{"_id": asset_id},
|
||||
{"data.tasks": True}
|
||||
)
|
||||
self._set_asset(asset_doc)
|
||||
|
||||
def _get_empty_task_item(self):
|
||||
if self._empty_tasks_item is None:
|
||||
item = QtGui.QStandardItem("No task")
|
||||
item.setData(self._no_tasks_icon, QtCore.Qt.DecorationRole)
|
||||
item.setFlags(QtCore.Qt.NoItemFlags)
|
||||
self._empty_tasks_item = item
|
||||
return self._empty_tasks_item
|
||||
|
||||
def _set_asset(self, asset_doc):
|
||||
"""Set assets to track by their database id
|
||||
|
||||
Arguments:
|
||||
asset_doc (dict): Asset document from MongoDB.
|
||||
"""
|
||||
asset_tasks = {}
|
||||
self._last_asset_id = None
|
||||
if asset_doc:
|
||||
asset_tasks = asset_doc.get("data", {}).get("tasks") or {}
|
||||
self._last_asset_id = asset_doc["_id"]
|
||||
|
||||
root_item = self.invisibleRootItem()
|
||||
root_item.removeRows(0, root_item.rowCount())
|
||||
|
||||
items = []
|
||||
for task_name, task_info in asset_tasks.items():
|
||||
task_icon = task_info.get("icon")
|
||||
task_type = task_info.get("type")
|
||||
task_order = task_info.get("order")
|
||||
task_type_info = self._project_task_types.get(task_type) or {}
|
||||
task_type_icon = task_type_info.get("icon")
|
||||
icon = self._get_icon(task_icon, task_type_icon)
|
||||
|
||||
label = "{} ({})".format(task_name, task_type or "type N/A")
|
||||
item = QtGui.QStandardItem(label)
|
||||
item.setData(task_name, TASK_NAME_ROLE)
|
||||
item.setData(task_type, TASK_TYPE_ROLE)
|
||||
item.setData(task_order, TASK_ORDER_ROLE)
|
||||
item.setData(icon, QtCore.Qt.DecorationRole)
|
||||
item.setFlags(QtCore.Qt.ItemIsEnabled | QtCore.Qt.ItemIsSelectable)
|
||||
items.append(item)
|
||||
|
||||
if not items:
|
||||
item = QtGui.QStandardItem("No task")
|
||||
item.setData(self._no_tasks_icon, QtCore.Qt.DecorationRole)
|
||||
item.setFlags(QtCore.Qt.NoItemFlags)
|
||||
items.append(item)
|
||||
|
||||
root_item.appendRows(items)
|
||||
|
||||
|
||||
class TasksProxyModel(QtCore.QSortFilterProxyModel):
|
||||
def lessThan(self, x_index, y_index):
|
||||
x_order = x_index.data(TASK_ORDER_ROLE)
|
||||
y_order = y_index.data(TASK_ORDER_ROLE)
|
||||
if x_order is not None and y_order is not None:
|
||||
if x_order < y_order:
|
||||
return True
|
||||
if x_order > y_order:
|
||||
return False
|
||||
|
||||
elif x_order is None and y_order is not None:
|
||||
return True
|
||||
|
||||
elif y_order is None and x_order is not None:
|
||||
return False
|
||||
|
||||
x_name = x_index.data(QtCore.Qt.DisplayRole)
|
||||
y_name = y_index.data(QtCore.Qt.DisplayRole)
|
||||
if x_name == y_name:
|
||||
return True
|
||||
|
||||
if x_name == tuple(sorted((x_name, y_name)))[0]:
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
class TasksWidget(QtWidgets.QWidget):
|
||||
"""Widget showing active Tasks"""
|
||||
|
||||
task_changed = QtCore.Signal()
|
||||
|
||||
def __init__(self, dbcon, parent=None):
|
||||
super(TasksWidget, self).__init__(parent)
|
||||
|
||||
tasks_view = DeselectableTreeView(self)
|
||||
tasks_view.setIndentation(0)
|
||||
tasks_view.setSortingEnabled(True)
|
||||
tasks_view.setEditTriggers(QtWidgets.QTreeView.NoEditTriggers)
|
||||
|
||||
header_view = tasks_view.header()
|
||||
header_view.setSortIndicator(0, QtCore.Qt.AscendingOrder)
|
||||
|
||||
tasks_model = TasksModel(dbcon)
|
||||
tasks_proxy = TasksProxyModel()
|
||||
tasks_proxy.setSourceModel(tasks_model)
|
||||
tasks_view.setModel(tasks_proxy)
|
||||
|
||||
layout = QtWidgets.QVBoxLayout(self)
|
||||
layout.setContentsMargins(0, 0, 0, 0)
|
||||
layout.addWidget(tasks_view)
|
||||
|
||||
selection_model = tasks_view.selectionModel()
|
||||
selection_model.selectionChanged.connect(self._on_task_change)
|
||||
|
||||
self._tasks_model = tasks_model
|
||||
self._tasks_proxy = tasks_proxy
|
||||
self._tasks_view = tasks_view
|
||||
|
||||
self._last_selected_task_name = None
|
||||
|
||||
def refresh(self):
|
||||
self._tasks_model.refresh()
|
||||
|
||||
def set_asset_id(self, asset_id):
|
||||
# Asset deselected
|
||||
if asset_id is None:
|
||||
return
|
||||
|
||||
# Try and preserve the last selected task and reselect it
|
||||
# after switching assets. If there's no currently selected
|
||||
# asset keep whatever the "last selected" was prior to it.
|
||||
current = self.get_selected_task_name()
|
||||
if current:
|
||||
self._last_selected_task_name = current
|
||||
|
||||
self._tasks_model.set_asset_id(asset_id)
|
||||
|
||||
if self._last_selected_task_name:
|
||||
self.select_task_name(self._last_selected_task_name)
|
||||
|
||||
# Force a task changed emit.
|
||||
self.task_changed.emit()
|
||||
|
||||
def select_task_name(self, task_name):
|
||||
"""Select a task by name.
|
||||
|
||||
If the task does not exist in the current model then selection is only
|
||||
cleared.
|
||||
|
||||
Args:
|
||||
task (str): Name of the task to select.
|
||||
|
||||
"""
|
||||
task_view_model = self._tasks_view.model()
|
||||
if not task_view_model:
|
||||
return
|
||||
|
||||
# Clear selection
|
||||
selection_model = self._tasks_view.selectionModel()
|
||||
selection_model.clearSelection()
|
||||
|
||||
# Select the task
|
||||
mode = selection_model.Select | selection_model.Rows
|
||||
for row in range(task_view_model.rowCount()):
|
||||
index = task_view_model.index(row, 0)
|
||||
name = index.data(TASK_NAME_ROLE)
|
||||
if name == task_name:
|
||||
selection_model.select(index, mode)
|
||||
|
||||
# Set the currently active index
|
||||
self._tasks_view.setCurrentIndex(index)
|
||||
break
|
||||
|
||||
def get_selected_task_name(self):
|
||||
"""Return name of task at current index (selected)
|
||||
|
||||
Returns:
|
||||
str: Name of the current task.
|
||||
|
||||
"""
|
||||
index = self._tasks_view.currentIndex()
|
||||
selection_model = self._tasks_view.selectionModel()
|
||||
if index.isValid() and selection_model.isSelected(index):
|
||||
return index.data(TASK_NAME_ROLE)
|
||||
return None
|
||||
|
||||
def get_selected_task_type(self):
|
||||
index = self._tasks_view.currentIndex()
|
||||
selection_model = self._tasks_view.selectionModel()
|
||||
if index.isValid() and selection_model.isSelected(index):
|
||||
return index.data(TASK_TYPE_ROLE)
|
||||
return None
|
||||
|
||||
def _on_task_change(self):
|
||||
self.task_changed.emit()
|
||||
|
|
@ -15,16 +15,9 @@ from openpype.tools.utils.lib import (
|
|||
schedule, qt_app_context
|
||||
)
|
||||
from openpype.tools.utils.widgets import AssetWidget
|
||||
from openpype.tools.utils.tasks_widget import TasksWidget
|
||||
from openpype.tools.utils.delegates import PrettyTimeDelegate
|
||||
|
||||
from openpype.tools.utils.constants import (
|
||||
TASK_NAME_ROLE,
|
||||
TASK_TYPE_ROLE
|
||||
)
|
||||
from openpype.tools.utils.models import (
|
||||
TasksModel,
|
||||
TasksProxyModel
|
||||
)
|
||||
from .model import FilesModel
|
||||
from .view import FilesView
|
||||
|
||||
|
|
@ -323,110 +316,6 @@ class NameWindow(QtWidgets.QDialog):
|
|||
)
|
||||
|
||||
|
||||
class TasksWidget(QtWidgets.QWidget):
|
||||
"""Widget showing active Tasks"""
|
||||
|
||||
task_changed = QtCore.Signal()
|
||||
|
||||
def __init__(self, dbcon=None, parent=None):
|
||||
super(TasksWidget, self).__init__(parent)
|
||||
|
||||
tasks_view = QtWidgets.QTreeView(self)
|
||||
tasks_view.setIndentation(0)
|
||||
tasks_view.setSortingEnabled(True)
|
||||
if dbcon is None:
|
||||
dbcon = io
|
||||
|
||||
tasks_model = TasksModel(dbcon)
|
||||
tasks_proxy = TasksProxyModel()
|
||||
tasks_proxy.setSourceModel(tasks_model)
|
||||
tasks_view.setModel(tasks_proxy)
|
||||
|
||||
layout = QtWidgets.QVBoxLayout(self)
|
||||
layout.setContentsMargins(0, 0, 0, 0)
|
||||
layout.addWidget(tasks_view)
|
||||
|
||||
selection_model = tasks_view.selectionModel()
|
||||
selection_model.currentChanged.connect(self.task_changed)
|
||||
|
||||
self._tasks_model = tasks_model
|
||||
self._tasks_proxy = tasks_proxy
|
||||
self._tasks_view = tasks_view
|
||||
|
||||
self._last_selected_task = None
|
||||
|
||||
def set_asset(self, asset_doc):
|
||||
# Asset deselected
|
||||
if asset_doc is None:
|
||||
return
|
||||
|
||||
# Try and preserve the last selected task and reselect it
|
||||
# after switching assets. If there's no currently selected
|
||||
# asset keep whatever the "last selected" was prior to it.
|
||||
current = self.get_current_task_name()
|
||||
if current:
|
||||
self._last_selected_task = current
|
||||
|
||||
self._tasks_model.set_asset(asset_doc)
|
||||
self._tasks_proxy.sort(0, QtCore.Qt.AscendingOrder)
|
||||
|
||||
if self._last_selected_task:
|
||||
self.select_task(self._last_selected_task)
|
||||
|
||||
# Force a task changed emit.
|
||||
self.task_changed.emit()
|
||||
|
||||
def select_task(self, task_name):
|
||||
"""Select a task by name.
|
||||
|
||||
If the task does not exist in the current model then selection is only
|
||||
cleared.
|
||||
|
||||
Args:
|
||||
task (str): Name of the task to select.
|
||||
|
||||
"""
|
||||
task_view_model = self._tasks_view.model()
|
||||
if not task_view_model:
|
||||
return
|
||||
|
||||
# Clear selection
|
||||
selection_model = self._tasks_view.selectionModel()
|
||||
selection_model.clearSelection()
|
||||
|
||||
# Select the task
|
||||
mode = selection_model.Select | selection_model.Rows
|
||||
for row in range(task_view_model.rowCount()):
|
||||
index = task_view_model.index(row, 0)
|
||||
name = index.data(TASK_NAME_ROLE)
|
||||
if name == task_name:
|
||||
selection_model.select(index, mode)
|
||||
|
||||
# Set the currently active index
|
||||
self._tasks_view.setCurrentIndex(index)
|
||||
break
|
||||
|
||||
def get_current_task_name(self):
|
||||
"""Return name of task at current index (selected)
|
||||
|
||||
Returns:
|
||||
str: Name of the current task.
|
||||
|
||||
"""
|
||||
index = self._tasks_view.currentIndex()
|
||||
selection_model = self._tasks_view.selectionModel()
|
||||
if index.isValid() and selection_model.isSelected(index):
|
||||
return index.data(TASK_NAME_ROLE)
|
||||
return None
|
||||
|
||||
def get_current_task_type(self):
|
||||
index = self._tasks_view.currentIndex()
|
||||
selection_model = self._tasks_view.selectionModel()
|
||||
if index.isValid() and selection_model.isSelected(index):
|
||||
return index.data(TASK_TYPE_ROLE)
|
||||
return None
|
||||
|
||||
|
||||
class FilesWidget(QtWidgets.QWidget):
|
||||
"""A widget displaying files that allows to save and open files."""
|
||||
file_selected = QtCore.Signal(str)
|
||||
|
|
@ -1052,7 +941,7 @@ class Window(QtWidgets.QMainWindow):
|
|||
if asset_docs:
|
||||
asset_doc = asset_docs[0]
|
||||
|
||||
task_name = self.tasks_widget.get_current_task_name()
|
||||
task_name = self.tasks_widget.get_selected_task_name()
|
||||
|
||||
workfile_doc = None
|
||||
if asset_doc and task_name and filepath:
|
||||
|
|
@ -1082,7 +971,7 @@ class Window(QtWidgets.QMainWindow):
|
|||
def _get_current_workfile_doc(self, filepath=None):
|
||||
if filepath is None:
|
||||
filepath = self.files_widget._get_selected_filepath()
|
||||
task_name = self.tasks_widget.get_current_task_name()
|
||||
task_name = self.tasks_widget.get_selected_task_name()
|
||||
asset_docs = self.assets_widget.get_selected_assets()
|
||||
if not task_name or not asset_docs or not filepath:
|
||||
return
|
||||
|
|
@ -1113,18 +1002,16 @@ class Window(QtWidgets.QMainWindow):
|
|||
"name": asset,
|
||||
"type": "asset"
|
||||
},
|
||||
{
|
||||
"data.tasks": 1
|
||||
}
|
||||
)
|
||||
{"_id": 1}
|
||||
) or {}
|
||||
|
||||
# Select the asset
|
||||
self.assets_widget.select_assets([asset], expand=True)
|
||||
|
||||
self.tasks_widget.set_asset(asset_document)
|
||||
self.tasks_widget.set_asset_id(asset_document.get("_id"))
|
||||
|
||||
if "task" in context:
|
||||
self.tasks_widget.select_task(context["task"])
|
||||
self.tasks_widget.select_task_name(context["task"])
|
||||
|
||||
def refresh(self):
|
||||
# Refresh asset widget
|
||||
|
|
@ -1134,7 +1021,7 @@ class Window(QtWidgets.QMainWindow):
|
|||
|
||||
def _on_asset_changed(self):
|
||||
asset = self.assets_widget.get_selected_assets() or None
|
||||
|
||||
asset_id = None
|
||||
if not asset:
|
||||
# Force disable the other widgets if no
|
||||
# active selection
|
||||
|
|
@ -1142,16 +1029,17 @@ class Window(QtWidgets.QMainWindow):
|
|||
self.files_widget.setEnabled(False)
|
||||
else:
|
||||
asset = asset[0]
|
||||
asset_id = asset.get("_id")
|
||||
self.tasks_widget.setEnabled(True)
|
||||
|
||||
self.tasks_widget.set_asset(asset)
|
||||
self.tasks_widget.set_asset_id(asset_id)
|
||||
|
||||
def _on_task_changed(self):
|
||||
asset = self.assets_widget.get_selected_assets() or None
|
||||
if asset is not None:
|
||||
asset = asset[0]
|
||||
task_name = self.tasks_widget.get_current_task_name()
|
||||
task_type = self.tasks_widget.get_current_task_type()
|
||||
task_name = self.tasks_widget.get_selected_task_name()
|
||||
task_type = self.tasks_widget.get_selected_task_type()
|
||||
|
||||
self.tasks_widget.setEnabled(bool(asset))
|
||||
|
||||
|
|
|
|||
2
openpype/vendor/python/common/capture.py
vendored
2
openpype/vendor/python/common/capture.py
vendored
|
|
@ -116,6 +116,8 @@ def capture(camera=None,
|
|||
if not cmds.objExists(camera):
|
||||
raise RuntimeError("Camera does not exist: {0}".format(camera))
|
||||
|
||||
if width and height :
|
||||
maintain_aspect_ratio = False
|
||||
width = width or cmds.getAttr("defaultResolution.width")
|
||||
height = height or cmds.getAttr("defaultResolution.height")
|
||||
if maintain_aspect_ratio:
|
||||
|
|
|
|||
|
|
@ -1,3 +1,3 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Package declaring Pype version."""
|
||||
__version__ = "3.6.0-nightly.4"
|
||||
__version__ = "3.6.2-nightly.1"
|
||||
|
|
|
|||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue