mirror of
https://github.com/ynput/ayon-core.git
synced 2025-12-24 21:04:40 +01:00
Merge branch 'develop' into settings_search
This commit is contained in:
commit
b1cc2b2da4
548 changed files with 27631 additions and 391079 deletions
7
.github/workflows/documentation.yml
vendored
7
.github/workflows/documentation.yml
vendored
|
|
@ -20,7 +20,8 @@ jobs:
|
|||
- uses: actions/checkout@v1
|
||||
- uses: actions/setup-node@v1
|
||||
with:
|
||||
node-version: '12.x'
|
||||
node-version: 14.x
|
||||
cache: yarn
|
||||
- name: Test Build
|
||||
run: |
|
||||
cd website
|
||||
|
|
@ -41,8 +42,8 @@ jobs:
|
|||
|
||||
- uses: actions/setup-node@v1
|
||||
with:
|
||||
node-version: '12.x'
|
||||
|
||||
node-version: 14.x
|
||||
cache: yarn
|
||||
- name: 🔨 Build
|
||||
run: |
|
||||
cd website
|
||||
|
|
|
|||
166
CHANGELOG.md
166
CHANGELOG.md
|
|
@ -1,15 +1,101 @@
|
|||
# Changelog
|
||||
|
||||
## [3.8.2-nightly.1](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
## [3.9.0-nightly.3](https://github.com/pypeclub/OpenPype/tree/HEAD)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.8.1...HEAD)
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/3.8.2...HEAD)
|
||||
|
||||
**Deprecated:**
|
||||
|
||||
- Loader: Remove default family states for hosts from code [\#2706](https://github.com/pypeclub/OpenPype/pull/2706)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
- Update docusaurus to latest version [\#2760](https://github.com/pypeclub/OpenPype/pull/2760)
|
||||
- documentation: add example to `repack-version` command [\#2669](https://github.com/pypeclub/OpenPype/pull/2669)
|
||||
|
||||
**🆕 New features**
|
||||
|
||||
- Flame: loading clips to reels [\#2622](https://github.com/pypeclub/OpenPype/pull/2622)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Pyblish Pype: Remove redundant new line in installed fonts printing [\#2758](https://github.com/pypeclub/OpenPype/pull/2758)
|
||||
- Flame: adding validator source clip [\#2746](https://github.com/pypeclub/OpenPype/pull/2746)
|
||||
- Work Files: Preserve subversion comment of current filename by default [\#2734](https://github.com/pypeclub/OpenPype/pull/2734)
|
||||
- Ftrack: Disable ftrack module by default [\#2732](https://github.com/pypeclub/OpenPype/pull/2732)
|
||||
- Project Manager: Disable add task, add asset and save button when not in a project [\#2727](https://github.com/pypeclub/OpenPype/pull/2727)
|
||||
- dropbox handle big file [\#2718](https://github.com/pypeclub/OpenPype/pull/2718)
|
||||
- Fusion Move PR: Minor tweaks to Fusion integration [\#2716](https://github.com/pypeclub/OpenPype/pull/2716)
|
||||
- Nuke: prerender with review knob [\#2691](https://github.com/pypeclub/OpenPype/pull/2691)
|
||||
- Maya configurable unit validator [\#2680](https://github.com/pypeclub/OpenPype/pull/2680)
|
||||
- General: Add settings for CleanUpFarm and disable the plugin by default [\#2679](https://github.com/pypeclub/OpenPype/pull/2679)
|
||||
- Project Manager: Only allow scroll wheel edits when spinbox is active [\#2678](https://github.com/pypeclub/OpenPype/pull/2678)
|
||||
- Ftrack: Sync description to assets [\#2670](https://github.com/pypeclub/OpenPype/pull/2670)
|
||||
- General: FFmpeg conversion also check attribute string length [\#2635](https://github.com/pypeclub/OpenPype/pull/2635)
|
||||
- Global: adding studio name/code to anatomy template formatting data [\#2630](https://github.com/pypeclub/OpenPype/pull/2630)
|
||||
- Houdini: Load Arnold .ass procedurals into Houdini [\#2606](https://github.com/pypeclub/OpenPype/pull/2606)
|
||||
- Deadline: Simplify GlobalJobPreLoad logic [\#2605](https://github.com/pypeclub/OpenPype/pull/2605)
|
||||
- Houdini: Implement Arnold .ass standin extraction from Houdini \(also support .ass.gz\) [\#2603](https://github.com/pypeclub/OpenPype/pull/2603)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Global: fix broken otio review extractor [\#2590](https://github.com/pypeclub/OpenPype/pull/2590)
|
||||
- Loader UI: Fix right click in representation widget [\#2757](https://github.com/pypeclub/OpenPype/pull/2757)
|
||||
- Aftereffects 2022 and Deadline [\#2748](https://github.com/pypeclub/OpenPype/pull/2748)
|
||||
- Flame: bunch of bugs [\#2745](https://github.com/pypeclub/OpenPype/pull/2745)
|
||||
- Maya: Save current scene on workfile publish [\#2744](https://github.com/pypeclub/OpenPype/pull/2744)
|
||||
- Version Up: Preserve parts of filename after version number \(like subversion\) on version\_up [\#2741](https://github.com/pypeclub/OpenPype/pull/2741)
|
||||
- Loader UI: Multiple asset selection and underline colors fixed [\#2731](https://github.com/pypeclub/OpenPype/pull/2731)
|
||||
- General: Fix loading of unused chars in xml format [\#2729](https://github.com/pypeclub/OpenPype/pull/2729)
|
||||
- TVPaint: Set objectName with members [\#2725](https://github.com/pypeclub/OpenPype/pull/2725)
|
||||
- General: Don't use 'objectName' from loaded references [\#2715](https://github.com/pypeclub/OpenPype/pull/2715)
|
||||
- Settings: Studio Project anatomy is queried using right keys [\#2711](https://github.com/pypeclub/OpenPype/pull/2711)
|
||||
- Local Settings: Additional applications don't break UI [\#2710](https://github.com/pypeclub/OpenPype/pull/2710)
|
||||
- Houdini: Fix refactor of Houdini host move for CreateArnoldAss [\#2704](https://github.com/pypeclub/OpenPype/pull/2704)
|
||||
- LookAssigner: Fix imports after moving code to OpenPype repository [\#2701](https://github.com/pypeclub/OpenPype/pull/2701)
|
||||
- Multiple hosts: unify menu style across hosts [\#2693](https://github.com/pypeclub/OpenPype/pull/2693)
|
||||
- Maya Redshift fixes [\#2692](https://github.com/pypeclub/OpenPype/pull/2692)
|
||||
- Maya: fix fps validation popup [\#2685](https://github.com/pypeclub/OpenPype/pull/2685)
|
||||
- Houdini Explicitly collect correct frame name even in case of single frame render when `frameStart` is provided [\#2676](https://github.com/pypeclub/OpenPype/pull/2676)
|
||||
- hiero: fix effect collector name and order [\#2673](https://github.com/pypeclub/OpenPype/pull/2673)
|
||||
- Maya: Fix menu callbacks [\#2671](https://github.com/pypeclub/OpenPype/pull/2671)
|
||||
- Launcher: Fix access to 'data' attribute on actions [\#2659](https://github.com/pypeclub/OpenPype/pull/2659)
|
||||
- Houdini: fix usd family in loader and integrators [\#2631](https://github.com/pypeclub/OpenPype/pull/2631)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Harmony: Rendering in Deadline didn't work in other machines than submitter [\#2754](https://github.com/pypeclub/OpenPype/pull/2754)
|
||||
- Maya: set Deadline job/batch name to original source workfile name instead of published workfile [\#2733](https://github.com/pypeclub/OpenPype/pull/2733)
|
||||
- Fusion: Moved implementation into OpenPype [\#2713](https://github.com/pypeclub/OpenPype/pull/2713)
|
||||
- TVPaint: Plugin build without dependencies [\#2705](https://github.com/pypeclub/OpenPype/pull/2705)
|
||||
- Webpublisher: Photoshop create a beauty png [\#2689](https://github.com/pypeclub/OpenPype/pull/2689)
|
||||
- Ftrack: Hierarchical attributes are queried properly [\#2682](https://github.com/pypeclub/OpenPype/pull/2682)
|
||||
- Maya: Add Validate Frame Range settings [\#2661](https://github.com/pypeclub/OpenPype/pull/2661)
|
||||
- Harmony: move to Openpype [\#2657](https://github.com/pypeclub/OpenPype/pull/2657)
|
||||
- General: Show applications without integration in project [\#2656](https://github.com/pypeclub/OpenPype/pull/2656)
|
||||
- Maya: cleanup duplicate rendersetup code [\#2642](https://github.com/pypeclub/OpenPype/pull/2642)
|
||||
|
||||
## [3.8.2](https://github.com/pypeclub/OpenPype/tree/3.8.2) (2022-02-07)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.8.2-nightly.3...3.8.2)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
- Cosmetics: Fix common typos in openpype/website [\#2617](https://github.com/pypeclub/OpenPype/pull/2617)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- General: Project backup tools [\#2629](https://github.com/pypeclub/OpenPype/pull/2629)
|
||||
- nuke: adding clear button to write nodes [\#2627](https://github.com/pypeclub/OpenPype/pull/2627)
|
||||
- Ftrack: Family to Asset type mapping is in settings [\#2602](https://github.com/pypeclub/OpenPype/pull/2602)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Fix pulling of cx\_freeze 6.10 [\#2628](https://github.com/pypeclub/OpenPype/pull/2628)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Docker: enhance dockerfiles with metadata, fix pyenv initialization [\#2647](https://github.com/pypeclub/OpenPype/pull/2647)
|
||||
- WebPublisher: fix instance duplicates [\#2641](https://github.com/pypeclub/OpenPype/pull/2641)
|
||||
- Fix - safer pulling of task name for webpublishing from PS [\#2613](https://github.com/pypeclub/OpenPype/pull/2613)
|
||||
|
||||
## [3.8.1](https://github.com/pypeclub/OpenPype/tree/3.8.1) (2022-02-01)
|
||||
|
|
@ -19,99 +105,25 @@
|
|||
**🚀 Enhancements**
|
||||
|
||||
- Webpublisher: Thumbnail extractor [\#2600](https://github.com/pypeclub/OpenPype/pull/2600)
|
||||
- Loader: Allow to toggle default family filters between "include" or "exclude" filtering [\#2541](https://github.com/pypeclub/OpenPype/pull/2541)
|
||||
- Launcher: Added context menu to to skip opening last workfile [\#2536](https://github.com/pypeclub/OpenPype/pull/2536)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- Release/3.8.0 [\#2619](https://github.com/pypeclub/OpenPype/pull/2619)
|
||||
- hotfix: OIIO tool path - add extension on windows [\#2618](https://github.com/pypeclub/OpenPype/pull/2618)
|
||||
- Settings: Enum does not store empty string if has single item to select [\#2615](https://github.com/pypeclub/OpenPype/pull/2615)
|
||||
- switch distutils to sysconfig for `get\_platform\(\)` [\#2594](https://github.com/pypeclub/OpenPype/pull/2594)
|
||||
- Fix poetry index and speedcopy update [\#2589](https://github.com/pypeclub/OpenPype/pull/2589)
|
||||
- Webpublisher: Fix - subset names from processed .psd used wrong value for task [\#2586](https://github.com/pypeclub/OpenPype/pull/2586)
|
||||
- `vrscene` creator Deadline webservice URL handling [\#2580](https://github.com/pypeclub/OpenPype/pull/2580)
|
||||
- global: track name was failing if duplicated root word in name [\#2568](https://github.com/pypeclub/OpenPype/pull/2568)
|
||||
- Validate Maya Rig produces no cycle errors [\#2484](https://github.com/pypeclub/OpenPype/pull/2484)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- Bump pillow from 8.4.0 to 9.0.0 [\#2595](https://github.com/pypeclub/OpenPype/pull/2595)
|
||||
- Webpublisher: Skip version collect [\#2591](https://github.com/pypeclub/OpenPype/pull/2591)
|
||||
- build\(deps\): bump pillow from 8.4.0 to 9.0.0 [\#2523](https://github.com/pypeclub/OpenPype/pull/2523)
|
||||
|
||||
## [3.8.0](https://github.com/pypeclub/OpenPype/tree/3.8.0) (2022-01-24)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.8.0-nightly.7...3.8.0)
|
||||
|
||||
### 📖 Documentation
|
||||
|
||||
- Variable in docs renamed to proper name [\#2546](https://github.com/pypeclub/OpenPype/pull/2546)
|
||||
|
||||
**🆕 New features**
|
||||
|
||||
- Flame: extracting segments with trans-coding [\#2547](https://github.com/pypeclub/OpenPype/pull/2547)
|
||||
- Maya : V-Ray Proxy - load all ABC files via proxy [\#2544](https://github.com/pypeclub/OpenPype/pull/2544)
|
||||
- Maya to Unreal: Extended static mesh workflow [\#2537](https://github.com/pypeclub/OpenPype/pull/2537)
|
||||
- Flame: collecting publishable instances [\#2519](https://github.com/pypeclub/OpenPype/pull/2519)
|
||||
- Flame: create publishable clips [\#2495](https://github.com/pypeclub/OpenPype/pull/2495)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- Webpublisher: Moved error at the beginning of the log [\#2559](https://github.com/pypeclub/OpenPype/pull/2559)
|
||||
- Ftrack: Use ApplicationManager to get DJV path [\#2558](https://github.com/pypeclub/OpenPype/pull/2558)
|
||||
- Webpublisher: Added endpoint to reprocess batch through UI [\#2555](https://github.com/pypeclub/OpenPype/pull/2555)
|
||||
- Settings: PathInput strip passed string [\#2550](https://github.com/pypeclub/OpenPype/pull/2550)
|
||||
- Global: Exctract Review anatomy fill data with output name [\#2548](https://github.com/pypeclub/OpenPype/pull/2548)
|
||||
- Cosmetics: Clean up some cosmetics / typos [\#2542](https://github.com/pypeclub/OpenPype/pull/2542)
|
||||
- General: Validate if current process OpenPype version is requested version [\#2529](https://github.com/pypeclub/OpenPype/pull/2529)
|
||||
- General: Be able to use anatomy data in ffmpeg output arguments [\#2525](https://github.com/pypeclub/OpenPype/pull/2525)
|
||||
- Expose toggle publish plug-in settings for Maya Look Shading Engine Naming [\#2521](https://github.com/pypeclub/OpenPype/pull/2521)
|
||||
- Photoshop: Move implementation to OpenPype [\#2510](https://github.com/pypeclub/OpenPype/pull/2510)
|
||||
- Slack: notifications are sent with Openpype logo and bot name [\#2499](https://github.com/pypeclub/OpenPype/pull/2499)
|
||||
- Slack: Add review to notification message [\#2498](https://github.com/pypeclub/OpenPype/pull/2498)
|
||||
- Maya: Collect 'fps' animation data only for "review" instances [\#2486](https://github.com/pypeclub/OpenPype/pull/2486)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- AfterEffects: Fix - removed obsolete import [\#2577](https://github.com/pypeclub/OpenPype/pull/2577)
|
||||
- General: OpenPype version updates [\#2575](https://github.com/pypeclub/OpenPype/pull/2575)
|
||||
- Ftrack: Delete action revision [\#2563](https://github.com/pypeclub/OpenPype/pull/2563)
|
||||
- Webpublisher: ftrack shows incorrect user names [\#2560](https://github.com/pypeclub/OpenPype/pull/2560)
|
||||
- General: Do not validate version if build does not support it [\#2557](https://github.com/pypeclub/OpenPype/pull/2557)
|
||||
- Webpublisher: Fixed progress reporting [\#2553](https://github.com/pypeclub/OpenPype/pull/2553)
|
||||
- Fix Maya AssProxyLoader version switch [\#2551](https://github.com/pypeclub/OpenPype/pull/2551)
|
||||
- General: Fix install thread in igniter [\#2549](https://github.com/pypeclub/OpenPype/pull/2549)
|
||||
- Houdini: vdbcache family preserve frame numbers on publish integration + enable validate version for Houdini [\#2535](https://github.com/pypeclub/OpenPype/pull/2535)
|
||||
- Maya: Fix Load VDB to V-Ray [\#2533](https://github.com/pypeclub/OpenPype/pull/2533)
|
||||
- Maya: ReferenceLoader fix not unique group name error for attach to root [\#2532](https://github.com/pypeclub/OpenPype/pull/2532)
|
||||
- Maya: namespaced context go back to original namespace when started from inside a namespace [\#2531](https://github.com/pypeclub/OpenPype/pull/2531)
|
||||
- Fix create zip tool - path argument [\#2522](https://github.com/pypeclub/OpenPype/pull/2522)
|
||||
- Maya: Fix Extract Look with space in names [\#2518](https://github.com/pypeclub/OpenPype/pull/2518)
|
||||
- Fix published frame content for sequence starting with 0 [\#2513](https://github.com/pypeclub/OpenPype/pull/2513)
|
||||
- Maya: reset empty string attributes correctly to "" instead of "None" [\#2506](https://github.com/pypeclub/OpenPype/pull/2506)
|
||||
- Improve FusionPreLaunch hook errors [\#2505](https://github.com/pypeclub/OpenPype/pull/2505)
|
||||
|
||||
**Merged pull requests:**
|
||||
|
||||
- AfterEffects: Move implementation to OpenPype [\#2543](https://github.com/pypeclub/OpenPype/pull/2543)
|
||||
- Maya: Remove Maya Look Assigner check on startup [\#2540](https://github.com/pypeclub/OpenPype/pull/2540)
|
||||
- build\(deps\): bump shelljs from 0.8.4 to 0.8.5 in /website [\#2538](https://github.com/pypeclub/OpenPype/pull/2538)
|
||||
- build\(deps\): bump follow-redirects from 1.14.4 to 1.14.7 in /website [\#2534](https://github.com/pypeclub/OpenPype/pull/2534)
|
||||
- Nuke: Merge avalon's implementation into OpenPype [\#2514](https://github.com/pypeclub/OpenPype/pull/2514)
|
||||
|
||||
## [3.7.0](https://github.com/pypeclub/OpenPype/tree/3.7.0) (2022-01-04)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.7.0-nightly.14...3.7.0)
|
||||
|
||||
**🚀 Enhancements**
|
||||
|
||||
- General: Workdir extra folders [\#2462](https://github.com/pypeclub/OpenPype/pull/2462)
|
||||
|
||||
**🐛 Bug fixes**
|
||||
|
||||
- TVPaint: Create render layer dialog is in front [\#2471](https://github.com/pypeclub/OpenPype/pull/2471)
|
||||
|
||||
## [3.6.4](https://github.com/pypeclub/OpenPype/tree/3.6.4) (2021-11-23)
|
||||
|
||||
[Full Changelog](https://github.com/pypeclub/OpenPype/compare/CI/3.7.0-nightly.1...3.6.4)
|
||||
|
|
|
|||
35
Dockerfile
35
Dockerfile
|
|
@ -1,13 +1,18 @@
|
|||
# Build Pype docker image
|
||||
FROM debian:bookworm-slim AS builder
|
||||
FROM ubuntu:focal AS builder
|
||||
ARG OPENPYPE_PYTHON_VERSION=3.7.12
|
||||
ARG BUILD_DATE
|
||||
ARG VERSION
|
||||
|
||||
LABEL maintainer="info@openpype.io"
|
||||
LABEL description="Docker Image to build and run OpenPype"
|
||||
LABEL description="Docker Image to build and run OpenPype under Ubuntu 20.04"
|
||||
LABEL org.opencontainers.image.name="pypeclub/openpype"
|
||||
LABEL org.opencontainers.image.title="OpenPype Docker Image"
|
||||
LABEL org.opencontainers.image.url="https://openpype.io/"
|
||||
LABEL org.opencontainers.image.source="https://github.com/pypeclub/pype"
|
||||
LABEL org.opencontainers.image.source="https://github.com/pypeclub/OpenPype"
|
||||
LABEL org.opencontainers.image.documentation="https://openpype.io/docs/system_introduction"
|
||||
LABEL org.opencontainers.image.created=$BUILD_DATE
|
||||
LABEL org.opencontainers.image.version=$VERSION
|
||||
|
||||
USER root
|
||||
|
||||
|
|
@ -42,14 +47,19 @@ RUN apt-get update \
|
|||
|
||||
SHELL ["/bin/bash", "-c"]
|
||||
|
||||
|
||||
RUN mkdir /opt/openpype
|
||||
|
||||
# download and install pyenv
|
||||
RUN curl https://pyenv.run | bash \
|
||||
&& echo 'export PATH="$HOME/.pyenv/bin:$PATH"'>> $HOME/.bashrc \
|
||||
&& echo 'eval "$(pyenv init -)"' >> $HOME/.bashrc \
|
||||
&& echo 'eval "$(pyenv virtualenv-init -)"' >> $HOME/.bashrc \
|
||||
&& echo 'eval "$(pyenv init --path)"' >> $HOME/.bashrc \
|
||||
&& source $HOME/.bashrc && pyenv install ${OPENPYPE_PYTHON_VERSION}
|
||||
&& echo 'export PATH="$HOME/.pyenv/bin:$PATH"'>> $HOME/init_pyenv.sh \
|
||||
&& echo 'eval "$(pyenv init -)"' >> $HOME/init_pyenv.sh \
|
||||
&& echo 'eval "$(pyenv virtualenv-init -)"' >> $HOME/init_pyenv.sh \
|
||||
&& echo 'eval "$(pyenv init --path)"' >> $HOME/init_pyenv.sh
|
||||
|
||||
# install python with pyenv
|
||||
RUN source $HOME/init_pyenv.sh \
|
||||
&& pyenv install ${OPENPYPE_PYTHON_VERSION}
|
||||
|
||||
COPY . /opt/openpype/
|
||||
|
||||
|
|
@ -57,13 +67,16 @@ RUN chmod +x /opt/openpype/tools/create_env.sh && chmod +x /opt/openpype/tools/b
|
|||
|
||||
WORKDIR /opt/openpype
|
||||
|
||||
# set local python version
|
||||
RUN cd /opt/openpype \
|
||||
&& source $HOME/.bashrc \
|
||||
&& source $HOME/init_pyenv.sh \
|
||||
&& pyenv local ${OPENPYPE_PYTHON_VERSION}
|
||||
|
||||
RUN source $HOME/.bashrc \
|
||||
# fetch third party tools/libraries
|
||||
RUN source $HOME/init_pyenv.sh \
|
||||
&& ./tools/create_env.sh \
|
||||
&& ./tools/fetch_thirdparty_libs.sh
|
||||
|
||||
RUN source $HOME/.bashrc \
|
||||
# build openpype
|
||||
RUN source $HOME/init_pyenv.sh \
|
||||
&& bash ./tools/build.sh
|
||||
|
|
|
|||
|
|
@ -1,11 +1,15 @@
|
|||
# Build Pype docker image
|
||||
FROM centos:7 AS builder
|
||||
ARG OPENPYPE_PYTHON_VERSION=3.7.10
|
||||
ARG OPENPYPE_PYTHON_VERSION=3.7.12
|
||||
|
||||
LABEL org.opencontainers.image.name="pypeclub/openpype"
|
||||
LABEL org.opencontainers.image.title="OpenPype Docker Image"
|
||||
LABEL org.opencontainers.image.url="https://openpype.io/"
|
||||
LABEL org.opencontainers.image.source="https://github.com/pypeclub/pype"
|
||||
LABEL org.opencontainers.image.documentation="https://openpype.io/docs/system_introduction"
|
||||
LABEL org.opencontainers.image.created=$BUILD_DATE
|
||||
LABEL org.opencontainers.image.version=$VERSION
|
||||
|
||||
|
||||
USER root
|
||||
|
||||
|
|
@ -38,7 +42,8 @@ RUN yum -y install https://dl.fedoraproject.org/pub/epel/epel-release-latest-7.n
|
|||
patchelf \
|
||||
automake \
|
||||
autoconf \
|
||||
ncurses \
|
||||
patch \
|
||||
ncurses \
|
||||
ncurses-devel \
|
||||
qt5-qtbase-devel \
|
||||
xcb-util-wm \
|
||||
|
|
|
|||
|
|
@ -23,7 +23,8 @@ from .user_settings import (
|
|||
OpenPypeSettingsRegistry
|
||||
)
|
||||
from .tools import (
|
||||
get_openpype_path_from_db,
|
||||
get_openpype_global_settings,
|
||||
get_openpype_path_from_settings,
|
||||
get_expected_studio_version_str
|
||||
)
|
||||
|
||||
|
|
@ -973,9 +974,19 @@ class BootstrapRepos:
|
|||
for line in checksums_data.split("\n") if line
|
||||
]
|
||||
|
||||
# get list of files in zip minus `checksums` file itself
|
||||
# and turn in to set to compare against list of files
|
||||
# from checksum file. If difference exists, something is
|
||||
# wrong
|
||||
files_in_zip = set(zip_file.namelist())
|
||||
files_in_zip.remove("checksums")
|
||||
files_in_checksum = {file[1] for file in checksums}
|
||||
diff = files_in_zip.difference(files_in_checksum)
|
||||
if diff:
|
||||
return False, f"Missing files {diff}"
|
||||
|
||||
# calculate and compare checksums in the zip file
|
||||
for file in checksums:
|
||||
file_name = file[1]
|
||||
for file_checksum, file_name in checksums:
|
||||
if platform.system().lower() == "windows":
|
||||
file_name = file_name.replace("/", "\\")
|
||||
h = hashlib.sha256()
|
||||
|
|
@ -983,21 +994,9 @@ class BootstrapRepos:
|
|||
h.update(zip_file.read(file_name))
|
||||
except FileNotFoundError:
|
||||
return False, f"Missing file [ {file_name} ]"
|
||||
if h.hexdigest() != file[0]:
|
||||
if h.hexdigest() != file_checksum:
|
||||
return False, f"Invalid checksum on {file_name}"
|
||||
|
||||
# get list of files in zip minus `checksums` file itself
|
||||
# and turn in to set to compare against list of files
|
||||
# from checksum file. If difference exists, something is
|
||||
# wrong
|
||||
files_in_zip = zip_file.namelist()
|
||||
files_in_zip.remove("checksums")
|
||||
files_in_zip = set(files_in_zip)
|
||||
files_in_checksum = {file[1] for file in checksums}
|
||||
diff = files_in_zip.difference(files_in_checksum)
|
||||
if diff:
|
||||
return False, f"Missing files {diff}"
|
||||
|
||||
return True, "All ok"
|
||||
|
||||
@staticmethod
|
||||
|
|
@ -1011,16 +1010,22 @@ class BootstrapRepos:
|
|||
tuple(line.split(":"))
|
||||
for line in checksums_data.split("\n") if line
|
||||
]
|
||||
files_in_dir = [
|
||||
|
||||
# compare file list against list of files from checksum file.
|
||||
# If difference exists, something is wrong and we invalidate directly
|
||||
files_in_dir = set(
|
||||
file.relative_to(path).as_posix()
|
||||
for file in path.iterdir() if file.is_file()
|
||||
]
|
||||
)
|
||||
files_in_dir.remove("checksums")
|
||||
files_in_dir = set(files_in_dir)
|
||||
files_in_checksum = {file[1] for file in checksums}
|
||||
|
||||
for file in checksums:
|
||||
file_name = file[1]
|
||||
diff = files_in_dir.difference(files_in_checksum)
|
||||
if diff:
|
||||
return False, f"Missing files {diff}"
|
||||
|
||||
# calculate and compare checksums
|
||||
for file_checksum, file_name in checksums:
|
||||
if platform.system().lower() == "windows":
|
||||
file_name = file_name.replace("/", "\\")
|
||||
try:
|
||||
|
|
@ -1028,11 +1033,8 @@ class BootstrapRepos:
|
|||
except FileNotFoundError:
|
||||
return False, f"Missing file [ {file_name} ]"
|
||||
|
||||
if file[0] != current:
|
||||
if file_checksum != current:
|
||||
return False, f"Invalid checksum on {file_name}"
|
||||
diff = files_in_dir.difference(files_in_checksum)
|
||||
if diff:
|
||||
return False, f"Missing files {diff}"
|
||||
|
||||
return True, "All ok"
|
||||
|
||||
|
|
@ -1262,7 +1264,8 @@ class BootstrapRepos:
|
|||
openpype_path = None
|
||||
# try to get OpenPype path from mongo.
|
||||
if location.startswith("mongodb"):
|
||||
openpype_path = get_openpype_path_from_db(location)
|
||||
global_settings = get_openpype_global_settings(location)
|
||||
openpype_path = get_openpype_path_from_settings(global_settings)
|
||||
if not openpype_path:
|
||||
self._print("cannot find OPENPYPE_PATH in settings.")
|
||||
return None
|
||||
|
|
|
|||
|
|
@ -12,7 +12,6 @@ from Qt.QtCore import QTimer # noqa
|
|||
from .install_thread import InstallThread
|
||||
from .tools import (
|
||||
validate_mongo_connection,
|
||||
get_openpype_path_from_db,
|
||||
get_openpype_icon_path
|
||||
)
|
||||
|
||||
|
|
|
|||
|
|
@ -161,18 +161,17 @@ def get_openpype_global_settings(url: str) -> dict:
|
|||
return global_settings.get("data") or {}
|
||||
|
||||
|
||||
def get_openpype_path_from_db(url: str) -> Union[str, None]:
|
||||
def get_openpype_path_from_settings(settings: dict) -> Union[str, None]:
|
||||
"""Get OpenPype path from global settings.
|
||||
|
||||
Args:
|
||||
url (str): mongodb url.
|
||||
settings (dict): mongodb url.
|
||||
|
||||
Returns:
|
||||
path to OpenPype or None if not found
|
||||
"""
|
||||
global_settings = get_openpype_global_settings(url)
|
||||
paths = (
|
||||
global_settings
|
||||
settings
|
||||
.get("openpype_path", {})
|
||||
.get(platform.system().lower())
|
||||
) or []
|
||||
|
|
|
|||
|
|
@ -412,3 +412,23 @@ def repack_version(directory):
|
|||
directory name.
|
||||
"""
|
||||
PypeCommands().repack_version(directory)
|
||||
|
||||
|
||||
@main.command()
|
||||
@click.option("--project", help="Project name")
|
||||
@click.option(
|
||||
"--dirpath", help="Directory where package is stored", default=None
|
||||
)
|
||||
def pack_project(project, dirpath):
|
||||
"""Create a package of project with all files and database dump."""
|
||||
PypeCommands().pack_project(project, dirpath)
|
||||
|
||||
|
||||
@main.command()
|
||||
@click.option("--zipfile", help="Path to zip file")
|
||||
@click.option(
|
||||
"--root", help="Replace root which was stored in project", default=None
|
||||
)
|
||||
def unpack_project(zipfile, root):
|
||||
"""Create a package of project with all files and database dump."""
|
||||
PypeCommands().unpack_project(zipfile, root)
|
||||
|
|
|
|||
|
|
@ -21,7 +21,7 @@ class AddLastWorkfileToLaunchArgs(PreLaunchHook):
|
|||
"blender",
|
||||
"photoshop",
|
||||
"tvpaint",
|
||||
"afftereffects"
|
||||
"aftereffects"
|
||||
]
|
||||
|
||||
def execute(self):
|
||||
|
|
|
|||
|
|
@ -19,7 +19,7 @@ class CopyTemplateWorkfile(PreLaunchHook):
|
|||
|
||||
# Before `AddLastWorkfileToLaunchArgs`
|
||||
order = 0
|
||||
app_groups = ["blender", "photoshop", "tvpaint", "afftereffects"]
|
||||
app_groups = ["blender", "photoshop", "tvpaint", "aftereffects"]
|
||||
|
||||
def execute(self):
|
||||
"""Check if can copy template for context and do it if possible.
|
||||
|
|
@ -44,7 +44,7 @@ class CopyTemplateWorkfile(PreLaunchHook):
|
|||
return
|
||||
|
||||
if os.path.exists(last_workfile):
|
||||
self.log.debug("Last workfile exits. Skipping {} process.".format(
|
||||
self.log.debug("Last workfile exists. Skipping {} process.".format(
|
||||
self.__class__.__name__
|
||||
))
|
||||
return
|
||||
|
|
@ -120,7 +120,7 @@ class CopyTemplateWorkfile(PreLaunchHook):
|
|||
f"Creating workfile from template: \"{template_path}\""
|
||||
)
|
||||
|
||||
# Copy template workfile to new destinantion
|
||||
# Copy template workfile to new destination
|
||||
shutil.copy2(
|
||||
os.path.normpath(template_path),
|
||||
os.path.normpath(last_workfile)
|
||||
|
|
|
|||
|
|
@ -1,10 +1,10 @@
|
|||
import os
|
||||
import subprocess
|
||||
|
||||
from openpype.lib import (
|
||||
PreLaunchHook,
|
||||
get_openpype_execute_args
|
||||
)
|
||||
from openpype.lib.applications import get_non_python_host_kwargs
|
||||
|
||||
from openpype import PACKAGE_DIR as OPENPYPE_DIR
|
||||
|
||||
|
|
@ -51,3 +51,7 @@ class NonPythonHostHook(PreLaunchHook):
|
|||
|
||||
if remainders:
|
||||
self.launch_context.launch_args.extend(remainders)
|
||||
|
||||
self.launch_context.kwargs = \
|
||||
get_non_python_host_kwargs(self.launch_context.kwargs)
|
||||
|
||||
|
|
|
|||
Binary file not shown.
|
|
@ -1,5 +1,5 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<ExtensionManifest Version="8.0" ExtensionBundleId="com.openpype.AE.panel" ExtensionBundleVersion="1.0.21"
|
||||
<ExtensionManifest Version="8.0" ExtensionBundleId="com.openpype.AE.panel" ExtensionBundleVersion="1.0.22"
|
||||
ExtensionBundleName="openpype" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
|
||||
<ExtensionList>
|
||||
<Extension Id="com.openpype.AE.panel" Version="1.0" />
|
||||
|
|
|
|||
|
|
@ -301,6 +301,15 @@ function main(websocket_url){
|
|||
return get_extension_version();
|
||||
});
|
||||
|
||||
RPC.addRoute('AfterEffects.get_app_version', function (data) {
|
||||
log.warn('Server called client route "get_app_version":', data);
|
||||
return runEvalScript("getAppVersion()")
|
||||
.then(function(result){
|
||||
log.warn("get_app_version: " + result);
|
||||
return result;
|
||||
});
|
||||
});
|
||||
|
||||
RPC.addRoute('AfterEffects.close', function (data) {
|
||||
log.warn('Server called client route "close":', data);
|
||||
return runEvalScript("close()");
|
||||
|
|
|
|||
|
|
@ -438,7 +438,10 @@ function getAudioUrlForComp(comp_id){
|
|||
for (i = 1; i <= item.numLayers; ++i){
|
||||
var layer = item.layers[i];
|
||||
if (layer instanceof AVLayer){
|
||||
return layer.source.file.fsName.toString();
|
||||
if (layer.hasAudio){
|
||||
source_url = layer.source.file.fsName.toString()
|
||||
return _prepareSingleValue(source_url);
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
|
@ -715,6 +718,10 @@ function close(){
|
|||
app.quit();
|
||||
}
|
||||
|
||||
function getAppVersion(){
|
||||
return _prepareSingleValue(app.version);
|
||||
}
|
||||
|
||||
function _prepareSingleValue(value){
|
||||
return JSON.stringify({"result": value})
|
||||
}
|
||||
|
|
|
|||
|
|
@ -35,7 +35,6 @@ def main(*subprocess_args):
|
|||
launcher.start()
|
||||
|
||||
if os.environ.get("HEADLESS_PUBLISH"):
|
||||
# reusing ConsoleTrayApp approach as it was already implemented
|
||||
launcher.execute_in_main_thread(lambda: headless_publish(
|
||||
log,
|
||||
"CloseAE",
|
||||
|
|
|
|||
|
|
@ -537,6 +537,13 @@ class AfterEffectsServerStub():
|
|||
|
||||
return self._handle_return(res)
|
||||
|
||||
def get_app_version(self):
|
||||
"""Returns version number of installed application (17.5...)."""
|
||||
res = self.websocketserver.call(self.client.call(
|
||||
'AfterEffects.get_app_version'))
|
||||
|
||||
return self._handle_return(res)
|
||||
|
||||
def close(self):
|
||||
res = self.websocketserver.call(self.client.call('AfterEffects.close'))
|
||||
|
||||
|
|
|
|||
|
|
@ -19,6 +19,7 @@ class CreateRender(openpype.api.Creator):
|
|||
name = "renderDefault"
|
||||
label = "Render on Farm"
|
||||
family = "render"
|
||||
defaults = ["Main"]
|
||||
|
||||
def process(self):
|
||||
stub = get_stub() # only after After Effects is up
|
||||
|
|
|
|||
|
|
@ -20,6 +20,7 @@ class AERenderInstance(RenderInstance):
|
|||
fps = attr.ib(default=None)
|
||||
projectEntity = attr.ib(default=None)
|
||||
stagingDir = attr.ib(default=None)
|
||||
app_version = attr.ib(default=None)
|
||||
|
||||
|
||||
class CollectAERender(abstract_collect_render.AbstractCollectRender):
|
||||
|
|
@ -41,6 +42,9 @@ class CollectAERender(abstract_collect_render.AbstractCollectRender):
|
|||
def get_instances(self, context):
|
||||
instances = []
|
||||
|
||||
app_version = self.stub.get_app_version()
|
||||
app_version = app_version[0:4]
|
||||
|
||||
current_file = context.data["currentFile"]
|
||||
version = context.data["version"]
|
||||
asset_entity = context.data["assetEntity"]
|
||||
|
|
@ -105,7 +109,8 @@ class CollectAERender(abstract_collect_render.AbstractCollectRender):
|
|||
frameEnd=frameEnd,
|
||||
frameStep=1,
|
||||
toBeRenderedOn='deadline',
|
||||
fps=fps
|
||||
fps=fps,
|
||||
app_version=app_version
|
||||
)
|
||||
|
||||
comp = compositions_by_id.get(int(item_id))
|
||||
|
|
@ -118,6 +123,7 @@ class CollectAERender(abstract_collect_render.AbstractCollectRender):
|
|||
instance.anatomyData = context.data["anatomyData"]
|
||||
|
||||
instance.outputDir = self._get_output_dir(instance)
|
||||
instance.context = context
|
||||
|
||||
settings = get_project_settings(os.getenv("AVALON_PROJECT"))
|
||||
reviewable_subset_filter = \
|
||||
|
|
@ -142,7 +148,6 @@ class CollectAERender(abstract_collect_render.AbstractCollectRender):
|
|||
break
|
||||
|
||||
self.log.info("New instance:: {}".format(instance))
|
||||
|
||||
instances.append(instance)
|
||||
|
||||
return instances
|
||||
|
|
|
|||
|
|
@ -306,6 +306,17 @@ class LaunchManager(LaunchQtApp):
|
|||
self._window.refresh()
|
||||
|
||||
|
||||
class LaunchLibrary(LaunchQtApp):
|
||||
"""Launch Library Loader."""
|
||||
|
||||
bl_idname = "wm.library_loader"
|
||||
bl_label = "Library..."
|
||||
_tool_name = "libraryloader"
|
||||
|
||||
def before_window_show(self):
|
||||
self._window.refresh()
|
||||
|
||||
|
||||
class LaunchWorkFiles(LaunchQtApp):
|
||||
"""Launch Avalon Work Files."""
|
||||
|
||||
|
|
@ -365,6 +376,7 @@ class TOPBAR_MT_avalon(bpy.types.Menu):
|
|||
icon_value=pyblish_menu_icon_id,
|
||||
)
|
||||
layout.operator(LaunchManager.bl_idname, text="Manage...")
|
||||
layout.operator(LaunchLibrary.bl_idname, text="Library...")
|
||||
layout.separator()
|
||||
layout.operator(LaunchWorkFiles.bl_idname, text="Work Files...")
|
||||
# TODO (jasper): maybe add 'Reload Pipeline', 'Reset Frame Range' and
|
||||
|
|
@ -382,6 +394,7 @@ classes = [
|
|||
LaunchLoader,
|
||||
LaunchPublisher,
|
||||
LaunchManager,
|
||||
LaunchLibrary,
|
||||
LaunchWorkFiles,
|
||||
TOPBAR_MT_avalon,
|
||||
]
|
||||
|
|
|
|||
|
|
@ -131,6 +131,8 @@ def deselect_all():
|
|||
|
||||
class Creator(PypeCreatorMixin, avalon.api.Creator):
|
||||
"""Base class for Creator plug-ins."""
|
||||
defaults = ['Main']
|
||||
|
||||
def process(self):
|
||||
collection = bpy.data.collections.new(name=self.data["subset"])
|
||||
bpy.context.scene.collection.children.link(collection)
|
||||
|
|
|
|||
|
|
@ -3,3 +3,20 @@ import os
|
|||
HOST_DIR = os.path.dirname(
|
||||
os.path.abspath(__file__)
|
||||
)
|
||||
|
||||
|
||||
def add_implementation_envs(env, _app):
|
||||
# Add requirements to DL_PYTHON_HOOK_PATH
|
||||
pype_root = os.environ["OPENPYPE_REPOS_ROOT"]
|
||||
|
||||
env["DL_PYTHON_HOOK_PATH"] = os.path.join(
|
||||
pype_root, "openpype", "hosts", "flame", "startup")
|
||||
env.pop("QT_AUTO_SCREEN_SCALE_FACTOR", None)
|
||||
|
||||
# Set default values if are not already set via settings
|
||||
defaults = {
|
||||
"LOGLEVEL": "DEBUG"
|
||||
}
|
||||
for key, value in defaults.items():
|
||||
if not env.get(key):
|
||||
env[key] = value
|
||||
|
|
|
|||
|
|
@ -28,7 +28,8 @@ from .lib import (
|
|||
get_reformated_filename,
|
||||
get_frame_from_filename,
|
||||
get_padding_from_filename,
|
||||
maintained_object_duplication
|
||||
maintained_object_duplication,
|
||||
get_clip_segment
|
||||
)
|
||||
from .utils import (
|
||||
setup,
|
||||
|
|
@ -52,7 +53,10 @@ from .menu import (
|
|||
)
|
||||
from .plugin import (
|
||||
Creator,
|
||||
PublishableClip
|
||||
PublishableClip,
|
||||
ClipLoader,
|
||||
OpenClipSolver
|
||||
|
||||
)
|
||||
from .workio import (
|
||||
open_file,
|
||||
|
|
@ -96,6 +100,7 @@ __all__ = [
|
|||
"get_frame_from_filename",
|
||||
"get_padding_from_filename",
|
||||
"maintained_object_duplication",
|
||||
"get_clip_segment",
|
||||
|
||||
# pipeline
|
||||
"install",
|
||||
|
|
@ -122,6 +127,8 @@ __all__ = [
|
|||
# plugin
|
||||
"Creator",
|
||||
"PublishableClip",
|
||||
"ClipLoader",
|
||||
"OpenClipSolver",
|
||||
|
||||
# workio
|
||||
"open_file",
|
||||
|
|
|
|||
|
|
@ -692,3 +692,18 @@ def maintained_object_duplication(item):
|
|||
finally:
|
||||
# delete the item at the end
|
||||
flame.delete(duplicate)
|
||||
|
||||
|
||||
def get_clip_segment(flame_clip):
|
||||
name = flame_clip.name.get_value()
|
||||
version = flame_clip.versions[0]
|
||||
track = version.tracks[0]
|
||||
segments = track.segments
|
||||
|
||||
if len(segments) < 1:
|
||||
raise ValueError("Clip `{}` has no segments!".format(name))
|
||||
|
||||
if len(segments) > 1:
|
||||
raise ValueError("Clip `{}` has too many segments!".format(name))
|
||||
|
||||
return segments[0]
|
||||
|
|
|
|||
|
|
@ -110,19 +110,19 @@ class FlameMenuProjectConnect(_FlameMenuApp):
|
|||
menu = deepcopy(self.menu)
|
||||
|
||||
menu['actions'].append({
|
||||
"name": "Workfiles ...",
|
||||
"name": "Workfiles...",
|
||||
"execute": lambda x: self.tools_helper.show_workfiles()
|
||||
})
|
||||
menu['actions'].append({
|
||||
"name": "Load ...",
|
||||
"name": "Load...",
|
||||
"execute": lambda x: self.tools_helper.show_loader()
|
||||
})
|
||||
menu['actions'].append({
|
||||
"name": "Manage ...",
|
||||
"name": "Manage...",
|
||||
"execute": lambda x: self.tools_helper.show_scene_inventory()
|
||||
})
|
||||
menu['actions'].append({
|
||||
"name": "Library ...",
|
||||
"name": "Library...",
|
||||
"execute": lambda x: self.tools_helper.show_library_loader()
|
||||
})
|
||||
return menu
|
||||
|
|
@ -164,24 +164,27 @@ class FlameMenuTimeline(_FlameMenuApp):
|
|||
menu = deepcopy(self.menu)
|
||||
|
||||
menu['actions'].append({
|
||||
"name": "Create ...",
|
||||
"name": "Create...",
|
||||
"execute": lambda x: callback_selection(
|
||||
x, self.tools_helper.show_creator)
|
||||
})
|
||||
menu['actions'].append({
|
||||
"name": "Publish ...",
|
||||
"name": "Publish...",
|
||||
"execute": lambda x: callback_selection(
|
||||
x, self.tools_helper.show_publish)
|
||||
})
|
||||
menu['actions'].append({
|
||||
"name": "Load ...",
|
||||
"name": "Load...",
|
||||
"execute": lambda x: self.tools_helper.show_loader()
|
||||
})
|
||||
menu['actions'].append({
|
||||
"name": "Manage ...",
|
||||
"name": "Manage...",
|
||||
"execute": lambda x: self.tools_helper.show_scene_inventory()
|
||||
})
|
||||
|
||||
menu['actions'].append({
|
||||
"name": "Library...",
|
||||
"execute": lambda x: self.tools_helper.show_library_loader()
|
||||
})
|
||||
return menu
|
||||
|
||||
def refresh(self, *args, **kwargs):
|
||||
|
|
|
|||
|
|
@ -4,6 +4,7 @@ Basic avalon integration
|
|||
import os
|
||||
import contextlib
|
||||
from avalon import api as avalon
|
||||
from avalon.pipeline import AVALON_CONTAINER_ID
|
||||
from pyblish import api as pyblish
|
||||
from openpype.api import Logger
|
||||
from .lib import (
|
||||
|
|
@ -28,18 +29,6 @@ log = Logger.get_logger(__name__)
|
|||
|
||||
|
||||
def install():
|
||||
# Disable all families except for the ones we explicitly want to see
|
||||
family_states = [
|
||||
"imagesequence",
|
||||
"render2d",
|
||||
"plate",
|
||||
"render",
|
||||
"mov",
|
||||
"clip"
|
||||
]
|
||||
avalon.data["familiesStateDefault"] = False
|
||||
avalon.data["familiesStateToggled"] = family_states
|
||||
|
||||
|
||||
pyblish.register_host("flame")
|
||||
pyblish.register_plugin_path(PUBLISH_PATH)
|
||||
|
|
@ -68,14 +57,31 @@ def uninstall():
|
|||
log.info("OpenPype Flame host uninstalled ...")
|
||||
|
||||
|
||||
def containerise(tl_segment,
|
||||
def containerise(flame_clip_segment,
|
||||
name,
|
||||
namespace,
|
||||
context,
|
||||
loader=None,
|
||||
data=None):
|
||||
# TODO: containerise
|
||||
pass
|
||||
|
||||
data_imprint = {
|
||||
"schema": "openpype:container-2.0",
|
||||
"id": AVALON_CONTAINER_ID,
|
||||
"name": str(name),
|
||||
"namespace": str(namespace),
|
||||
"loader": str(loader),
|
||||
"representation": str(context["representation"]["_id"]),
|
||||
}
|
||||
|
||||
if data:
|
||||
for k, v in data.items():
|
||||
data_imprint[k] = v
|
||||
|
||||
log.debug("_ data_imprint: {}".format(data_imprint))
|
||||
|
||||
set_segment_data_marker(flame_clip_segment, data_imprint)
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def ls():
|
||||
|
|
|
|||
|
|
@ -1,7 +1,14 @@
|
|||
import os
|
||||
import re
|
||||
import shutil
|
||||
import sys
|
||||
from avalon.vendor import qargparse
|
||||
from xml.etree import ElementTree as ET
|
||||
import six
|
||||
from Qt import QtWidgets, QtCore
|
||||
import openpype.api as openpype
|
||||
from openpype import style
|
||||
import avalon.api as avalon
|
||||
from . import (
|
||||
lib as flib,
|
||||
pipeline as fpipeline,
|
||||
|
|
@ -644,3 +651,274 @@ class PublishableClip:
|
|||
|
||||
# Publishing plugin functions
|
||||
# Loader plugin functions
|
||||
|
||||
class ClipLoader(avalon.Loader):
|
||||
"""A basic clip loader for Flame
|
||||
|
||||
This will implement the basic behavior for a loader to inherit from that
|
||||
will containerize the reference and will implement the `remove` and
|
||||
`update` logic.
|
||||
|
||||
"""
|
||||
|
||||
options = [
|
||||
qargparse.Boolean(
|
||||
"handles",
|
||||
label="Set handles",
|
||||
default=0,
|
||||
help="Also set handles to clip as In/Out marks"
|
||||
)
|
||||
]
|
||||
|
||||
|
||||
class OpenClipSolver:
|
||||
media_script_path = "/opt/Autodesk/mio/current/dl_get_media_info"
|
||||
tmp_name = "_tmp.clip"
|
||||
tmp_file = None
|
||||
create_new_clip = False
|
||||
|
||||
out_feed_nb_ticks = None
|
||||
out_feed_fps = None
|
||||
out_feed_drop_mode = None
|
||||
|
||||
log = log
|
||||
|
||||
def __init__(self, openclip_file_path, feed_data):
|
||||
# test if media script paht exists
|
||||
self._validate_media_script_path()
|
||||
|
||||
# new feed variables:
|
||||
feed_path = feed_data["path"]
|
||||
self.feed_version_name = feed_data["version"]
|
||||
self.feed_colorspace = feed_data.get("colorspace")
|
||||
|
||||
if feed_data.get("logger"):
|
||||
self.log = feed_data["logger"]
|
||||
|
||||
# derivate other feed variables
|
||||
self.feed_basename = os.path.basename(feed_path)
|
||||
self.feed_dir = os.path.dirname(feed_path)
|
||||
self.feed_ext = os.path.splitext(self.feed_basename)[1][1:].lower()
|
||||
|
||||
if not os.path.isfile(openclip_file_path):
|
||||
# openclip does not exist yet and will be created
|
||||
self.tmp_file = self.out_file = openclip_file_path
|
||||
self.create_new_clip = True
|
||||
|
||||
else:
|
||||
# output a temp file
|
||||
self.out_file = openclip_file_path
|
||||
self.tmp_file = os.path.join(self.feed_dir, self.tmp_name)
|
||||
self._clear_tmp_file()
|
||||
|
||||
self.log.info("Temp File: {}".format(self.tmp_file))
|
||||
|
||||
def make(self):
|
||||
self._generate_media_info_file()
|
||||
|
||||
if self.create_new_clip:
|
||||
# New openClip
|
||||
self._create_new_open_clip()
|
||||
else:
|
||||
self._update_open_clip()
|
||||
|
||||
def _validate_media_script_path(self):
|
||||
if not os.path.isfile(self.media_script_path):
|
||||
raise IOError("Media Scirpt does not exist: `{}`".format(
|
||||
self.media_script_path))
|
||||
|
||||
def _generate_media_info_file(self):
|
||||
# Create cmd arguments for gettig xml file info file
|
||||
cmd_args = [
|
||||
self.media_script_path,
|
||||
"-e", self.feed_ext,
|
||||
"-o", self.tmp_file,
|
||||
self.feed_dir
|
||||
]
|
||||
|
||||
# execute creation of clip xml template data
|
||||
try:
|
||||
openpype.run_subprocess(cmd_args)
|
||||
except TypeError:
|
||||
self.log.error("Error creating self.tmp_file")
|
||||
six.reraise(*sys.exc_info())
|
||||
|
||||
def _clear_tmp_file(self):
|
||||
if os.path.isfile(self.tmp_file):
|
||||
os.remove(self.tmp_file)
|
||||
|
||||
def _clear_handler(self, xml_object):
|
||||
for handler in xml_object.findall("./handler"):
|
||||
self.log.debug("Handler found")
|
||||
xml_object.remove(handler)
|
||||
|
||||
def _create_new_open_clip(self):
|
||||
self.log.info("Building new openClip")
|
||||
|
||||
tmp_xml = ET.parse(self.tmp_file)
|
||||
|
||||
tmp_xml_feeds = tmp_xml.find('tracks/track/feeds')
|
||||
tmp_xml_feeds.set('currentVersion', self.feed_version_name)
|
||||
for tmp_feed in tmp_xml_feeds:
|
||||
tmp_feed.set('vuid', self.feed_version_name)
|
||||
|
||||
# add colorspace if any is set
|
||||
if self.feed_colorspace:
|
||||
self._add_colorspace(tmp_feed, self.feed_colorspace)
|
||||
|
||||
self._clear_handler(tmp_feed)
|
||||
|
||||
tmp_xml_versions_obj = tmp_xml.find('versions')
|
||||
tmp_xml_versions_obj.set('currentVersion', self.feed_version_name)
|
||||
for xml_new_version in tmp_xml_versions_obj:
|
||||
xml_new_version.set('uid', self.feed_version_name)
|
||||
xml_new_version.set('type', 'version')
|
||||
|
||||
xml_data = self._fix_xml_data(tmp_xml)
|
||||
self.log.info("Adding feed version: {}".format(self.feed_basename))
|
||||
|
||||
self._write_result_xml_to_file(xml_data)
|
||||
|
||||
self.log.info("openClip Updated: {}".format(self.tmp_file))
|
||||
|
||||
def _update_open_clip(self):
|
||||
self.log.info("Updating openClip ..")
|
||||
|
||||
out_xml = ET.parse(self.out_file)
|
||||
tmp_xml = ET.parse(self.tmp_file)
|
||||
|
||||
self.log.debug(">> out_xml: {}".format(out_xml))
|
||||
self.log.debug(">> tmp_xml: {}".format(tmp_xml))
|
||||
|
||||
# Get new feed from tmp file
|
||||
tmp_xml_feed = tmp_xml.find('tracks/track/feeds/feed')
|
||||
|
||||
self._clear_handler(tmp_xml_feed)
|
||||
self._get_time_info_from_origin(out_xml)
|
||||
|
||||
if self.out_feed_fps:
|
||||
tmp_feed_fps_obj = tmp_xml_feed.find(
|
||||
"startTimecode/rate")
|
||||
tmp_feed_fps_obj.text = self.out_feed_fps
|
||||
if self.out_feed_nb_ticks:
|
||||
tmp_feed_nb_ticks_obj = tmp_xml_feed.find(
|
||||
"startTimecode/nbTicks")
|
||||
tmp_feed_nb_ticks_obj.text = self.out_feed_nb_ticks
|
||||
if self.out_feed_drop_mode:
|
||||
tmp_feed_drop_mode_obj = tmp_xml_feed.find(
|
||||
"startTimecode/dropMode")
|
||||
tmp_feed_drop_mode_obj.text = self.out_feed_drop_mode
|
||||
|
||||
new_path_obj = tmp_xml_feed.find(
|
||||
"spans/span/path")
|
||||
new_path = new_path_obj.text
|
||||
|
||||
feed_added = False
|
||||
if not self._feed_exists(out_xml, new_path):
|
||||
tmp_xml_feed.set('vuid', self.feed_version_name)
|
||||
# Append new temp file feed to .clip source out xml
|
||||
out_track = out_xml.find("tracks/track")
|
||||
# add colorspace if any is set
|
||||
if self.feed_colorspace:
|
||||
self._add_colorspace(tmp_xml_feed, self.feed_colorspace)
|
||||
|
||||
out_feeds = out_track.find('feeds')
|
||||
out_feeds.set('currentVersion', self.feed_version_name)
|
||||
out_feeds.append(tmp_xml_feed)
|
||||
|
||||
self.log.info(
|
||||
"Appending new feed: {}".format(
|
||||
self.feed_version_name))
|
||||
feed_added = True
|
||||
|
||||
if feed_added:
|
||||
# Append vUID to versions
|
||||
out_xml_versions_obj = out_xml.find('versions')
|
||||
out_xml_versions_obj.set(
|
||||
'currentVersion', self.feed_version_name)
|
||||
new_version_obj = ET.Element(
|
||||
"version", {"type": "version", "uid": self.feed_version_name})
|
||||
out_xml_versions_obj.insert(0, new_version_obj)
|
||||
|
||||
xml_data = self._fix_xml_data(out_xml)
|
||||
|
||||
# fist create backup
|
||||
self._create_openclip_backup_file(self.out_file)
|
||||
|
||||
self.log.info("Adding feed version: {}".format(
|
||||
self.feed_version_name))
|
||||
|
||||
self._write_result_xml_to_file(xml_data)
|
||||
|
||||
self.log.info("openClip Updated: {}".format(self.out_file))
|
||||
|
||||
self._clear_tmp_file()
|
||||
|
||||
def _get_time_info_from_origin(self, xml_data):
|
||||
try:
|
||||
for out_track in xml_data.iter('track'):
|
||||
for out_feed in out_track.iter('feed'):
|
||||
out_feed_nb_ticks_obj = out_feed.find(
|
||||
'startTimecode/nbTicks')
|
||||
self.out_feed_nb_ticks = out_feed_nb_ticks_obj.text
|
||||
out_feed_fps_obj = out_feed.find(
|
||||
'startTimecode/rate')
|
||||
self.out_feed_fps = out_feed_fps_obj.text
|
||||
out_feed_drop_mode_obj = out_feed.find(
|
||||
'startTimecode/dropMode')
|
||||
self.out_feed_drop_mode = out_feed_drop_mode_obj.text
|
||||
break
|
||||
else:
|
||||
continue
|
||||
except Exception as msg:
|
||||
self.log.warning(msg)
|
||||
|
||||
def _feed_exists(self, xml_data, path):
|
||||
# loop all available feed paths and check if
|
||||
# the path is not already in file
|
||||
for src_path in xml_data.iter('path'):
|
||||
if path == src_path.text:
|
||||
self.log.warning(
|
||||
"Not appending file as it already is in .clip file")
|
||||
return True
|
||||
|
||||
def _fix_xml_data(self, xml_data):
|
||||
xml_root = xml_data.getroot()
|
||||
self._clear_handler(xml_root)
|
||||
return ET.tostring(xml_root).decode('utf-8')
|
||||
|
||||
def _write_result_xml_to_file(self, xml_data):
|
||||
with open(self.out_file, "w") as f:
|
||||
f.write(xml_data)
|
||||
|
||||
def _create_openclip_backup_file(self, file):
|
||||
bck_file = "{}.bak".format(file)
|
||||
# if backup does not exist
|
||||
if not os.path.isfile(bck_file):
|
||||
shutil.copy2(file, bck_file)
|
||||
else:
|
||||
# in case it exists and is already multiplied
|
||||
created = False
|
||||
for _i in range(1, 99):
|
||||
bck_file = "{name}.bak.{idx:0>2}".format(
|
||||
name=file,
|
||||
idx=_i)
|
||||
# create numbered backup file
|
||||
if not os.path.isfile(bck_file):
|
||||
shutil.copy2(file, bck_file)
|
||||
created = True
|
||||
break
|
||||
# in case numbered does not exists
|
||||
if not created:
|
||||
bck_file = "{}.bak.last".format(file)
|
||||
shutil.copy2(file, bck_file)
|
||||
|
||||
def _add_colorspace(self, feed_obj, profile_name):
|
||||
feed_storage_obj = feed_obj.find("storageFormat")
|
||||
feed_clr_obj = feed_storage_obj.find("colourSpace")
|
||||
if feed_clr_obj is not None:
|
||||
feed_clr_obj = ET.Element(
|
||||
"colourSpace", {"type": "string"})
|
||||
feed_storage_obj.append(feed_clr_obj)
|
||||
|
||||
feed_clr_obj.text = profile_name
|
||||
|
|
|
|||
|
|
@ -4,9 +4,13 @@ import tempfile
|
|||
import contextlib
|
||||
import socket
|
||||
from openpype.lib import (
|
||||
PreLaunchHook, get_openpype_username)
|
||||
PreLaunchHook,
|
||||
get_openpype_username
|
||||
)
|
||||
from openpype.lib.applications import (
|
||||
ApplicationLaunchFailed
|
||||
)
|
||||
from openpype.hosts import flame as opflame
|
||||
import openpype.hosts.flame.api as opfapi
|
||||
import openpype
|
||||
from pprint import pformat
|
||||
|
||||
|
|
@ -33,7 +37,25 @@ class FlamePrelaunch(PreLaunchHook):
|
|||
|
||||
"""Hook entry method."""
|
||||
project_doc = self.data["project_doc"]
|
||||
project_name = project_doc["name"]
|
||||
|
||||
# get image io
|
||||
project_anatomy = self.data["anatomy"]
|
||||
|
||||
# make sure anatomy settings are having flame key
|
||||
if not project_anatomy["imageio"].get("flame"):
|
||||
raise ApplicationLaunchFailed((
|
||||
"Anatomy project settings are missing `flame` key. "
|
||||
"Please make sure you remove project overides on "
|
||||
"Anatomy Image io")
|
||||
)
|
||||
|
||||
imageio_flame = project_anatomy["imageio"]["flame"]
|
||||
|
||||
# get user name and host name
|
||||
user_name = get_openpype_username()
|
||||
user_name = user_name.replace(".", "_")
|
||||
|
||||
hostname = socket.gethostname() # not returning wiretap host name
|
||||
|
||||
self.log.debug("Collected user \"{}\"".format(user_name))
|
||||
|
|
@ -41,7 +63,7 @@ class FlamePrelaunch(PreLaunchHook):
|
|||
_db_p_data = project_doc["data"]
|
||||
width = _db_p_data["resolutionWidth"]
|
||||
height = _db_p_data["resolutionHeight"]
|
||||
fps = int(_db_p_data["fps"])
|
||||
fps = float(_db_p_data["fps"])
|
||||
|
||||
project_data = {
|
||||
"Name": project_doc["name"],
|
||||
|
|
@ -52,8 +74,8 @@ class FlamePrelaunch(PreLaunchHook):
|
|||
"FrameHeight": int(height),
|
||||
"AspectRatio": float((width / height) * _db_p_data["pixelAspect"]),
|
||||
"FrameRate": "{} fps".format(fps),
|
||||
"FrameDepth": "16-bit fp",
|
||||
"FieldDominance": "PROGRESSIVE"
|
||||
"FrameDepth": str(imageio_flame["project"]["frameDepth"]),
|
||||
"FieldDominance": str(imageio_flame["project"]["fieldDominance"])
|
||||
}
|
||||
|
||||
data_to_script = {
|
||||
|
|
@ -61,10 +83,10 @@ class FlamePrelaunch(PreLaunchHook):
|
|||
"host_name": _env.get("FLAME_WIRETAP_HOSTNAME") or hostname,
|
||||
"volume_name": _env.get("FLAME_WIRETAP_VOLUME"),
|
||||
"group_name": _env.get("FLAME_WIRETAP_GROUP"),
|
||||
"color_policy": "ACES 1.1",
|
||||
"color_policy": str(imageio_flame["project"]["colourPolicy"]),
|
||||
|
||||
# from project
|
||||
"project_name": project_doc["name"],
|
||||
"project_name": project_name,
|
||||
"user_name": user_name,
|
||||
"project_data": project_data
|
||||
}
|
||||
|
|
@ -77,8 +99,6 @@ class FlamePrelaunch(PreLaunchHook):
|
|||
|
||||
app_arguments = self._get_launch_arguments(data_to_script)
|
||||
|
||||
opfapi.setup(self.launch_context.env)
|
||||
|
||||
self.launch_context.launch_args.extend(app_arguments)
|
||||
|
||||
def _add_pythonpath(self):
|
||||
|
|
|
|||
247
openpype/hosts/flame/plugins/load/load_clip.py
Normal file
247
openpype/hosts/flame/plugins/load/load_clip.py
Normal file
|
|
@ -0,0 +1,247 @@
|
|||
import os
|
||||
import flame
|
||||
from pprint import pformat
|
||||
import openpype.hosts.flame.api as opfapi
|
||||
|
||||
|
||||
class LoadClip(opfapi.ClipLoader):
|
||||
"""Load a subset to timeline as clip
|
||||
|
||||
Place clip to timeline on its asset origin timings collected
|
||||
during conforming to project
|
||||
"""
|
||||
|
||||
families = ["render2d", "source", "plate", "render", "review"]
|
||||
representations = ["exr", "dpx", "jpg", "jpeg", "png", "h264"]
|
||||
|
||||
label = "Load as clip"
|
||||
order = -10
|
||||
icon = "code-fork"
|
||||
color = "orange"
|
||||
|
||||
# settings
|
||||
reel_group_name = "OpenPype_Reels"
|
||||
reel_name = "Loaded"
|
||||
clip_name_template = "{asset}_{subset}_{representation}"
|
||||
|
||||
def load(self, context, name, namespace, options):
|
||||
|
||||
# get flame objects
|
||||
fproject = flame.project.current_project
|
||||
self.fpd = fproject.current_workspace.desktop
|
||||
|
||||
# load clip to timeline and get main variables
|
||||
namespace = namespace
|
||||
version = context['version']
|
||||
version_data = version.get("data", {})
|
||||
version_name = version.get("name", None)
|
||||
colorspace = version_data.get("colorspace", None)
|
||||
clip_name = self.clip_name_template.format(
|
||||
**context["representation"]["context"])
|
||||
|
||||
# todo: settings in imageio
|
||||
# convert colorspace with ocio to flame mapping
|
||||
# in imageio flame section
|
||||
colorspace = colorspace
|
||||
|
||||
# create workfile path
|
||||
workfile_dir = os.environ["AVALON_WORKDIR"]
|
||||
openclip_dir = os.path.join(
|
||||
workfile_dir, clip_name
|
||||
)
|
||||
openclip_path = os.path.join(
|
||||
openclip_dir, clip_name + ".clip"
|
||||
)
|
||||
if not os.path.exists(openclip_dir):
|
||||
os.makedirs(openclip_dir)
|
||||
|
||||
# prepare clip data from context ad send it to openClipLoader
|
||||
loading_context = {
|
||||
"path": self.fname.replace("\\", "/"),
|
||||
"colorspace": colorspace,
|
||||
"version": "v{:0>3}".format(version_name),
|
||||
"logger": self.log
|
||||
|
||||
}
|
||||
self.log.debug(pformat(
|
||||
loading_context
|
||||
))
|
||||
self.log.debug(openclip_path)
|
||||
|
||||
# make openpype clip file
|
||||
opfapi.OpenClipSolver(openclip_path, loading_context).make()
|
||||
|
||||
# prepare Reel group in actual desktop
|
||||
opc = self._get_clip(
|
||||
clip_name,
|
||||
openclip_path
|
||||
)
|
||||
|
||||
# add additional metadata from the version to imprint Avalon knob
|
||||
add_keys = [
|
||||
"frameStart", "frameEnd", "source", "author",
|
||||
"fps", "handleStart", "handleEnd"
|
||||
]
|
||||
|
||||
# move all version data keys to tag data
|
||||
data_imprint = {}
|
||||
for key in add_keys:
|
||||
data_imprint.update({
|
||||
key: version_data.get(key, str(None))
|
||||
})
|
||||
|
||||
# add variables related to version context
|
||||
data_imprint.update({
|
||||
"version": version_name,
|
||||
"colorspace": colorspace,
|
||||
"objectName": clip_name
|
||||
})
|
||||
|
||||
# TODO: finish the containerisation
|
||||
# opc_segment = opfapi.get_clip_segment(opc)
|
||||
|
||||
# return opfapi.containerise(
|
||||
# opc_segment,
|
||||
# name, namespace, context,
|
||||
# self.__class__.__name__,
|
||||
# data_imprint)
|
||||
|
||||
return opc
|
||||
|
||||
def _get_clip(self, name, clip_path):
|
||||
reel = self._get_reel()
|
||||
# with maintained openclip as opc
|
||||
matching_clip = [cl for cl in reel.clips
|
||||
if cl.name.get_value() == name]
|
||||
if matching_clip:
|
||||
return matching_clip.pop()
|
||||
else:
|
||||
created_clips = flame.import_clips(str(clip_path), reel)
|
||||
return created_clips.pop()
|
||||
|
||||
def _get_reel(self):
|
||||
|
||||
matching_rgroup = [
|
||||
rg for rg in self.fpd.reel_groups
|
||||
if rg.name.get_value() == self.reel_group_name
|
||||
]
|
||||
|
||||
if not matching_rgroup:
|
||||
reel_group = self.fpd.create_reel_group(str(self.reel_group_name))
|
||||
for _r in reel_group.reels:
|
||||
if "reel" not in _r.name.get_value().lower():
|
||||
continue
|
||||
self.log.debug("Removing: {}".format(_r.name))
|
||||
flame.delete(_r)
|
||||
else:
|
||||
reel_group = matching_rgroup.pop()
|
||||
|
||||
matching_reel = [
|
||||
re for re in reel_group.reels
|
||||
if re.name.get_value() == self.reel_name
|
||||
]
|
||||
|
||||
if not matching_reel:
|
||||
reel_group = reel_group.create_reel(str(self.reel_name))
|
||||
else:
|
||||
reel_group = matching_reel.pop()
|
||||
|
||||
return reel_group
|
||||
|
||||
def _get_segment_from_clip(self, clip):
|
||||
# unwrapping segment from input clip
|
||||
pass
|
||||
|
||||
# def switch(self, container, representation):
|
||||
# self.update(container, representation)
|
||||
|
||||
# def update(self, container, representation):
|
||||
# """ Updating previously loaded clips
|
||||
# """
|
||||
|
||||
# # load clip to timeline and get main variables
|
||||
# name = container['name']
|
||||
# namespace = container['namespace']
|
||||
# track_item = phiero.get_track_items(
|
||||
# track_item_name=namespace)
|
||||
# version = io.find_one({
|
||||
# "type": "version",
|
||||
# "_id": representation["parent"]
|
||||
# })
|
||||
# version_data = version.get("data", {})
|
||||
# version_name = version.get("name", None)
|
||||
# colorspace = version_data.get("colorspace", None)
|
||||
# object_name = "{}_{}".format(name, namespace)
|
||||
# file = api.get_representation_path(representation).replace("\\", "/")
|
||||
# clip = track_item.source()
|
||||
|
||||
# # reconnect media to new path
|
||||
# clip.reconnectMedia(file)
|
||||
|
||||
# # set colorspace
|
||||
# if colorspace:
|
||||
# clip.setSourceMediaColourTransform(colorspace)
|
||||
|
||||
# # add additional metadata from the version to imprint Avalon knob
|
||||
# add_keys = [
|
||||
# "frameStart", "frameEnd", "source", "author",
|
||||
# "fps", "handleStart", "handleEnd"
|
||||
# ]
|
||||
|
||||
# # move all version data keys to tag data
|
||||
# data_imprint = {}
|
||||
# for key in add_keys:
|
||||
# data_imprint.update({
|
||||
# key: version_data.get(key, str(None))
|
||||
# })
|
||||
|
||||
# # add variables related to version context
|
||||
# data_imprint.update({
|
||||
# "representation": str(representation["_id"]),
|
||||
# "version": version_name,
|
||||
# "colorspace": colorspace,
|
||||
# "objectName": object_name
|
||||
# })
|
||||
|
||||
# # update color of clip regarding the version order
|
||||
# self.set_item_color(track_item, version)
|
||||
|
||||
# return phiero.update_container(track_item, data_imprint)
|
||||
|
||||
# def remove(self, container):
|
||||
# """ Removing previously loaded clips
|
||||
# """
|
||||
# # load clip to timeline and get main variables
|
||||
# namespace = container['namespace']
|
||||
# track_item = phiero.get_track_items(
|
||||
# track_item_name=namespace)
|
||||
# track = track_item.parent()
|
||||
|
||||
# # remove track item from track
|
||||
# track.removeItem(track_item)
|
||||
|
||||
# @classmethod
|
||||
# def multiselection(cls, track_item):
|
||||
# if not cls.track:
|
||||
# cls.track = track_item.parent()
|
||||
# cls.sequence = cls.track.parent()
|
||||
|
||||
# @classmethod
|
||||
# def set_item_color(cls, track_item, version):
|
||||
|
||||
# clip = track_item.source()
|
||||
# # define version name
|
||||
# version_name = version.get("name", None)
|
||||
# # get all versions in list
|
||||
# versions = io.find({
|
||||
# "type": "version",
|
||||
# "parent": version["parent"]
|
||||
# }).distinct('name')
|
||||
|
||||
# max_version = max(versions)
|
||||
|
||||
# # set clip colour
|
||||
# if version_name == max_version:
|
||||
# clip.binItem().setColor(cls.clip_color_last)
|
||||
# else:
|
||||
# clip.binItem().setColor(cls.clip_color)
|
||||
|
|
@ -22,6 +22,7 @@ class ExtractSubsetResources(openpype.api.Extractor):
|
|||
"ext": "jpg",
|
||||
"xml_preset_file": "Jpeg (8-bit).xml",
|
||||
"xml_preset_dir": "",
|
||||
"colorspace_out": "Output - sRGB",
|
||||
"representation_add_range": False,
|
||||
"representation_tags": ["thumbnail"]
|
||||
},
|
||||
|
|
@ -29,6 +30,7 @@ class ExtractSubsetResources(openpype.api.Extractor):
|
|||
"ext": "mov",
|
||||
"xml_preset_file": "Apple iPad (1920x1080).xml",
|
||||
"xml_preset_dir": "",
|
||||
"colorspace_out": "Output - Rec.709",
|
||||
"representation_add_range": True,
|
||||
"representation_tags": [
|
||||
"review",
|
||||
|
|
@ -45,7 +47,6 @@ class ExtractSubsetResources(openpype.api.Extractor):
|
|||
export_presets_mapping = {}
|
||||
|
||||
def process(self, instance):
|
||||
|
||||
if (
|
||||
self.keep_original_representation
|
||||
and "representations" not in instance.data
|
||||
|
|
@ -84,6 +85,7 @@ class ExtractSubsetResources(openpype.api.Extractor):
|
|||
preset_file = preset_config["xml_preset_file"]
|
||||
preset_dir = preset_config["xml_preset_dir"]
|
||||
repre_tags = preset_config["representation_tags"]
|
||||
color_out = preset_config["colorspace_out"]
|
||||
|
||||
# validate xml preset file is filled
|
||||
if preset_file == "":
|
||||
|
|
@ -129,17 +131,31 @@ class ExtractSubsetResources(openpype.api.Extractor):
|
|||
opfapi.export_clip(
|
||||
export_dir_path, duplclip, preset_path, **kwargs)
|
||||
|
||||
extension = preset_config["ext"]
|
||||
# create representation data
|
||||
representation_data = {
|
||||
"name": unique_name,
|
||||
"outputName": unique_name,
|
||||
"ext": preset_config["ext"],
|
||||
"ext": extension,
|
||||
"stagingDir": export_dir_path,
|
||||
"tags": repre_tags
|
||||
"tags": repre_tags,
|
||||
"data": {
|
||||
"colorspace": color_out
|
||||
}
|
||||
}
|
||||
|
||||
# collect all available content of export dir
|
||||
files = os.listdir(export_dir_path)
|
||||
|
||||
# make sure no nested folders inside
|
||||
n_stage_dir, n_files = self._unfolds_nested_folders(
|
||||
export_dir_path, files, extension)
|
||||
|
||||
# fix representation in case of nested folders
|
||||
if n_stage_dir:
|
||||
representation_data["stagingDir"] = n_stage_dir
|
||||
files = n_files
|
||||
|
||||
# add files to represetation but add
|
||||
# imagesequence as list
|
||||
if (
|
||||
|
|
@ -170,3 +186,63 @@ class ExtractSubsetResources(openpype.api.Extractor):
|
|||
|
||||
self.log.debug("All representations: {}".format(
|
||||
pformat(instance.data["representations"])))
|
||||
|
||||
def _unfolds_nested_folders(self, stage_dir, files_list, ext):
|
||||
"""Unfolds nested folders
|
||||
|
||||
Args:
|
||||
stage_dir (str): path string with directory
|
||||
files_list (list): list of file names
|
||||
ext (str): extension (jpg)[without dot]
|
||||
|
||||
Raises:
|
||||
IOError: in case no files were collected form any directory
|
||||
|
||||
Returns:
|
||||
str, list: new staging dir path, new list of file names
|
||||
or
|
||||
None, None: In case single file in `files_list`
|
||||
"""
|
||||
# exclude single files which are having extension
|
||||
# the same as input ext attr
|
||||
if (
|
||||
# only one file in list
|
||||
len(files_list) == 1
|
||||
# file is having extension as input
|
||||
and ext in os.path.splitext(files_list[0])[-1]
|
||||
):
|
||||
return None, None
|
||||
elif (
|
||||
# more then one file in list
|
||||
len(files_list) >= 1
|
||||
# extension is correct
|
||||
and ext in os.path.splitext(files_list[0])[-1]
|
||||
# test file exists
|
||||
and os.path.exists(
|
||||
os.path.join(stage_dir, files_list[0])
|
||||
)
|
||||
):
|
||||
return None, None
|
||||
|
||||
new_stage_dir = None
|
||||
new_files_list = []
|
||||
for file in files_list:
|
||||
search_path = os.path.join(stage_dir, file)
|
||||
if not os.path.isdir(search_path):
|
||||
continue
|
||||
for root, _dirs, files in os.walk(search_path):
|
||||
for _file in files:
|
||||
_fn, _ext = os.path.splitext(_file)
|
||||
if ext.lower() != _ext[1:].lower():
|
||||
continue
|
||||
new_files_list.append(_file)
|
||||
if not new_stage_dir:
|
||||
new_stage_dir = root
|
||||
|
||||
if not new_stage_dir:
|
||||
raise AssertionError(
|
||||
"Files in `{}` are not correct! Check `{}`".format(
|
||||
files_list, stage_dir)
|
||||
)
|
||||
|
||||
return new_stage_dir, new_files_list
|
||||
|
|
|
|||
24
openpype/hosts/flame/plugins/publish/validate_source_clip.py
Normal file
24
openpype/hosts/flame/plugins/publish/validate_source_clip.py
Normal file
|
|
@ -0,0 +1,24 @@
|
|||
import pyblish
|
||||
|
||||
|
||||
@pyblish.api.log
|
||||
class ValidateSourceClip(pyblish.api.InstancePlugin):
|
||||
"""Validate instance is not having empty `flameSourceClip`"""
|
||||
|
||||
order = pyblish.api.ValidatorOrder
|
||||
label = "Validate Source Clip"
|
||||
hosts = ["flame"]
|
||||
families = ["clip"]
|
||||
|
||||
def process(self, instance):
|
||||
flame_source_clip = instance.data["flameSourceClip"]
|
||||
|
||||
self.log.debug("_ flame_source_clip: {}".format(flame_source_clip))
|
||||
|
||||
if flame_source_clip is None:
|
||||
raise AttributeError((
|
||||
"Timeline segment `{}` is not having "
|
||||
"relative clip in reels. Please make sure "
|
||||
"you push `Save Sources` button in Conform Tab").format(
|
||||
instance.data["asset"]
|
||||
))
|
||||
|
|
@ -3,6 +3,10 @@ from __future__ import print_function
|
|||
import os
|
||||
import sys
|
||||
|
||||
# only testing dependency for nested modules in package
|
||||
import six # noqa
|
||||
|
||||
|
||||
SCRIPT_DIR = os.path.dirname(__file__)
|
||||
PACKAGE_DIR = os.path.join(SCRIPT_DIR, "modules")
|
||||
sys.path.append(PACKAGE_DIR)
|
||||
|
|
@ -1,14 +1,27 @@
|
|||
from .pipeline import (
|
||||
install,
|
||||
uninstall
|
||||
uninstall,
|
||||
|
||||
ls,
|
||||
|
||||
imprint_container,
|
||||
parse_container,
|
||||
|
||||
get_current_comp,
|
||||
comp_lock_and_undo_chunk
|
||||
)
|
||||
|
||||
from .utils import (
|
||||
setup
|
||||
from .workio import (
|
||||
open_file,
|
||||
save_file,
|
||||
current_file,
|
||||
has_unsaved_changes,
|
||||
file_extensions,
|
||||
work_root
|
||||
)
|
||||
|
||||
|
||||
from .lib import (
|
||||
maintained_selection,
|
||||
get_additional_data,
|
||||
update_frame_range
|
||||
)
|
||||
|
|
@ -20,11 +33,24 @@ __all__ = [
|
|||
# pipeline
|
||||
"install",
|
||||
"uninstall",
|
||||
"ls",
|
||||
|
||||
# utils
|
||||
"setup",
|
||||
"imprint_container",
|
||||
"parse_container",
|
||||
|
||||
"get_current_comp",
|
||||
"comp_lock_and_undo_chunk",
|
||||
|
||||
# workio
|
||||
"open_file",
|
||||
"save_file",
|
||||
"current_file",
|
||||
"has_unsaved_changes",
|
||||
"file_extensions",
|
||||
"work_root",
|
||||
|
||||
# lib
|
||||
"maintained_selection",
|
||||
"get_additional_data",
|
||||
"update_frame_range",
|
||||
|
||||
|
|
|
|||
|
|
@ -1,8 +1,13 @@
|
|||
import os
|
||||
import sys
|
||||
import re
|
||||
import contextlib
|
||||
|
||||
from Qt import QtGui
|
||||
import avalon.fusion
|
||||
|
||||
import avalon.api
|
||||
from avalon import io
|
||||
from .pipeline import get_current_comp, comp_lock_and_undo_chunk
|
||||
|
||||
self = sys.modules[__name__]
|
||||
self._project = None
|
||||
|
|
@ -24,7 +29,7 @@ def update_frame_range(start, end, comp=None, set_render_range=True):
|
|||
"""
|
||||
|
||||
if not comp:
|
||||
comp = avalon.fusion.get_current_comp()
|
||||
comp = get_current_comp()
|
||||
|
||||
attrs = {
|
||||
"COMPN_GlobalStart": start,
|
||||
|
|
@ -37,7 +42,7 @@ def update_frame_range(start, end, comp=None, set_render_range=True):
|
|||
"COMPN_RenderEnd": end
|
||||
})
|
||||
|
||||
with avalon.fusion.comp_lock_and_undo_chunk(comp):
|
||||
with comp_lock_and_undo_chunk(comp):
|
||||
comp.SetAttrs(attrs)
|
||||
|
||||
|
||||
|
|
@ -140,3 +145,51 @@ def switch_item(container,
|
|||
avalon.api.switch(container, representation)
|
||||
|
||||
return representation
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def maintained_selection():
|
||||
comp = get_current_comp()
|
||||
previous_selection = comp.GetToolList(True).values()
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
flow = comp.CurrentFrame.FlowView
|
||||
flow.Select() # No args equals clearing selection
|
||||
if previous_selection:
|
||||
for tool in previous_selection:
|
||||
flow.Select(tool, True)
|
||||
|
||||
|
||||
def get_frame_path(path):
|
||||
"""Get filename for the Fusion Saver with padded number as '#'
|
||||
|
||||
>>> get_frame_path("C:/test.exr")
|
||||
('C:/test', 4, '.exr')
|
||||
|
||||
>>> get_frame_path("filename.00.tif")
|
||||
('filename.', 2, '.tif')
|
||||
|
||||
>>> get_frame_path("foobar35.tif")
|
||||
('foobar', 2, '.tif')
|
||||
|
||||
Args:
|
||||
path (str): The path to render to.
|
||||
|
||||
Returns:
|
||||
tuple: head, padding, tail (extension)
|
||||
|
||||
"""
|
||||
filename, ext = os.path.splitext(path)
|
||||
|
||||
# Find a final number group
|
||||
match = re.match('.*?([0-9]+)$', filename)
|
||||
if match:
|
||||
padding = len(match.group(1))
|
||||
# remove number from end since fusion
|
||||
# will swap it with the frame number
|
||||
filename = filename[:-padding]
|
||||
else:
|
||||
padding = 4 # default Fusion padding
|
||||
|
||||
return filename, padding, ext
|
||||
|
|
|
|||
|
|
@ -3,6 +3,7 @@ import sys
|
|||
|
||||
from Qt import QtWidgets, QtCore
|
||||
|
||||
from openpype import style
|
||||
from openpype.tools.utils import host_tools
|
||||
|
||||
from openpype.hosts.fusion.scripts import (
|
||||
|
|
@ -54,13 +55,13 @@ class OpenPypeMenu(QtWidgets.QWidget):
|
|||
)
|
||||
self.render_mode_widget = None
|
||||
self.setWindowTitle("OpenPype")
|
||||
workfiles_btn = QtWidgets.QPushButton("Workfiles ...", self)
|
||||
create_btn = QtWidgets.QPushButton("Create ...", self)
|
||||
publish_btn = QtWidgets.QPushButton("Publish ...", self)
|
||||
load_btn = QtWidgets.QPushButton("Load ...", self)
|
||||
inventory_btn = QtWidgets.QPushButton("Inventory ...", self)
|
||||
libload_btn = QtWidgets.QPushButton("Library ...", self)
|
||||
rendermode_btn = QtWidgets.QPushButton("Set render mode ...", self)
|
||||
workfiles_btn = QtWidgets.QPushButton("Workfiles...", self)
|
||||
create_btn = QtWidgets.QPushButton("Create...", self)
|
||||
publish_btn = QtWidgets.QPushButton("Publish...", self)
|
||||
load_btn = QtWidgets.QPushButton("Load...", self)
|
||||
manager_btn = QtWidgets.QPushButton("Manage...", self)
|
||||
libload_btn = QtWidgets.QPushButton("Library...", self)
|
||||
rendermode_btn = QtWidgets.QPushButton("Set render mode...", self)
|
||||
duplicate_with_inputs_btn = QtWidgets.QPushButton(
|
||||
"Duplicate with input connections", self
|
||||
)
|
||||
|
|
@ -75,7 +76,7 @@ class OpenPypeMenu(QtWidgets.QWidget):
|
|||
layout.addWidget(create_btn)
|
||||
layout.addWidget(publish_btn)
|
||||
layout.addWidget(load_btn)
|
||||
layout.addWidget(inventory_btn)
|
||||
layout.addWidget(manager_btn)
|
||||
|
||||
layout.addWidget(Spacer(15, self))
|
||||
|
||||
|
|
@ -96,7 +97,7 @@ class OpenPypeMenu(QtWidgets.QWidget):
|
|||
create_btn.clicked.connect(self.on_create_clicked)
|
||||
publish_btn.clicked.connect(self.on_publish_clicked)
|
||||
load_btn.clicked.connect(self.on_load_clicked)
|
||||
inventory_btn.clicked.connect(self.on_inventory_clicked)
|
||||
manager_btn.clicked.connect(self.on_manager_clicked)
|
||||
libload_btn.clicked.connect(self.on_libload_clicked)
|
||||
rendermode_btn.clicked.connect(self.on_rendernode_clicked)
|
||||
duplicate_with_inputs_btn.clicked.connect(
|
||||
|
|
@ -119,8 +120,8 @@ class OpenPypeMenu(QtWidgets.QWidget):
|
|||
print("Clicked Load")
|
||||
host_tools.show_loader(use_context=True)
|
||||
|
||||
def on_inventory_clicked(self):
|
||||
print("Clicked Inventory")
|
||||
def on_manager_clicked(self):
|
||||
print("Clicked Manager")
|
||||
host_tools.show_scene_inventory()
|
||||
|
||||
def on_libload_clicked(self):
|
||||
|
|
@ -128,7 +129,6 @@ class OpenPypeMenu(QtWidgets.QWidget):
|
|||
host_tools.show_library_loader()
|
||||
|
||||
def on_rendernode_clicked(self):
|
||||
from avalon import style
|
||||
print("Clicked Set Render Mode")
|
||||
if self.render_mode_widget is None:
|
||||
window = set_rendermode.SetRenderMode()
|
||||
|
|
|
|||
|
|
@ -2,9 +2,14 @@
|
|||
Basic avalon integration
|
||||
"""
|
||||
import os
|
||||
import sys
|
||||
import logging
|
||||
import contextlib
|
||||
|
||||
import pyblish.api
|
||||
import avalon.api
|
||||
from avalon.pipeline import AVALON_CONTAINER_ID
|
||||
|
||||
from avalon import api as avalon
|
||||
from pyblish import api as pyblish
|
||||
from openpype.api import Logger
|
||||
import openpype.hosts.fusion
|
||||
|
||||
|
|
@ -19,6 +24,14 @@ CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
|
|||
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
|
||||
|
||||
|
||||
class CompLogHandler(logging.Handler):
|
||||
def emit(self, record):
|
||||
entry = self.format(record)
|
||||
comp = get_current_comp()
|
||||
if comp:
|
||||
comp.Print(entry)
|
||||
|
||||
|
||||
def install():
|
||||
"""Install fusion-specific functionality of avalon-core.
|
||||
|
||||
|
|
@ -30,25 +43,32 @@ def install():
|
|||
See the Maya equivalent for inspiration on how to implement this.
|
||||
|
||||
"""
|
||||
# Remove all handlers associated with the root logger object, because
|
||||
# that one sometimes logs as "warnings" incorrectly.
|
||||
for handler in logging.root.handlers[:]:
|
||||
logging.root.removeHandler(handler)
|
||||
|
||||
# Disable all families except for the ones we explicitly want to see
|
||||
family_states = ["imagesequence",
|
||||
"camera",
|
||||
"pointcache"]
|
||||
avalon.data["familiesStateDefault"] = False
|
||||
avalon.data["familiesStateToggled"] = family_states
|
||||
# Attach default logging handler that prints to active comp
|
||||
logger = logging.getLogger()
|
||||
formatter = logging.Formatter(fmt="%(message)s\n")
|
||||
handler = CompLogHandler()
|
||||
handler.setFormatter(formatter)
|
||||
logger.addHandler(handler)
|
||||
logger.setLevel(logging.DEBUG)
|
||||
|
||||
log.info("openpype.hosts.fusion installed")
|
||||
|
||||
pyblish.register_host("fusion")
|
||||
pyblish.register_plugin_path(PUBLISH_PATH)
|
||||
pyblish.api.register_host("fusion")
|
||||
pyblish.api.register_plugin_path(PUBLISH_PATH)
|
||||
log.info("Registering Fusion plug-ins..")
|
||||
|
||||
avalon.register_plugin_path(avalon.Loader, LOAD_PATH)
|
||||
avalon.register_plugin_path(avalon.Creator, CREATE_PATH)
|
||||
avalon.register_plugin_path(avalon.InventoryAction, INVENTORY_PATH)
|
||||
avalon.api.register_plugin_path(avalon.api.Loader, LOAD_PATH)
|
||||
avalon.api.register_plugin_path(avalon.api.Creator, CREATE_PATH)
|
||||
avalon.api.register_plugin_path(avalon.api.InventoryAction, INVENTORY_PATH)
|
||||
|
||||
pyblish.register_callback("instanceToggled", on_pyblish_instance_toggled)
|
||||
pyblish.api.register_callback(
|
||||
"instanceToggled", on_pyblish_instance_toggled
|
||||
)
|
||||
|
||||
|
||||
def uninstall():
|
||||
|
|
@ -62,22 +82,23 @@ def uninstall():
|
|||
modifying the menu or registered families.
|
||||
|
||||
"""
|
||||
pyblish.deregister_host("fusion")
|
||||
pyblish.deregister_plugin_path(PUBLISH_PATH)
|
||||
pyblish.api.deregister_host("fusion")
|
||||
pyblish.api.deregister_plugin_path(PUBLISH_PATH)
|
||||
log.info("Deregistering Fusion plug-ins..")
|
||||
|
||||
avalon.deregister_plugin_path(avalon.Loader, LOAD_PATH)
|
||||
avalon.deregister_plugin_path(avalon.Creator, CREATE_PATH)
|
||||
avalon.deregister_plugin_path(avalon.InventoryAction, INVENTORY_PATH)
|
||||
avalon.api.deregister_plugin_path(avalon.api.Loader, LOAD_PATH)
|
||||
avalon.api.deregister_plugin_path(avalon.api.Creator, CREATE_PATH)
|
||||
avalon.api.deregister_plugin_path(
|
||||
avalon.api.InventoryAction, INVENTORY_PATH
|
||||
)
|
||||
|
||||
pyblish.deregister_callback("instanceToggled", on_pyblish_instance_toggled)
|
||||
pyblish.api.deregister_callback(
|
||||
"instanceToggled", on_pyblish_instance_toggled
|
||||
)
|
||||
|
||||
|
||||
def on_pyblish_instance_toggled(instance, new_value, old_value):
|
||||
"""Toggle saver tool passthrough states on instance toggles."""
|
||||
|
||||
from avalon.fusion import comp_lock_and_undo_chunk
|
||||
|
||||
comp = instance.context.data.get("currentComp")
|
||||
if not comp:
|
||||
return
|
||||
|
|
@ -97,3 +118,106 @@ def on_pyblish_instance_toggled(instance, new_value, old_value):
|
|||
current = attrs["TOOLB_PassThrough"]
|
||||
if current != passthrough:
|
||||
tool.SetAttrs({"TOOLB_PassThrough": passthrough})
|
||||
|
||||
|
||||
def ls():
|
||||
"""List containers from active Fusion scene
|
||||
|
||||
This is the host-equivalent of api.ls(), but instead of listing
|
||||
assets on disk, it lists assets already loaded in Fusion; once loaded
|
||||
they are called 'containers'
|
||||
|
||||
Yields:
|
||||
dict: container
|
||||
|
||||
"""
|
||||
|
||||
comp = get_current_comp()
|
||||
tools = comp.GetToolList(False, "Loader").values()
|
||||
|
||||
for tool in tools:
|
||||
container = parse_container(tool)
|
||||
if container:
|
||||
yield container
|
||||
|
||||
|
||||
def imprint_container(tool,
|
||||
name,
|
||||
namespace,
|
||||
context,
|
||||
loader=None):
|
||||
"""Imprint a Loader with metadata
|
||||
|
||||
Containerisation enables a tracking of version, author and origin
|
||||
for loaded assets.
|
||||
|
||||
Arguments:
|
||||
tool (object): The node in Fusion to imprint as container, usually a
|
||||
Loader.
|
||||
name (str): Name of resulting assembly
|
||||
namespace (str): Namespace under which to host container
|
||||
context (dict): Asset information
|
||||
loader (str, optional): Name of loader used to produce this container.
|
||||
|
||||
Returns:
|
||||
None
|
||||
|
||||
"""
|
||||
|
||||
data = [
|
||||
("schema", "openpype:container-2.0"),
|
||||
("id", AVALON_CONTAINER_ID),
|
||||
("name", str(name)),
|
||||
("namespace", str(namespace)),
|
||||
("loader", str(loader)),
|
||||
("representation", str(context["representation"]["_id"])),
|
||||
]
|
||||
|
||||
for key, value in data:
|
||||
tool.SetData("avalon.{}".format(key), value)
|
||||
|
||||
|
||||
def parse_container(tool):
|
||||
"""Returns imprinted container data of a tool
|
||||
|
||||
This reads the imprinted data from `imprint_container`.
|
||||
|
||||
"""
|
||||
|
||||
data = tool.GetData('avalon')
|
||||
if not isinstance(data, dict):
|
||||
return
|
||||
|
||||
# If not all required data return the empty container
|
||||
required = ['schema', 'id', 'name',
|
||||
'namespace', 'loader', 'representation']
|
||||
if not all(key in data for key in required):
|
||||
return
|
||||
|
||||
container = {key: data[key] for key in required}
|
||||
|
||||
# Store the tool's name
|
||||
container["objectName"] = tool.Name
|
||||
|
||||
# Store reference to the tool object
|
||||
container["_tool"] = tool
|
||||
|
||||
return container
|
||||
|
||||
|
||||
def get_current_comp():
|
||||
"""Hack to get current comp in this session"""
|
||||
fusion = getattr(sys.modules["__main__"], "fusion", None)
|
||||
return fusion.CurrentComp if fusion else None
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def comp_lock_and_undo_chunk(comp, undo_queue_name="Script CMD"):
|
||||
"""Lock comp and open an undo chunk during the context"""
|
||||
try:
|
||||
comp.Lock()
|
||||
comp.StartUndo(undo_queue_name)
|
||||
yield
|
||||
finally:
|
||||
comp.Unlock()
|
||||
comp.EndUndo()
|
||||
|
|
|
|||
|
|
@ -1,86 +0,0 @@
|
|||
#! python3
|
||||
|
||||
"""
|
||||
Fusion tools for setting environment
|
||||
"""
|
||||
|
||||
import os
|
||||
import shutil
|
||||
|
||||
from openpype.api import Logger
|
||||
import openpype.hosts.fusion
|
||||
|
||||
log = Logger().get_logger(__name__)
|
||||
|
||||
|
||||
def _sync_utility_scripts(env=None):
|
||||
""" Synchronizing basic utlility scripts for resolve.
|
||||
|
||||
To be able to run scripts from inside `Fusion/Workspace/Scripts` menu
|
||||
all scripts has to be accessible from defined folder.
|
||||
"""
|
||||
if not env:
|
||||
env = os.environ
|
||||
|
||||
# initiate inputs
|
||||
scripts = {}
|
||||
us_env = env.get("FUSION_UTILITY_SCRIPTS_SOURCE_DIR")
|
||||
us_dir = env.get("FUSION_UTILITY_SCRIPTS_DIR", "")
|
||||
us_paths = [os.path.join(
|
||||
os.path.dirname(os.path.abspath(openpype.hosts.fusion.__file__)),
|
||||
"utility_scripts"
|
||||
)]
|
||||
|
||||
# collect script dirs
|
||||
if us_env:
|
||||
log.info(f"Utility Scripts Env: `{us_env}`")
|
||||
us_paths = us_env.split(
|
||||
os.pathsep) + us_paths
|
||||
|
||||
# collect scripts from dirs
|
||||
for path in us_paths:
|
||||
scripts.update({path: os.listdir(path)})
|
||||
|
||||
log.info(f"Utility Scripts Dir: `{us_paths}`")
|
||||
log.info(f"Utility Scripts: `{scripts}`")
|
||||
|
||||
# make sure no script file is in folder
|
||||
if next((s for s in os.listdir(us_dir)), None):
|
||||
for s in os.listdir(us_dir):
|
||||
path = os.path.normpath(
|
||||
os.path.join(us_dir, s))
|
||||
log.info(f"Removing `{path}`...")
|
||||
|
||||
# remove file or directory if not in our folders
|
||||
if not os.path.isdir(path):
|
||||
os.remove(path)
|
||||
else:
|
||||
shutil.rmtree(path)
|
||||
|
||||
# copy scripts into Resolve's utility scripts dir
|
||||
for d, sl in scripts.items():
|
||||
# directory and scripts list
|
||||
for s in sl:
|
||||
# script in script list
|
||||
src = os.path.normpath(os.path.join(d, s))
|
||||
dst = os.path.normpath(os.path.join(us_dir, s))
|
||||
|
||||
log.info(f"Copying `{src}` to `{dst}`...")
|
||||
|
||||
# copy file or directory from our folders to fusion's folder
|
||||
if not os.path.isdir(src):
|
||||
shutil.copy2(src, dst)
|
||||
else:
|
||||
shutil.copytree(src, dst)
|
||||
|
||||
|
||||
def setup(env=None):
|
||||
""" Wrapper installer started from pype.hooks.fusion.FusionPrelaunch()
|
||||
"""
|
||||
if not env:
|
||||
env = os.environ
|
||||
|
||||
# synchronize resolve utility scripts
|
||||
_sync_utility_scripts(env)
|
||||
|
||||
log.info("Fusion Pype wrapper has been installed")
|
||||
45
openpype/hosts/fusion/api/workio.py
Normal file
45
openpype/hosts/fusion/api/workio.py
Normal file
|
|
@ -0,0 +1,45 @@
|
|||
"""Host API required Work Files tool"""
|
||||
import sys
|
||||
import os
|
||||
from avalon import api
|
||||
from .pipeline import get_current_comp
|
||||
|
||||
|
||||
def file_extensions():
|
||||
return api.HOST_WORKFILE_EXTENSIONS["fusion"]
|
||||
|
||||
|
||||
def has_unsaved_changes():
|
||||
comp = get_current_comp()
|
||||
return comp.GetAttrs()["COMPB_Modified"]
|
||||
|
||||
|
||||
def save_file(filepath):
|
||||
comp = get_current_comp()
|
||||
comp.Save(filepath)
|
||||
|
||||
|
||||
def open_file(filepath):
|
||||
# Hack to get fusion, see
|
||||
# openpype.hosts.fusion.api.pipeline.get_current_comp()
|
||||
fusion = getattr(sys.modules["__main__"], "fusion", None)
|
||||
|
||||
return fusion.LoadComp(filepath)
|
||||
|
||||
|
||||
def current_file():
|
||||
comp = get_current_comp()
|
||||
current_filepath = comp.GetAttrs()["COMPS_FileName"]
|
||||
if not current_filepath:
|
||||
return None
|
||||
|
||||
return current_filepath
|
||||
|
||||
|
||||
def work_root(session):
|
||||
work_dir = session["AVALON_WORKDIR"]
|
||||
scene_dir = session.get("AVALON_SCENEDIR")
|
||||
if scene_dir:
|
||||
return os.path.join(work_dir, scene_dir)
|
||||
else:
|
||||
return work_dir
|
||||
|
|
@ -1,7 +1,8 @@
|
|||
import os
|
||||
import importlib
|
||||
import shutil
|
||||
|
||||
import openpype.hosts.fusion
|
||||
from openpype.lib import PreLaunchHook, ApplicationLaunchFailed
|
||||
from openpype.hosts.fusion.api import utils
|
||||
|
||||
|
||||
class FusionPrelaunch(PreLaunchHook):
|
||||
|
|
@ -13,40 +14,101 @@ class FusionPrelaunch(PreLaunchHook):
|
|||
|
||||
def execute(self):
|
||||
# making sure python 3.6 is installed at provided path
|
||||
py36_dir = os.path.normpath(self.launch_context.env.get("PYTHON36", ""))
|
||||
py36_dir = self.launch_context.env.get("PYTHON36")
|
||||
if not py36_dir:
|
||||
raise ApplicationLaunchFailed(
|
||||
"Required environment variable \"PYTHON36\" is not set."
|
||||
"\n\nFusion implementation requires to have"
|
||||
" installed Python 3.6"
|
||||
)
|
||||
|
||||
py36_dir = os.path.normpath(py36_dir)
|
||||
if not os.path.isdir(py36_dir):
|
||||
raise ApplicationLaunchFailed(
|
||||
"Python 3.6 is not installed at the provided path.\n"
|
||||
"Either make sure the 'environments/fusion.json' has "
|
||||
"'PYTHON36' set corectly or make sure Python 3.6 is installed "
|
||||
f"in the given path.\n\nPYTHON36: {py36_dir}"
|
||||
"Either make sure the environments in fusion settings has"
|
||||
" 'PYTHON36' set corectly or make sure Python 3.6 is installed"
|
||||
f" in the given path.\n\nPYTHON36: {py36_dir}"
|
||||
)
|
||||
self.log.info(f"Path to Fusion Python folder: '{py36_dir}'...")
|
||||
self.launch_context.env["PYTHON36"] = py36_dir
|
||||
|
||||
utility_dir = self.launch_context.env.get("FUSION_UTILITY_SCRIPTS_DIR")
|
||||
if not utility_dir:
|
||||
raise ApplicationLaunchFailed(
|
||||
"Required Fusion utility script dir environment variable"
|
||||
" \"FUSION_UTILITY_SCRIPTS_DIR\" is not set."
|
||||
)
|
||||
|
||||
# setting utility scripts dir for scripts syncing
|
||||
us_dir = os.path.normpath(
|
||||
self.launch_context.env.get("FUSION_UTILITY_SCRIPTS_DIR", "")
|
||||
)
|
||||
if not os.path.isdir(us_dir):
|
||||
utility_dir = os.path.normpath(utility_dir)
|
||||
if not os.path.isdir(utility_dir):
|
||||
raise ApplicationLaunchFailed(
|
||||
"Fusion utility script dir does not exist. Either make sure "
|
||||
"the 'environments/fusion.json' has "
|
||||
"'FUSION_UTILITY_SCRIPTS_DIR' set correctly or reinstall "
|
||||
f"Fusion.\n\nFUSION_UTILITY_SCRIPTS_DIR: '{us_dir}'"
|
||||
"the environments in fusion settings has"
|
||||
" 'FUSION_UTILITY_SCRIPTS_DIR' set correctly or reinstall "
|
||||
f"Fusion.\n\nFUSION_UTILITY_SCRIPTS_DIR: '{utility_dir}'"
|
||||
)
|
||||
|
||||
try:
|
||||
__import__("avalon.fusion")
|
||||
__import__("pyblish")
|
||||
self._sync_utility_scripts(self.launch_context.env)
|
||||
self.log.info("Fusion Pype wrapper has been installed")
|
||||
|
||||
except ImportError:
|
||||
self.log.warning(
|
||||
"pyblish: Could not load Fusion integration.",
|
||||
exc_info=True
|
||||
)
|
||||
def _sync_utility_scripts(self, env):
|
||||
""" Synchronizing basic utlility scripts for resolve.
|
||||
|
||||
else:
|
||||
# Resolve Setup integration
|
||||
importlib.reload(utils)
|
||||
utils.setup(self.launch_context.env)
|
||||
To be able to run scripts from inside `Fusion/Workspace/Scripts` menu
|
||||
all scripts has to be accessible from defined folder.
|
||||
"""
|
||||
if not env:
|
||||
env = {k: v for k, v in os.environ.items()}
|
||||
|
||||
# initiate inputs
|
||||
scripts = {}
|
||||
us_env = env.get("FUSION_UTILITY_SCRIPTS_SOURCE_DIR")
|
||||
us_dir = env.get("FUSION_UTILITY_SCRIPTS_DIR", "")
|
||||
us_paths = [os.path.join(
|
||||
os.path.dirname(os.path.abspath(openpype.hosts.fusion.__file__)),
|
||||
"utility_scripts"
|
||||
)]
|
||||
|
||||
# collect script dirs
|
||||
if us_env:
|
||||
self.log.info(f"Utility Scripts Env: `{us_env}`")
|
||||
us_paths = us_env.split(
|
||||
os.pathsep) + us_paths
|
||||
|
||||
# collect scripts from dirs
|
||||
for path in us_paths:
|
||||
scripts.update({path: os.listdir(path)})
|
||||
|
||||
self.log.info(f"Utility Scripts Dir: `{us_paths}`")
|
||||
self.log.info(f"Utility Scripts: `{scripts}`")
|
||||
|
||||
# make sure no script file is in folder
|
||||
if next((s for s in os.listdir(us_dir)), None):
|
||||
for s in os.listdir(us_dir):
|
||||
path = os.path.normpath(
|
||||
os.path.join(us_dir, s))
|
||||
self.log.info(f"Removing `{path}`...")
|
||||
|
||||
# remove file or directory if not in our folders
|
||||
if not os.path.isdir(path):
|
||||
os.remove(path)
|
||||
else:
|
||||
shutil.rmtree(path)
|
||||
|
||||
# copy scripts into Resolve's utility scripts dir
|
||||
for d, sl in scripts.items():
|
||||
# directory and scripts list
|
||||
for s in sl:
|
||||
# script in script list
|
||||
src = os.path.normpath(os.path.join(d, s))
|
||||
dst = os.path.normpath(os.path.join(us_dir, s))
|
||||
|
||||
self.log.info(f"Copying `{src}` to `{dst}`...")
|
||||
|
||||
# copy file or directory from our folders to fusion's folder
|
||||
if not os.path.isdir(src):
|
||||
shutil.copy2(src, dst)
|
||||
else:
|
||||
shutil.copytree(src, dst)
|
||||
|
|
|
|||
|
|
@ -1,7 +1,10 @@
|
|||
import os
|
||||
|
||||
import openpype.api
|
||||
from avalon import fusion
|
||||
from openpype.hosts.fusion.api import (
|
||||
get_current_comp,
|
||||
comp_lock_and_undo_chunk
|
||||
)
|
||||
|
||||
|
||||
class CreateOpenEXRSaver(openpype.api.Creator):
|
||||
|
|
@ -10,12 +13,13 @@ class CreateOpenEXRSaver(openpype.api.Creator):
|
|||
label = "Create OpenEXR Saver"
|
||||
hosts = ["fusion"]
|
||||
family = "render"
|
||||
defaults = ["Main"]
|
||||
|
||||
def process(self):
|
||||
|
||||
file_format = "OpenEXRFormat"
|
||||
|
||||
comp = fusion.get_current_comp()
|
||||
comp = get_current_comp()
|
||||
|
||||
# todo: improve method of getting current environment
|
||||
# todo: pref avalon.Session over os.environ
|
||||
|
|
@ -25,7 +29,7 @@ class CreateOpenEXRSaver(openpype.api.Creator):
|
|||
filename = "{}..tiff".format(self.name)
|
||||
filepath = os.path.join(workdir, "render", filename)
|
||||
|
||||
with fusion.comp_lock_and_undo_chunk(comp):
|
||||
with comp_lock_and_undo_chunk(comp):
|
||||
args = (-32768, -32768) # Magical position numbers
|
||||
saver = comp.AddTool("Saver", *args)
|
||||
saver.SetAttrs({"TOOLS_Name": self.name})
|
||||
|
|
|
|||
|
|
@ -8,15 +8,17 @@ class FusionSelectContainers(api.InventoryAction):
|
|||
color = "#d8d8d8"
|
||||
|
||||
def process(self, containers):
|
||||
|
||||
import avalon.fusion
|
||||
from openpype.hosts.fusion.api import (
|
||||
get_current_comp,
|
||||
comp_lock_and_undo_chunk
|
||||
)
|
||||
|
||||
tools = [i["_tool"] for i in containers]
|
||||
|
||||
comp = avalon.fusion.get_current_comp()
|
||||
comp = get_current_comp()
|
||||
flow = comp.CurrentFrame.FlowView
|
||||
|
||||
with avalon.fusion.comp_lock_and_undo_chunk(comp, self.label):
|
||||
with comp_lock_and_undo_chunk(comp, self.label):
|
||||
# Clear selection
|
||||
flow.Select()
|
||||
|
||||
|
|
|
|||
|
|
@ -1,7 +1,11 @@
|
|||
from avalon import api, style
|
||||
from avalon import api
|
||||
from Qt import QtGui, QtWidgets
|
||||
|
||||
import avalon.fusion
|
||||
from openpype import style
|
||||
from openpype.hosts.fusion.api import (
|
||||
get_current_comp,
|
||||
comp_lock_and_undo_chunk
|
||||
)
|
||||
|
||||
|
||||
class FusionSetToolColor(api.InventoryAction):
|
||||
|
|
@ -16,7 +20,7 @@ class FusionSetToolColor(api.InventoryAction):
|
|||
"""Color all selected tools the selected colors"""
|
||||
|
||||
result = []
|
||||
comp = avalon.fusion.get_current_comp()
|
||||
comp = get_current_comp()
|
||||
|
||||
# Get tool color
|
||||
first = containers[0]
|
||||
|
|
@ -33,7 +37,7 @@ class FusionSetToolColor(api.InventoryAction):
|
|||
if not picked_color:
|
||||
return
|
||||
|
||||
with avalon.fusion.comp_lock_and_undo_chunk(comp):
|
||||
with comp_lock_and_undo_chunk(comp):
|
||||
for container in containers:
|
||||
# Convert color to RGB 0-1 floats
|
||||
rgb_f = picked_color.getRgbF()
|
||||
|
|
|
|||
|
|
@ -12,7 +12,8 @@ class FusionSetFrameRangeLoader(api.Loader):
|
|||
"camera",
|
||||
"imagesequence",
|
||||
"yeticache",
|
||||
"pointcache"]
|
||||
"pointcache",
|
||||
"render"]
|
||||
representations = ["*"]
|
||||
|
||||
label = "Set frame range"
|
||||
|
|
@ -45,7 +46,8 @@ class FusionSetFrameRangeWithHandlesLoader(api.Loader):
|
|||
"camera",
|
||||
"imagesequence",
|
||||
"yeticache",
|
||||
"pointcache"]
|
||||
"pointcache",
|
||||
"render"]
|
||||
representations = ["*"]
|
||||
|
||||
label = "Set frame range (with handles)"
|
||||
|
|
|
|||
|
|
@ -1,12 +1,15 @@
|
|||
import os
|
||||
import contextlib
|
||||
|
||||
from avalon import api
|
||||
import avalon.io as io
|
||||
from avalon import api, io
|
||||
|
||||
from avalon import fusion
|
||||
from openpype.hosts.fusion.api import (
|
||||
imprint_container,
|
||||
get_current_comp,
|
||||
comp_lock_and_undo_chunk
|
||||
)
|
||||
|
||||
comp = fusion.get_current_comp()
|
||||
comp = get_current_comp()
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
|
|
@ -117,7 +120,7 @@ def loader_shift(loader, frame, relative=True):
|
|||
class FusionLoadSequence(api.Loader):
|
||||
"""Load image sequence into Fusion"""
|
||||
|
||||
families = ["imagesequence", "review"]
|
||||
families = ["imagesequence", "review", "render"]
|
||||
representations = ["*"]
|
||||
|
||||
label = "Load sequence"
|
||||
|
|
@ -126,13 +129,6 @@ class FusionLoadSequence(api.Loader):
|
|||
color = "orange"
|
||||
|
||||
def load(self, context, name, namespace, data):
|
||||
|
||||
from avalon.fusion import (
|
||||
imprint_container,
|
||||
get_current_comp,
|
||||
comp_lock_and_undo_chunk
|
||||
)
|
||||
|
||||
# Fallback to asset name when namespace is None
|
||||
if namespace is None:
|
||||
namespace = context['asset']['name']
|
||||
|
|
@ -204,13 +200,11 @@ class FusionLoadSequence(api.Loader):
|
|||
|
||||
"""
|
||||
|
||||
from avalon.fusion import comp_lock_and_undo_chunk
|
||||
|
||||
tool = container["_tool"]
|
||||
assert tool.ID == "Loader", "Must be Loader"
|
||||
comp = tool.Comp()
|
||||
|
||||
root = api.get_representation_path(representation)
|
||||
root = os.path.dirname(api.get_representation_path(representation))
|
||||
path = self._get_first_image(root)
|
||||
|
||||
# Get start frame from version data
|
||||
|
|
@ -247,9 +241,6 @@ class FusionLoadSequence(api.Loader):
|
|||
tool.SetData("avalon.representation", str(representation["_id"]))
|
||||
|
||||
def remove(self, container):
|
||||
|
||||
from avalon.fusion import comp_lock_and_undo_chunk
|
||||
|
||||
tool = container["_tool"]
|
||||
assert tool.ID == "Loader", "Must be Loader"
|
||||
comp = tool.Comp()
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@ import os
|
|||
|
||||
import pyblish.api
|
||||
|
||||
from avalon import fusion
|
||||
from openpype.hosts.fusion.api import get_current_comp
|
||||
|
||||
|
||||
class CollectCurrentCompFusion(pyblish.api.ContextPlugin):
|
||||
|
|
@ -15,7 +15,7 @@ class CollectCurrentCompFusion(pyblish.api.ContextPlugin):
|
|||
def process(self, context):
|
||||
"""Collect all image sequence tools"""
|
||||
|
||||
current_comp = fusion.get_current_comp()
|
||||
current_comp = get_current_comp()
|
||||
assert current_comp, "Must have active Fusion composition"
|
||||
context.data["currentComp"] = current_comp
|
||||
|
||||
|
|
|
|||
|
|
@ -34,7 +34,7 @@ class CollectInstances(pyblish.api.ContextPlugin):
|
|||
def process(self, context):
|
||||
"""Collect all image sequence tools"""
|
||||
|
||||
from avalon.fusion.lib import get_frame_path
|
||||
from openpype.hosts.fusion.api.lib import get_frame_path
|
||||
|
||||
comp = context.data["currentComp"]
|
||||
|
||||
|
|
|
|||
|
|
@ -1,9 +1,9 @@
|
|||
import os
|
||||
import pyblish.api
|
||||
|
||||
import avalon.fusion as fusion
|
||||
from pprint import pformat
|
||||
|
||||
import pyblish.api
|
||||
from openpype.hosts.fusion.api import comp_lock_and_undo_chunk
|
||||
|
||||
|
||||
class Fusionlocal(pyblish.api.InstancePlugin):
|
||||
"""Render the current Fusion composition locally.
|
||||
|
|
@ -39,7 +39,7 @@ class Fusionlocal(pyblish.api.InstancePlugin):
|
|||
self.log.info("Start frame: {}".format(frame_start))
|
||||
self.log.info("End frame: {}".format(frame_end))
|
||||
|
||||
with fusion.comp_lock_and_undo_chunk(current_comp):
|
||||
with comp_lock_and_undo_chunk(current_comp):
|
||||
result = current_comp.Render()
|
||||
|
||||
if "representations" not in instance.data:
|
||||
|
|
|
|||
|
|
@ -2,8 +2,9 @@ import os
|
|||
import json
|
||||
import getpass
|
||||
|
||||
import requests
|
||||
|
||||
from avalon import api
|
||||
from avalon.vendor import requests
|
||||
|
||||
import pyblish.api
|
||||
|
||||
|
|
@ -30,7 +31,7 @@ class FusionSubmitDeadline(pyblish.api.InstancePlugin):
|
|||
else:
|
||||
context.data[key] = True
|
||||
|
||||
from avalon.fusion.lib import get_frame_path
|
||||
from openpype.hosts.fusion.api.lib import get_frame_path
|
||||
|
||||
deadline_url = (
|
||||
context.data["system_settings"]
|
||||
|
|
|
|||
|
|
@ -1,4 +1,7 @@
|
|||
from avalon import fusion
|
||||
from openpype.hosts.fusion.api import (
|
||||
comp_lock_and_undo_chunk,
|
||||
get_current_comp
|
||||
)
|
||||
|
||||
|
||||
def is_connected(input):
|
||||
|
|
@ -9,12 +12,12 @@ def is_connected(input):
|
|||
def duplicate_with_input_connections():
|
||||
"""Duplicate selected tools with incoming connections."""
|
||||
|
||||
comp = fusion.get_current_comp()
|
||||
comp = get_current_comp()
|
||||
original_tools = comp.GetToolList(True).values()
|
||||
if not original_tools:
|
||||
return # nothing selected
|
||||
|
||||
with fusion.comp_lock_and_undo_chunk(
|
||||
with comp_lock_and_undo_chunk(
|
||||
comp, "Duplicate With Input Connections"):
|
||||
|
||||
# Generate duplicates
|
||||
|
|
|
|||
|
|
@ -4,12 +4,12 @@ import sys
|
|||
import logging
|
||||
|
||||
# Pipeline imports
|
||||
from avalon import api, io, pipeline
|
||||
import avalon.fusion
|
||||
import avalon.api
|
||||
from avalon import io, pipeline
|
||||
|
||||
# Config imports
|
||||
import openpype.lib as pype
|
||||
import openpype.hosts.fusion.api.lib as fusion_lib
|
||||
from openpype.lib import version_up
|
||||
from openpype.hosts.fusion import api
|
||||
from openpype.hosts.fusion.api import lib
|
||||
|
||||
log = logging.getLogger("Update Slap Comp")
|
||||
|
||||
|
|
@ -87,7 +87,7 @@ def _format_filepath(session):
|
|||
|
||||
# Create new unique filepath
|
||||
if os.path.exists(new_filepath):
|
||||
new_filepath = pype.version_up(new_filepath)
|
||||
new_filepath = version_up(new_filepath)
|
||||
|
||||
return new_filepath
|
||||
|
||||
|
|
@ -119,7 +119,7 @@ def _update_savers(comp, session):
|
|||
|
||||
comp.Print("New renders to: %s\n" % renders)
|
||||
|
||||
with avalon.fusion.comp_lock_and_undo_chunk(comp):
|
||||
with api.comp_lock_and_undo_chunk(comp):
|
||||
savers = comp.GetToolList(False, "Saver").values()
|
||||
for saver in savers:
|
||||
filepath = saver.GetAttrs("TOOLST_Clip_Name")[1.0]
|
||||
|
|
@ -185,7 +185,7 @@ def update_frame_range(comp, representations):
|
|||
start = min(v["data"]["frameStart"] for v in versions)
|
||||
end = max(v["data"]["frameEnd"] for v in versions)
|
||||
|
||||
fusion_lib.update_frame_range(start, end, comp=comp)
|
||||
lib.update_frame_range(start, end, comp=comp)
|
||||
|
||||
|
||||
def switch(asset_name, filepath=None, new=True):
|
||||
|
|
@ -215,11 +215,11 @@ def switch(asset_name, filepath=None, new=True):
|
|||
|
||||
# Get current project
|
||||
self._project = io.find_one({"type": "project",
|
||||
"name": api.Session["AVALON_PROJECT"]})
|
||||
"name": avalon.api.Session["AVALON_PROJECT"]})
|
||||
|
||||
# Go to comp
|
||||
if not filepath:
|
||||
current_comp = avalon.fusion.get_current_comp()
|
||||
current_comp = api.get_current_comp()
|
||||
assert current_comp is not None, "Could not find current comp"
|
||||
else:
|
||||
fusion = _get_fusion_instance()
|
||||
|
|
@ -227,14 +227,14 @@ def switch(asset_name, filepath=None, new=True):
|
|||
assert current_comp is not None, (
|
||||
"Fusion could not load '{}'").format(filepath)
|
||||
|
||||
host = api.registered_host()
|
||||
host = avalon.api.registered_host()
|
||||
containers = list(host.ls())
|
||||
assert containers, "Nothing to update"
|
||||
|
||||
representations = []
|
||||
for container in containers:
|
||||
try:
|
||||
representation = fusion_lib.switch_item(
|
||||
representation = lib.switch_item(
|
||||
container,
|
||||
asset_name=asset_name)
|
||||
representations.append(representation)
|
||||
|
|
@ -246,7 +246,7 @@ def switch(asset_name, filepath=None, new=True):
|
|||
current_comp.Print(message)
|
||||
|
||||
# Build the session to switch to
|
||||
switch_to_session = api.Session.copy()
|
||||
switch_to_session = avalon.api.Session.copy()
|
||||
switch_to_session["AVALON_ASSET"] = asset['name']
|
||||
|
||||
if new:
|
||||
|
|
@ -255,7 +255,7 @@ def switch(asset_name, filepath=None, new=True):
|
|||
# Update savers output based on new session
|
||||
_update_savers(current_comp, switch_to_session)
|
||||
else:
|
||||
comp_path = pype.version_up(filepath)
|
||||
comp_path = version_up(filepath)
|
||||
|
||||
current_comp.Print(comp_path)
|
||||
|
||||
|
|
@ -288,7 +288,7 @@ if __name__ == '__main__':
|
|||
|
||||
args, unknown = parser.parse_args()
|
||||
|
||||
api.install(avalon.fusion)
|
||||
avalon.api.install(api)
|
||||
switch(args.asset_name, args.file_path)
|
||||
|
||||
sys.exit(0)
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
from Qt import QtWidgets
|
||||
from avalon.vendor import qtawesome
|
||||
import avalon.fusion as avalon
|
||||
from openpype.hosts.fusion.api import get_current_comp
|
||||
|
||||
|
||||
_help = {"local": "Render the comp on your own machine and publish "
|
||||
|
|
@ -14,7 +14,7 @@ class SetRenderMode(QtWidgets.QWidget):
|
|||
def __init__(self, parent=None):
|
||||
QtWidgets.QWidget.__init__(self, parent)
|
||||
|
||||
self._comp = avalon.get_current_comp()
|
||||
self._comp = get_current_comp()
|
||||
self._comp_name = self._get_comp_name()
|
||||
|
||||
self.setWindowTitle("Set Render Mode")
|
||||
|
|
@ -79,7 +79,7 @@ class SetRenderMode(QtWidgets.QWidget):
|
|||
def update(self):
|
||||
"""Update all information in the UI"""
|
||||
|
||||
self._comp = avalon.get_current_comp()
|
||||
self._comp = get_current_comp()
|
||||
self._comp_name = self._get_comp_name()
|
||||
self.comp_information.setText(self._comp_name)
|
||||
|
||||
|
|
|
|||
|
|
@ -1,10 +1,11 @@
|
|||
from avalon.fusion import comp_lock_and_undo_chunk
|
||||
|
||||
from avalon import fusion
|
||||
comp = fusion.get_current_comp()
|
||||
from openpype.hosts.fusion.api import (
|
||||
comp_lock_and_undo_chunk,
|
||||
get_current_comp
|
||||
)
|
||||
|
||||
|
||||
def main():
|
||||
comp = get_current_comp()
|
||||
"""Set all selected backgrounds to 32 bit"""
|
||||
with comp_lock_and_undo_chunk(comp, 'Selected Backgrounds to 32bit'):
|
||||
tools = comp.GetToolList(True, "Background").values()
|
||||
|
|
|
|||
|
|
@ -1,9 +1,11 @@
|
|||
from avalon.fusion import comp_lock_and_undo_chunk
|
||||
from avalon import fusion
|
||||
comp = fusion.get_current_comp()
|
||||
from openpype.hosts.fusion.api import (
|
||||
comp_lock_and_undo_chunk,
|
||||
get_current_comp
|
||||
)
|
||||
|
||||
|
||||
def main():
|
||||
comp = get_current_comp()
|
||||
"""Set all backgrounds to 32 bit"""
|
||||
with comp_lock_and_undo_chunk(comp, 'Backgrounds to 32bit'):
|
||||
tools = comp.GetToolList(False, "Background").values()
|
||||
|
|
|
|||
|
|
@ -1,9 +1,11 @@
|
|||
from avalon.fusion import comp_lock_and_undo_chunk
|
||||
from avalon import fusion
|
||||
comp = fusion.get_current_comp()
|
||||
from openpype.hosts.fusion.api import (
|
||||
comp_lock_and_undo_chunk,
|
||||
get_current_comp
|
||||
)
|
||||
|
||||
|
||||
def main():
|
||||
comp = get_current_comp()
|
||||
"""Set all selected loaders to 32 bit"""
|
||||
with comp_lock_and_undo_chunk(comp, 'Selected Loaders to 32bit'):
|
||||
tools = comp.GetToolList(True, "Loader").values()
|
||||
|
|
|
|||
|
|
@ -1,9 +1,11 @@
|
|||
from avalon.fusion import comp_lock_and_undo_chunk
|
||||
from avalon import fusion
|
||||
comp = fusion.get_current_comp()
|
||||
from openpype.hosts.fusion.api import (
|
||||
comp_lock_and_undo_chunk,
|
||||
get_current_comp
|
||||
)
|
||||
|
||||
|
||||
def main():
|
||||
comp = get_current_comp()
|
||||
"""Set all loaders to 32 bit"""
|
||||
with comp_lock_and_undo_chunk(comp, 'Loaders to 32bit'):
|
||||
tools = comp.GetToolList(False, "Loader").values()
|
||||
|
|
|
|||
|
|
@ -8,13 +8,15 @@ log = Logger().get_logger(__name__)
|
|||
|
||||
|
||||
def main(env):
|
||||
import avalon.api
|
||||
from openpype.hosts.fusion import api
|
||||
from openpype.hosts.fusion.api import menu
|
||||
import avalon.fusion
|
||||
|
||||
# Registers pype's Global pyblish plugins
|
||||
openpype.install()
|
||||
|
||||
# activate resolve from pype
|
||||
avalon.api.install(avalon.fusion)
|
||||
avalon.api.install(api)
|
||||
|
||||
log.info(f"Avalon registered hosts: {avalon.api.registered_host()}")
|
||||
|
||||
|
|
|
|||
|
|
@ -4,13 +4,12 @@ import logging
|
|||
|
||||
from Qt import QtWidgets, QtCore
|
||||
|
||||
import avalon.io as io
|
||||
import avalon.api as api
|
||||
import avalon.pipeline as pipeline
|
||||
import avalon.fusion
|
||||
import avalon.style as style
|
||||
import avalon.api
|
||||
from avalon import io, pipeline
|
||||
from avalon.vendor import qtawesome as qta
|
||||
|
||||
from openpype import style
|
||||
from openpype.hosts.fusion import api
|
||||
|
||||
log = logging.getLogger("Fusion Switch Shot")
|
||||
|
||||
|
|
@ -150,7 +149,7 @@ class App(QtWidgets.QWidget):
|
|||
if not self._use_current.isChecked():
|
||||
file_name = self._comps.itemData(self._comps.currentIndex())
|
||||
else:
|
||||
comp = avalon.fusion.get_current_comp()
|
||||
comp = api.get_current_comp()
|
||||
file_name = comp.GetAttrs("COMPS_FileName")
|
||||
|
||||
asset = self._assets.currentText()
|
||||
|
|
@ -161,11 +160,11 @@ class App(QtWidgets.QWidget):
|
|||
def _get_context_directory(self):
|
||||
|
||||
project = io.find_one({"type": "project",
|
||||
"name": api.Session["AVALON_PROJECT"]},
|
||||
"name": avalon.api.Session["AVALON_PROJECT"]},
|
||||
projection={"config": True})
|
||||
|
||||
template = project["config"]["template"]["work"]
|
||||
dir = pipeline._format_work_template(template, api.Session)
|
||||
dir = pipeline._format_work_template(template, avalon.api.Session)
|
||||
|
||||
return dir
|
||||
|
||||
|
|
@ -174,7 +173,7 @@ class App(QtWidgets.QWidget):
|
|||
return items
|
||||
|
||||
def collect_assets(self):
|
||||
return list(io.find({"type": "asset", "silo": "film"}))
|
||||
return list(io.find({"type": "asset"}, {"name": True}))
|
||||
|
||||
def populate_comp_box(self, files):
|
||||
"""Ensure we display the filename only but the path is stored as well
|
||||
|
|
@ -193,7 +192,7 @@ class App(QtWidgets.QWidget):
|
|||
|
||||
if __name__ == '__main__':
|
||||
import sys
|
||||
api.install(avalon.fusion)
|
||||
avalon.api.install(api)
|
||||
|
||||
app = QtWidgets.QApplication(sys.argv)
|
||||
window = App()
|
||||
|
|
|
|||
|
|
@ -5,12 +5,15 @@ Warning:
|
|||
settings of the Loader. So use this at your own risk.
|
||||
|
||||
"""
|
||||
from avalon import fusion
|
||||
from openpype.hosts.fusion.api.pipeline import (
|
||||
get_current_comp,
|
||||
comp_lock_and_undo_chunk
|
||||
)
|
||||
|
||||
|
||||
def update_loader_ranges():
|
||||
comp = fusion.get_current_comp()
|
||||
with fusion.comp_lock_and_undo_chunk(comp, "Reload clip time ranges"):
|
||||
comp = get_current_comp()
|
||||
with comp_lock_and_undo_chunk(comp, "Reload clip time ranges"):
|
||||
tools = comp.GetToolList(True, "Loader").values()
|
||||
for tool in tools:
|
||||
|
||||
|
|
|
|||
|
|
@ -4,7 +4,8 @@ import os
|
|||
def add_implementation_envs(env, _app):
|
||||
"""Modify environments to contain all required for implementation."""
|
||||
openharmony_path = os.path.join(
|
||||
os.environ["OPENPYPE_REPOS_ROOT"], "pype", "vendor", "OpenHarmony"
|
||||
os.environ["OPENPYPE_REPOS_ROOT"], "openpype", "hosts",
|
||||
"harmony", "vendor", "OpenHarmony"
|
||||
)
|
||||
# TODO check if is already set? What to do if is already set?
|
||||
env["LIB_OPENHARMONY_PATH"] = openharmony_path
|
||||
|
|
|
|||
655
openpype/hosts/harmony/api/README.md
Normal file
655
openpype/hosts/harmony/api/README.md
Normal file
|
|
@ -0,0 +1,655 @@
|
|||
# Harmony Integration
|
||||
|
||||
## Setup
|
||||
|
||||
The easiest way to setup for using Toon Boom Harmony is to use the built-in launch:
|
||||
|
||||
```
|
||||
python -c "import openpype.hosts.harmony.api as harmony;harmony.launch("path/to/harmony/executable")"
|
||||
```
|
||||
|
||||
Communication with Harmony happens with a server/client relationship where the server is in the Python process and the client is in the Harmony process. Messages between Python and Harmony are required to be dictionaries, which are serialized to strings:
|
||||
```
|
||||
+------------+
|
||||
| |
|
||||
| Python |
|
||||
| Process |
|
||||
| |
|
||||
| +--------+ |
|
||||
| | | |
|
||||
| | Main | |
|
||||
| | Thread | |
|
||||
| | | |
|
||||
| +----^---+ |
|
||||
| || |
|
||||
| || |
|
||||
| +---v----+ | +---------+
|
||||
| | | | | |
|
||||
| | Server +-------> Harmony |
|
||||
| | Thread <-------+ Process |
|
||||
| | | | | |
|
||||
| +--------+ | +---------+
|
||||
+------------+
|
||||
```
|
||||
|
||||
Server/client now uses stricter protocol to handle communication. This is necessary because of precise control over data passed between server/client. Each message is prepended with 6 bytes:
|
||||
```
|
||||
| A | H | 0x00 | 0x00 | 0x00 | 0x00 | ...
|
||||
|
||||
```
|
||||
First two bytes are *magic* bytes stands for **A**valon **H**armony. Next four bytes hold length of the message `...` encoded as 32bit unsigned integer. This way we know how many bytes to read from the socket and if we need more or we need to parse multiple messages.
|
||||
|
||||
|
||||
## Usage
|
||||
|
||||
The integration creates an `Openpype` menu entry where all related tools are located.
|
||||
|
||||
**NOTE: Menu creation can be temperamental. The best way is to launch Harmony and do nothing else until Harmony is fully launched.**
|
||||
|
||||
### Work files
|
||||
|
||||
Because Harmony projects are directories, this integration uses `.zip` as work file extension. Internally the project directories are stored under `[User]/.avalon/harmony`. Whenever the user saves the `.xstage` file, the integration zips up the project directory and moves it to the Avalon project path. Zipping and moving happens in the background.
|
||||
|
||||
### Show Workfiles on launch
|
||||
|
||||
You can show the Workfiles app when Harmony launches by setting environment variable `AVALON_HARMONY_WORKFILES_ON_LAUNCH=1`.
|
||||
|
||||
## Developing
|
||||
|
||||
### Low level messaging
|
||||
To send from Python to Harmony you can use the exposed method:
|
||||
```python
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
from uuid import uuid4
|
||||
|
||||
|
||||
func = """function %s_hello(person)
|
||||
{
|
||||
return ("Hello " + person + "!");
|
||||
}
|
||||
%s_hello
|
||||
""" % (uuid4(), uuid4())
|
||||
print(harmony.send({"function": func, "args": ["Python"]})["result"])
|
||||
```
|
||||
**NOTE:** Its important to declare the function at the end of the function string. You can have multiple functions within your function string, but the function declared at the end is what gets executed.
|
||||
|
||||
To send a function with multiple arguments its best to declare the arguments within the function:
|
||||
```python
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
from uuid import uuid4
|
||||
|
||||
signature = str(uuid4()).replace("-", "_")
|
||||
func = """function %s_hello(args)
|
||||
{
|
||||
var greeting = args[0];
|
||||
var person = args[1];
|
||||
return (greeting + " " + person + "!");
|
||||
}
|
||||
%s_hello
|
||||
""" % (signature, signature)
|
||||
print(harmony.send({"function": func, "args": ["Hello", "Python"]})["result"])
|
||||
```
|
||||
|
||||
### Caution
|
||||
|
||||
When naming your functions be aware that they are executed in global scope. They can potentially clash with Harmony own function and object names.
|
||||
For example `func` is already existing Harmony object. When you call your function `func` it will overwrite in global scope the one from Harmony, causing
|
||||
erratic behavior of Harmony. Openpype is prefixing those function names with [UUID4](https://docs.python.org/3/library/uuid.html) making chance of such clash minimal.
|
||||
See above examples how that works. This will result in function named `38dfcef0_a6d7_4064_8069_51fe99ab276e_hello()`.
|
||||
You can find list of Harmony object and function in Harmony documentation.
|
||||
|
||||
### Higher level (recommended)
|
||||
|
||||
Instead of sending functions directly to Harmony, it is more efficient and safe to just add your code to `js/PypeHarmony.js` or utilize `{"script": "..."}` method.
|
||||
|
||||
#### Extending PypeHarmony.js
|
||||
|
||||
Add your function to `PypeHarmony.js`. For example:
|
||||
|
||||
```javascript
|
||||
PypeHarmony.myAwesomeFunction = function() {
|
||||
someCoolStuff();
|
||||
};
|
||||
```
|
||||
Then you can call that javascript code from your Python like:
|
||||
|
||||
```Python
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
harmony.send({"function": "PypeHarmony.myAwesomeFunction"});
|
||||
|
||||
```
|
||||
|
||||
#### Using Script method
|
||||
|
||||
You can also pass whole scripts into harmony and call their functions later as needed.
|
||||
|
||||
For example, you have bunch of javascript files:
|
||||
|
||||
```javascript
|
||||
/* Master.js */
|
||||
|
||||
var Master = {
|
||||
Foo = {};
|
||||
Boo = {};
|
||||
};
|
||||
|
||||
/* FileA.js */
|
||||
var Foo = function() {};
|
||||
|
||||
Foo.prototype.A = function() {
|
||||
someAStuff();
|
||||
}
|
||||
|
||||
// This will construct object Foo and add it to Master namespace.
|
||||
Master.Foo = new Foo();
|
||||
|
||||
/* FileB.js */
|
||||
var Boo = function() {};
|
||||
|
||||
Boo.prototype.B = function() {
|
||||
someBStuff();
|
||||
}
|
||||
|
||||
// This will construct object Boo and add it to Master namespace.
|
||||
Master.Boo = new Boo();
|
||||
```
|
||||
|
||||
Now in python, just read all those files and send them to Harmony.
|
||||
|
||||
```python
|
||||
from pathlib import Path
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
path_to_js = Path('/path/to/my/js')
|
||||
script_to_send = ""
|
||||
|
||||
for file in path_to_js.iterdir():
|
||||
if file.suffix == ".js":
|
||||
script_to_send += file.read_text()
|
||||
|
||||
harmony.send({"script": script_to_send})
|
||||
|
||||
# and use your code in Harmony
|
||||
harmony.send({"function": "Master.Boo.B"})
|
||||
|
||||
```
|
||||
|
||||
### Scene Save
|
||||
Instead of sending a request to Harmony with `scene.saveAll` please use:
|
||||
```python
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
harmony.save_scene()
|
||||
```
|
||||
|
||||
<details>
|
||||
<summary>Click to expand for details on scene save.</summary>
|
||||
|
||||
Because Openpype tools does not deal well with folders for a single entity like a Harmony scene, this integration has implemented to use zip files to encapsulate the Harmony scene folders. Saving scene in Harmony via menu or CTRL+S will not result in producing zip file, only saving it from Workfiles will. This is because
|
||||
zipping process can take some time in which we cannot block user from saving again. If xstage file is changed during zipping process it will produce corrupted zip
|
||||
archive.
|
||||
</details>
|
||||
|
||||
### Plugin Examples
|
||||
These plugins were made with the [polly config](https://github.com/mindbender-studio/config).
|
||||
|
||||
#### Creator Plugin
|
||||
```python
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
from uuid import uuid4
|
||||
|
||||
|
||||
class CreateComposite(harmony.Creator):
|
||||
"""Composite node for publish."""
|
||||
|
||||
name = "compositeDefault"
|
||||
label = "Composite"
|
||||
family = "mindbender.template"
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super(CreateComposite, self).__init__(*args, **kwargs)
|
||||
```
|
||||
|
||||
The creator plugin can be configured to use other node types. For example here is a write node creator:
|
||||
```python
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
|
||||
class CreateRender(harmony.Creator):
|
||||
"""Composite node for publishing renders."""
|
||||
|
||||
name = "writeDefault"
|
||||
label = "Write"
|
||||
family = "mindbender.imagesequence"
|
||||
node_type = "WRITE"
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super(CreateRender, self).__init__(*args, **kwargs)
|
||||
|
||||
def setup_node(self, node):
|
||||
signature = str(uuid4()).replace("-", "_")
|
||||
func = """function %s_func(args)
|
||||
{
|
||||
node.setTextAttr(args[0], "DRAWING_TYPE", 1, "PNG4");
|
||||
}
|
||||
%s_func
|
||||
""" % (signature, signature)
|
||||
harmony.send(
|
||||
{"function": func, "args": [node]}
|
||||
)
|
||||
```
|
||||
|
||||
#### Collector Plugin
|
||||
```python
|
||||
import pyblish.api
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
|
||||
class CollectInstances(pyblish.api.ContextPlugin):
|
||||
"""Gather instances by nodes metadata.
|
||||
|
||||
This collector takes into account assets that are associated with
|
||||
a composite node and marked with a unique identifier;
|
||||
|
||||
Identifier:
|
||||
id (str): "pyblish.avalon.instance"
|
||||
"""
|
||||
|
||||
label = "Instances"
|
||||
order = pyblish.api.CollectorOrder
|
||||
hosts = ["harmony"]
|
||||
|
||||
def process(self, context):
|
||||
nodes = harmony.send(
|
||||
{"function": "node.getNodes", "args": [["COMPOSITE"]]}
|
||||
)["result"]
|
||||
|
||||
for node in nodes:
|
||||
data = harmony.read(node)
|
||||
|
||||
# Skip non-tagged nodes.
|
||||
if not data:
|
||||
continue
|
||||
|
||||
# Skip containers.
|
||||
if "container" in data["id"]:
|
||||
continue
|
||||
|
||||
instance = context.create_instance(node.split("/")[-1])
|
||||
instance.append(node)
|
||||
instance.data.update(data)
|
||||
|
||||
# Produce diagnostic message for any graphical
|
||||
# user interface interested in visualising it.
|
||||
self.log.info("Found: \"%s\" " % instance.data["name"])
|
||||
```
|
||||
|
||||
#### Extractor Plugin
|
||||
```python
|
||||
import os
|
||||
|
||||
import pyblish.api
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
import clique
|
||||
|
||||
|
||||
class ExtractImage(pyblish.api.InstancePlugin):
|
||||
"""Produce a flattened image file from instance.
|
||||
This plug-in only takes into account the nodes connected to the composite.
|
||||
"""
|
||||
label = "Extract Image Sequence"
|
||||
order = pyblish.api.ExtractorOrder
|
||||
hosts = ["harmony"]
|
||||
families = ["mindbender.imagesequence"]
|
||||
|
||||
def process(self, instance):
|
||||
project_path = harmony.send(
|
||||
{"function": "scene.currentProjectPath"}
|
||||
)["result"]
|
||||
|
||||
# Store reference for integration
|
||||
if "files" not in instance.data:
|
||||
instance.data["files"] = list()
|
||||
|
||||
# Store display source node for later.
|
||||
display_node = "Top/Display"
|
||||
signature = str(uuid4()).replace("-", "_")
|
||||
func = """function %s_func(display_node)
|
||||
{
|
||||
var source_node = null;
|
||||
if (node.isLinked(display_node, 0))
|
||||
{
|
||||
source_node = node.srcNode(display_node, 0);
|
||||
node.unlink(display_node, 0);
|
||||
}
|
||||
return source_node
|
||||
}
|
||||
%s_func
|
||||
""" % (signature, signature)
|
||||
display_source_node = harmony.send(
|
||||
{"function": func, "args": [display_node]}
|
||||
)["result"]
|
||||
|
||||
# Perform extraction
|
||||
path = os.path.join(
|
||||
os.path.normpath(
|
||||
project_path
|
||||
).replace("\\", "/"),
|
||||
instance.data["name"]
|
||||
)
|
||||
if not os.path.exists(path):
|
||||
os.makedirs(path)
|
||||
|
||||
render_func = """function frameReady(frame, celImage)
|
||||
{{
|
||||
var path = "{path}/{filename}" + frame + ".png";
|
||||
celImage.imageFileAs(path, "", "PNG4");
|
||||
}}
|
||||
function %s_func(composite_node)
|
||||
{{
|
||||
node.link(composite_node, 0, "{display_node}", 0);
|
||||
render.frameReady.connect(frameReady);
|
||||
render.setRenderDisplay("{display_node}");
|
||||
render.renderSceneAll();
|
||||
render.frameReady.disconnect(frameReady);
|
||||
}}
|
||||
%s_func
|
||||
""" % (signature, signature)
|
||||
restore_func = """function %s_func(args)
|
||||
{
|
||||
var display_node = args[0];
|
||||
var display_source_node = args[1];
|
||||
if (node.isLinked(display_node, 0))
|
||||
{
|
||||
node.unlink(display_node, 0);
|
||||
}
|
||||
node.link(display_source_node, 0, display_node, 0);
|
||||
}
|
||||
%s_func
|
||||
""" % (signature, signature)
|
||||
|
||||
with harmony.maintained_selection():
|
||||
self.log.info("Extracting %s" % str(list(instance)))
|
||||
|
||||
harmony.send(
|
||||
{
|
||||
"function": render_func.format(
|
||||
path=path.replace("\\", "/"),
|
||||
filename=os.path.basename(path),
|
||||
display_node=display_node
|
||||
),
|
||||
"args": [instance[0]]
|
||||
}
|
||||
)
|
||||
|
||||
# Restore display.
|
||||
if display_source_node:
|
||||
harmony.send(
|
||||
{
|
||||
"function": restore_func,
|
||||
"args": [display_node, display_source_node]
|
||||
}
|
||||
)
|
||||
|
||||
files = os.listdir(path)
|
||||
collections, remainder = clique.assemble(files, minimum_items=1)
|
||||
assert not remainder, (
|
||||
"There shouldn't have been a remainder for '%s': "
|
||||
"%s" % (instance[0], remainder)
|
||||
)
|
||||
assert len(collections) == 1, (
|
||||
"There should only be one image sequence in {}. Found: {}".format(
|
||||
path, len(collections)
|
||||
)
|
||||
)
|
||||
|
||||
data = {
|
||||
"subset": collections[0].head,
|
||||
"isSeries": True,
|
||||
"stagingDir": path,
|
||||
"files": list(collections[0]),
|
||||
}
|
||||
instance.data.update(data)
|
||||
|
||||
self.log.info("Extracted {instance} to {path}".format(**locals()))
|
||||
```
|
||||
|
||||
#### Loader Plugin
|
||||
```python
|
||||
import os
|
||||
|
||||
from avalon import api, io
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
signature = str(uuid4()).replace("-", "_")
|
||||
copy_files = """function copyFile(srcFilename, dstFilename)
|
||||
{
|
||||
var srcFile = new PermanentFile(srcFilename);
|
||||
var dstFile = new PermanentFile(dstFilename);
|
||||
srcFile.copy(dstFile);
|
||||
}
|
||||
"""
|
||||
|
||||
import_files = """function %s_import_files()
|
||||
{
|
||||
var PNGTransparencyMode = 0; // Premultiplied wih Black
|
||||
var TGATransparencyMode = 0; // Premultiplied wih Black
|
||||
var SGITransparencyMode = 0; // Premultiplied wih Black
|
||||
var LayeredPSDTransparencyMode = 1; // Straight
|
||||
var FlatPSDTransparencyMode = 2; // Premultiplied wih White
|
||||
|
||||
function getUniqueColumnName( column_prefix )
|
||||
{
|
||||
var suffix = 0;
|
||||
// finds if unique name for a column
|
||||
var column_name = column_prefix;
|
||||
while(suffix < 2000)
|
||||
{
|
||||
if(!column.type(column_name))
|
||||
break;
|
||||
|
||||
suffix = suffix + 1;
|
||||
column_name = column_prefix + "_" + suffix;
|
||||
}
|
||||
return column_name;
|
||||
}
|
||||
|
||||
function import_files(args)
|
||||
{
|
||||
var root = args[0];
|
||||
var files = args[1];
|
||||
var name = args[2];
|
||||
var start_frame = args[3];
|
||||
|
||||
var vectorFormat = null;
|
||||
var extension = null;
|
||||
var filename = files[0];
|
||||
|
||||
var pos = filename.lastIndexOf(".");
|
||||
if( pos < 0 )
|
||||
return null;
|
||||
|
||||
extension = filename.substr(pos+1).toLowerCase();
|
||||
|
||||
if(extension == "jpeg")
|
||||
extension = "jpg";
|
||||
if(extension == "tvg")
|
||||
{
|
||||
vectorFormat = "TVG"
|
||||
extension ="SCAN"; // element.add() will use this.
|
||||
}
|
||||
|
||||
var elemId = element.add(
|
||||
name,
|
||||
"BW",
|
||||
scene.numberOfUnitsZ(),
|
||||
extension.toUpperCase(),
|
||||
vectorFormat
|
||||
);
|
||||
if (elemId == -1)
|
||||
{
|
||||
// hum, unknown file type most likely -- let's skip it.
|
||||
return null; // no read to add.
|
||||
}
|
||||
|
||||
var uniqueColumnName = getUniqueColumnName(name);
|
||||
column.add(uniqueColumnName , "DRAWING");
|
||||
column.setElementIdOfDrawing(uniqueColumnName, elemId);
|
||||
|
||||
var read = node.add(root, name, "READ", 0, 0, 0);
|
||||
var transparencyAttr = node.getAttr(
|
||||
read, frame.current(), "READ_TRANSPARENCY"
|
||||
);
|
||||
var opacityAttr = node.getAttr(read, frame.current(), "OPACITY");
|
||||
transparencyAttr.setValue(true);
|
||||
opacityAttr.setValue(true);
|
||||
|
||||
var alignmentAttr = node.getAttr(read, frame.current(), "ALIGNMENT_RULE");
|
||||
alignmentAttr.setValue("ASIS");
|
||||
|
||||
var transparencyModeAttr = node.getAttr(
|
||||
read, frame.current(), "applyMatteToColor"
|
||||
);
|
||||
if (extension == "png")
|
||||
transparencyModeAttr.setValue(PNGTransparencyMode);
|
||||
if (extension == "tga")
|
||||
transparencyModeAttr.setValue(TGATransparencyMode);
|
||||
if (extension == "sgi")
|
||||
transparencyModeAttr.setValue(SGITransparencyMode);
|
||||
if (extension == "psd")
|
||||
transparencyModeAttr.setValue(FlatPSDTransparencyMode);
|
||||
|
||||
node.linkAttr(read, "DRAWING.ELEMENT", uniqueColumnName);
|
||||
|
||||
// Create a drawing for each file.
|
||||
for( var i =0; i <= files.length - 1; ++i)
|
||||
{
|
||||
timing = start_frame + i
|
||||
// Create a drawing drawing, 'true' indicate that the file exists.
|
||||
Drawing.create(elemId, timing, true);
|
||||
// Get the actual path, in tmp folder.
|
||||
var drawingFilePath = Drawing.filename(elemId, timing.toString());
|
||||
copyFile( files[i], drawingFilePath );
|
||||
|
||||
column.setEntry(uniqueColumnName, 1, timing, timing.toString());
|
||||
}
|
||||
return read;
|
||||
}
|
||||
import_files();
|
||||
}
|
||||
%s_import_files
|
||||
""" % (signature, signature)
|
||||
|
||||
replace_files = """function %s_replace_files(args)
|
||||
{
|
||||
var files = args[0];
|
||||
var _node = args[1];
|
||||
var start_frame = args[2];
|
||||
|
||||
var _column = node.linkedColumn(_node, "DRAWING.ELEMENT");
|
||||
|
||||
// Delete existing drawings.
|
||||
var timings = column.getDrawingTimings(_column);
|
||||
for( var i =0; i <= timings.length - 1; ++i)
|
||||
{
|
||||
column.deleteDrawingAt(_column, parseInt(timings[i]));
|
||||
}
|
||||
|
||||
// Create new drawings.
|
||||
for( var i =0; i <= files.length - 1; ++i)
|
||||
{
|
||||
timing = start_frame + i
|
||||
// Create a drawing drawing, 'true' indicate that the file exists.
|
||||
Drawing.create(node.getElementId(_node), timing, true);
|
||||
// Get the actual path, in tmp folder.
|
||||
var drawingFilePath = Drawing.filename(
|
||||
node.getElementId(_node), timing.toString()
|
||||
);
|
||||
copyFile( files[i], drawingFilePath );
|
||||
|
||||
column.setEntry(_column, 1, timing, timing.toString());
|
||||
}
|
||||
}
|
||||
%s_replace_files
|
||||
""" % (signature, signature)
|
||||
|
||||
|
||||
class ImageSequenceLoader(api.Loader):
|
||||
"""Load images
|
||||
Stores the imported asset in a container named after the asset.
|
||||
"""
|
||||
families = ["mindbender.imagesequence"]
|
||||
representations = ["*"]
|
||||
|
||||
def load(self, context, name=None, namespace=None, data=None):
|
||||
files = []
|
||||
for f in context["version"]["data"]["files"]:
|
||||
files.append(
|
||||
os.path.join(
|
||||
context["version"]["data"]["stagingDir"], f
|
||||
).replace("\\", "/")
|
||||
)
|
||||
|
||||
read_node = harmony.send(
|
||||
{
|
||||
"function": copy_files + import_files,
|
||||
"args": ["Top", files, context["version"]["data"]["subset"], 1]
|
||||
}
|
||||
)["result"]
|
||||
|
||||
self[:] = [read_node]
|
||||
|
||||
return harmony.containerise(
|
||||
name,
|
||||
namespace,
|
||||
read_node,
|
||||
context,
|
||||
self.__class__.__name__
|
||||
)
|
||||
|
||||
def update(self, container, representation):
|
||||
node = container.pop("node")
|
||||
|
||||
version = io.find_one({"_id": representation["parent"]})
|
||||
files = []
|
||||
for f in version["data"]["files"]:
|
||||
files.append(
|
||||
os.path.join(
|
||||
version["data"]["stagingDir"], f
|
||||
).replace("\\", "/")
|
||||
)
|
||||
|
||||
harmony.send(
|
||||
{
|
||||
"function": copy_files + replace_files,
|
||||
"args": [files, node, 1]
|
||||
}
|
||||
)
|
||||
|
||||
harmony.imprint(
|
||||
node, {"representation": str(representation["_id"])}
|
||||
)
|
||||
|
||||
def remove(self, container):
|
||||
node = container.pop("node")
|
||||
signature = str(uuid4()).replace("-", "_")
|
||||
func = """function %s_deleteNode(_node)
|
||||
{
|
||||
node.deleteNode(_node, true, true);
|
||||
}
|
||||
%_deleteNode
|
||||
""" % (signature, signature)
|
||||
harmony.send(
|
||||
{"function": func, "args": [node]}
|
||||
)
|
||||
|
||||
def switch(self, container, representation):
|
||||
self.update(container, representation)
|
||||
```
|
||||
|
||||
## Resources
|
||||
- https://github.com/diegogarciahuerta/tk-harmony
|
||||
- https://github.com/cfourney/OpenHarmony
|
||||
- [Toon Boom Discord](https://discord.gg/syAjy4H)
|
||||
- [Toon Boom TD](https://discord.gg/yAjyQtZ)
|
||||
531
openpype/hosts/harmony/api/TB_sceneOpened.js
Normal file
531
openpype/hosts/harmony/api/TB_sceneOpened.js
Normal file
|
|
@ -0,0 +1,531 @@
|
|||
/* global QTcpSocket, QByteArray, QDataStream, QTimer, QTextCodec, QIODevice, QApplication, include */
|
||||
/* global QTcpSocket, QByteArray, QDataStream, QTimer, QTextCodec, QIODevice, QApplication, include */
|
||||
/*
|
||||
Avalon Harmony Integration - Client
|
||||
-----------------------------------
|
||||
|
||||
This script implements client communication with Avalon server to bridge
|
||||
gap between Python and QtScript.
|
||||
|
||||
*/
|
||||
/* jshint proto: true */
|
||||
var LD_OPENHARMONY_PATH = System.getenv('LIB_OPENHARMONY_PATH');
|
||||
LD_OPENHARMONY_PATH = LD_OPENHARMONY_PATH + '/openHarmony.js';
|
||||
LD_OPENHARMONY_PATH = LD_OPENHARMONY_PATH.replace(/\\/g, "/");
|
||||
include(LD_OPENHARMONY_PATH);
|
||||
this.__proto__['$'] = $;
|
||||
|
||||
function Client() {
|
||||
var self = this;
|
||||
/** socket */
|
||||
self.socket = new QTcpSocket(this);
|
||||
/** receiving data buffer */
|
||||
self.received = '';
|
||||
self.messageId = 1;
|
||||
self.buffer = new QByteArray();
|
||||
self.waitingForData = 0;
|
||||
|
||||
|
||||
/**
|
||||
* pack number into bytes.
|
||||
* @function
|
||||
* @param {int} num 32 bit integer
|
||||
* @return {string}
|
||||
*/
|
||||
self.pack = function(num) {
|
||||
var ascii='';
|
||||
for (var i = 3; i >= 0; i--) {
|
||||
ascii += String.fromCharCode((num >> (8 * i)) & 255);
|
||||
}
|
||||
return ascii;
|
||||
};
|
||||
|
||||
/**
|
||||
* unpack number from string.
|
||||
* @function
|
||||
* @param {string} numString bytes to unpack
|
||||
* @return {int} 32bit unsigned integer.
|
||||
*/
|
||||
self.unpack = function(numString) {
|
||||
var result=0;
|
||||
for (var i = 3; i >= 0; i--) {
|
||||
result += numString.charCodeAt(3 - i) << (8 * i);
|
||||
}
|
||||
return result;
|
||||
};
|
||||
|
||||
/**
|
||||
* prettify json for easier debugging
|
||||
* @function
|
||||
* @param {object} json json to process
|
||||
* @return {string} prettified json string
|
||||
*/
|
||||
self.prettifyJson = function(json) {
|
||||
var jsonString = JSON.stringify(json);
|
||||
return JSON.stringify(JSON.parse(jsonString), null, 2);
|
||||
};
|
||||
|
||||
/**
|
||||
* Log message in debug level.
|
||||
* @function
|
||||
* @param {string} data - message
|
||||
*/
|
||||
self.logDebug = function(data) {
|
||||
var message = typeof(data.message) != 'undefined' ? data.message : data;
|
||||
MessageLog.trace('(DEBUG): ' + message.toString());
|
||||
};
|
||||
|
||||
/**
|
||||
* Log message in info level.
|
||||
* @function
|
||||
* @param {string} data - message
|
||||
*/
|
||||
self.logInfo = function(data) {
|
||||
var message = typeof(data.message) != 'undefined' ? data.message : data;
|
||||
MessageLog.trace('(DEBUG): ' + message.toString());
|
||||
};
|
||||
|
||||
/**
|
||||
* Log message in warning level.
|
||||
* @function
|
||||
* @param {string} data - message
|
||||
*/
|
||||
self.logWarning = function(data) {
|
||||
var message = typeof(data.message) != 'undefined' ? data.message : data;
|
||||
MessageLog.trace('(INFO): ' + message.toString());
|
||||
};
|
||||
|
||||
/**
|
||||
* Log message in error level.
|
||||
* @function
|
||||
* @param {string} data - message
|
||||
*/
|
||||
self.logError = function(data) {
|
||||
var message = typeof(data.message) != 'undefined' ? data.message : data;
|
||||
MessageLog.trace('(ERROR): ' +message.toString());
|
||||
};
|
||||
|
||||
/**
|
||||
* Show message in Harmony GUI as popup window.
|
||||
* @function
|
||||
* @param {string} msg - message
|
||||
*/
|
||||
self.showMessage = function(msg) {
|
||||
MessageBox.information(msg);
|
||||
};
|
||||
|
||||
/**
|
||||
* Implement missing setTimeout() in Harmony.
|
||||
* This calls once given function after some interval in milliseconds.
|
||||
* @function
|
||||
* @param {function} fc function to call.
|
||||
* @param {int} interval interval in milliseconds.
|
||||
* @param {boolean} single behave as setTimeout or setInterval.
|
||||
*/
|
||||
self.setTimeout = function(fc, interval, single) {
|
||||
var timer = new QTimer();
|
||||
if (!single) {
|
||||
timer.singleShot = true; // in-case if setTimout and false in-case of setInterval
|
||||
} else {
|
||||
timer.singleShot = single;
|
||||
}
|
||||
timer.interval = interval; // set the time in milliseconds
|
||||
timer.singleShot = true; // in-case if setTimout and false in-case of setInterval
|
||||
timer.timeout.connect(this, function(){
|
||||
fc.call();
|
||||
});
|
||||
timer.start();
|
||||
};
|
||||
|
||||
/**
|
||||
* Process recieved request. This will eval recieved function and produce
|
||||
* results.
|
||||
* @function
|
||||
* @param {object} request - recieved request JSON
|
||||
* @return {object} result of evaled function.
|
||||
*/
|
||||
self.processRequest = function(request) {
|
||||
var mid = request.message_id;
|
||||
if (typeof request.reply !== 'undefined') {
|
||||
self.logDebug('['+ mid +'] *** received reply.');
|
||||
return;
|
||||
}
|
||||
self.logDebug('['+ mid +'] - Processing: ' + self.prettifyJson(request));
|
||||
var result = null;
|
||||
|
||||
if (typeof request.script !== 'undefined') {
|
||||
self.logDebug('[' + mid + '] Injecting script.');
|
||||
try {
|
||||
eval.call(null, request.script);
|
||||
} catch (error) {
|
||||
self.logError(error);
|
||||
}
|
||||
} else if (typeof request["function"] !== 'undefined') {
|
||||
try {
|
||||
var _func = eval.call(null, request["function"]);
|
||||
|
||||
if (request.args == null) {
|
||||
result = _func();
|
||||
} else {
|
||||
result = _func(request.args);
|
||||
}
|
||||
} catch (error) {
|
||||
result = 'Error processing request.\n' +
|
||||
'Request:\n' +
|
||||
self.prettifyJson(request) + '\n' +
|
||||
'Error:\n' + error;
|
||||
}
|
||||
} else {
|
||||
self.logError('Command type not implemented.');
|
||||
}
|
||||
|
||||
return result;
|
||||
};
|
||||
|
||||
/**
|
||||
* This gets called when socket received new data.
|
||||
* @function
|
||||
*/
|
||||
self.onReadyRead = function() {
|
||||
var currentSize = self.buffer.size();
|
||||
self.logDebug('--- Receiving data ( '+ currentSize + ' )...');
|
||||
var newData = self.socket.readAll();
|
||||
var newSize = newData.size();
|
||||
self.buffer.append(newData);
|
||||
self.logDebug(' - got ' + newSize + ' bytes of new data.');
|
||||
self.processBuffer();
|
||||
};
|
||||
|
||||
/**
|
||||
* Process data received in buffer.
|
||||
* This detects messages by looking for header and message length.
|
||||
* @function
|
||||
*/
|
||||
self.processBuffer = function() {
|
||||
var length = self.waitingForData;
|
||||
if (self.waitingForData == 0) {
|
||||
// read header from the buffer and remove it
|
||||
var header_data = self.buffer.mid(0, 6);
|
||||
self.buffer = self.buffer.remove(0, 6);
|
||||
|
||||
// convert header to string
|
||||
var header = '';
|
||||
for (var i = 0; i < header_data.size(); ++i) {
|
||||
// data in QByteArray come out as signed bytes.
|
||||
var unsigned = header_data.at(i) & 0xff;
|
||||
header = header.concat(String.fromCharCode(unsigned));
|
||||
}
|
||||
|
||||
// skip 'AH' and read only length, unpack it to integer
|
||||
header = header.substr(2);
|
||||
length = self.unpack(header);
|
||||
}
|
||||
|
||||
var data = self.buffer.mid(0, length);
|
||||
self.logDebug('--- Expected: ' + length + ' | Got: ' + data.size());
|
||||
if (length > data.size()) {
|
||||
// we didn't received whole message.
|
||||
self.waitingForData = length;
|
||||
self.logDebug('... waiting for more data (' + length + ') ...');
|
||||
return;
|
||||
}
|
||||
self.waitingForData = 0;
|
||||
self.buffer.remove(0, length);
|
||||
|
||||
for (var j = 0; j < data.size(); ++j) {
|
||||
self.received = self.received.concat(String.fromCharCode(data.at(j)));
|
||||
}
|
||||
|
||||
// self.logDebug('--- Received: ' + self.received);
|
||||
var to_parse = self.received;
|
||||
var request = JSON.parse(to_parse);
|
||||
var mid = request.message_id;
|
||||
// self.logDebug('[' + mid + '] - Request: ' + '\n' + JSON.stringify(request));
|
||||
self.logDebug('[' + mid + '] Recieved.');
|
||||
|
||||
request.result = self.processRequest(request);
|
||||
self.logDebug('[' + mid + '] Processing done.');
|
||||
self.received = '';
|
||||
|
||||
if (request.reply !== true) {
|
||||
request.reply = true;
|
||||
self.logDebug('[' + mid + '] Replying.');
|
||||
self._send(JSON.stringify(request));
|
||||
}
|
||||
|
||||
if ((length < data.size()) || (length < self.buffer.size())) {
|
||||
// we've received more data.
|
||||
self.logDebug('--- Got more data to process ...');
|
||||
self.processBuffer();
|
||||
}
|
||||
};
|
||||
|
||||
/**
|
||||
* Run when Harmony connects to server.
|
||||
* @function
|
||||
*/
|
||||
self.onConnected = function() {
|
||||
self.logDebug('Connected to server ...');
|
||||
self.lock = false;
|
||||
self.socket.readyRead.connect(self.onReadyRead);
|
||||
var app = QCoreApplication.instance();
|
||||
|
||||
app.avalonClient.send(
|
||||
{
|
||||
'module': 'avalon.api',
|
||||
'method': 'emit',
|
||||
'args': ['application.launched']
|
||||
}, false);
|
||||
};
|
||||
|
||||
self._send = function(message) {
|
||||
var data = new QByteArray();
|
||||
var outstr = new QDataStream(data, QIODevice.WriteOnly);
|
||||
outstr.writeInt(0);
|
||||
data.append('UTF-8');
|
||||
outstr.device().seek(0);
|
||||
outstr.writeInt(data.size() - 4);
|
||||
var codec = QTextCodec.codecForUtfText(data);
|
||||
var msg = codec.fromUnicode(message);
|
||||
var l = msg.size();
|
||||
var coded = new QByteArray('AH').append(self.pack(l));
|
||||
coded = coded.append(msg);
|
||||
self.socket.write(new QByteArray(coded));
|
||||
self.logDebug('Sent.');
|
||||
};
|
||||
|
||||
self.waitForLock = function() {
|
||||
if (self._lock === false) {
|
||||
self.logDebug('>>> Unlocking ...');
|
||||
return;
|
||||
} else {
|
||||
self.logDebug('Setting timer.');
|
||||
self.setTimeout(self.waitForLock, 300);
|
||||
}
|
||||
};
|
||||
|
||||
/**
|
||||
* Send request to server.
|
||||
* @param {object} request - json encoded request.
|
||||
*/
|
||||
self.send = function(request) {
|
||||
request.message_id = self.messageId;
|
||||
if (typeof request.reply == 'undefined') {
|
||||
self.logDebug("[" + self.messageId + "] sending:\n" + self.prettifyJson(request));
|
||||
}
|
||||
self._send(JSON.stringify(request));
|
||||
};
|
||||
|
||||
/**
|
||||
* Executed on disconnection.
|
||||
*/
|
||||
self.onDisconnected = function() {
|
||||
self.socket.close();
|
||||
};
|
||||
|
||||
/**
|
||||
* Disconnect from server.
|
||||
*/
|
||||
self.disconnect = function() {
|
||||
self.socket.close();
|
||||
};
|
||||
|
||||
self.socket.connected.connect(self.onConnected);
|
||||
self.socket.disconnected.connect(self.onDisconnected);
|
||||
}
|
||||
|
||||
/**
|
||||
* Entry point, creating Avalon Client.
|
||||
*/
|
||||
function start() {
|
||||
var self = this;
|
||||
/** hostname or ip of server - should be localhost */
|
||||
var host = '127.0.0.1';
|
||||
/** port of the server */
|
||||
var port = parseInt(System.getenv('AVALON_HARMONY_PORT'));
|
||||
|
||||
// Attach the client to the QApplication to preserve.
|
||||
var app = QCoreApplication.instance();
|
||||
|
||||
if (app.avalonClient == null) {
|
||||
app.avalonClient = new Client();
|
||||
app.avalonClient.socket.connectToHost(host, port);
|
||||
}
|
||||
var menuBar = QApplication.activeWindow().menuBar();
|
||||
var actions = menuBar.actions();
|
||||
app.avalonMenu = null;
|
||||
|
||||
for (var i = 0 ; i < actions.length; i++) {
|
||||
label = System.getenv('AVALON_LABEL');
|
||||
if (actions[i].text == label) {
|
||||
app.avalonMenu = true;
|
||||
}
|
||||
}
|
||||
|
||||
var menu = null;
|
||||
if (app.avalonMenu == null) {
|
||||
menu = menuBar.addMenu(System.getenv('AVALON_LABEL'));
|
||||
}
|
||||
// menu = menuBar.addMenu('Avalon');
|
||||
|
||||
/**
|
||||
* Show creator
|
||||
*/
|
||||
self.onCreator = function() {
|
||||
app.avalonClient.send({
|
||||
'module': 'openpype.hosts.harmony.api.lib',
|
||||
'method': 'show',
|
||||
'args': ['creator']
|
||||
}, false);
|
||||
};
|
||||
|
||||
var action = menu.addAction('Create...');
|
||||
action.triggered.connect(self.onCreator);
|
||||
|
||||
|
||||
/**
|
||||
* Show Workfiles
|
||||
*/
|
||||
self.onWorkfiles = function() {
|
||||
app.avalonClient.send({
|
||||
'module': 'openpype.hosts.harmony.api.lib',
|
||||
'method': 'show',
|
||||
'args': ['workfiles']
|
||||
}, false);
|
||||
};
|
||||
if (app.avalonMenu == null) {
|
||||
action = menu.addAction('Workfiles...');
|
||||
action.triggered.connect(self.onWorkfiles);
|
||||
}
|
||||
|
||||
/**
|
||||
* Show Loader
|
||||
*/
|
||||
self.onLoad = function() {
|
||||
app.avalonClient.send({
|
||||
'module': 'openpype.hosts.harmony.api.lib',
|
||||
'method': 'show',
|
||||
'args': ['loader']
|
||||
}, false);
|
||||
};
|
||||
// add Loader item to menu
|
||||
if (app.avalonMenu == null) {
|
||||
action = menu.addAction('Load...');
|
||||
action.triggered.connect(self.onLoad);
|
||||
}
|
||||
|
||||
/**
|
||||
* Show Publisher
|
||||
*/
|
||||
self.onPublish = function() {
|
||||
app.avalonClient.send({
|
||||
'module': 'openpype.hosts.harmony.api.lib',
|
||||
'method': 'show',
|
||||
'args': ['publish']
|
||||
}, false);
|
||||
};
|
||||
// add Publisher item to menu
|
||||
if (app.avalonMenu == null) {
|
||||
action = menu.addAction('Publish...');
|
||||
action.triggered.connect(self.onPublish);
|
||||
}
|
||||
|
||||
/**
|
||||
* Show Scene Manager
|
||||
*/
|
||||
self.onManage = function() {
|
||||
app.avalonClient.send({
|
||||
'module': 'openpype.hosts.harmony.api.lib',
|
||||
'method': 'show',
|
||||
'args': ['sceneinventory']
|
||||
}, false);
|
||||
};
|
||||
// add Scene Manager item to menu
|
||||
if (app.avalonMenu == null) {
|
||||
action = menu.addAction('Manage...');
|
||||
action.triggered.connect(self.onManage);
|
||||
}
|
||||
|
||||
/**
|
||||
* Show Subset Manager
|
||||
*/
|
||||
self.onSubsetManage = function() {
|
||||
app.avalonClient.send({
|
||||
'module': 'openpype.hosts.harmony.api.lib',
|
||||
'method': 'show',
|
||||
'args': ['subsetmanager']
|
||||
}, false);
|
||||
};
|
||||
// add Subset Manager item to menu
|
||||
if (app.avalonMenu == null) {
|
||||
action = menu.addAction('Subset Manager...');
|
||||
action.triggered.connect(self.onSubsetManage);
|
||||
}
|
||||
|
||||
/**
|
||||
* Show Experimental dialog
|
||||
*/
|
||||
self.onExperimentalTools = function() {
|
||||
app.avalonClient.send({
|
||||
'module': 'openpype.hosts.harmony.api.lib',
|
||||
'method': 'show',
|
||||
'args': ['experimental_tools']
|
||||
}, false);
|
||||
};
|
||||
// add Subset Manager item to menu
|
||||
if (app.avalonMenu == null) {
|
||||
action = menu.addAction('Experimental Tools...');
|
||||
action.triggered.connect(self.onExperimentalTools);
|
||||
}
|
||||
|
||||
// FIXME(antirotor): We need to disable `on_file_changed` now as is wreak
|
||||
// havoc when "Save" is called multiple times and zipping didn't finished yet
|
||||
/*
|
||||
|
||||
// Watch scene file for changes.
|
||||
app.onFileChanged = function(path)
|
||||
{
|
||||
var app = QCoreApplication.instance();
|
||||
if (app.avalonOnFileChanged){
|
||||
app.avalonClient.send(
|
||||
{
|
||||
'module': 'avalon.harmony.lib',
|
||||
'method': 'on_file_changed',
|
||||
'args': [path]
|
||||
},
|
||||
false
|
||||
);
|
||||
}
|
||||
|
||||
app.watcher.addPath(path);
|
||||
};
|
||||
|
||||
|
||||
app.watcher = new QFileSystemWatcher();
|
||||
scene_path = scene.currentProjectPath() +"/" + scene.currentVersionName() + ".xstage";
|
||||
app.watcher.addPath(scenePath);
|
||||
app.watcher.fileChanged.connect(app.onFileChanged);
|
||||
app.avalonOnFileChanged = true;
|
||||
*/
|
||||
app.onFileChanged = function(path) {
|
||||
// empty stub
|
||||
return path;
|
||||
};
|
||||
}
|
||||
|
||||
function ensureSceneSettings() {
|
||||
var app = QCoreApplication.instance();
|
||||
app.avalonClient.send(
|
||||
{
|
||||
"module": "openpype.hosts.harmony.api",
|
||||
"method": "ensure_scene_settings",
|
||||
"args": []
|
||||
},
|
||||
false
|
||||
);
|
||||
}
|
||||
|
||||
function TB_sceneOpened()
|
||||
{
|
||||
start();
|
||||
}
|
||||
|
|
@ -1,209 +1,90 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Pype Harmony Host implementation."""
|
||||
import os
|
||||
from pathlib import Path
|
||||
import logging
|
||||
"""Public API
|
||||
|
||||
from openpype import lib
|
||||
import openpype.hosts.harmony
|
||||
Anything that isn't defined here is INTERNAL and unreliable for external use.
|
||||
|
||||
import pyblish.api
|
||||
"""
|
||||
from .pipeline import (
|
||||
ls,
|
||||
install,
|
||||
list_instances,
|
||||
remove_instance,
|
||||
select_instance,
|
||||
containerise,
|
||||
set_scene_settings,
|
||||
get_asset_settings,
|
||||
ensure_scene_settings,
|
||||
check_inventory,
|
||||
application_launch,
|
||||
export_template,
|
||||
on_pyblish_instance_toggled,
|
||||
inject_avalon_js,
|
||||
)
|
||||
|
||||
from avalon import io, harmony
|
||||
import avalon.api
|
||||
from .lib import (
|
||||
launch,
|
||||
maintained_selection,
|
||||
imprint,
|
||||
read,
|
||||
send,
|
||||
maintained_nodes_state,
|
||||
save_scene,
|
||||
save_scene_as,
|
||||
remove,
|
||||
delete_node,
|
||||
find_node_by_name,
|
||||
signature,
|
||||
select_nodes,
|
||||
get_scene_data
|
||||
)
|
||||
|
||||
from .workio import (
|
||||
open_file,
|
||||
save_file,
|
||||
current_file,
|
||||
has_unsaved_changes,
|
||||
file_extensions,
|
||||
work_root
|
||||
)
|
||||
|
||||
log = logging.getLogger("openpype.hosts.harmony")
|
||||
__all__ = [
|
||||
# pipeline
|
||||
"ls",
|
||||
"install",
|
||||
"list_instances",
|
||||
"remove_instance",
|
||||
"select_instance",
|
||||
"containerise",
|
||||
"set_scene_settings",
|
||||
"get_asset_settings",
|
||||
"ensure_scene_settings",
|
||||
"check_inventory",
|
||||
"application_launch",
|
||||
"export_template",
|
||||
"on_pyblish_instance_toggled",
|
||||
"inject_avalon_js",
|
||||
|
||||
HOST_DIR = os.path.dirname(os.path.abspath(openpype.hosts.harmony.__file__))
|
||||
PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
|
||||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
|
||||
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
|
||||
# lib
|
||||
"launch",
|
||||
"maintained_selection",
|
||||
"imprint",
|
||||
"read",
|
||||
"send",
|
||||
"maintained_nodes_state",
|
||||
"save_scene",
|
||||
"save_scene_as",
|
||||
"remove",
|
||||
"delete_node",
|
||||
"find_node_by_name",
|
||||
"signature",
|
||||
"select_nodes",
|
||||
"get_scene_data",
|
||||
|
||||
# Workfiles API
|
||||
"open_file",
|
||||
"save_file",
|
||||
"current_file",
|
||||
"has_unsaved_changes",
|
||||
"file_extensions",
|
||||
"work_root",
|
||||
]
|
||||
|
||||
def set_scene_settings(settings):
|
||||
"""Set correct scene settings in Harmony.
|
||||
|
||||
Args:
|
||||
settings (dict): Scene settings.
|
||||
|
||||
Returns:
|
||||
dict: Dictionary of settings to set.
|
||||
|
||||
"""
|
||||
harmony.send(
|
||||
{"function": "PypeHarmony.setSceneSettings", "args": settings})
|
||||
|
||||
|
||||
def get_asset_settings():
|
||||
"""Get settings on current asset from database.
|
||||
|
||||
Returns:
|
||||
dict: Scene data.
|
||||
|
||||
"""
|
||||
asset_data = lib.get_asset()["data"]
|
||||
fps = asset_data.get("fps")
|
||||
frame_start = asset_data.get("frameStart")
|
||||
frame_end = asset_data.get("frameEnd")
|
||||
handle_start = asset_data.get("handleStart")
|
||||
handle_end = asset_data.get("handleEnd")
|
||||
resolution_width = asset_data.get("resolutionWidth")
|
||||
resolution_height = asset_data.get("resolutionHeight")
|
||||
entity_type = asset_data.get("entityType")
|
||||
|
||||
scene_data = {
|
||||
"fps": fps,
|
||||
"frameStart": frame_start,
|
||||
"frameEnd": frame_end,
|
||||
"handleStart": handle_start,
|
||||
"handleEnd": handle_end,
|
||||
"resolutionWidth": resolution_width,
|
||||
"resolutionHeight": resolution_height,
|
||||
"entityType": entity_type
|
||||
}
|
||||
|
||||
return scene_data
|
||||
|
||||
|
||||
def ensure_scene_settings():
|
||||
"""Validate if Harmony scene has valid settings."""
|
||||
settings = get_asset_settings()
|
||||
|
||||
invalid_settings = []
|
||||
valid_settings = {}
|
||||
for key, value in settings.items():
|
||||
if value is None:
|
||||
invalid_settings.append(key)
|
||||
else:
|
||||
valid_settings[key] = value
|
||||
|
||||
# Warn about missing attributes.
|
||||
if invalid_settings:
|
||||
msg = "Missing attributes:"
|
||||
for item in invalid_settings:
|
||||
msg += f"\n{item}"
|
||||
|
||||
harmony.send(
|
||||
{"function": "PypeHarmony.message", "args": msg})
|
||||
|
||||
set_scene_settings(valid_settings)
|
||||
|
||||
|
||||
def check_inventory():
|
||||
"""Check is scene contains outdated containers.
|
||||
|
||||
If it does it will colorize outdated nodes and display warning message
|
||||
in Harmony.
|
||||
"""
|
||||
if not lib.any_outdated():
|
||||
return
|
||||
|
||||
host = avalon.api.registered_host()
|
||||
outdated_containers = []
|
||||
for container in host.ls():
|
||||
representation = container['representation']
|
||||
representation_doc = io.find_one(
|
||||
{
|
||||
"_id": io.ObjectId(representation),
|
||||
"type": "representation"
|
||||
},
|
||||
projection={"parent": True}
|
||||
)
|
||||
if representation_doc and not lib.is_latest(representation_doc):
|
||||
outdated_containers.append(container)
|
||||
|
||||
# Colour nodes.
|
||||
outdated_nodes = []
|
||||
for container in outdated_containers:
|
||||
if container["loader"] == "ImageSequenceLoader":
|
||||
outdated_nodes.append(
|
||||
harmony.find_node_by_name(container["name"], "READ")
|
||||
)
|
||||
harmony.send({"function": "PypeHarmony.setColor", "args": outdated_nodes})
|
||||
|
||||
# Warn about outdated containers.
|
||||
msg = "There are outdated containers in the scene."
|
||||
harmony.send({"function": "PypeHarmony.message", "args": msg})
|
||||
|
||||
|
||||
def application_launch():
|
||||
"""Event that is executed after Harmony is launched."""
|
||||
# FIXME: This is breaking server <-> client communication.
|
||||
# It is now moved so it it manually called.
|
||||
# ensure_scene_settings()
|
||||
# check_inventory()
|
||||
# fills OPENPYPE_HARMONY_JS
|
||||
pype_harmony_path = Path(__file__).parent.parent / "js" / "PypeHarmony.js"
|
||||
pype_harmony_js = pype_harmony_path.read_text()
|
||||
|
||||
# go through js/creators, loaders and publish folders and load all scripts
|
||||
script = ""
|
||||
for item in ["creators", "loaders", "publish"]:
|
||||
dir_to_scan = Path(__file__).parent.parent / "js" / item
|
||||
for child in dir_to_scan.iterdir():
|
||||
script += child.read_text()
|
||||
|
||||
# send scripts to Harmony
|
||||
harmony.send({"script": pype_harmony_js})
|
||||
harmony.send({"script": script})
|
||||
|
||||
|
||||
def export_template(backdrops, nodes, filepath):
|
||||
"""Export Template to file.
|
||||
|
||||
Args:
|
||||
backdrops (list): List of backdrops to export.
|
||||
nodes (list): List of nodes to export.
|
||||
filepath (str): Path where to save Template.
|
||||
|
||||
"""
|
||||
harmony.send({
|
||||
"function": "PypeHarmony.exportTemplate",
|
||||
"args": [
|
||||
backdrops,
|
||||
nodes,
|
||||
os.path.basename(filepath),
|
||||
os.path.dirname(filepath)
|
||||
]
|
||||
})
|
||||
|
||||
|
||||
def install():
|
||||
"""Install Pype as host config."""
|
||||
print("Installing Pype config ...")
|
||||
|
||||
pyblish.api.register_plugin_path(PUBLISH_PATH)
|
||||
avalon.api.register_plugin_path(avalon.api.Loader, LOAD_PATH)
|
||||
avalon.api.register_plugin_path(avalon.api.Creator, CREATE_PATH)
|
||||
log.info(PUBLISH_PATH)
|
||||
|
||||
# Register callbacks.
|
||||
pyblish.api.register_callback(
|
||||
"instanceToggled", on_pyblish_instance_toggled
|
||||
)
|
||||
|
||||
avalon.api.on("application.launched", application_launch)
|
||||
|
||||
|
||||
def uninstall():
|
||||
pyblish.api.deregister_plugin_path(PUBLISH_PATH)
|
||||
avalon.api.deregister_plugin_path(avalon.api.Loader, LOAD_PATH)
|
||||
avalon.api.deregister_plugin_path(avalon.api.Creator, CREATE_PATH)
|
||||
|
||||
|
||||
def on_pyblish_instance_toggled(instance, old_value, new_value):
|
||||
"""Toggle node enabling on instance toggles."""
|
||||
node = None
|
||||
if instance.data.get("setMembers"):
|
||||
node = instance.data["setMembers"][0]
|
||||
|
||||
if node:
|
||||
harmony.send(
|
||||
{
|
||||
"function": "PypeHarmony.toggleInstance",
|
||||
"args": [node, new_value]
|
||||
}
|
||||
)
|
||||
|
|
|
|||
117
openpype/hosts/harmony/api/js/.eslintrc.json
Normal file
117
openpype/hosts/harmony/api/js/.eslintrc.json
Normal file
|
|
@ -0,0 +1,117 @@
|
|||
{
|
||||
"env": {
|
||||
"browser": true
|
||||
},
|
||||
"extends": "eslint:recommended",
|
||||
"parserOptions": {
|
||||
"ecmaVersion": 3
|
||||
},
|
||||
"rules": {
|
||||
"indent": [
|
||||
"error",
|
||||
4
|
||||
],
|
||||
"linebreak-style": [
|
||||
"error",
|
||||
"unix"
|
||||
],
|
||||
"quotes": [
|
||||
"error",
|
||||
"single"
|
||||
],
|
||||
"semi": [
|
||||
"error",
|
||||
"always"
|
||||
]
|
||||
},
|
||||
"globals": {
|
||||
"Action": "readonly",
|
||||
"Backdrop": "readonly",
|
||||
"Button": "readonly",
|
||||
"Cel": "readonly",
|
||||
"Cel3d": "readonly",
|
||||
"CheckBox": "readonly",
|
||||
"ColorRGBA": "readonly",
|
||||
"ComboBox": "readonly",
|
||||
"DateEdit": "readonly",
|
||||
"DateEditEnum": "readonly",
|
||||
"Dialog": "readonly",
|
||||
"Dir": "readonly",
|
||||
"DirSpec": "readonly",
|
||||
"Drawing": "readonly",
|
||||
"DrawingToolParams": "readonly",
|
||||
"DrawingTools": "readonly",
|
||||
"EnvelopeCreator": "readonly",
|
||||
"ExportVideoDlg": "readonly",
|
||||
"File": "readonly",
|
||||
"FileAccess": "readonly",
|
||||
"FileDialog": "readonly",
|
||||
"GroupBox": "readonly",
|
||||
"ImportDrawingDlg": "readonly",
|
||||
"Input": "readonly",
|
||||
"KeyModifiers": "readonly",
|
||||
"Label": "readonly",
|
||||
"LayoutExports": "readonly",
|
||||
"LayoutExportsParams": "readonly",
|
||||
"LineEdit": "readonly",
|
||||
"Matrix4x4": "readonly",
|
||||
"MessageBox": "readonly",
|
||||
"MessageLog": "readonly",
|
||||
"Model3d": "readonly",
|
||||
"MovieImport": "readonly",
|
||||
"NumberEdit": "readonly",
|
||||
"PaletteManager": "readonly",
|
||||
"PaletteObjectManager": "readonly",
|
||||
"PermanentFile": "readonly",
|
||||
"Point2d": "readonly",
|
||||
"Point3d": "readonly",
|
||||
"Process": "readonly",
|
||||
"Process2": "readonly",
|
||||
"Quaternion": "readonly",
|
||||
"QuicktimeExporter": "readonly",
|
||||
"RadioButton": "readonly",
|
||||
"RemoteCmd": "readonly",
|
||||
"Scene": "readonly",
|
||||
"Settings": "readonly",
|
||||
"Slider": "readonly",
|
||||
"SpinBox": "readonly",
|
||||
"SubnodeData": "readonly",
|
||||
"System": "readonly",
|
||||
"TemporaryFile": "readonly",
|
||||
"TextEdit": "readonly",
|
||||
"TimeEdit": "readonly",
|
||||
"Timeline": "readonly",
|
||||
"ToolProperties": "readonly",
|
||||
"UiLoader": "readonly",
|
||||
"Vector2d": "readonly",
|
||||
"Vector3d": "readonly",
|
||||
"WebCCExporter": "readonly",
|
||||
"Workspaces": "readonly",
|
||||
"__scriptManager__": "readonly",
|
||||
"__temporaryFileContext__": "readonly",
|
||||
"about": "readonly",
|
||||
"column": "readonly",
|
||||
"compositionOrder": "readonly",
|
||||
"copyPaste": "readonly",
|
||||
"deformation": "readonly",
|
||||
"drawingExport": "readonly",
|
||||
"element": "readonly",
|
||||
"exporter": "readonly",
|
||||
"fileMapper": "readonly",
|
||||
"frame": "readonly",
|
||||
"func": "readonly",
|
||||
"library": "readonly",
|
||||
"node": "readonly",
|
||||
"preferences": "readonly",
|
||||
"render": "readonly",
|
||||
"scene": "readonly",
|
||||
"selection": "readonly",
|
||||
"sound": "readonly",
|
||||
"specialFolders": "readonly",
|
||||
"translator": "readonly",
|
||||
"view": "readonly",
|
||||
"waypoint": "readonly",
|
||||
"xsheet": "readonly",
|
||||
"QCoreApplication": "readonly"
|
||||
}
|
||||
}
|
||||
226
openpype/hosts/harmony/api/js/AvalonHarmony.js
Normal file
226
openpype/hosts/harmony/api/js/AvalonHarmony.js
Normal file
|
|
@ -0,0 +1,226 @@
|
|||
// ***************************************************************************
|
||||
// * Avalon Harmony Host *
|
||||
// ***************************************************************************
|
||||
|
||||
|
||||
/**
|
||||
* @namespace
|
||||
* @classdesc AvalonHarmony encapsulate all Avalon related functions.
|
||||
*/
|
||||
var AvalonHarmony = {};
|
||||
|
||||
|
||||
/**
|
||||
* Get scene metadata from Harmony.
|
||||
* @function
|
||||
* @return {object} Scene metadata.
|
||||
*/
|
||||
AvalonHarmony.getSceneData = function() {
|
||||
var metadata = scene.metadata('avalon');
|
||||
if (metadata){
|
||||
return JSON.parse(metadata.value);
|
||||
}else {
|
||||
return {};
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
/**
|
||||
* Set scene metadata to Harmony.
|
||||
* @function
|
||||
* @param {object} metadata Object containing metadata.
|
||||
*/
|
||||
AvalonHarmony.setSceneData = function(metadata) {
|
||||
scene.setMetadata({
|
||||
'name' : 'avalon',
|
||||
'type' : 'string',
|
||||
'creator' : 'Avalon',
|
||||
'version' : '1.0',
|
||||
'value' : JSON.stringify(metadata)
|
||||
});
|
||||
};
|
||||
|
||||
|
||||
/**
|
||||
* Get selected nodes in Harmony.
|
||||
* @function
|
||||
* @return {array} Selected nodes paths.
|
||||
*/
|
||||
AvalonHarmony.getSelectedNodes = function () {
|
||||
var selectionLength = selection.numberOfNodesSelected();
|
||||
var selectedNodes = [];
|
||||
for (var i = 0 ; i < selectionLength; i++) {
|
||||
selectedNodes.push(selection.selectedNode(i));
|
||||
}
|
||||
return selectedNodes;
|
||||
};
|
||||
|
||||
|
||||
/**
|
||||
* Set selection of nodes.
|
||||
* @function
|
||||
* @param {array} nodes Arrya containing node paths to add to selection.
|
||||
*/
|
||||
AvalonHarmony.selectNodes = function(nodes) {
|
||||
selection.clearSelection();
|
||||
for (var i = 0 ; i < nodes.length; i++) {
|
||||
selection.addNodeToSelection(nodes[i]);
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
/**
|
||||
* Is node enabled?
|
||||
* @function
|
||||
* @param {string} node Node path.
|
||||
* @return {boolean} state
|
||||
*/
|
||||
AvalonHarmony.isEnabled = function(node) {
|
||||
return node.getEnable(node);
|
||||
};
|
||||
|
||||
|
||||
/**
|
||||
* Are nodes enabled?
|
||||
* @function
|
||||
* @param {array} nodes Array of node paths.
|
||||
* @return {array} array of boolean states.
|
||||
*/
|
||||
AvalonHarmony.areEnabled = function(nodes) {
|
||||
var states = [];
|
||||
for (var i = 0 ; i < nodes.length; i++) {
|
||||
states.push(node.getEnable(nodes[i]));
|
||||
}
|
||||
return states;
|
||||
};
|
||||
|
||||
|
||||
/**
|
||||
* Set state on nodes.
|
||||
* @function
|
||||
* @param {array} args Array of nodes array and states array.
|
||||
*/
|
||||
AvalonHarmony.setState = function(args) {
|
||||
var nodes = args[0];
|
||||
var states = args[1];
|
||||
// length of both arrays must be equal.
|
||||
if (nodes.length !== states.length) {
|
||||
return false;
|
||||
}
|
||||
for (var i = 0 ; i < nodes.length; i++) {
|
||||
node.setEnable(nodes[i], states[i]);
|
||||
}
|
||||
return true;
|
||||
};
|
||||
|
||||
|
||||
/**
|
||||
* Disable specified nodes.
|
||||
* @function
|
||||
* @param {array} nodes Array of nodes.
|
||||
*/
|
||||
AvalonHarmony.disableNodes = function(nodes) {
|
||||
for (var i = 0 ; i < nodes.length; i++)
|
||||
{
|
||||
node.setEnable(nodes[i], false);
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
/**
|
||||
* Save scene in Harmony.
|
||||
* @function
|
||||
* @return {string} Scene path.
|
||||
*/
|
||||
AvalonHarmony.saveScene = function() {
|
||||
var app = QCoreApplication.instance();
|
||||
app.avalon_on_file_changed = false;
|
||||
scene.saveAll();
|
||||
return (
|
||||
scene.currentProjectPath() + '/' +
|
||||
scene.currentVersionName() + '.xstage'
|
||||
);
|
||||
};
|
||||
|
||||
|
||||
/**
|
||||
* Enable Harmony file-watcher.
|
||||
* @function
|
||||
*/
|
||||
AvalonHarmony.enableFileWather = function() {
|
||||
var app = QCoreApplication.instance();
|
||||
app.avalon_on_file_changed = true;
|
||||
};
|
||||
|
||||
|
||||
/**
|
||||
* Add path to file-watcher.
|
||||
* @function
|
||||
* @param {string} path Path to watch.
|
||||
*/
|
||||
AvalonHarmony.addPathToWatcher = function(path) {
|
||||
var app = QCoreApplication.instance();
|
||||
app.watcher.addPath(path);
|
||||
};
|
||||
|
||||
|
||||
/**
|
||||
* Setup node for Creator.
|
||||
* @function
|
||||
* @param {string} node Node path.
|
||||
*/
|
||||
AvalonHarmony.setupNodeForCreator = function(node) {
|
||||
node.setTextAttr(node, 'COMPOSITE_MODE', 1, 'Pass Through');
|
||||
};
|
||||
|
||||
|
||||
/**
|
||||
* Get node names for specified node type.
|
||||
* @function
|
||||
* @param {string} nodeType Node type.
|
||||
* @return {array} Node names.
|
||||
*/
|
||||
AvalonHarmony.getNodesNamesByType = function(nodeType) {
|
||||
var nodes = node.getNodes(nodeType);
|
||||
var nodeNames = [];
|
||||
for (var i = 0; i < nodes.length; ++i) {
|
||||
nodeNames.push(node.getName(nodes[i]));
|
||||
}
|
||||
return nodeNames;
|
||||
};
|
||||
|
||||
|
||||
/**
|
||||
* Create container node in Harmony.
|
||||
* @function
|
||||
* @param {array} args Arguments, see example.
|
||||
* @return {string} Resulting node.
|
||||
*
|
||||
* @example
|
||||
* // arguments are in following order:
|
||||
* var args = [
|
||||
* nodeName,
|
||||
* nodeType,
|
||||
* selection
|
||||
* ];
|
||||
*/
|
||||
AvalonHarmony.createContainer = function(args) {
|
||||
var resultNode = node.add('Top', args[0], args[1], 0, 0, 0);
|
||||
if (args.length > 2) {
|
||||
node.link(args[2], 0, resultNode, 0, false, true);
|
||||
node.setCoord(resultNode,
|
||||
node.coordX(args[2]),
|
||||
node.coordY(args[2]) + 70);
|
||||
}
|
||||
return resultNode;
|
||||
};
|
||||
|
||||
|
||||
/**
|
||||
* Delete node.
|
||||
* @function
|
||||
* @param {string} node Node path.
|
||||
*/
|
||||
AvalonHarmony.deleteNode = function(_node) {
|
||||
node.deleteNode(_node, true, true);
|
||||
};
|
||||
18
openpype/hosts/harmony/api/js/package.json
Normal file
18
openpype/hosts/harmony/api/js/package.json
Normal file
|
|
@ -0,0 +1,18 @@
|
|||
{
|
||||
"name": "avalon-harmony",
|
||||
"version": "1.0.0",
|
||||
"description": "Avalon Harmony Host integration",
|
||||
"keywords": [
|
||||
"Avalon",
|
||||
"Harmony",
|
||||
"pipeline"
|
||||
],
|
||||
"license": "MIT",
|
||||
"main": "AvalonHarmony.js",
|
||||
"scripts": {
|
||||
"test": "echo \"Error: no test specified\" && exit 1"
|
||||
},
|
||||
"devDependencies": {
|
||||
"eslint": "^7.11.0"
|
||||
}
|
||||
}
|
||||
622
openpype/hosts/harmony/api/lib.py
Normal file
622
openpype/hosts/harmony/api/lib.py
Normal file
|
|
@ -0,0 +1,622 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Utility functions used for Avalon - Harmony integration."""
|
||||
import subprocess
|
||||
import threading
|
||||
import os
|
||||
import random
|
||||
import zipfile
|
||||
import sys
|
||||
import filecmp
|
||||
import shutil
|
||||
import logging
|
||||
import contextlib
|
||||
import json
|
||||
import signal
|
||||
import time
|
||||
from uuid import uuid4
|
||||
from Qt import QtWidgets, QtCore, QtGui
|
||||
import collections
|
||||
|
||||
from .server import Server
|
||||
|
||||
from openpype.tools.stdout_broker.app import StdOutBroker
|
||||
from openpype.tools.utils import host_tools
|
||||
from openpype import style
|
||||
from openpype.lib.applications import get_non_python_host_kwargs
|
||||
|
||||
# Setup logging.
|
||||
log = logging.getLogger(__name__)
|
||||
log.setLevel(logging.DEBUG)
|
||||
|
||||
|
||||
class ProcessContext:
|
||||
server = None
|
||||
pid = None
|
||||
process = None
|
||||
application_path = None
|
||||
callback_queue = collections.deque()
|
||||
workfile_path = None
|
||||
port = None
|
||||
stdout_broker = None
|
||||
workfile_tool = None
|
||||
|
||||
@classmethod
|
||||
def execute_in_main_thread(cls, func_to_call_from_main_thread):
|
||||
cls.callback_queue.append(func_to_call_from_main_thread)
|
||||
|
||||
@classmethod
|
||||
def main_thread_listen(cls):
|
||||
if cls.callback_queue:
|
||||
callback = cls.callback_queue.popleft()
|
||||
callback()
|
||||
if cls.process is not None and cls.process.poll() is not None:
|
||||
log.info("Server is not running, closing")
|
||||
ProcessContext.stdout_broker.stop()
|
||||
QtWidgets.QApplication.quit()
|
||||
|
||||
|
||||
def signature(postfix="func") -> str:
|
||||
"""Return random ECMA6 compatible function name.
|
||||
|
||||
Args:
|
||||
postfix (str): name to append to random string.
|
||||
Returns:
|
||||
str: random function name.
|
||||
|
||||
"""
|
||||
return "f{}_{}".format(str(uuid4()).replace("-", "_"), postfix)
|
||||
|
||||
|
||||
class _ZipFile(zipfile.ZipFile):
|
||||
"""Extended check for windows invalid characters."""
|
||||
|
||||
# this is extending default zipfile table for few invalid characters
|
||||
# that can come from Mac
|
||||
_windows_illegal_characters = ":<>|\"?*\r\n\x00"
|
||||
_windows_illegal_name_trans_table = str.maketrans(
|
||||
_windows_illegal_characters,
|
||||
"_" * len(_windows_illegal_characters)
|
||||
)
|
||||
|
||||
|
||||
def main(*subprocess_args):
|
||||
# coloring in StdOutBroker
|
||||
os.environ["OPENPYPE_LOG_NO_COLORS"] = "False"
|
||||
app = QtWidgets.QApplication([])
|
||||
app.setQuitOnLastWindowClosed(False)
|
||||
icon = QtGui.QIcon(style.get_app_icon_path())
|
||||
app.setWindowIcon(icon)
|
||||
|
||||
ProcessContext.stdout_broker = StdOutBroker('harmony')
|
||||
ProcessContext.stdout_broker.start()
|
||||
launch(*subprocess_args)
|
||||
|
||||
loop_timer = QtCore.QTimer()
|
||||
loop_timer.setInterval(20)
|
||||
|
||||
loop_timer.timeout.connect(ProcessContext.main_thread_listen)
|
||||
loop_timer.start()
|
||||
|
||||
sys.exit(app.exec_())
|
||||
|
||||
|
||||
def setup_startup_scripts():
|
||||
"""Manages installation of avalon's TB_sceneOpened.js for Harmony launch.
|
||||
|
||||
If a studio already has defined "TOONBOOM_GLOBAL_SCRIPT_LOCATION", copies
|
||||
the TB_sceneOpened.js to that location if the file is different.
|
||||
Otherwise, will set the env var to point to the avalon/harmony folder.
|
||||
|
||||
Admins should be aware that this will overwrite TB_sceneOpened in the
|
||||
"TOONBOOM_GLOBAL_SCRIPT_LOCATION", and that if they want to have additional
|
||||
logic, they will need to one of the following:
|
||||
* Create a Harmony package to manage startup logic
|
||||
* Use TB_sceneOpenedUI.js instead to manage startup logic
|
||||
* Add their startup logic to avalon/harmony/TB_sceneOpened.js
|
||||
"""
|
||||
avalon_dcc_dir = os.path.join(os.path.dirname(os.path.dirname(__file__)),
|
||||
"api")
|
||||
startup_js = "TB_sceneOpened.js"
|
||||
|
||||
if os.getenv("TOONBOOM_GLOBAL_SCRIPT_LOCATION"):
|
||||
|
||||
avalon_harmony_startup = os.path.join(avalon_dcc_dir, startup_js)
|
||||
|
||||
env_harmony_startup = os.path.join(
|
||||
os.getenv("TOONBOOM_GLOBAL_SCRIPT_LOCATION"), startup_js)
|
||||
|
||||
if not filecmp.cmp(avalon_harmony_startup, env_harmony_startup):
|
||||
try:
|
||||
shutil.copy(avalon_harmony_startup, env_harmony_startup)
|
||||
except Exception as e:
|
||||
log.error(e)
|
||||
log.warning(
|
||||
"Failed to copy {0} to {1}! "
|
||||
"Defaulting to Avalon TOONBOOM_GLOBAL_SCRIPT_LOCATION."
|
||||
.format(avalon_harmony_startup, env_harmony_startup))
|
||||
|
||||
os.environ["TOONBOOM_GLOBAL_SCRIPT_LOCATION"] = avalon_dcc_dir
|
||||
else:
|
||||
os.environ["TOONBOOM_GLOBAL_SCRIPT_LOCATION"] = avalon_dcc_dir
|
||||
|
||||
|
||||
def check_libs():
|
||||
"""Check if `OpenHarmony`_ is available.
|
||||
|
||||
Avalon expects either path in `LIB_OPENHARMONY_PATH` or `openHarmony.js`
|
||||
present in `TOONBOOM_GLOBAL_SCRIPT_LOCATION`.
|
||||
|
||||
Throws:
|
||||
RuntimeError: If openHarmony is not found.
|
||||
|
||||
.. _OpenHarmony:
|
||||
https://github.com/cfourney/OpenHarmony
|
||||
|
||||
"""
|
||||
if not os.getenv("LIB_OPENHARMONY_PATH"):
|
||||
|
||||
if os.getenv("TOONBOOM_GLOBAL_SCRIPT_LOCATION"):
|
||||
if os.path.exists(
|
||||
os.path.join(
|
||||
os.getenv("TOONBOOM_GLOBAL_SCRIPT_LOCATION"),
|
||||
"openHarmony.js")):
|
||||
|
||||
os.environ["LIB_OPENHARMONY_PATH"] = \
|
||||
os.getenv("TOONBOOM_GLOBAL_SCRIPT_LOCATION")
|
||||
return
|
||||
|
||||
else:
|
||||
log.error(("Cannot find OpenHarmony library. "
|
||||
"Please set path to it in LIB_OPENHARMONY_PATH "
|
||||
"environment variable."))
|
||||
raise RuntimeError("Missing OpenHarmony library.")
|
||||
|
||||
|
||||
def launch(application_path, *args):
|
||||
"""Set Harmony for launch.
|
||||
|
||||
Launches Harmony and the server, then starts listening on the main thread
|
||||
for callbacks from the server. This is to have Qt applications run in the
|
||||
main thread.
|
||||
|
||||
Args:
|
||||
application_path (str): Path to Harmony.
|
||||
|
||||
"""
|
||||
from avalon import api
|
||||
from openpype.hosts.harmony import api as harmony
|
||||
|
||||
api.install(harmony)
|
||||
|
||||
ProcessContext.port = random.randrange(49152, 65535)
|
||||
os.environ["AVALON_HARMONY_PORT"] = str(ProcessContext.port)
|
||||
ProcessContext.application_path = application_path
|
||||
|
||||
# Launch Harmony.
|
||||
setup_startup_scripts()
|
||||
check_libs()
|
||||
|
||||
if not os.environ.get("AVALON_HARMONY_WORKFILES_ON_LAUNCH", False):
|
||||
open_empty_workfile()
|
||||
return
|
||||
|
||||
ProcessContext.workfile_tool = host_tools.get_tool_by_name("workfiles")
|
||||
host_tools.show_workfiles(save=False)
|
||||
ProcessContext.execute_in_main_thread(check_workfiles_tool)
|
||||
|
||||
|
||||
def check_workfiles_tool():
|
||||
if ProcessContext.workfile_tool.isVisible():
|
||||
ProcessContext.execute_in_main_thread(check_workfiles_tool)
|
||||
elif not ProcessContext.workfile_path:
|
||||
open_empty_workfile()
|
||||
|
||||
|
||||
def open_empty_workfile():
|
||||
zip_file = os.path.join(os.path.dirname(__file__), "temp.zip")
|
||||
temp_path = get_local_harmony_path(zip_file)
|
||||
if os.path.exists(temp_path):
|
||||
log.info(f"removing existing {temp_path}")
|
||||
try:
|
||||
shutil.rmtree(temp_path)
|
||||
except Exception as e:
|
||||
log.critical(f"cannot clear {temp_path}")
|
||||
raise Exception(f"cannot clear {temp_path}") from e
|
||||
|
||||
launch_zip_file(zip_file)
|
||||
|
||||
|
||||
def get_local_harmony_path(filepath):
|
||||
"""From the provided path get the equivalent local Harmony path."""
|
||||
basename = os.path.splitext(os.path.basename(filepath))[0]
|
||||
harmony_path = os.path.join(os.path.expanduser("~"), ".avalon", "harmony")
|
||||
return os.path.join(harmony_path, basename)
|
||||
|
||||
|
||||
def launch_zip_file(filepath):
|
||||
"""Launch a Harmony application instance with the provided zip file.
|
||||
|
||||
Args:
|
||||
filepath (str): Path to file.
|
||||
"""
|
||||
print(f"Localizing {filepath}")
|
||||
|
||||
temp_path = get_local_harmony_path(filepath)
|
||||
scene_path = os.path.join(
|
||||
temp_path, os.path.basename(temp_path) + ".xstage"
|
||||
)
|
||||
unzip = False
|
||||
if os.path.exists(scene_path):
|
||||
# Check remote scene is newer than local.
|
||||
if os.path.getmtime(scene_path) < os.path.getmtime(filepath):
|
||||
try:
|
||||
shutil.rmtree(temp_path)
|
||||
except Exception as e:
|
||||
log.error(e)
|
||||
raise Exception("Cannot delete working folder") from e
|
||||
unzip = True
|
||||
else:
|
||||
unzip = True
|
||||
|
||||
if unzip:
|
||||
with _ZipFile(filepath, "r") as zip_ref:
|
||||
zip_ref.extractall(temp_path)
|
||||
|
||||
# Close existing scene.
|
||||
if ProcessContext.pid:
|
||||
os.kill(ProcessContext.pid, signal.SIGTERM)
|
||||
|
||||
# Stop server.
|
||||
if ProcessContext.server:
|
||||
ProcessContext.server.stop()
|
||||
|
||||
# Launch Avalon server.
|
||||
ProcessContext.server = Server(ProcessContext.port)
|
||||
ProcessContext.server.start()
|
||||
# thread = threading.Thread(target=self.server.start)
|
||||
# thread.daemon = True
|
||||
# thread.start()
|
||||
|
||||
# Save workfile path for later.
|
||||
ProcessContext.workfile_path = filepath
|
||||
|
||||
# find any xstage files is directory, prefer the one with the same name
|
||||
# as directory (plus extension)
|
||||
xstage_files = []
|
||||
for _, _, files in os.walk(temp_path):
|
||||
for file in files:
|
||||
if os.path.splitext(file)[1] == ".xstage":
|
||||
xstage_files.append(file)
|
||||
|
||||
if not os.path.basename("temp.zip"):
|
||||
if not xstage_files:
|
||||
ProcessContext.server.stop()
|
||||
print("no xstage file was found")
|
||||
return
|
||||
|
||||
# try to use first available
|
||||
scene_path = os.path.join(
|
||||
temp_path, xstage_files[0]
|
||||
)
|
||||
|
||||
# prefer the one named as zip file
|
||||
zip_based_name = "{}.xstage".format(
|
||||
os.path.splitext(os.path.basename(filepath))[0])
|
||||
|
||||
if zip_based_name in xstage_files:
|
||||
scene_path = os.path.join(
|
||||
temp_path, zip_based_name
|
||||
)
|
||||
|
||||
if not os.path.exists(scene_path):
|
||||
print("error: cannot determine scene file")
|
||||
ProcessContext.server.stop()
|
||||
return
|
||||
|
||||
print("Launching {}".format(scene_path))
|
||||
kwargs = get_non_python_host_kwargs({}, False)
|
||||
process = subprocess.Popen(
|
||||
[ProcessContext.application_path, scene_path],
|
||||
**kwargs
|
||||
)
|
||||
ProcessContext.pid = process.pid
|
||||
ProcessContext.process = process
|
||||
ProcessContext.stdout_broker.host_connected()
|
||||
|
||||
|
||||
def on_file_changed(path, threaded=True):
|
||||
"""Threaded zipping and move of the project directory.
|
||||
|
||||
This method is called when the `.xstage` file is changed.
|
||||
"""
|
||||
log.debug("File changed: " + path)
|
||||
|
||||
if ProcessContext.workfile_path is None:
|
||||
return
|
||||
|
||||
if threaded:
|
||||
thread = threading.Thread(
|
||||
target=zip_and_move,
|
||||
args=(os.path.dirname(path), ProcessContext.workfile_path)
|
||||
)
|
||||
thread.start()
|
||||
else:
|
||||
zip_and_move(os.path.dirname(path), ProcessContext.workfile_path)
|
||||
|
||||
|
||||
def zip_and_move(source, destination):
|
||||
"""Zip a directory and move to `destination`.
|
||||
|
||||
Args:
|
||||
source (str): Directory to zip and move to destination.
|
||||
destination (str): Destination file path to zip file.
|
||||
|
||||
"""
|
||||
os.chdir(os.path.dirname(source))
|
||||
shutil.make_archive(os.path.basename(source), "zip", source)
|
||||
with _ZipFile(os.path.basename(source) + ".zip") as zr:
|
||||
if zr.testzip() is not None:
|
||||
raise Exception("File archive is corrupted.")
|
||||
shutil.move(os.path.basename(source) + ".zip", destination)
|
||||
log.debug(f"Saved '{source}' to '{destination}'")
|
||||
|
||||
|
||||
def show(module_name):
|
||||
"""Call show on "module_name".
|
||||
|
||||
This allows to make a QApplication ahead of time and always "exec_" to
|
||||
prevent crashing.
|
||||
|
||||
Args:
|
||||
module_name (str): Name of module to call "show" on.
|
||||
|
||||
"""
|
||||
# Requests often get doubled up when showing tools, so we wait a second for
|
||||
# requests to be received properly.
|
||||
time.sleep(1)
|
||||
|
||||
# Get tool name from module name
|
||||
# TODO this is for backwards compatibility not sure if `TB_sceneOpened.js`
|
||||
# is automatically updated.
|
||||
# Previous javascript sent 'module_name' which contained whole tool import
|
||||
# string e.g. "avalon.tools.workfiles" now it should be only "workfiles"
|
||||
tool_name = module_name.split(".")[-1]
|
||||
|
||||
kwargs = {}
|
||||
if tool_name == "loader":
|
||||
kwargs["use_context"] = True
|
||||
|
||||
ProcessContext.execute_in_main_thread(
|
||||
lambda: host_tools.show_tool_by_name(tool_name, **kwargs)
|
||||
)
|
||||
|
||||
# Required return statement.
|
||||
return "nothing"
|
||||
|
||||
|
||||
def get_scene_data():
|
||||
try:
|
||||
return send(
|
||||
{
|
||||
"function": "AvalonHarmony.getSceneData"
|
||||
})["result"]
|
||||
except json.decoder.JSONDecodeError:
|
||||
# Means no sceen metadata has been made before.
|
||||
return {}
|
||||
except KeyError:
|
||||
# Means no existing scene metadata has been made.
|
||||
return {}
|
||||
|
||||
|
||||
def set_scene_data(data):
|
||||
"""Write scene data to metadata.
|
||||
|
||||
Args:
|
||||
data (dict): Data to write.
|
||||
|
||||
"""
|
||||
# Write scene data.
|
||||
send(
|
||||
{
|
||||
"function": "AvalonHarmony.setSceneData",
|
||||
"args": data
|
||||
})
|
||||
|
||||
|
||||
def read(node_id):
|
||||
"""Read object metadata in to a dictionary.
|
||||
|
||||
Args:
|
||||
node_id (str): Path to node or id of object.
|
||||
|
||||
Returns:
|
||||
dict
|
||||
"""
|
||||
scene_data = get_scene_data()
|
||||
if node_id in scene_data:
|
||||
return scene_data[node_id]
|
||||
|
||||
return {}
|
||||
|
||||
|
||||
def remove(node_id):
|
||||
"""
|
||||
Remove node data from scene metadata.
|
||||
|
||||
Args:
|
||||
node_id (str): full name (eg. 'Top/renderAnimation')
|
||||
"""
|
||||
data = get_scene_data()
|
||||
del data[node_id]
|
||||
set_scene_data(data)
|
||||
|
||||
|
||||
def delete_node(node):
|
||||
""" Physically delete node from scene. """
|
||||
send(
|
||||
{
|
||||
"function": "AvalonHarmony.deleteNode",
|
||||
"args": node
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
def imprint(node_id, data, remove=False):
|
||||
"""Write `data` to the `node` as json.
|
||||
|
||||
Arguments:
|
||||
node_id (str): Path to node or id of object.
|
||||
data (dict): Dictionary of key/value pairs.
|
||||
remove (bool): Removes the data from the scene.
|
||||
|
||||
Example:
|
||||
>>> from avalon.harmony import lib
|
||||
>>> node = "Top/Display"
|
||||
>>> data = {"str": "someting", "int": 1, "float": 0.32, "bool": True}
|
||||
>>> lib.imprint(layer, data)
|
||||
"""
|
||||
scene_data = get_scene_data()
|
||||
|
||||
if remove and (node_id in scene_data):
|
||||
scene_data.pop(node_id, None)
|
||||
else:
|
||||
if node_id in scene_data:
|
||||
scene_data[node_id].update(data)
|
||||
else:
|
||||
scene_data[node_id] = data
|
||||
|
||||
set_scene_data(scene_data)
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def maintained_selection():
|
||||
"""Maintain selection during context."""
|
||||
|
||||
selected_nodes = send(
|
||||
{
|
||||
"function": "AvalonHarmony.getSelectedNodes"
|
||||
})["result"]
|
||||
|
||||
try:
|
||||
yield selected_nodes
|
||||
finally:
|
||||
selected_nodes = send(
|
||||
{
|
||||
"function": "AvalonHarmony.selectNodes",
|
||||
"args": selected_nodes
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
def send(request):
|
||||
"""Public method for sending requests to Harmony."""
|
||||
return ProcessContext.server.send(request)
|
||||
|
||||
|
||||
def select_nodes(nodes):
|
||||
""" Selects nodes in Node View """
|
||||
_ = send(
|
||||
{
|
||||
"function": "AvalonHarmony.selectNodes",
|
||||
"args": nodes
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def maintained_nodes_state(nodes):
|
||||
"""Maintain nodes states during context."""
|
||||
# Collect current state.
|
||||
states = send(
|
||||
{
|
||||
"function": "AvalonHarmony.areEnabled", "args": nodes
|
||||
})["result"]
|
||||
|
||||
# Disable all nodes.
|
||||
send(
|
||||
{
|
||||
"function": "AvalonHarmony.disableNodes", "args": nodes
|
||||
})
|
||||
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
send(
|
||||
{
|
||||
"function": "AvalonHarmony.setState",
|
||||
"args": [nodes, states]
|
||||
})
|
||||
|
||||
|
||||
def save_scene():
|
||||
"""Save the Harmony scene safely.
|
||||
|
||||
The built-in (to Avalon) background zip and moving of the Harmony scene
|
||||
folder, interfers with server/client communication by sending two requests
|
||||
at the same time. This only happens when sending "scene.saveAll()". This
|
||||
method prevents this double request and safely saves the scene.
|
||||
|
||||
"""
|
||||
# Need to turn off the backgound watcher else the communication with
|
||||
# the server gets spammed with two requests at the same time.
|
||||
scene_path = send(
|
||||
{"function": "AvalonHarmony.saveScene"})["result"]
|
||||
|
||||
# Manually update the remote file.
|
||||
on_file_changed(scene_path, threaded=False)
|
||||
|
||||
# Re-enable the background watcher.
|
||||
send({"function": "AvalonHarmony.enableFileWather"})
|
||||
|
||||
|
||||
def save_scene_as(filepath):
|
||||
"""Save Harmony scene as `filepath`."""
|
||||
scene_dir = os.path.dirname(filepath)
|
||||
destination = os.path.join(
|
||||
os.path.dirname(ProcessContext.workfile_path),
|
||||
os.path.splitext(os.path.basename(filepath))[0] + ".zip"
|
||||
)
|
||||
|
||||
if os.path.exists(scene_dir):
|
||||
try:
|
||||
shutil.rmtree(scene_dir)
|
||||
except Exception as e:
|
||||
log.error(f"Cannot remove {scene_dir}")
|
||||
raise Exception(f"Cannot remove {scene_dir}") from e
|
||||
|
||||
send(
|
||||
{"function": "scene.saveAs", "args": [scene_dir]}
|
||||
)["result"]
|
||||
|
||||
zip_and_move(scene_dir, destination)
|
||||
|
||||
ProcessContext.workfile_path = destination
|
||||
|
||||
send(
|
||||
{"function": "AvalonHarmony.addPathToWatcher", "args": filepath}
|
||||
)
|
||||
|
||||
|
||||
def find_node_by_name(name, node_type):
|
||||
"""Find node by its name.
|
||||
|
||||
Args:
|
||||
name (str): Name of the Node. (without part before '/')
|
||||
node_type (str): Type of the Node.
|
||||
'READ' - for loaded data with Loaders (background)
|
||||
'GROUP' - for loaded data with Loaders (templates)
|
||||
'WRITE' - render nodes
|
||||
|
||||
Returns:
|
||||
str: FQ Node name.
|
||||
|
||||
"""
|
||||
nodes = send(
|
||||
{"function": "node.getNodes", "args": [[node_type]]}
|
||||
)["result"]
|
||||
for node in nodes:
|
||||
node_name = node.split("/")[-1]
|
||||
if name == node_name:
|
||||
return node
|
||||
|
||||
return None
|
||||
351
openpype/hosts/harmony/api/pipeline.py
Normal file
351
openpype/hosts/harmony/api/pipeline.py
Normal file
|
|
@ -0,0 +1,351 @@
|
|||
import os
|
||||
from pathlib import Path
|
||||
import logging
|
||||
|
||||
import pyblish.api
|
||||
|
||||
from avalon import io
|
||||
import avalon.api
|
||||
from avalon.pipeline import AVALON_CONTAINER_ID
|
||||
|
||||
from openpype import lib
|
||||
import openpype.hosts.harmony
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
|
||||
log = logging.getLogger("openpype.hosts.harmony")
|
||||
|
||||
HOST_DIR = os.path.dirname(os.path.abspath(openpype.hosts.harmony.__file__))
|
||||
PLUGINS_DIR = os.path.join(HOST_DIR, "plugins")
|
||||
PUBLISH_PATH = os.path.join(PLUGINS_DIR, "publish")
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "load")
|
||||
CREATE_PATH = os.path.join(PLUGINS_DIR, "create")
|
||||
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "inventory")
|
||||
|
||||
|
||||
def set_scene_settings(settings):
|
||||
"""Set correct scene settings in Harmony.
|
||||
|
||||
Args:
|
||||
settings (dict): Scene settings.
|
||||
|
||||
Returns:
|
||||
dict: Dictionary of settings to set.
|
||||
|
||||
"""
|
||||
harmony.send(
|
||||
{"function": "PypeHarmony.setSceneSettings", "args": settings})
|
||||
|
||||
|
||||
def get_asset_settings():
|
||||
"""Get settings on current asset from database.
|
||||
|
||||
Returns:
|
||||
dict: Scene data.
|
||||
|
||||
"""
|
||||
asset_data = lib.get_asset()["data"]
|
||||
fps = asset_data.get("fps")
|
||||
frame_start = asset_data.get("frameStart")
|
||||
frame_end = asset_data.get("frameEnd")
|
||||
handle_start = asset_data.get("handleStart")
|
||||
handle_end = asset_data.get("handleEnd")
|
||||
resolution_width = asset_data.get("resolutionWidth")
|
||||
resolution_height = asset_data.get("resolutionHeight")
|
||||
entity_type = asset_data.get("entityType")
|
||||
|
||||
scene_data = {
|
||||
"fps": fps,
|
||||
"frameStart": frame_start,
|
||||
"frameEnd": frame_end,
|
||||
"handleStart": handle_start,
|
||||
"handleEnd": handle_end,
|
||||
"resolutionWidth": resolution_width,
|
||||
"resolutionHeight": resolution_height,
|
||||
"entityType": entity_type
|
||||
}
|
||||
|
||||
return scene_data
|
||||
|
||||
|
||||
def ensure_scene_settings():
|
||||
"""Validate if Harmony scene has valid settings."""
|
||||
settings = get_asset_settings()
|
||||
|
||||
invalid_settings = []
|
||||
valid_settings = {}
|
||||
for key, value in settings.items():
|
||||
if value is None:
|
||||
invalid_settings.append(key)
|
||||
else:
|
||||
valid_settings[key] = value
|
||||
|
||||
# Warn about missing attributes.
|
||||
if invalid_settings:
|
||||
msg = "Missing attributes:"
|
||||
for item in invalid_settings:
|
||||
msg += f"\n{item}"
|
||||
|
||||
harmony.send(
|
||||
{"function": "PypeHarmony.message", "args": msg})
|
||||
|
||||
set_scene_settings(valid_settings)
|
||||
|
||||
|
||||
def check_inventory():
|
||||
"""Check is scene contains outdated containers.
|
||||
|
||||
If it does it will colorize outdated nodes and display warning message
|
||||
in Harmony.
|
||||
"""
|
||||
if not lib.any_outdated():
|
||||
return
|
||||
|
||||
host = avalon.api.registered_host()
|
||||
outdated_containers = []
|
||||
for container in host.ls():
|
||||
representation = container['representation']
|
||||
representation_doc = io.find_one(
|
||||
{
|
||||
"_id": io.ObjectId(representation),
|
||||
"type": "representation"
|
||||
},
|
||||
projection={"parent": True}
|
||||
)
|
||||
if representation_doc and not lib.is_latest(representation_doc):
|
||||
outdated_containers.append(container)
|
||||
|
||||
# Colour nodes.
|
||||
outdated_nodes = []
|
||||
for container in outdated_containers:
|
||||
if container["loader"] == "ImageSequenceLoader":
|
||||
outdated_nodes.append(
|
||||
harmony.find_node_by_name(container["name"], "READ")
|
||||
)
|
||||
harmony.send({"function": "PypeHarmony.setColor", "args": outdated_nodes})
|
||||
|
||||
# Warn about outdated containers.
|
||||
msg = "There are outdated containers in the scene."
|
||||
harmony.send({"function": "PypeHarmony.message", "args": msg})
|
||||
|
||||
|
||||
def application_launch():
|
||||
"""Event that is executed after Harmony is launched."""
|
||||
# FIXME: This is breaking server <-> client communication.
|
||||
# It is now moved so it it manually called.
|
||||
# ensure_scene_settings()
|
||||
# check_inventory()
|
||||
# fills OPENPYPE_HARMONY_JS
|
||||
pype_harmony_path = Path(__file__).parent.parent / "js" / "PypeHarmony.js"
|
||||
pype_harmony_js = pype_harmony_path.read_text()
|
||||
|
||||
# go through js/creators, loaders and publish folders and load all scripts
|
||||
script = ""
|
||||
for item in ["creators", "loaders", "publish"]:
|
||||
dir_to_scan = Path(__file__).parent.parent / "js" / item
|
||||
for child in dir_to_scan.iterdir():
|
||||
script += child.read_text()
|
||||
|
||||
# send scripts to Harmony
|
||||
harmony.send({"script": pype_harmony_js})
|
||||
harmony.send({"script": script})
|
||||
inject_avalon_js()
|
||||
|
||||
|
||||
def export_template(backdrops, nodes, filepath):
|
||||
"""Export Template to file.
|
||||
|
||||
Args:
|
||||
backdrops (list): List of backdrops to export.
|
||||
nodes (list): List of nodes to export.
|
||||
filepath (str): Path where to save Template.
|
||||
|
||||
"""
|
||||
harmony.send({
|
||||
"function": "PypeHarmony.exportTemplate",
|
||||
"args": [
|
||||
backdrops,
|
||||
nodes,
|
||||
os.path.basename(filepath),
|
||||
os.path.dirname(filepath)
|
||||
]
|
||||
})
|
||||
|
||||
|
||||
def install():
|
||||
"""Install Pype as host config."""
|
||||
print("Installing Pype config ...")
|
||||
|
||||
pyblish.api.register_host("harmony")
|
||||
pyblish.api.register_plugin_path(PUBLISH_PATH)
|
||||
avalon.api.register_plugin_path(avalon.api.Loader, LOAD_PATH)
|
||||
avalon.api.register_plugin_path(avalon.api.Creator, CREATE_PATH)
|
||||
log.info(PUBLISH_PATH)
|
||||
|
||||
# Register callbacks.
|
||||
pyblish.api.register_callback(
|
||||
"instanceToggled", on_pyblish_instance_toggled
|
||||
)
|
||||
|
||||
avalon.api.on("application.launched", application_launch)
|
||||
|
||||
|
||||
def uninstall():
|
||||
pyblish.api.deregister_plugin_path(PUBLISH_PATH)
|
||||
avalon.api.deregister_plugin_path(avalon.api.Loader, LOAD_PATH)
|
||||
avalon.api.deregister_plugin_path(avalon.api.Creator, CREATE_PATH)
|
||||
|
||||
|
||||
def on_pyblish_instance_toggled(instance, old_value, new_value):
|
||||
"""Toggle node enabling on instance toggles."""
|
||||
node = None
|
||||
if instance.data.get("setMembers"):
|
||||
node = instance.data["setMembers"][0]
|
||||
|
||||
if node:
|
||||
harmony.send(
|
||||
{
|
||||
"function": "PypeHarmony.toggleInstance",
|
||||
"args": [node, new_value]
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
def inject_avalon_js():
|
||||
"""Inject AvalonHarmony.js into Harmony."""
|
||||
avalon_harmony_js = Path(__file__).parent.joinpath("js/AvalonHarmony.js")
|
||||
script = avalon_harmony_js.read_text()
|
||||
# send AvalonHarmony.js to Harmony
|
||||
harmony.send({"script": script})
|
||||
|
||||
|
||||
def ls():
|
||||
"""Yields containers from Harmony scene.
|
||||
|
||||
This is the host-equivalent of api.ls(), but instead of listing
|
||||
assets on disk, it lists assets already loaded in Harmony; once loaded
|
||||
they are called 'containers'.
|
||||
|
||||
Yields:
|
||||
dict: container
|
||||
"""
|
||||
objects = harmony.get_scene_data() or {}
|
||||
for _, data in objects.items():
|
||||
# Skip non-tagged objects.
|
||||
if not data:
|
||||
continue
|
||||
|
||||
# Filter to only containers.
|
||||
if "container" not in data.get("id"):
|
||||
continue
|
||||
|
||||
if not data.get("objectName"): # backward compatibility
|
||||
data["objectName"] = data["name"]
|
||||
yield data
|
||||
|
||||
|
||||
def list_instances(remove_orphaned=True):
|
||||
"""
|
||||
List all created instances from current workfile which
|
||||
will be published.
|
||||
|
||||
Pulls from File > File Info
|
||||
|
||||
For SubsetManager, by default it check if instance has matching node
|
||||
in the scene, if not, instance gets deleted from metadata.
|
||||
|
||||
Returns:
|
||||
(list) of dictionaries matching instances format
|
||||
"""
|
||||
objects = harmony.get_scene_data() or {}
|
||||
instances = []
|
||||
for key, data in objects.items():
|
||||
# Skip non-tagged objects.
|
||||
if not data:
|
||||
continue
|
||||
|
||||
# Filter out containers.
|
||||
if "container" in data.get("id"):
|
||||
continue
|
||||
|
||||
data['uuid'] = key
|
||||
|
||||
if remove_orphaned:
|
||||
node_name = key.split("/")[-1]
|
||||
located_node = harmony.find_node_by_name(node_name, 'WRITE')
|
||||
if not located_node:
|
||||
print("Removing orphaned instance {}".format(key))
|
||||
harmony.remove(key)
|
||||
continue
|
||||
|
||||
instances.append(data)
|
||||
|
||||
return instances
|
||||
|
||||
|
||||
def remove_instance(instance):
|
||||
"""
|
||||
Remove instance from current workfile metadata and from scene!
|
||||
|
||||
Updates metadata of current file in File > File Info and removes
|
||||
icon highlight on group layer.
|
||||
|
||||
For SubsetManager
|
||||
|
||||
Args:
|
||||
instance (dict): instance representation from subsetmanager model
|
||||
"""
|
||||
node = instance.get("uuid")
|
||||
harmony.remove(node)
|
||||
harmony.delete_node(node)
|
||||
|
||||
|
||||
def select_instance(instance):
|
||||
"""
|
||||
Select instance in Node View
|
||||
|
||||
Args:
|
||||
instance (dict): instance representation from subsetmanager model
|
||||
"""
|
||||
harmony.select_nodes([instance.get("uuid")])
|
||||
|
||||
|
||||
def containerise(name,
|
||||
namespace,
|
||||
node,
|
||||
context,
|
||||
loader=None,
|
||||
suffix=None,
|
||||
nodes=None):
|
||||
"""Imprint node with metadata.
|
||||
|
||||
Containerisation enables a tracking of version, author and origin
|
||||
for loaded assets.
|
||||
|
||||
Arguments:
|
||||
name (str): Name of resulting assembly.
|
||||
namespace (str): Namespace under which to host container.
|
||||
node (str): Node to containerise.
|
||||
context (dict): Asset information.
|
||||
loader (str, optional): Name of loader used to produce this container.
|
||||
suffix (str, optional): Suffix of container, defaults to `_CON`.
|
||||
|
||||
Returns:
|
||||
container (str): Path of container assembly.
|
||||
"""
|
||||
if not nodes:
|
||||
nodes = []
|
||||
|
||||
data = {
|
||||
"schema": "openpype:container-2.0",
|
||||
"id": AVALON_CONTAINER_ID,
|
||||
"name": name,
|
||||
"namespace": namespace,
|
||||
"loader": str(loader),
|
||||
"representation": str(context["representation"]["_id"]),
|
||||
"nodes": nodes
|
||||
}
|
||||
|
||||
harmony.imprint(node, data)
|
||||
|
||||
return node
|
||||
|
|
@ -1,6 +1,71 @@
|
|||
from avalon import harmony
|
||||
import avalon.api
|
||||
from openpype.api import PypeCreatorMixin
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
|
||||
class Creator(PypeCreatorMixin, harmony.Creator):
|
||||
pass
|
||||
class Creator(PypeCreatorMixin, avalon.api.Creator):
|
||||
"""Creator plugin to create instances in Harmony.
|
||||
|
||||
By default a Composite node is created to support any number of nodes in
|
||||
an instance, but any node type is supported.
|
||||
If the selection is used, the selected nodes will be connected to the
|
||||
created node.
|
||||
"""
|
||||
|
||||
defaults = ["Main"]
|
||||
node_type = "COMPOSITE"
|
||||
|
||||
def setup_node(self, node):
|
||||
"""Prepare node as container.
|
||||
|
||||
Args:
|
||||
node (str): Path to node.
|
||||
"""
|
||||
harmony.send(
|
||||
{
|
||||
"function": "AvalonHarmony.setupNodeForCreator",
|
||||
"args": node
|
||||
}
|
||||
)
|
||||
|
||||
def process(self):
|
||||
"""Plugin entry point."""
|
||||
existing_node_names = harmony.send(
|
||||
{
|
||||
"function": "AvalonHarmony.getNodesNamesByType",
|
||||
"args": self.node_type
|
||||
})["result"]
|
||||
|
||||
# Dont allow instances with the same name.
|
||||
msg = "Instance with name \"{}\" already exists.".format(self.name)
|
||||
for name in existing_node_names:
|
||||
if self.name.lower() == name.lower():
|
||||
harmony.send(
|
||||
{
|
||||
"function": "AvalonHarmony.message", "args": msg
|
||||
}
|
||||
)
|
||||
return False
|
||||
|
||||
with harmony.maintained_selection() as selection:
|
||||
node = None
|
||||
|
||||
if (self.options or {}).get("useSelection") and selection:
|
||||
node = harmony.send(
|
||||
{
|
||||
"function": "AvalonHarmony.createContainer",
|
||||
"args": [self.name, self.node_type, selection[-1]]
|
||||
}
|
||||
)["result"]
|
||||
else:
|
||||
node = harmony.send(
|
||||
{
|
||||
"function": "AvalonHarmony.createContainer",
|
||||
"args": [self.name, self.node_type]
|
||||
}
|
||||
)["result"]
|
||||
|
||||
harmony.imprint(node, self.data)
|
||||
self.setup_node(node)
|
||||
|
||||
return node
|
||||
|
|
|
|||
267
openpype/hosts/harmony/api/server.py
Normal file
267
openpype/hosts/harmony/api/server.py
Normal file
|
|
@ -0,0 +1,267 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Server-side implementation of Toon Boon Harmony communication."""
|
||||
import socket
|
||||
import logging
|
||||
import json
|
||||
import traceback
|
||||
import importlib
|
||||
import functools
|
||||
import time
|
||||
import struct
|
||||
from datetime import datetime
|
||||
import threading
|
||||
from . import lib
|
||||
|
||||
|
||||
class Server(threading.Thread):
|
||||
"""Class for communication with Toon Boon Harmony.
|
||||
|
||||
Attributes:
|
||||
connection (Socket): connection holding object.
|
||||
received (str): received data buffer.any(iterable)
|
||||
port (int): port number.
|
||||
message_id (int): index of last message going out.
|
||||
queue (dict): dictionary holding queue of incoming messages.
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, port):
|
||||
"""Constructor."""
|
||||
super(Server, self).__init__()
|
||||
self.daemon = True
|
||||
self.connection = None
|
||||
self.received = ""
|
||||
self.port = port
|
||||
self.message_id = 1
|
||||
|
||||
# Setup logging.
|
||||
self.log = logging.getLogger(__name__)
|
||||
self.log.setLevel(logging.DEBUG)
|
||||
|
||||
# Create a TCP/IP socket
|
||||
self.socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||
|
||||
# Bind the socket to the port
|
||||
server_address = ("127.0.0.1", port)
|
||||
self.log.debug(
|
||||
f"[{self.timestamp()}] Starting up on "
|
||||
f"{server_address[0]}:{server_address[1]}")
|
||||
self.socket.bind(server_address)
|
||||
|
||||
# Listen for incoming connections
|
||||
self.socket.listen(1)
|
||||
self.queue = {}
|
||||
|
||||
def process_request(self, request):
|
||||
"""Process incoming request.
|
||||
|
||||
Args:
|
||||
request (dict): {
|
||||
"module": (str), # Module of method.
|
||||
"method" (str), # Name of method in module.
|
||||
"args" (list), # Arguments to pass to method.
|
||||
"kwargs" (dict), # Keywork arguments to pass to method.
|
||||
"reply" (bool), # Optional wait for method completion.
|
||||
}
|
||||
"""
|
||||
pretty = self._pretty(request)
|
||||
self.log.debug(
|
||||
f"[{self.timestamp()}] Processing request:\n{pretty}")
|
||||
|
||||
try:
|
||||
module = importlib.import_module(request["module"])
|
||||
method = getattr(module, request["method"])
|
||||
|
||||
args = request.get("args", [])
|
||||
kwargs = request.get("kwargs", {})
|
||||
partial_method = functools.partial(method, *args, **kwargs)
|
||||
|
||||
lib.ProcessContext.execute_in_main_thread(partial_method)
|
||||
except Exception:
|
||||
self.log.error(traceback.format_exc())
|
||||
|
||||
def receive(self):
|
||||
"""Receives data from `self.connection`.
|
||||
|
||||
When the data is a json serializable string, a reply is sent then
|
||||
processing of the request.
|
||||
"""
|
||||
current_time = time.time()
|
||||
while True:
|
||||
|
||||
# Receive the data in small chunks and retransmit it
|
||||
request = None
|
||||
header = self.connection.recv(6)
|
||||
if len(header) == 0:
|
||||
# null data received, socket is closing.
|
||||
self.log.info(f"[{self.timestamp()}] Connection closing.")
|
||||
break
|
||||
if header[0:2] != b"AH":
|
||||
self.log.error("INVALID HEADER")
|
||||
length = struct.unpack(">I", header[2:])[0]
|
||||
data = self.connection.recv(length)
|
||||
while (len(data) < length):
|
||||
# we didn't received everything in first try, lets wait for
|
||||
# all data.
|
||||
time.sleep(0.1)
|
||||
if self.connection is None:
|
||||
self.log.error(f"[{self.timestamp()}] "
|
||||
"Connection is broken")
|
||||
break
|
||||
if time.time() > current_time + 30:
|
||||
self.log.error(f"[{self.timestamp()}] Connection timeout.")
|
||||
break
|
||||
|
||||
data += self.connection.recv(length - len(data))
|
||||
|
||||
self.received += data.decode("utf-8")
|
||||
pretty = self._pretty(self.received)
|
||||
self.log.debug(
|
||||
f"[{self.timestamp()}] Received:\n{pretty}")
|
||||
|
||||
try:
|
||||
request = json.loads(self.received)
|
||||
except json.decoder.JSONDecodeError as e:
|
||||
self.log.error(f"[{self.timestamp()}] "
|
||||
f"Invalid message received.\n{e}",
|
||||
exc_info=True)
|
||||
|
||||
self.received = ""
|
||||
if request is None:
|
||||
continue
|
||||
|
||||
if "message_id" in request.keys():
|
||||
message_id = request["message_id"]
|
||||
self.message_id = message_id + 1
|
||||
self.log.debug(f"--- storing request as {message_id}")
|
||||
self.queue[message_id] = request
|
||||
if "reply" not in request.keys():
|
||||
request["reply"] = True
|
||||
self.send(request)
|
||||
self.process_request(request)
|
||||
|
||||
if "message_id" in request.keys():
|
||||
try:
|
||||
self.log.debug(f"[{self.timestamp()}] "
|
||||
f"Removing from the queue {message_id}")
|
||||
del self.queue[message_id]
|
||||
except IndexError:
|
||||
self.log.debug(f"[{self.timestamp()}] "
|
||||
f"{message_id} is no longer in queue")
|
||||
else:
|
||||
self.log.debug(f"[{self.timestamp()}] "
|
||||
"received data was just a reply.")
|
||||
|
||||
def run(self):
|
||||
"""Entry method for server.
|
||||
|
||||
Waits for a connection on `self.port` before going into listen mode.
|
||||
"""
|
||||
# Wait for a connection
|
||||
timestamp = datetime.now().strftime("%H:%M:%S.%f")
|
||||
self.log.debug(f"[{timestamp}] Waiting for a connection.")
|
||||
self.connection, client_address = self.socket.accept()
|
||||
|
||||
timestamp = datetime.now().strftime("%H:%M:%S.%f")
|
||||
self.log.debug(f"[{timestamp}] Connection from: {client_address}")
|
||||
|
||||
self.receive()
|
||||
|
||||
def stop(self):
|
||||
"""Shutdown socket server gracefully."""
|
||||
timestamp = datetime.now().strftime("%H:%M:%S.%f")
|
||||
self.log.debug(f"[{timestamp}] Shutting down server.")
|
||||
if self.connection is None:
|
||||
self.log.debug("Connect to shutdown.")
|
||||
socket.socket(
|
||||
socket.AF_INET, socket.SOCK_STREAM
|
||||
).connect(("localhost", self.port))
|
||||
|
||||
self.connection.close()
|
||||
self.connection = None
|
||||
|
||||
self.socket.close()
|
||||
|
||||
def _send(self, message):
|
||||
"""Send a message to Harmony.
|
||||
|
||||
Args:
|
||||
message (str): Data to send to Harmony.
|
||||
"""
|
||||
# Wait for a connection.
|
||||
while not self.connection:
|
||||
pass
|
||||
|
||||
timestamp = datetime.now().strftime("%H:%M:%S.%f")
|
||||
encoded = message.encode("utf-8")
|
||||
coded_message = b"AH" + struct.pack('>I', len(encoded)) + encoded
|
||||
pretty = self._pretty(coded_message)
|
||||
self.log.debug(
|
||||
f"[{timestamp}] Sending [{self.message_id}]:\n{pretty}")
|
||||
self.log.debug(f"--- Message length: {len(encoded)}")
|
||||
self.connection.sendall(coded_message)
|
||||
self.message_id += 1
|
||||
|
||||
def send(self, request):
|
||||
"""Send a request in dictionary to Harmony.
|
||||
|
||||
Waits for a reply from Harmony.
|
||||
|
||||
Args:
|
||||
request (dict): Data to send to Harmony.
|
||||
"""
|
||||
request["message_id"] = self.message_id
|
||||
self._send(json.dumps(request))
|
||||
if request.get("reply"):
|
||||
timestamp = datetime.now().strftime("%H:%M:%S.%f")
|
||||
self.log.debug(
|
||||
f"[{timestamp}] sent reply, not waiting for anything.")
|
||||
return None
|
||||
result = None
|
||||
current_time = time.time()
|
||||
try_index = 1
|
||||
while True:
|
||||
time.sleep(0.1)
|
||||
if time.time() > current_time + 30:
|
||||
timestamp = datetime.now().strftime("%H:%M:%S.%f")
|
||||
self.log.error((f"[{timestamp}][{self.message_id}] "
|
||||
"No reply from Harmony in 30s. "
|
||||
f"Retrying {try_index}"))
|
||||
try_index += 1
|
||||
current_time = time.time()
|
||||
if try_index > 30:
|
||||
break
|
||||
try:
|
||||
result = self.queue[request["message_id"]]
|
||||
timestamp = datetime.now().strftime("%H:%M:%S.%f")
|
||||
self.log.debug((f"[{timestamp}] Got request "
|
||||
f"id {self.message_id}, "
|
||||
"removing from queue"))
|
||||
del self.queue[request["message_id"]]
|
||||
break
|
||||
except KeyError:
|
||||
# response not in received queue yey
|
||||
pass
|
||||
try:
|
||||
result = json.loads(self.received)
|
||||
break
|
||||
except json.decoder.JSONDecodeError:
|
||||
pass
|
||||
|
||||
self.received = ""
|
||||
|
||||
return result
|
||||
|
||||
def _pretty(self, message) -> str:
|
||||
# result = pformat(message, indent=2)
|
||||
# return result.replace("\\n", "\n")
|
||||
return "{}{}".format(4 * " ", message)
|
||||
|
||||
def timestamp(self):
|
||||
"""Return current timestamp as a string.
|
||||
|
||||
Returns:
|
||||
str: current timestamp.
|
||||
|
||||
"""
|
||||
return datetime.now().strftime("%H:%M:%S.%f")
|
||||
BIN
openpype/hosts/harmony/api/temp.zip
Normal file
BIN
openpype/hosts/harmony/api/temp.zip
Normal file
Binary file not shown.
78
openpype/hosts/harmony/api/workio.py
Normal file
78
openpype/hosts/harmony/api/workio.py
Normal file
|
|
@ -0,0 +1,78 @@
|
|||
"""Host API required Work Files tool"""
|
||||
import os
|
||||
import shutil
|
||||
|
||||
from .lib import (
|
||||
ProcessContext,
|
||||
get_local_harmony_path,
|
||||
zip_and_move,
|
||||
launch_zip_file
|
||||
)
|
||||
from avalon import api
|
||||
|
||||
# used to lock saving until previous save is done.
|
||||
save_disabled = False
|
||||
|
||||
|
||||
def file_extensions():
|
||||
return api.HOST_WORKFILE_EXTENSIONS["harmony"]
|
||||
|
||||
|
||||
def has_unsaved_changes():
|
||||
if ProcessContext.server:
|
||||
return ProcessContext.server.send(
|
||||
{"function": "scene.isDirty"})["result"]
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def save_file(filepath):
|
||||
global save_disabled
|
||||
if save_disabled:
|
||||
return ProcessContext.server.send(
|
||||
{
|
||||
"function": "show_message",
|
||||
"args": "Saving in progress, please wait until it finishes."
|
||||
})["result"]
|
||||
|
||||
save_disabled = True
|
||||
temp_path = get_local_harmony_path(filepath)
|
||||
|
||||
if ProcessContext.server:
|
||||
if os.path.exists(temp_path):
|
||||
try:
|
||||
shutil.rmtree(temp_path)
|
||||
except Exception as e:
|
||||
raise Exception(f"cannot delete {temp_path}") from e
|
||||
|
||||
ProcessContext.server.send(
|
||||
{"function": "scene.saveAs", "args": [temp_path]}
|
||||
)["result"]
|
||||
|
||||
zip_and_move(temp_path, filepath)
|
||||
|
||||
ProcessContext.workfile_path = filepath
|
||||
|
||||
scene_path = os.path.join(
|
||||
temp_path, os.path.basename(temp_path) + ".xstage"
|
||||
)
|
||||
ProcessContext.server.send(
|
||||
{"function": "AvalonHarmony.addPathToWatcher", "args": scene_path}
|
||||
)
|
||||
else:
|
||||
os.environ["HARMONY_NEW_WORKFILE_PATH"] = filepath.replace("\\", "/")
|
||||
|
||||
save_disabled = False
|
||||
|
||||
|
||||
def open_file(filepath):
|
||||
launch_zip_file(filepath)
|
||||
|
||||
|
||||
def current_file():
|
||||
"""Returning None to make Workfiles app look at first file extension."""
|
||||
return None
|
||||
|
||||
|
||||
def work_root(session):
|
||||
return os.path.normpath(session["AVALON_WORKDIR"]).replace("\\", "/")
|
||||
|
|
@ -1,6 +1,6 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Create Composite node for render on farm."""
|
||||
from avalon import harmony
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
from openpype.hosts.harmony.api import plugin
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Create render node."""
|
||||
from avalon import harmony
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
from openpype.hosts.harmony.api import plugin
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
from avalon import api, harmony
|
||||
from avalon import api
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
sig = harmony.signature()
|
||||
func = """
|
||||
|
|
|
|||
|
|
@ -1,7 +1,8 @@
|
|||
import os
|
||||
import json
|
||||
|
||||
from avalon import api, harmony
|
||||
from avalon import api
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
import openpype.lib
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -6,7 +6,8 @@ from pathlib import Path
|
|||
|
||||
import clique
|
||||
|
||||
from avalon import api, harmony
|
||||
from avalon import api
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
import openpype.lib
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -1,7 +1,8 @@
|
|||
import os
|
||||
import shutil
|
||||
|
||||
from avalon import api, harmony
|
||||
from avalon import api
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
|
||||
class ImportPaletteLoader(api.Loader):
|
||||
|
|
@ -41,7 +42,9 @@ class ImportPaletteLoader(api.Loader):
|
|||
harmony.save_scene()
|
||||
|
||||
msg = "Updated {}.".format(subset_name)
|
||||
msg += " You need to reload the scene to see the changes."
|
||||
msg += " You need to reload the scene to see the changes.\n"
|
||||
msg += "Please save workfile when ready and use Workfiles "
|
||||
msg += "to reopen it."
|
||||
|
||||
harmony.send(
|
||||
{
|
||||
|
|
|
|||
|
|
@ -6,7 +6,8 @@ import os
|
|||
import shutil
|
||||
import uuid
|
||||
|
||||
from avalon import api, harmony
|
||||
from avalon import api
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
import openpype.lib
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -3,7 +3,8 @@ import zipfile
|
|||
import os
|
||||
import shutil
|
||||
|
||||
from avalon import api, harmony
|
||||
from avalon import api
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
|
||||
class ImportTemplateLoader(api.Loader):
|
||||
|
|
|
|||
|
|
@ -3,7 +3,7 @@
|
|||
import os
|
||||
|
||||
import pyblish.api
|
||||
from avalon import harmony
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
|
||||
class CollectCurrentFile(pyblish.api.ContextPlugin):
|
||||
|
|
|
|||
|
|
@ -3,9 +3,10 @@
|
|||
from pathlib import Path
|
||||
|
||||
import attr
|
||||
from avalon import harmony, api
|
||||
from avalon import api
|
||||
|
||||
import openpype.lib.abstract_collect_render
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
from openpype.lib.abstract_collect_render import RenderInstance
|
||||
import openpype.lib
|
||||
|
||||
|
|
@ -176,6 +177,7 @@ class CollectFarmRender(openpype.lib.abstract_collect_render.
|
|||
ignoreFrameHandleCheck=True
|
||||
|
||||
)
|
||||
render_instance.context = context
|
||||
self.log.debug(render_instance)
|
||||
instances.append(render_instance)
|
||||
|
||||
|
|
|
|||
|
|
@ -3,7 +3,7 @@
|
|||
import json
|
||||
|
||||
import pyblish.api
|
||||
from avalon import harmony
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
|
||||
class CollectInstances(pyblish.api.ContextPlugin):
|
||||
|
|
|
|||
|
|
@ -5,7 +5,7 @@ import json
|
|||
import re
|
||||
|
||||
import pyblish.api
|
||||
from avalon import harmony
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
|
||||
class CollectPalettes(pyblish.api.ContextPlugin):
|
||||
|
|
|
|||
|
|
@ -3,7 +3,7 @@
|
|||
import os
|
||||
|
||||
import pyblish.api
|
||||
from avalon import harmony
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
|
||||
class CollectScene(pyblish.api.ContextPlugin):
|
||||
|
|
|
|||
|
|
@ -5,7 +5,7 @@ import csv
|
|||
|
||||
from PIL import Image, ImageDraw, ImageFont
|
||||
|
||||
from avalon import harmony
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
import openpype.api
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -3,7 +3,7 @@ import tempfile
|
|||
import subprocess
|
||||
|
||||
import pyblish.api
|
||||
from avalon import harmony
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
import openpype.lib
|
||||
|
||||
import clique
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
import pyblish.api
|
||||
from avalon import harmony
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
|
||||
class ExtractSaveScene(pyblish.api.ContextPlugin):
|
||||
|
|
|
|||
|
|
@ -4,7 +4,7 @@ import os
|
|||
import shutil
|
||||
|
||||
import openpype.api
|
||||
from avalon import harmony
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
import openpype.hosts.harmony
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -3,7 +3,7 @@ import os
|
|||
import pyblish.api
|
||||
from openpype.action import get_errored_plugins_from_data
|
||||
from openpype.lib import version_up
|
||||
from avalon import harmony
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
|
||||
class IncrementWorkfile(pyblish.api.InstancePlugin):
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@ import os
|
|||
|
||||
import pyblish.api
|
||||
|
||||
from avalon import harmony
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
|
||||
class ValidateAudio(pyblish.api.InstancePlugin):
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@ import os
|
|||
|
||||
import pyblish.api
|
||||
import openpype.api
|
||||
from avalon import harmony
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
|
||||
|
||||
class ValidateInstanceRepair(pyblish.api.Action):
|
||||
|
|
|
|||
|
|
@ -6,7 +6,7 @@ import re
|
|||
|
||||
import pyblish.api
|
||||
|
||||
from avalon import harmony
|
||||
import openpype.hosts.harmony.api as harmony
|
||||
import openpype.hosts.harmony
|
||||
|
||||
|
||||
|
|
|
|||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue