mirror of
https://github.com/ynput/ayon-core.git
synced 2025-12-24 21:04:40 +01:00
Merge branch 'develop' into feature/OP-5207_Global-host-color-management-activation
This commit is contained in:
commit
2c31dc1f21
231 changed files with 5311 additions and 5134 deletions
14
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
14
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
|
|
@ -35,6 +35,13 @@ body:
|
|||
label: Version
|
||||
description: What version are you running? Look to OpenPype Tray
|
||||
options:
|
||||
- 3.15.8-nightly.3
|
||||
- 3.15.8-nightly.2
|
||||
- 3.15.8-nightly.1
|
||||
- 3.15.7
|
||||
- 3.15.7-nightly.3
|
||||
- 3.15.7-nightly.2
|
||||
- 3.15.7-nightly.1
|
||||
- 3.15.6
|
||||
- 3.15.6-nightly.3
|
||||
- 3.15.6-nightly.2
|
||||
|
|
@ -128,13 +135,6 @@ body:
|
|||
- 3.14.2-nightly.3
|
||||
- 3.14.2-nightly.2
|
||||
- 3.14.2-nightly.1
|
||||
- 3.14.1
|
||||
- 3.14.1-nightly.4
|
||||
- 3.14.1-nightly.3
|
||||
- 3.14.1-nightly.2
|
||||
- 3.14.1-nightly.1
|
||||
- 3.14.0
|
||||
- 3.14.0-nightly.1
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
|
|
|
|||
6
.gitignore
vendored
6
.gitignore
vendored
|
|
@ -112,3 +112,9 @@ tools/run_eventserver.*
|
|||
tools/dev_*
|
||||
|
||||
.github_changelog_generator
|
||||
|
||||
|
||||
# Addons
|
||||
########
|
||||
/openpype/addons/*
|
||||
!/openpype/addons/README.md
|
||||
|
|
|
|||
5
.gitmodules
vendored
5
.gitmodules
vendored
|
|
@ -4,4 +4,7 @@
|
|||
|
||||
[submodule "tools/modules/powershell/PSWriteColor"]
|
||||
path = tools/modules/powershell/PSWriteColor
|
||||
url = https://github.com/EvotecIT/PSWriteColor.git
|
||||
url = https://github.com/EvotecIT/PSWriteColor.git
|
||||
[submodule "openpype/hosts/unreal/integration"]
|
||||
path = openpype/hosts/unreal/integration
|
||||
url = https://github.com/ynput/ayon-unreal-plugin.git
|
||||
|
|
|
|||
297
CHANGELOG.md
297
CHANGELOG.md
|
|
@ -1,6 +1,303 @@
|
|||
# Changelog
|
||||
|
||||
|
||||
## [3.15.7](https://github.com/ynput/OpenPype/tree/3.15.7)
|
||||
|
||||
|
||||
[Full Changelog](https://github.com/ynput/OpenPype/compare/3.15.6...3.15.7)
|
||||
|
||||
### **🆕 New features**
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Addons directory <a href="https://github.com/ynput/OpenPype/pull/4893">#4893</a></summary>
|
||||
|
||||
This adds a directory for Addons, for easier distribution of studio specific code.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Kitsu - Add "image", "online" and "plate" to review families <a href="https://github.com/ynput/OpenPype/pull/4923">#4923</a></summary>
|
||||
|
||||
This PR adds "image", "online" and "plate" to the review families so they also can be uploaded to Kitsu.It also adds the `Add review to Kitsu` tag to the default png review. Without it the user would manually need to add it for single image uploads to Kitsu and might confuse users (it confused me first for a while as movies did work).
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Feature/remove and load inv action <a href="https://github.com/ynput/OpenPype/pull/4930">#4930</a></summary>
|
||||
|
||||
Added the ability to remove and load a container, as a way to reset it.This can be useful in cases where a container breaks in a way that can be fixed by removing it, then reloading it.Also added the ability to add `InventoryAction` plugins by placing them in `openpype/plugins/inventory`.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
### **🚀 Enhancements**
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Load Rig References - Change Rig to Animation in Animation instance <a href="https://github.com/ynput/OpenPype/pull/4877">#4877</a></summary>
|
||||
|
||||
We are using the template builder to build an animation scene. All the rig placeholders are imported correctly, but the automatically created animation instances retain the rig family in their names and subsets. In our example, we need animationMain instead of rigMain, because this name will be used in the following steps like lighting.Here is the result we need. I checked, and it's not a template builder problem, because even if I load a rig as a reference, the result is the same. For me, since we are in the animation instance, it makes more sense to have animation instead of rig in the name. The naming is just fine if we use create from the Openpype menu.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Maya template builder - preserve all references when importing a template <a href="https://github.com/ynput/OpenPype/pull/4797">#4797</a></summary>
|
||||
|
||||
When building a template with Maya template builder, we import the template and also the references inside the template file. This causes some problems:
|
||||
- We cannot use the references to version assets imported by the template.
|
||||
- When we import the file, the internal reference files are also imported. As a side effect, Maya complains about a reference that no longer exists.`// Error: file: /xxx/maya/2023.3/linux/scripts/AETemplates/AEtransformRelated.mel line 58: Reference node 'turntable_mayaSceneMain_01_RN' is not associated with a reference file.`
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Unreal: Renaming the integration plugin to Ayon. <a href="https://github.com/ynput/OpenPype/pull/4646">#4646</a></summary>
|
||||
|
||||
Renamed the .h, and .cpp files to Ayon. Also renamed the classes to with the Ayon keyword.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>3dsMax: render dialogue needs to be closed <a href="https://github.com/ynput/OpenPype/pull/4729">#4729</a></summary>
|
||||
|
||||
Make sure the render setup dialog is in a closed state for the update of resolution and other render settings
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Maya Template Builder - Remove default cameras from renderable cameras <a href="https://github.com/ynput/OpenPype/pull/4815">#4815</a></summary>
|
||||
|
||||
When we build an asset workfile with build workfile from template inside Maya, we load our turntable camera. But then we end up with 2 renderables camera : **persp** the one imported from the template.We need to remove the **persp** camera (or any other default camera) from renderable cameras when building the work file.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Validators for Frame Range in Max <a href="https://github.com/ynput/OpenPype/pull/4914">#4914</a></summary>
|
||||
|
||||
Switch Render Frame Range Type to 3 for specific ranges (initial setup for the range type is 4)Reset Frame Range will also set the frame range for render settingsRender Collector won't take the frame range from context data but take the range directly from render settingAdd validators for render frame range type and frame range respectively with repair action
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Fusion: Saver creator settings <a href="https://github.com/ynput/OpenPype/pull/4943">#4943</a></summary>
|
||||
|
||||
Adding Saver creator settings and enhanced rendering path with template.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>General: Project Anatomy on creators <a href="https://github.com/ynput/OpenPype/pull/4962">#4962</a></summary>
|
||||
|
||||
Anatomy object of current project is available on `CreateContext` and create plugins.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
### **🐛 Bug fixes**
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Maya: Validate shader name - OP-5903 <a href="https://github.com/ynput/OpenPype/pull/4971">#4971</a></summary>
|
||||
|
||||
Running the plugin would error with:
|
||||
```
|
||||
// TypeError: 'str' object cannot be interpreted as an integer
|
||||
```Fixed and added setting `active`.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Houdini: Fix slow Houdini launch due to shelves generation <a href="https://github.com/ynput/OpenPype/pull/4829">#4829</a></summary>
|
||||
|
||||
Shelf generation during Houdini startup would add an insane amount of delay for the Houdini UI to launch correctly. By deferring the shelf generation this takes away the 5+ minutes of delay for the Houdini UI to launch.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Fusion - Fixed "optional validation" <a href="https://github.com/ynput/OpenPype/pull/4912">#4912</a></summary>
|
||||
|
||||
Added OptionalPyblishPluginMixin and is_active checks for all publish tools that should be optional
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Bug: add missing `pyblish.util` import <a href="https://github.com/ynput/OpenPype/pull/4937">#4937</a></summary>
|
||||
|
||||
remote publishing was missing import of `remote_publish`. This is adding it back.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Unreal: Fix missing 'object_path' property <a href="https://github.com/ynput/OpenPype/pull/4938">#4938</a></summary>
|
||||
|
||||
Epic removed the `object_path` property from `AssetData`. This PR fixes usages of that property.Fixes #4936
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Remove obsolete global validator <a href="https://github.com/ynput/OpenPype/pull/4939">#4939</a></summary>
|
||||
|
||||
Removing `Validate Sequence Frames` validator from global plugins as it wasn't handling correctly many things and was by mistake enabled, breaking functionality on Deadline.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>General: fix build_workfile get_linked_assets missing project_name arg <a href="https://github.com/ynput/OpenPype/pull/4940">#4940</a></summary>
|
||||
|
||||
Linked assets collection don't work within `build_workfile` because `get_linked_assets` function call has a missing `project_name`argument.
|
||||
- Added the `project_name` arg to the `get_linked_assets` function call.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>General: fix Scene Inventory switch version error dialog missing parent arg on init <a href="https://github.com/ynput/OpenPype/pull/4941">#4941</a></summary>
|
||||
|
||||
QuickFix for the switch version error dialog to set inventory widget as parent.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Unreal: Fix camera frame range <a href="https://github.com/ynput/OpenPype/pull/4956">#4956</a></summary>
|
||||
|
||||
Fix the frame range of the level sequence for the Camera in Unreal.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Unreal: Fix missing parameter when updating Alembic StaticMesh <a href="https://github.com/ynput/OpenPype/pull/4957">#4957</a></summary>
|
||||
|
||||
Fix an error when updating an Alembic StaticMesh in Unreal, due to a missing parameter in a function call.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Unreal: Fix render extraction <a href="https://github.com/ynput/OpenPype/pull/4963">#4963</a></summary>
|
||||
|
||||
Fix a problem with the extraction of renders in Unreal.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Unreal: Remove Python 3.8 syntax from addon <a href="https://github.com/ynput/OpenPype/pull/4965">#4965</a></summary>
|
||||
|
||||
Removed Python 3.8 syntax from addon.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Ftrack: Fix editorial task creation <a href="https://github.com/ynput/OpenPype/pull/4966">#4966</a></summary>
|
||||
|
||||
Fix key assignment on instance data during editorial publishing in ftrack hierarchy integration.
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
### **Merged pull requests**
|
||||
|
||||
|
||||
<details>
|
||||
<summary>Add "shortcut" to Scripts Menu Definition <a href="https://github.com/ynput/OpenPype/pull/4927">#4927</a></summary>
|
||||
|
||||
Add the possibility to associate a shorcut for an entry in the script menu definition with the key "shortcut"
|
||||
|
||||
|
||||
___
|
||||
|
||||
</details>
|
||||
|
||||
|
||||
|
||||
|
||||
## [3.15.6](https://github.com/ynput/OpenPype/tree/3.15.6)
|
||||
|
||||
|
||||
|
|
|
|||
3
openpype/addons/README.md
Normal file
3
openpype/addons/README.md
Normal file
|
|
@ -0,0 +1,3 @@
|
|||
This directory is for storing external addons that needs to be included in the pipeline when distributed.
|
||||
|
||||
The directory is ignored by Git, but included in the zip and installation files.
|
||||
|
|
@ -4,9 +4,8 @@ Anything that isn't defined here is INTERNAL and unreliable for external use.
|
|||
|
||||
"""
|
||||
|
||||
from .launch_logic import (
|
||||
from .ws_stub import (
|
||||
get_stub,
|
||||
stub,
|
||||
)
|
||||
|
||||
from .pipeline import (
|
||||
|
|
@ -18,7 +17,8 @@ from .pipeline import (
|
|||
from .lib import (
|
||||
maintained_selection,
|
||||
get_extension_manifest_path,
|
||||
get_asset_settings
|
||||
get_asset_settings,
|
||||
set_settings
|
||||
)
|
||||
|
||||
from .plugin import (
|
||||
|
|
@ -27,9 +27,8 @@ from .plugin import (
|
|||
|
||||
|
||||
__all__ = [
|
||||
# launch_logic
|
||||
# ws_stub
|
||||
"get_stub",
|
||||
"stub",
|
||||
|
||||
# pipeline
|
||||
"ls",
|
||||
|
|
@ -39,6 +38,7 @@ __all__ = [
|
|||
"maintained_selection",
|
||||
"get_extension_manifest_path",
|
||||
"get_asset_settings",
|
||||
"set_settings",
|
||||
|
||||
# plugin
|
||||
"AfterEffectsLoader"
|
||||
|
|
|
|||
Binary file not shown.
|
|
@ -1,6 +1,6 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<ExtensionManifest Version="8.0" ExtensionBundleId="com.openpype.AE.panel" ExtensionBundleVersion="1.0.24"
|
||||
ExtensionBundleName="openpype" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
|
||||
<ExtensionManifest Version="8.0" ExtensionBundleId="com.openpype.AE.panel" ExtensionBundleVersion="1.0.25"
|
||||
ExtensionBundleName="com.openpype.AE.panel" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
|
||||
<ExtensionList>
|
||||
<Extension Id="com.openpype.AE.panel" Version="1.0" />
|
||||
</ExtensionList>
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@
|
|||
<html>
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
|
||||
|
||||
<link rel="stylesheet" href="css/topcoat-desktop-dark.min.css"/>
|
||||
<link id="hostStyle" rel="stylesheet" href="css/styles.css"/>
|
||||
|
||||
|
|
@ -25,11 +25,11 @@
|
|||
|
||||
<title></title>
|
||||
<script src="js/libs/jquery-2.0.2.min.js"></script>
|
||||
|
||||
|
||||
<script type=text/javascript>
|
||||
$(function() {
|
||||
$("a#workfiles-button").bind("click", function() {
|
||||
|
||||
|
||||
RPC.call('AfterEffects.workfiles_route').then(function (data) {
|
||||
}, function (error) {
|
||||
alert(error);
|
||||
|
|
@ -37,7 +37,7 @@
|
|||
});
|
||||
});
|
||||
</script>
|
||||
|
||||
|
||||
<script type=text/javascript>
|
||||
$(function() {
|
||||
$("a#loader-button").bind("click", function() {
|
||||
|
|
@ -48,7 +48,7 @@
|
|||
});
|
||||
});
|
||||
</script>
|
||||
|
||||
|
||||
<script type=text/javascript>
|
||||
$(function() {
|
||||
$("a#publish-button").bind("click", function() {
|
||||
|
|
@ -59,7 +59,7 @@
|
|||
});
|
||||
});
|
||||
</script>
|
||||
|
||||
|
||||
<script type=text/javascript>
|
||||
$(function() {
|
||||
$("a#sceneinventory-button").bind("click", function() {
|
||||
|
|
@ -70,7 +70,40 @@
|
|||
});
|
||||
});
|
||||
</script>
|
||||
|
||||
|
||||
<script type=text/javascript>
|
||||
$(function() {
|
||||
$("a#setresolution-button").bind("click", function() {
|
||||
RPC.call('AfterEffects.setresolution_route').then(function (data) {
|
||||
}, function (error) {
|
||||
alert(error);
|
||||
});
|
||||
});
|
||||
});
|
||||
</script>
|
||||
|
||||
<script type=text/javascript>
|
||||
$(function() {
|
||||
$("a#setframes-button").bind("click", function() {
|
||||
RPC.call('AfterEffects.setframes_route').then(function (data) {
|
||||
}, function (error) {
|
||||
alert(error);
|
||||
});
|
||||
});
|
||||
});
|
||||
</script>
|
||||
|
||||
<script type=text/javascript>
|
||||
$(function() {
|
||||
$("a#setall-button").bind("click", function() {
|
||||
RPC.call('AfterEffects.setall_route').then(function (data) {
|
||||
}, function (error) {
|
||||
alert(error);
|
||||
});
|
||||
});
|
||||
});
|
||||
</script>
|
||||
|
||||
<script type=text/javascript>
|
||||
$(function() {
|
||||
$("a#experimental-button").bind("click", function() {
|
||||
|
|
@ -80,25 +113,28 @@
|
|||
});
|
||||
});
|
||||
});
|
||||
</script>
|
||||
|
||||
|
||||
</script>
|
||||
|
||||
|
||||
</head>
|
||||
|
||||
<body class="hostElt">
|
||||
|
||||
<div id="content">
|
||||
|
||||
<div>
|
||||
<div id="content">
|
||||
|
||||
<div>
|
||||
<div></div><a href=# id=workfiles-button><button class="hostFontSize">Workfiles...</button></a></div>
|
||||
<div><a href=# id=loader-button><button class="hostFontSize">Load...</button></a></div>
|
||||
<div><a href=# id=publish-button><button class="hostFontSize">Publish...</button></a></div>
|
||||
<div><a href=# id=sceneinventory-button><button class="hostFontSize">Manage...</button></a></div>
|
||||
<div><a href=# id=setresolution-button><button class="hostFontSize">Set Resolution</button></a></div>
|
||||
<div><a href=# id=setframes-button><button class="hostFontSize">Set Frame Range</button></a></div>
|
||||
<div><a href=# id=setall-button><button class="hostFontSize">Apply All Settings</button></a></div>
|
||||
<div><a href=# id=experimental-button><button class="hostFontSize">Experimental Tools...</button></a></div>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
|
||||
<!-- <script src="js/libs/PlayerDebugMode"></script> -->
|
||||
<script src="js/libs/wsrpc.js"></script>
|
||||
<script src="js/libs/loglevel.min.js"></script>
|
||||
|
|
@ -107,6 +143,6 @@
|
|||
<script src="js/themeManager.js"></script>
|
||||
<script src="js/main.js"></script>
|
||||
|
||||
|
||||
|
||||
</body>
|
||||
</html>
|
||||
</html>
|
||||
|
|
|
|||
|
|
@ -4,7 +4,7 @@ indent: 4, maxerr: 50 */
|
|||
|
||||
|
||||
var csInterface = new CSInterface();
|
||||
|
||||
|
||||
log.warn("script start");
|
||||
|
||||
WSRPC.DEBUG = false;
|
||||
|
|
@ -14,7 +14,7 @@ WSRPC.TRACE = false;
|
|||
async function startUp(url){
|
||||
promis = runEvalScript("getEnv('" + url + "')");
|
||||
|
||||
var res = await promis;
|
||||
var res = await promis;
|
||||
log.warn("res: " + res);
|
||||
|
||||
promis = runEvalScript("getEnv('OPENPYPE_DEBUG')");
|
||||
|
|
@ -56,7 +56,7 @@ function get_extension_version(){
|
|||
}
|
||||
|
||||
function main(websocket_url){
|
||||
// creates connection to 'websocket_url', registers routes
|
||||
// creates connection to 'websocket_url', registers routes
|
||||
var default_url = 'ws://localhost:8099/ws/';
|
||||
|
||||
if (websocket_url == ''){
|
||||
|
|
@ -66,7 +66,7 @@ function main(websocket_url){
|
|||
|
||||
RPC.connect();
|
||||
|
||||
log.warn("connected");
|
||||
log.warn("connected");
|
||||
|
||||
RPC.addRoute('AfterEffects.open', function (data) {
|
||||
log.warn('Server called client route "open":', data);
|
||||
|
|
@ -88,7 +88,7 @@ function main(websocket_url){
|
|||
});
|
||||
|
||||
RPC.addRoute('AfterEffects.get_active_document_name', function (data) {
|
||||
log.warn('Server called client route ' +
|
||||
log.warn('Server called client route ' +
|
||||
'"get_active_document_name":', data);
|
||||
return runEvalScript("getActiveDocumentName()")
|
||||
.then(function(result){
|
||||
|
|
@ -98,7 +98,7 @@ function main(websocket_url){
|
|||
});
|
||||
|
||||
RPC.addRoute('AfterEffects.get_active_document_full_name', function (data){
|
||||
log.warn('Server called client route ' +
|
||||
log.warn('Server called client route ' +
|
||||
'"get_active_document_full_name":', data);
|
||||
return runEvalScript("getActiveDocumentFullName()")
|
||||
.then(function(result){
|
||||
|
|
@ -118,7 +118,7 @@ function main(websocket_url){
|
|||
});
|
||||
});
|
||||
|
||||
|
||||
|
||||
RPC.addRoute('AfterEffects.get_selected_items', function (data) {
|
||||
log.warn('Server called client route "get_selected_items":', data);
|
||||
return runEvalScript("getSelectedItems(" + data.comps + "," +
|
||||
|
|
@ -194,23 +194,25 @@ function main(websocket_url){
|
|||
});
|
||||
});
|
||||
|
||||
RPC.addRoute('AfterEffects.get_work_area', function (data) {
|
||||
log.warn('Server called client route "get_work_area":', data);
|
||||
return runEvalScript("getWorkArea(" + data.item_id + ")")
|
||||
RPC.addRoute('AfterEffects.get_comp_properties', function (data) {
|
||||
log.warn('Server called client route "get_comp_properties":', data);
|
||||
return runEvalScript("getCompProperties(" + data.item_id + ")")
|
||||
.then(function(result){
|
||||
log.warn("getWorkArea: " + result);
|
||||
log.warn("get_comp_properties: " + result);
|
||||
return result;
|
||||
});
|
||||
});
|
||||
|
||||
RPC.addRoute('AfterEffects.set_work_area', function (data) {
|
||||
RPC.addRoute('AfterEffects.set_comp_properties', function (data) {
|
||||
log.warn('Server called client route "set_work_area":', data);
|
||||
return runEvalScript("setWorkArea(" + data.item_id + ',' +
|
||||
return runEvalScript("setCompProperties(" + data.item_id + ',' +
|
||||
data.start + ',' +
|
||||
data.duration + ',' +
|
||||
data.frame_rate + ")")
|
||||
data.frame_rate + ',' +
|
||||
data.width + ',' +
|
||||
data.height + ")")
|
||||
.then(function(result){
|
||||
log.warn("getWorkArea: " + result);
|
||||
log.warn("set_comp_properties: " + result);
|
||||
return result;
|
||||
});
|
||||
});
|
||||
|
|
@ -255,7 +257,7 @@ function main(websocket_url){
|
|||
|
||||
RPC.addRoute('AfterEffects.import_background', function (data) {
|
||||
log.warn('Server called client route "import_background":', data);
|
||||
return runEvalScript("importBackground(" + data.comp_id + ", " +
|
||||
return runEvalScript("importBackground(" + data.comp_id + ", " +
|
||||
"'" + data.comp_name + "', " +
|
||||
JSON.stringify(data.files) + ")")
|
||||
.then(function(result){
|
||||
|
|
@ -266,7 +268,7 @@ function main(websocket_url){
|
|||
|
||||
RPC.addRoute('AfterEffects.reload_background', function (data) {
|
||||
log.warn('Server called client route "reload_background":', data);
|
||||
return runEvalScript("reloadBackground(" + data.comp_id + ", " +
|
||||
return runEvalScript("reloadBackground(" + data.comp_id + ", " +
|
||||
"'" + data.comp_name + "', " +
|
||||
JSON.stringify(data.files) + ")")
|
||||
.then(function(result){
|
||||
|
|
@ -314,6 +316,16 @@ function main(websocket_url){
|
|||
log.warn('Server called client route "close":', data);
|
||||
return runEvalScript("close()");
|
||||
});
|
||||
|
||||
RPC.addRoute('AfterEffects.print_msg', function (data) {
|
||||
log.warn('Server called client route "print_msg":', data);
|
||||
var escaped_msg = EscapeStringForJSX(data.msg);
|
||||
return runEvalScript("printMsg('" + escaped_msg +"')")
|
||||
.then(function(result){
|
||||
log.warn("print_msg: " + result);
|
||||
return result;
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
/** main entry point **/
|
||||
|
|
@ -323,17 +335,17 @@ startUp("WEBSOCKET_URL");
|
|||
'use strict';
|
||||
|
||||
var csInterface = new CSInterface();
|
||||
|
||||
|
||||
|
||||
|
||||
function init() {
|
||||
|
||||
|
||||
themeManager.init();
|
||||
|
||||
|
||||
$("#btn_test").click(function () {
|
||||
csInterface.evalScript('sayHello()');
|
||||
});
|
||||
}
|
||||
|
||||
|
||||
init();
|
||||
|
||||
}());
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
/*jslint vars: true, plusplus: true, devel: true, nomen: true, regexp: true,
|
||||
indent: 4, maxerr: 50 */
|
||||
/*global $, Folder*/
|
||||
#include "../js/libs/json.js";
|
||||
//@include "../js/libs/json.js"
|
||||
|
||||
/* All public API function should return JSON! */
|
||||
|
||||
|
|
@ -29,13 +29,13 @@ function getEnv(variable){
|
|||
function getMetadata(){
|
||||
/**
|
||||
* Returns payload in 'Label' field of project's metadata
|
||||
*
|
||||
*
|
||||
**/
|
||||
if (ExternalObject.AdobeXMPScript === undefined){
|
||||
ExternalObject.AdobeXMPScript =
|
||||
new ExternalObject('lib:AdobeXMPScript');
|
||||
}
|
||||
|
||||
|
||||
var proj = app.project;
|
||||
var meta = new XMPMeta(app.project.xmpPacket);
|
||||
var schemaNS = XMPMeta.getNamespaceURI("xmp");
|
||||
|
|
@ -53,7 +53,7 @@ function getMetadata(){
|
|||
function imprint(payload){
|
||||
/**
|
||||
* Stores payload in 'Label' field of project's metadata
|
||||
*
|
||||
*
|
||||
* Args:
|
||||
* payload (string): json content
|
||||
*/
|
||||
|
|
@ -61,14 +61,14 @@ function imprint(payload){
|
|||
ExternalObject.AdobeXMPScript =
|
||||
new ExternalObject('lib:AdobeXMPScript');
|
||||
}
|
||||
|
||||
|
||||
var proj = app.project;
|
||||
var meta = new XMPMeta(app.project.xmpPacket);
|
||||
var schemaNS = XMPMeta.getNamespaceURI("xmp");
|
||||
var label = "xmp:Label";
|
||||
|
||||
meta.setProperty(schemaNS, label, payload);
|
||||
|
||||
|
||||
app.project.xmpPacket = meta.serialize();
|
||||
|
||||
}
|
||||
|
|
@ -116,14 +116,14 @@ function getItems(comps, folders, footages){
|
|||
/**
|
||||
* Returns JSON representation of compositions and
|
||||
* if 'collectLayers' then layers in comps too.
|
||||
*
|
||||
*
|
||||
* Args:
|
||||
* comps (bool): return selected compositions
|
||||
* folders (bool): return folders
|
||||
* footages (bool): return FootageItem
|
||||
* Returns:
|
||||
* (list) of JSON items
|
||||
*/
|
||||
*/
|
||||
var items = []
|
||||
for (i = 1; i <= app.project.items.length; ++i){
|
||||
var item = app.project.items[i];
|
||||
|
|
@ -142,14 +142,14 @@ function getItems(comps, folders, footages){
|
|||
function getSelectedItems(comps, folders, footages){
|
||||
/**
|
||||
* Returns list of selected items from Project menu
|
||||
*
|
||||
*
|
||||
* Args:
|
||||
* comps (bool): return selected compositions
|
||||
* folders (bool): return folders
|
||||
* footages (bool): return FootageItem
|
||||
* Returns:
|
||||
* (list) of JSON items
|
||||
*/
|
||||
*/
|
||||
var items = []
|
||||
for (i = 0; i < app.project.selection.length; ++i){
|
||||
var item = app.project.selection[i];
|
||||
|
|
@ -166,9 +166,9 @@ function getSelectedItems(comps, folders, footages){
|
|||
|
||||
function _getItem(item, comps, folders, footages){
|
||||
/**
|
||||
* Auxiliary function as project items and selections
|
||||
* Auxiliary function as project items and selections
|
||||
* are indexed in different way :/
|
||||
* Refactor
|
||||
* Refactor
|
||||
*/
|
||||
var item_type = '';
|
||||
if (item instanceof FolderItem){
|
||||
|
|
@ -189,7 +189,7 @@ function _getItem(item, comps, folders, footages){
|
|||
return "{}";
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
var item = {"name": item.name,
|
||||
"id": item.id,
|
||||
"type": item_type};
|
||||
|
|
@ -200,7 +200,7 @@ function importFile(path, item_name, import_options){
|
|||
/**
|
||||
* Imports file (image tested for now) as a FootageItem.
|
||||
* Creates new composition
|
||||
*
|
||||
*
|
||||
* Args:
|
||||
* path (string): absolute path to image file
|
||||
* item_name (string): label for composition
|
||||
|
|
@ -218,7 +218,7 @@ function importFile(path, item_name, import_options){
|
|||
app.beginUndoGroup("Import File");
|
||||
fp = new File(path);
|
||||
if (fp.exists){
|
||||
try {
|
||||
try {
|
||||
im_opt = new ImportOptions(fp);
|
||||
importAsType = import_options["ImportAsType"];
|
||||
|
||||
|
|
@ -234,18 +234,18 @@ function importFile(path, item_name, import_options){
|
|||
}
|
||||
if (importAsType.indexOf('PROJECT') > 0){
|
||||
im_opt.importAs = ImportAsType.PROJECT;
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
}
|
||||
if ('sequence' in import_options){
|
||||
im_opt.sequence = true;
|
||||
}
|
||||
|
||||
|
||||
comp = app.project.importFile(im_opt);
|
||||
|
||||
if (app.project.selection.length == 2 &&
|
||||
app.project.selection[0] instanceof FolderItem){
|
||||
comp.parentFolder = app.project.selection[0]
|
||||
comp.parentFolder = app.project.selection[0]
|
||||
}
|
||||
} catch (error) {
|
||||
return _prepareError(error.toString() + importOptions.file.fsName);
|
||||
|
|
@ -283,14 +283,14 @@ function setLabelColor(comp_id, color_idx){
|
|||
function replaceItem(comp_id, path, item_name){
|
||||
/**
|
||||
* Replaces loaded file with new file and updates name
|
||||
*
|
||||
*
|
||||
* Args:
|
||||
* comp_id (int): id of composition, not a index!
|
||||
* path (string): absolute path to new file
|
||||
* item_name (string): new composition name
|
||||
*/
|
||||
app.beginUndoGroup("Replace File");
|
||||
|
||||
|
||||
fp = new File(path);
|
||||
if (!fp.exists){
|
||||
return _prepareError("File " + path + " not found.");
|
||||
|
|
@ -303,7 +303,7 @@ function replaceItem(comp_id, path, item_name){
|
|||
}else{
|
||||
item.replace(fp);
|
||||
}
|
||||
|
||||
|
||||
item.name = item_name;
|
||||
} catch (error) {
|
||||
return _prepareError(error.toString() + path);
|
||||
|
|
@ -319,7 +319,7 @@ function replaceItem(comp_id, path, item_name){
|
|||
function renameItem(item_id, new_name){
|
||||
/**
|
||||
* Renames item with 'item_id' to 'new_name'
|
||||
*
|
||||
*
|
||||
* Args:
|
||||
* item_id (int): id to search item
|
||||
* new_name (str)
|
||||
|
|
@ -335,7 +335,7 @@ function renameItem(item_id, new_name){
|
|||
function deleteItem(item_id){
|
||||
/**
|
||||
* Delete any 'item_id'
|
||||
*
|
||||
*
|
||||
* Not restricted only to comp, it could delete
|
||||
* any item with 'id'
|
||||
*/
|
||||
|
|
@ -347,38 +347,76 @@ function deleteItem(item_id){
|
|||
}
|
||||
}
|
||||
|
||||
function getWorkArea(comp_id){
|
||||
function getCompProperties(comp_id){
|
||||
/**
|
||||
* Returns information about workarea - are that will be
|
||||
* rendered. All calculation will be done in OpenPype,
|
||||
* easier to modify without redeploy of extension.
|
||||
*
|
||||
* Returns information about composition - are that will be
|
||||
* rendered.
|
||||
*
|
||||
* Returns
|
||||
* (dict)
|
||||
*/
|
||||
var item = app.project.itemByID(comp_id);
|
||||
if (item){
|
||||
return JSON.stringify({
|
||||
"workAreaStart": item.displayStartFrame,
|
||||
"workAreaDuration": item.duration,
|
||||
"frameRate": item.frameRate});
|
||||
}else{
|
||||
var comp = app.project.itemByID(comp_id);
|
||||
if (!comp){
|
||||
return _prepareError("There is no composition with "+ comp_id);
|
||||
}
|
||||
|
||||
return JSON.stringify({
|
||||
"id": comp.id,
|
||||
"name": comp.name,
|
||||
"frameStart": comp.displayStartFrame,
|
||||
"framesDuration": comp.duration * comp.frameRate,
|
||||
"frameRate": comp.frameRate,
|
||||
"width": comp.width,
|
||||
"height": comp.height});
|
||||
}
|
||||
|
||||
function setWorkArea(comp_id, workAreaStart, workAreaDuration, frameRate){
|
||||
function setCompProperties(comp_id, frameStart, framesCount, frameRate,
|
||||
width, height){
|
||||
/**
|
||||
* Sets work area info from outside (from Ftrack via OpenPype)
|
||||
*/
|
||||
var item = app.project.itemByID(comp_id);
|
||||
if (item){
|
||||
item.displayStartTime = workAreaStart;
|
||||
item.duration = workAreaDuration;
|
||||
item.frameRate = frameRate;
|
||||
}else{
|
||||
var comp = app.project.itemByID(comp_id);
|
||||
if (!comp){
|
||||
return _prepareError("There is no composition with "+ comp_id);
|
||||
}
|
||||
|
||||
app.beginUndoGroup('change comp properties');
|
||||
if (frameStart && framesCount && frameRate){
|
||||
comp.displayStartFrame = frameStart;
|
||||
comp.duration = framesCount / frameRate;
|
||||
comp.frameRate = frameRate;
|
||||
}
|
||||
if (width && height){
|
||||
var widthOld = comp.width;
|
||||
var widthNew = width;
|
||||
var widthDelta = widthNew - widthOld;
|
||||
|
||||
var heightOld = comp.height;
|
||||
var heightNew = height;
|
||||
var heightDelta = heightNew - heightOld;
|
||||
|
||||
var offset = [widthDelta / 2, heightDelta / 2];
|
||||
|
||||
comp.width = widthNew;
|
||||
comp.height = heightNew;
|
||||
|
||||
for (var i = 1, il = comp.numLayers; i <= il; i++) {
|
||||
var layer = comp.layer(i);
|
||||
var positionProperty = layer.property('ADBE Transform Group').property('ADBE Position');
|
||||
|
||||
if (positionProperty.numKeys > 0) {
|
||||
for (var j = 1, jl = positionProperty.numKeys; j <= jl; j++) {
|
||||
var keyValue = positionProperty.keyValue(j);
|
||||
positionProperty.setValueAtKey(j, keyValue + offset);
|
||||
}
|
||||
} else {
|
||||
var positionValue = positionProperty.value;
|
||||
positionProperty.setValue(positionValue + offset);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
app.endUndoGroup();
|
||||
}
|
||||
|
||||
function save(){
|
||||
|
|
@ -504,7 +542,7 @@ function addItemAsLayerToComp(comp_id, item_id, found_comp){
|
|||
* Args:
|
||||
* comp_id (int): id of target composition
|
||||
* item_id (int): FootageItem.id
|
||||
* found_comp (CompItem, optional): to limit querying if
|
||||
* found_comp (CompItem, optional): to limit quering if
|
||||
* comp already found previously
|
||||
*/
|
||||
var comp = found_comp || app.project.itemByID(comp_id);
|
||||
|
|
@ -749,7 +787,7 @@ function render(target_folder, comp_id){
|
|||
|
||||
var om1 = app.project.renderQueue.item(i).outputModule(1);
|
||||
var file_name = File.decode( om1.file.name ).replace('℗', ''); // Name contains special character, space?
|
||||
|
||||
|
||||
var omItem1_settable_str = app.project.renderQueue.item(i).outputModule(1).getSettings( GetSettingsFormat.STRING_SETTABLE );
|
||||
|
||||
var targetFolder = new Folder(target_folder);
|
||||
|
|
@ -763,7 +801,7 @@ function render(target_folder, comp_id){
|
|||
render_item.render = false;
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
}
|
||||
app.beginSuppressDialogs();
|
||||
app.project.renderQueue.render();
|
||||
|
|
@ -779,6 +817,10 @@ function getAppVersion(){
|
|||
return _prepareSingleValue(app.version);
|
||||
}
|
||||
|
||||
function printMsg(msg){
|
||||
alert(msg);
|
||||
}
|
||||
|
||||
function _prepareSingleValue(value){
|
||||
return JSON.stringify({"result": value})
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,49 +1,77 @@
|
|||
import os
|
||||
import sys
|
||||
import subprocess
|
||||
import collections
|
||||
import logging
|
||||
import asyncio
|
||||
import functools
|
||||
import traceback
|
||||
|
||||
|
||||
from wsrpc_aiohttp import (
|
||||
WebSocketRoute,
|
||||
WebSocketAsync
|
||||
)
|
||||
|
||||
from qtpy import QtCore
|
||||
from qtpy import QtCore, QtWidgets
|
||||
|
||||
from openpype.lib import Logger
|
||||
from openpype.pipeline import legacy_io
|
||||
from openpype.tools.utils import host_tools
|
||||
from openpype.tests.lib import is_in_tests
|
||||
from openpype.pipeline import install_host, legacy_io
|
||||
from openpype.modules import ModulesManager
|
||||
from openpype.tools.adobe_webserver.app import WebServerTool
|
||||
|
||||
from .ws_stub import AfterEffectsServerStub
|
||||
from .ws_stub import get_stub
|
||||
from .lib import set_settings
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
log.setLevel(logging.DEBUG)
|
||||
|
||||
|
||||
class ConnectionNotEstablishedYet(Exception):
|
||||
pass
|
||||
def safe_excepthook(*args):
|
||||
traceback.print_exception(*args)
|
||||
|
||||
|
||||
def get_stub():
|
||||
"""
|
||||
Convenience function to get server RPC stub to call methods directed
|
||||
for host (Photoshop).
|
||||
It expects already created connection, started from client.
|
||||
Currently created when panel is opened (PS: Window>Extensions>Avalon)
|
||||
:return: <PhotoshopClientStub> where functions could be called from
|
||||
"""
|
||||
ae_stub = AfterEffectsServerStub()
|
||||
if not ae_stub.client:
|
||||
raise ConnectionNotEstablishedYet("Connection is not created yet")
|
||||
def main(*subprocess_args):
|
||||
"""Main entrypoint to AE launching, called from pre hook."""
|
||||
sys.excepthook = safe_excepthook
|
||||
|
||||
return ae_stub
|
||||
from openpype.hosts.aftereffects.api import AfterEffectsHost
|
||||
|
||||
host = AfterEffectsHost()
|
||||
install_host(host)
|
||||
|
||||
def stub():
|
||||
return get_stub()
|
||||
os.environ["OPENPYPE_LOG_NO_COLORS"] = "False"
|
||||
app = QtWidgets.QApplication([])
|
||||
app.setQuitOnLastWindowClosed(False)
|
||||
|
||||
launcher = ProcessLauncher(subprocess_args)
|
||||
launcher.start()
|
||||
|
||||
if os.environ.get("HEADLESS_PUBLISH"):
|
||||
manager = ModulesManager()
|
||||
webpublisher_addon = manager["webpublisher"]
|
||||
|
||||
launcher.execute_in_main_thread(
|
||||
functools.partial(
|
||||
webpublisher_addon.headless_publish,
|
||||
log,
|
||||
"CloseAE",
|
||||
is_in_tests()
|
||||
)
|
||||
)
|
||||
|
||||
elif os.environ.get("AVALON_PHOTOSHOP_WORKFILES_ON_LAUNCH", True):
|
||||
save = False
|
||||
if os.getenv("WORKFILES_SAVE_AS"):
|
||||
save = True
|
||||
|
||||
launcher.execute_in_main_thread(
|
||||
lambda: host_tools.show_tool_by_name("workfiles", save=save)
|
||||
)
|
||||
|
||||
sys.exit(app.exec_())
|
||||
|
||||
|
||||
def show_tool_by_name(tool_name):
|
||||
|
|
@ -55,6 +83,7 @@ def show_tool_by_name(tool_name):
|
|||
|
||||
|
||||
class ProcessLauncher(QtCore.QObject):
|
||||
"""Launches webserver, connects to it, runs main thread."""
|
||||
route_name = "AfterEffects"
|
||||
_main_thread_callbacks = collections.deque()
|
||||
|
||||
|
|
@ -296,6 +325,15 @@ class AfterEffectsRoute(WebSocketRoute):
|
|||
async def sceneinventory_route(self):
|
||||
self._tool_route("sceneinventory")
|
||||
|
||||
async def setresolution_route(self):
|
||||
self._settings_route(False, True)
|
||||
|
||||
async def setframes_route(self):
|
||||
self._settings_route(True, False)
|
||||
|
||||
async def setall_route(self):
|
||||
self._settings_route(True, True)
|
||||
|
||||
async def experimental_tools_route(self):
|
||||
self._tool_route("experimental_tools")
|
||||
|
||||
|
|
@ -309,3 +347,13 @@ class AfterEffectsRoute(WebSocketRoute):
|
|||
|
||||
# Required return statement.
|
||||
return "nothing"
|
||||
|
||||
def _settings_route(self, frames, resolution):
|
||||
partial_method = functools.partial(set_settings,
|
||||
frames,
|
||||
resolution)
|
||||
|
||||
ProcessLauncher.execute_in_main_thread(partial_method)
|
||||
|
||||
# Required return statement.
|
||||
return "nothing"
|
||||
|
|
|
|||
|
|
@ -1,69 +1,17 @@
|
|||
import os
|
||||
import sys
|
||||
import re
|
||||
import json
|
||||
import contextlib
|
||||
import traceback
|
||||
import logging
|
||||
from functools import partial
|
||||
|
||||
from qtpy import QtWidgets
|
||||
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.modules import ModulesManager
|
||||
|
||||
from openpype.tools.utils import host_tools
|
||||
from openpype.tests.lib import is_in_tests
|
||||
from .launch_logic import ProcessLauncher, get_stub
|
||||
from openpype.pipeline.context_tools import get_current_context
|
||||
from openpype.client import get_asset_by_name
|
||||
from .ws_stub import get_stub
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
log.setLevel(logging.DEBUG)
|
||||
|
||||
|
||||
def safe_excepthook(*args):
|
||||
traceback.print_exception(*args)
|
||||
|
||||
|
||||
def main(*subprocess_args):
|
||||
sys.excepthook = safe_excepthook
|
||||
|
||||
from openpype.hosts.aftereffects.api import AfterEffectsHost
|
||||
|
||||
host = AfterEffectsHost()
|
||||
install_host(host)
|
||||
|
||||
os.environ["OPENPYPE_LOG_NO_COLORS"] = "False"
|
||||
app = QtWidgets.QApplication([])
|
||||
app.setQuitOnLastWindowClosed(False)
|
||||
|
||||
launcher = ProcessLauncher(subprocess_args)
|
||||
launcher.start()
|
||||
|
||||
if os.environ.get("HEADLESS_PUBLISH"):
|
||||
manager = ModulesManager()
|
||||
webpublisher_addon = manager["webpublisher"]
|
||||
|
||||
launcher.execute_in_main_thread(
|
||||
partial(
|
||||
webpublisher_addon.headless_publish,
|
||||
log,
|
||||
"CloseAE",
|
||||
is_in_tests()
|
||||
)
|
||||
)
|
||||
|
||||
elif os.environ.get("AVALON_PHOTOSHOP_WORKFILES_ON_LAUNCH", True):
|
||||
save = False
|
||||
if os.getenv("WORKFILES_SAVE_AS"):
|
||||
save = True
|
||||
|
||||
launcher.execute_in_main_thread(
|
||||
lambda: host_tools.show_tool_by_name("workfiles", save=save)
|
||||
)
|
||||
|
||||
sys.exit(app.exec_())
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def maintained_selection():
|
||||
"""Maintain selection during context."""
|
||||
|
|
@ -145,13 +93,13 @@ def get_asset_settings(asset_doc):
|
|||
|
||||
"""
|
||||
asset_data = asset_doc["data"]
|
||||
fps = asset_data.get("fps")
|
||||
frame_start = asset_data.get("frameStart")
|
||||
frame_end = asset_data.get("frameEnd")
|
||||
handle_start = asset_data.get("handleStart")
|
||||
handle_end = asset_data.get("handleEnd")
|
||||
resolution_width = asset_data.get("resolutionWidth")
|
||||
resolution_height = asset_data.get("resolutionHeight")
|
||||
fps = asset_data.get("fps", 0)
|
||||
frame_start = asset_data.get("frameStart", 0)
|
||||
frame_end = asset_data.get("frameEnd", 0)
|
||||
handle_start = asset_data.get("handleStart", 0)
|
||||
handle_end = asset_data.get("handleEnd", 0)
|
||||
resolution_width = asset_data.get("resolutionWidth", 0)
|
||||
resolution_height = asset_data.get("resolutionHeight", 0)
|
||||
duration = (frame_end - frame_start + 1) + handle_start + handle_end
|
||||
|
||||
return {
|
||||
|
|
@ -164,3 +112,49 @@ def get_asset_settings(asset_doc):
|
|||
"resolutionHeight": resolution_height,
|
||||
"duration": duration
|
||||
}
|
||||
|
||||
|
||||
def set_settings(frames, resolution, comp_ids=None, print_msg=True):
|
||||
"""Sets number of frames and resolution to selected comps.
|
||||
|
||||
Args:
|
||||
frames (bool): True if set frame info
|
||||
resolution (bool): True if set resolution
|
||||
comp_ids (list): specific composition ids, if empty
|
||||
it tries to look for currently selected
|
||||
print_msg (bool): True throw JS alert with msg
|
||||
"""
|
||||
frame_start = frames_duration = fps = width = height = None
|
||||
current_context = get_current_context()
|
||||
|
||||
asset_doc = get_asset_by_name(current_context["project_name"],
|
||||
current_context["asset_name"])
|
||||
settings = get_asset_settings(asset_doc)
|
||||
|
||||
msg = ''
|
||||
if frames:
|
||||
frame_start = settings["frameStart"] - settings["handleStart"]
|
||||
frames_duration = settings["duration"]
|
||||
fps = settings["fps"]
|
||||
msg += f"frame start:{frame_start}, duration:{frames_duration}, "\
|
||||
f"fps:{fps}"
|
||||
if resolution:
|
||||
width = settings["resolutionWidth"]
|
||||
height = settings["resolutionHeight"]
|
||||
msg += f"width:{width} and height:{height}"
|
||||
|
||||
stub = get_stub()
|
||||
if not comp_ids:
|
||||
comps = stub.get_selected_items(True, False, False)
|
||||
comp_ids = [comp.id for comp in comps]
|
||||
if not comp_ids:
|
||||
stub.print_msg("Select at least one composition to apply settings.")
|
||||
return
|
||||
|
||||
for comp_id in comp_ids:
|
||||
msg = f"Setting for comp {comp_id} " + msg
|
||||
log.debug(msg)
|
||||
stub.set_comp_properties(comp_id, frame_start, frames_duration,
|
||||
fps, width, height)
|
||||
if print_msg:
|
||||
stub.print_msg(msg)
|
||||
|
|
|
|||
|
|
@ -8,10 +8,7 @@ from openpype.lib import Logger, register_event_callback
|
|||
from openpype.pipeline import (
|
||||
register_loader_plugin_path,
|
||||
register_creator_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
legacy_io,
|
||||
)
|
||||
from openpype.pipeline.load import any_outdated_containers
|
||||
import openpype.hosts.aftereffects
|
||||
|
|
@ -23,7 +20,8 @@ from openpype.host import (
|
|||
IPublishHost
|
||||
)
|
||||
|
||||
from .launch_logic import get_stub, ConnectionNotEstablishedYet
|
||||
from .launch_logic import get_stub
|
||||
from .ws_stub import ConnectionNotEstablishedYet
|
||||
|
||||
log = Logger.get_logger(__name__)
|
||||
|
||||
|
|
@ -60,9 +58,6 @@ class AfterEffectsHost(HostBase, IWorkfileHost, ILoadHost, IPublishHost):
|
|||
print("Not connected yet, ignoring")
|
||||
return
|
||||
|
||||
if not stub.get_active_document_name():
|
||||
return
|
||||
|
||||
self._stub = stub
|
||||
return self._stub
|
||||
|
||||
|
|
|
|||
|
|
@ -11,6 +11,10 @@ from wsrpc_aiohttp import WebSocketAsync
|
|||
from openpype.tools.adobe_webserver.app import WebServerTool
|
||||
|
||||
|
||||
class ConnectionNotEstablishedYet(Exception):
|
||||
pass
|
||||
|
||||
|
||||
@attr.s
|
||||
class AEItem(object):
|
||||
"""
|
||||
|
|
@ -24,8 +28,8 @@ class AEItem(object):
|
|||
# all imported elements, single for
|
||||
# regular image, array for Backgrounds
|
||||
members = attr.ib(factory=list)
|
||||
workAreaStart = attr.ib(default=None)
|
||||
workAreaDuration = attr.ib(default=None)
|
||||
frameStart = attr.ib(default=None)
|
||||
framesDuration = attr.ib(default=None)
|
||||
frameRate = attr.ib(default=None)
|
||||
file_name = attr.ib(default=None)
|
||||
instance_id = attr.ib(default=None) # New Publisher
|
||||
|
|
@ -355,42 +359,50 @@ class AfterEffectsServerStub():
|
|||
|
||||
return self._handle_return(res)
|
||||
|
||||
def get_work_area(self, item_id):
|
||||
""" Get work are information for render purposes
|
||||
def get_comp_properties(self, comp_id):
|
||||
""" Get composition information for render purposes
|
||||
|
||||
Returns startFrame, frameDuration, fps, width, height.
|
||||
|
||||
Args:
|
||||
item_id (int):
|
||||
comp_id (int):
|
||||
|
||||
Returns:
|
||||
(AEItem)
|
||||
|
||||
"""
|
||||
res = self.websocketserver.call(self.client.call
|
||||
('AfterEffects.get_work_area',
|
||||
item_id=item_id
|
||||
('AfterEffects.get_comp_properties',
|
||||
item_id=comp_id
|
||||
))
|
||||
|
||||
records = self._to_records(self._handle_return(res))
|
||||
if records:
|
||||
return records.pop()
|
||||
|
||||
def set_work_area(self, item, start, duration, frame_rate):
|
||||
def set_comp_properties(self, comp_id, start, duration, frame_rate,
|
||||
width, height):
|
||||
"""
|
||||
Set work area to predefined values (from Ftrack).
|
||||
Work area directs what gets rendered.
|
||||
Beware of rounding, AE expects seconds, not frames directly.
|
||||
|
||||
Args:
|
||||
item (dict):
|
||||
start (float): workAreaStart in seconds
|
||||
duration (float): in seconds
|
||||
comp_id (int):
|
||||
start (int): workAreaStart in frames
|
||||
duration (int): in frames
|
||||
frame_rate (float): frames in seconds
|
||||
width (int): resolution width
|
||||
height (int): resolution height
|
||||
"""
|
||||
res = self.websocketserver.call(self.client.call
|
||||
('AfterEffects.set_work_area',
|
||||
item_id=item.id,
|
||||
('AfterEffects.set_comp_properties',
|
||||
item_id=comp_id,
|
||||
start=start,
|
||||
duration=duration,
|
||||
frame_rate=frame_rate))
|
||||
frame_rate=frame_rate,
|
||||
width=width,
|
||||
height=height))
|
||||
return self._handle_return(res)
|
||||
|
||||
def save(self):
|
||||
|
|
@ -554,6 +566,12 @@ class AfterEffectsServerStub():
|
|||
|
||||
return self._handle_return(res)
|
||||
|
||||
def print_msg(self, msg):
|
||||
"""Triggers Javascript alert dialog."""
|
||||
self.websocketserver.call(self.client.call
|
||||
('AfterEffects.print_msg',
|
||||
msg=msg))
|
||||
|
||||
def _handle_return(self, res):
|
||||
"""Wraps return, throws ValueError if 'error' key is present."""
|
||||
if res and isinstance(res, str) and res != "undefined":
|
||||
|
|
@ -608,8 +626,8 @@ class AfterEffectsServerStub():
|
|||
d.get('name'),
|
||||
d.get('type'),
|
||||
d.get('members'),
|
||||
d.get('workAreaStart'),
|
||||
d.get('workAreaDuration'),
|
||||
d.get('frameStart'),
|
||||
d.get('framesDuration'),
|
||||
d.get('frameRate'),
|
||||
d.get('file_name'),
|
||||
d.get("instance_id"),
|
||||
|
|
@ -618,3 +636,18 @@ class AfterEffectsServerStub():
|
|||
|
||||
ret.append(item)
|
||||
return ret
|
||||
|
||||
|
||||
def get_stub():
|
||||
"""
|
||||
Convenience function to get server RPC stub to call methods directed
|
||||
for host (Photoshop).
|
||||
It expects already created connection, started from client.
|
||||
Currently created when panel is opened (PS: Window>Extensions>Avalon)
|
||||
:return: <PhotoshopClientStub> where functions could be called from
|
||||
"""
|
||||
ae_stub = AfterEffectsServerStub()
|
||||
if not ae_stub.client:
|
||||
raise ConnectionNotEstablishedYet("Connection is not created yet")
|
||||
|
||||
return ae_stub
|
||||
|
|
|
|||
|
|
@ -9,6 +9,7 @@ from openpype.pipeline import (
|
|||
CreatorError
|
||||
)
|
||||
from openpype.hosts.aftereffects.api.pipeline import cache_and_get_instances
|
||||
from openpype.hosts.aftereffects.api.lib import set_settings
|
||||
from openpype.lib import prepare_template_data
|
||||
from openpype.pipeline.create import SUBSET_NAME_ALLOWED_SYMBOLS
|
||||
|
||||
|
|
@ -32,6 +33,14 @@ class RenderCreator(Creator):
|
|||
|
||||
def create(self, subset_name_from_ui, data, pre_create_data):
|
||||
stub = api.get_stub() # only after After Effects is up
|
||||
|
||||
try:
|
||||
_ = stub.get_active_document_full_name()
|
||||
except ValueError:
|
||||
raise CreatorError(
|
||||
"Please save workfile via Workfile app first!"
|
||||
)
|
||||
|
||||
if pre_create_data.get("use_selection"):
|
||||
comps = stub.get_selected_items(
|
||||
comps=True, folders=False, footages=False
|
||||
|
|
@ -41,8 +50,8 @@ class RenderCreator(Creator):
|
|||
|
||||
if not comps:
|
||||
raise CreatorError(
|
||||
"Nothing to create. Select composition "
|
||||
"if 'useSelection' or create at least "
|
||||
"Nothing to create. Select composition in Project Bin if "
|
||||
"'Use selection' is toggled or create at least "
|
||||
"one composition."
|
||||
)
|
||||
use_composition_name = (pre_create_data.get("use_composition_name") or
|
||||
|
|
@ -87,10 +96,14 @@ class RenderCreator(Creator):
|
|||
self._add_instance_to_context(new_instance)
|
||||
|
||||
stub.rename_item(comp.id, subset_name)
|
||||
set_settings(True, True, [comp.id], print_msg=False)
|
||||
|
||||
def get_pre_create_attr_defs(self):
|
||||
output = [
|
||||
BoolDef("use_selection", default=True, label="Use selection"),
|
||||
BoolDef("use_selection",
|
||||
tooltip="Composition for publishable instance should be "
|
||||
"selected by default.",
|
||||
default=True, label="Use selection"),
|
||||
BoolDef("use_composition_name",
|
||||
label="Use composition name in subset"),
|
||||
UISeparatorDef(),
|
||||
|
|
|
|||
|
|
@ -66,19 +66,19 @@ class CollectAERender(publish.AbstractCollectRender):
|
|||
|
||||
comp_id = int(inst.data["members"][0])
|
||||
|
||||
work_area_info = CollectAERender.get_stub().get_work_area(comp_id)
|
||||
comp_info = CollectAERender.get_stub().get_comp_properties(
|
||||
comp_id)
|
||||
|
||||
if not work_area_info:
|
||||
if not comp_info:
|
||||
self.log.warning("Orphaned instance, deleting metadata")
|
||||
inst_id = inst.get("instance_id") or str(comp_id)
|
||||
inst_id = inst.data.get("instance_id") or str(comp_id)
|
||||
CollectAERender.get_stub().remove_instance(inst_id)
|
||||
continue
|
||||
|
||||
frame_start = work_area_info.workAreaStart
|
||||
frame_end = round(work_area_info.workAreaStart +
|
||||
float(work_area_info.workAreaDuration) *
|
||||
float(work_area_info.frameRate)) - 1
|
||||
fps = work_area_info.frameRate
|
||||
frame_start = comp_info.frameStart
|
||||
frame_end = round(comp_info.frameStart +
|
||||
comp_info.framesDuration) - 1
|
||||
fps = comp_info.frameRate
|
||||
# TODO add resolution when supported by extension
|
||||
|
||||
task_name = inst.data.get("task") # legacy
|
||||
|
|
|
|||
|
|
@ -65,37 +65,19 @@ class CacheModelLoader(plugin.AssetLoader):
|
|||
|
||||
imported = lib.get_selection()
|
||||
|
||||
empties = [obj for obj in imported if obj.type == 'EMPTY']
|
||||
|
||||
container = None
|
||||
|
||||
for empty in empties:
|
||||
if not empty.parent:
|
||||
container = empty
|
||||
break
|
||||
|
||||
assert container, "No asset group found"
|
||||
|
||||
# Children must be linked before parents,
|
||||
# otherwise the hierarchy will break
|
||||
objects = []
|
||||
nodes = list(container.children)
|
||||
|
||||
for obj in nodes:
|
||||
for obj in imported:
|
||||
obj.parent = asset_group
|
||||
|
||||
bpy.data.objects.remove(container)
|
||||
|
||||
for obj in nodes:
|
||||
for obj in imported:
|
||||
objects.append(obj)
|
||||
nodes.extend(list(obj.children))
|
||||
imported.extend(list(obj.children))
|
||||
|
||||
objects.reverse()
|
||||
|
||||
for obj in objects:
|
||||
parent.objects.link(obj)
|
||||
collection.objects.unlink(obj)
|
||||
|
||||
for obj in objects:
|
||||
name = obj.name
|
||||
obj.name = f"{group_name}:{name}"
|
||||
|
|
@ -138,13 +120,14 @@ class CacheModelLoader(plugin.AssetLoader):
|
|||
group_name = plugin.asset_name(asset, subset, unique_number)
|
||||
namespace = namespace or f"{asset}_{unique_number}"
|
||||
|
||||
avalon_container = bpy.data.collections.get(AVALON_CONTAINERS)
|
||||
if not avalon_container:
|
||||
avalon_container = bpy.data.collections.new(name=AVALON_CONTAINERS)
|
||||
bpy.context.scene.collection.children.link(avalon_container)
|
||||
avalon_containers = bpy.data.collections.get(AVALON_CONTAINERS)
|
||||
if not avalon_containers:
|
||||
avalon_containers = bpy.data.collections.new(
|
||||
name=AVALON_CONTAINERS)
|
||||
bpy.context.scene.collection.children.link(avalon_containers)
|
||||
|
||||
asset_group = bpy.data.objects.new(group_name, object_data=None)
|
||||
avalon_container.objects.link(asset_group)
|
||||
avalon_containers.objects.link(asset_group)
|
||||
|
||||
objects = self._process(libpath, asset_group, group_name)
|
||||
|
||||
|
|
|
|||
|
|
@ -13,6 +13,7 @@ from .lib import (
|
|||
update_frame_range,
|
||||
set_asset_framerange,
|
||||
get_current_comp,
|
||||
get_bmd_library,
|
||||
comp_lock_and_undo_chunk
|
||||
)
|
||||
|
||||
|
|
|
|||
|
|
@ -256,8 +256,11 @@ def switch_item(container,
|
|||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def maintained_selection():
|
||||
comp = get_current_comp()
|
||||
def maintained_selection(comp=None):
|
||||
"""Reset comp selection from before the context after the context"""
|
||||
if comp is None:
|
||||
comp = get_current_comp()
|
||||
|
||||
previous_selection = comp.GetToolList(True).values()
|
||||
try:
|
||||
yield
|
||||
|
|
@ -269,6 +272,33 @@ def maintained_selection():
|
|||
flow.Select(tool, True)
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def maintained_comp_range(comp=None,
|
||||
global_start=True,
|
||||
global_end=True,
|
||||
render_start=True,
|
||||
render_end=True):
|
||||
"""Reset comp frame ranges from before the context after the context"""
|
||||
if comp is None:
|
||||
comp = get_current_comp()
|
||||
|
||||
comp_attrs = comp.GetAttrs()
|
||||
preserve_attrs = {}
|
||||
if global_start:
|
||||
preserve_attrs["COMPN_GlobalStart"] = comp_attrs["COMPN_GlobalStart"]
|
||||
if global_end:
|
||||
preserve_attrs["COMPN_GlobalEnd"] = comp_attrs["COMPN_GlobalEnd"]
|
||||
if render_start:
|
||||
preserve_attrs["COMPN_RenderStart"] = comp_attrs["COMPN_RenderStart"]
|
||||
if render_end:
|
||||
preserve_attrs["COMPN_RenderEnd"] = comp_attrs["COMPN_RenderEnd"]
|
||||
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
comp.SetAttrs(preserve_attrs)
|
||||
|
||||
|
||||
def get_frame_path(path):
|
||||
"""Get filename for the Fusion Saver with padded number as '#'
|
||||
|
||||
|
|
@ -309,6 +339,12 @@ def get_fusion_module():
|
|||
return fusion
|
||||
|
||||
|
||||
def get_bmd_library():
|
||||
"""Get bmd library"""
|
||||
bmd = getattr(sys.modules["__main__"], "bmd", None)
|
||||
return bmd
|
||||
|
||||
|
||||
def get_current_comp():
|
||||
"""Get current comp in this session"""
|
||||
fusion = get_fusion_module()
|
||||
|
|
|
|||
|
|
@ -1,3 +1,4 @@
|
|||
from copy import deepcopy
|
||||
import os
|
||||
|
||||
from openpype.hosts.fusion.api import (
|
||||
|
|
@ -11,15 +12,13 @@ from openpype.lib import (
|
|||
)
|
||||
from openpype.pipeline import (
|
||||
legacy_io,
|
||||
Creator,
|
||||
Creator as NewCreator,
|
||||
CreatedInstance,
|
||||
)
|
||||
from openpype.client import (
|
||||
get_asset_by_name,
|
||||
Anatomy
|
||||
)
|
||||
|
||||
|
||||
class CreateSaver(Creator):
|
||||
class CreateSaver(NewCreator):
|
||||
identifier = "io.openpype.creators.fusion.saver"
|
||||
label = "Render (saver)"
|
||||
name = "render"
|
||||
|
|
@ -28,9 +27,29 @@ class CreateSaver(Creator):
|
|||
description = "Fusion Saver to generate image sequence"
|
||||
icon = "fa5.eye"
|
||||
|
||||
instance_attributes = ["reviewable"]
|
||||
instance_attributes = [
|
||||
"reviewable"
|
||||
]
|
||||
default_variants = [
|
||||
"Main",
|
||||
"Mask"
|
||||
]
|
||||
|
||||
# TODO: This should be renamed together with Nuke so it is aligned
|
||||
temp_rendering_path_template = (
|
||||
"{workdir}/renders/fusion/{subset}/{subset}.{frame}.{ext}")
|
||||
|
||||
def create(self, subset_name, instance_data, pre_create_data):
|
||||
self.pass_pre_attributes_to_instance(
|
||||
instance_data,
|
||||
pre_create_data
|
||||
)
|
||||
|
||||
instance_data.update({
|
||||
"id": "pyblish.avalon.instance",
|
||||
"subset": subset_name
|
||||
})
|
||||
|
||||
# TODO: Add pre_create attributes to choose file format?
|
||||
file_format = "OpenEXRFormat"
|
||||
|
||||
|
|
@ -39,7 +58,6 @@ class CreateSaver(Creator):
|
|||
args = (-32768, -32768) # Magical position numbers
|
||||
saver = comp.AddTool("Saver", *args)
|
||||
|
||||
instance_data["subset"] = subset_name
|
||||
self._update_tool_with_data(saver, data=instance_data)
|
||||
|
||||
saver["OutputFormat"] = file_format
|
||||
|
|
@ -78,7 +96,7 @@ class CreateSaver(Creator):
|
|||
for tool in tools:
|
||||
data = self.get_managed_tool_data(tool)
|
||||
if not data:
|
||||
data = self._collect_unmanaged_saver(tool)
|
||||
continue
|
||||
|
||||
# Add instance
|
||||
created_instance = CreatedInstance.from_existing(data, self)
|
||||
|
|
@ -125,60 +143,35 @@ class CreateSaver(Creator):
|
|||
original_subset = tool.GetData("openpype.subset")
|
||||
subset = data["subset"]
|
||||
if original_subset != subset:
|
||||
# Subset change detected
|
||||
# Update output filepath
|
||||
workdir = os.path.normpath(legacy_io.Session["AVALON_WORKDIR"])
|
||||
filename = f"{subset}..exr"
|
||||
filepath = os.path.join(workdir, "render", subset, filename)
|
||||
tool["Clip"] = filepath
|
||||
self._configure_saver_tool(data, tool, subset)
|
||||
|
||||
# Rename tool
|
||||
if tool.Name != subset:
|
||||
print(f"Renaming {tool.Name} -> {subset}")
|
||||
tool.SetAttrs({"TOOLS_Name": subset})
|
||||
def _configure_saver_tool(self, data, tool, subset):
|
||||
formatting_data = deepcopy(data)
|
||||
|
||||
def _collect_unmanaged_saver(self, tool):
|
||||
# TODO: this should not be done this way - this should actually
|
||||
# get the data as stored on the tool explicitly (however)
|
||||
# that would disallow any 'regular saver' to be collected
|
||||
# unless the instance data is stored on it to begin with
|
||||
|
||||
print("Collecting unmanaged saver..")
|
||||
comp = tool.Comp()
|
||||
|
||||
# Allow regular non-managed savers to also be picked up
|
||||
project = legacy_io.Session["AVALON_PROJECT"]
|
||||
asset = legacy_io.Session["AVALON_ASSET"]
|
||||
task = legacy_io.Session["AVALON_TASK"]
|
||||
|
||||
asset_doc = get_asset_by_name(project_name=project, asset_name=asset)
|
||||
|
||||
path = tool["Clip"][comp.TIME_UNDEFINED]
|
||||
fname = os.path.basename(path)
|
||||
fname, _ext = os.path.splitext(fname)
|
||||
variant = fname.rstrip(".")
|
||||
subset = self.get_subset_name(
|
||||
variant=variant,
|
||||
task_name=task,
|
||||
asset_doc=asset_doc,
|
||||
project_name=project,
|
||||
# get frame padding from anatomy templates
|
||||
anatomy = Anatomy()
|
||||
frame_padding = int(
|
||||
anatomy.templates["render"].get("frame_padding", 4)
|
||||
)
|
||||
|
||||
attrs = tool.GetAttrs()
|
||||
passthrough = attrs["TOOLB_PassThrough"]
|
||||
return {
|
||||
# Required data
|
||||
"project": project,
|
||||
"asset": asset,
|
||||
"subset": subset,
|
||||
"task": task,
|
||||
"variant": variant,
|
||||
"active": not passthrough,
|
||||
"family": self.family,
|
||||
# Unique identifier for instance and this creator
|
||||
"id": "pyblish.avalon.instance",
|
||||
"creator_identifier": self.identifier,
|
||||
}
|
||||
# Subset change detected
|
||||
workdir = os.path.normpath(legacy_io.Session["AVALON_WORKDIR"])
|
||||
formatting_data.update({
|
||||
"workdir": workdir,
|
||||
"frame": "0" * frame_padding,
|
||||
"ext": "exr"
|
||||
})
|
||||
|
||||
# build file path to render
|
||||
filepath = self.temp_rendering_path_template.format(
|
||||
**formatting_data)
|
||||
|
||||
tool["Clip"] = os.path.normpath(filepath)
|
||||
|
||||
# Rename tool
|
||||
if tool.Name != subset:
|
||||
print(f"Renaming {tool.Name} -> {subset}")
|
||||
tool.SetAttrs({"TOOLS_Name": subset})
|
||||
|
||||
def get_managed_tool_data(self, tool):
|
||||
"""Return data of the tool if it matches creator identifier"""
|
||||
|
|
@ -206,20 +199,25 @@ class CreateSaver(Creator):
|
|||
attr_defs = [
|
||||
self._get_render_target_enum(),
|
||||
self._get_reviewable_bool(),
|
||||
self._get_frame_range_enum()
|
||||
]
|
||||
return attr_defs
|
||||
|
||||
def get_instance_attr_defs(self):
|
||||
"""Settings for publish page"""
|
||||
attr_defs = [
|
||||
self._get_render_target_enum(),
|
||||
self._get_reviewable_bool(),
|
||||
]
|
||||
return attr_defs
|
||||
return self.get_pre_create_attr_defs()
|
||||
|
||||
def pass_pre_attributes_to_instance(
|
||||
self,
|
||||
instance_data,
|
||||
pre_create_data
|
||||
):
|
||||
creator_attrs = instance_data["creator_attributes"] = {}
|
||||
for pass_key in pre_create_data.keys():
|
||||
creator_attrs[pass_key] = pre_create_data[pass_key]
|
||||
|
||||
# These functions below should be moved to another file
|
||||
# so it can be used by other plugins. plugin.py ?
|
||||
|
||||
def _get_render_target_enum(self):
|
||||
rendering_targets = {
|
||||
"local": "Local machine rendering",
|
||||
|
|
@ -232,9 +230,44 @@ class CreateSaver(Creator):
|
|||
"render_target", items=rendering_targets, label="Render target"
|
||||
)
|
||||
|
||||
def _get_frame_range_enum(self):
|
||||
frame_range_options = {
|
||||
"asset_db": "Current asset context",
|
||||
"render_range": "From render in/out",
|
||||
"comp_range": "From composition timeline"
|
||||
}
|
||||
|
||||
return EnumDef(
|
||||
"frame_range_source",
|
||||
items=frame_range_options,
|
||||
label="Frame range source"
|
||||
)
|
||||
|
||||
def _get_reviewable_bool(self):
|
||||
return BoolDef(
|
||||
"review",
|
||||
default=("reviewable" in self.instance_attributes),
|
||||
label="Review",
|
||||
)
|
||||
|
||||
def apply_settings(
|
||||
self,
|
||||
project_settings,
|
||||
system_settings
|
||||
):
|
||||
"""Method called on initialization of plugin to apply settings."""
|
||||
|
||||
# plugin settings
|
||||
plugin_settings = (
|
||||
project_settings["fusion"]["create"][self.__class__.__name__]
|
||||
)
|
||||
|
||||
# individual attributes
|
||||
self.instance_attributes = plugin_settings.get(
|
||||
"instance_attributes") or self.instance_attributes
|
||||
self.default_variants = plugin_settings.get(
|
||||
"default_variants") or self.default_variants
|
||||
self.temp_rendering_path_template = (
|
||||
plugin_settings.get("temp_rendering_path_template")
|
||||
or self.temp_rendering_path_template
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,4 +1,3 @@
|
|||
|
||||
from openpype.pipeline import (
|
||||
load,
|
||||
get_representation_path,
|
||||
|
|
@ -6,7 +5,7 @@ from openpype.pipeline import (
|
|||
from openpype.hosts.fusion.api import (
|
||||
imprint_container,
|
||||
get_current_comp,
|
||||
comp_lock_and_undo_chunk
|
||||
comp_lock_and_undo_chunk,
|
||||
)
|
||||
|
||||
|
||||
|
|
@ -15,7 +14,21 @@ class FusionLoadFBXMesh(load.LoaderPlugin):
|
|||
|
||||
families = ["*"]
|
||||
representations = ["*"]
|
||||
extensions = {"fbx"}
|
||||
extensions = {
|
||||
"3ds",
|
||||
"amc",
|
||||
"aoa",
|
||||
"asf",
|
||||
"bvh",
|
||||
"c3d",
|
||||
"dae",
|
||||
"dxf",
|
||||
"fbx",
|
||||
"htr",
|
||||
"mcd",
|
||||
"obj",
|
||||
"trc",
|
||||
}
|
||||
|
||||
label = "Load FBX mesh"
|
||||
order = -10
|
||||
|
|
@ -27,23 +40,24 @@ class FusionLoadFBXMesh(load.LoaderPlugin):
|
|||
def load(self, context, name, namespace, data):
|
||||
# Fallback to asset name when namespace is None
|
||||
if namespace is None:
|
||||
namespace = context['asset']['name']
|
||||
namespace = context["asset"]["name"]
|
||||
|
||||
# Create the Loader with the filename path set
|
||||
comp = get_current_comp()
|
||||
with comp_lock_and_undo_chunk(comp, "Create tool"):
|
||||
|
||||
path = self.fname
|
||||
|
||||
args = (-32768, -32768)
|
||||
tool = comp.AddTool(self.tool_type, *args)
|
||||
tool["ImportFile"] = path
|
||||
|
||||
imprint_container(tool,
|
||||
name=name,
|
||||
namespace=namespace,
|
||||
context=context,
|
||||
loader=self.__class__.__name__)
|
||||
imprint_container(
|
||||
tool,
|
||||
name=name,
|
||||
namespace=namespace,
|
||||
context=context,
|
||||
loader=self.__class__.__name__,
|
||||
)
|
||||
|
||||
def switch(self, container, representation):
|
||||
self.update(container, representation)
|
||||
|
|
|
|||
|
|
@ -3,17 +3,14 @@ import contextlib
|
|||
import openpype.pipeline.load as load
|
||||
from openpype.pipeline.load import (
|
||||
get_representation_context,
|
||||
get_representation_path_from_context
|
||||
get_representation_path_from_context,
|
||||
)
|
||||
from openpype.hosts.fusion.api import (
|
||||
imprint_container,
|
||||
get_current_comp,
|
||||
comp_lock_and_undo_chunk
|
||||
)
|
||||
from openpype.lib.transcoding import (
|
||||
IMAGE_EXTENSIONS,
|
||||
VIDEO_EXTENSIONS
|
||||
comp_lock_and_undo_chunk,
|
||||
)
|
||||
from openpype.lib.transcoding import IMAGE_EXTENSIONS, VIDEO_EXTENSIONS
|
||||
|
||||
comp = get_current_comp()
|
||||
|
||||
|
|
@ -57,20 +54,23 @@ def preserve_trim(loader, log=None):
|
|||
try:
|
||||
yield
|
||||
finally:
|
||||
|
||||
length = loader.GetAttrs()["TOOLIT_Clip_Length"][1] - 1
|
||||
if trim_from_start > length:
|
||||
trim_from_start = length
|
||||
if log:
|
||||
log.warning("Reducing trim in to %d "
|
||||
"(because of less frames)" % trim_from_start)
|
||||
log.warning(
|
||||
"Reducing trim in to %d "
|
||||
"(because of less frames)" % trim_from_start
|
||||
)
|
||||
|
||||
remainder = length - trim_from_start
|
||||
if trim_from_end > remainder:
|
||||
trim_from_end = remainder
|
||||
if log:
|
||||
log.warning("Reducing trim in to %d "
|
||||
"(because of less frames)" % trim_from_end)
|
||||
log.warning(
|
||||
"Reducing trim in to %d "
|
||||
"(because of less frames)" % trim_from_end
|
||||
)
|
||||
|
||||
loader["ClipTimeStart"][time] = trim_from_start
|
||||
loader["ClipTimeEnd"][time] = length - trim_from_end
|
||||
|
|
@ -109,11 +109,15 @@ def loader_shift(loader, frame, relative=True):
|
|||
# Shifting global in will try to automatically compensate for the change
|
||||
# in the "ClipTimeStart" and "HoldFirstFrame" inputs, so we preserve those
|
||||
# input values to "just shift" the clip
|
||||
with preserve_inputs(loader, inputs=["ClipTimeStart",
|
||||
"ClipTimeEnd",
|
||||
"HoldFirstFrame",
|
||||
"HoldLastFrame"]):
|
||||
|
||||
with preserve_inputs(
|
||||
loader,
|
||||
inputs=[
|
||||
"ClipTimeStart",
|
||||
"ClipTimeEnd",
|
||||
"HoldFirstFrame",
|
||||
"HoldLastFrame",
|
||||
],
|
||||
):
|
||||
# GlobalIn cannot be set past GlobalOut or vice versa
|
||||
# so we must apply them in the order of the shift.
|
||||
if shift > 0:
|
||||
|
|
@ -129,7 +133,14 @@ def loader_shift(loader, frame, relative=True):
|
|||
class FusionLoadSequence(load.LoaderPlugin):
|
||||
"""Load image sequence into Fusion"""
|
||||
|
||||
families = ["imagesequence", "review", "render", "plate"]
|
||||
families = [
|
||||
"imagesequence",
|
||||
"review",
|
||||
"render",
|
||||
"plate",
|
||||
"image",
|
||||
"onilne",
|
||||
]
|
||||
representations = ["*"]
|
||||
extensions = set(
|
||||
ext.lstrip(".") for ext in IMAGE_EXTENSIONS.union(VIDEO_EXTENSIONS)
|
||||
|
|
@ -143,7 +154,7 @@ class FusionLoadSequence(load.LoaderPlugin):
|
|||
def load(self, context, name, namespace, data):
|
||||
# Fallback to asset name when namespace is None
|
||||
if namespace is None:
|
||||
namespace = context['asset']['name']
|
||||
namespace = context["asset"]["name"]
|
||||
|
||||
# Use the first file for now
|
||||
path = get_representation_path_from_context(context)
|
||||
|
|
@ -151,7 +162,6 @@ class FusionLoadSequence(load.LoaderPlugin):
|
|||
# Create the Loader with the filename path set
|
||||
comp = get_current_comp()
|
||||
with comp_lock_and_undo_chunk(comp, "Create Loader"):
|
||||
|
||||
args = (-32768, -32768)
|
||||
tool = comp.AddTool("Loader", *args)
|
||||
tool["Clip"] = path
|
||||
|
|
@ -160,11 +170,13 @@ class FusionLoadSequence(load.LoaderPlugin):
|
|||
start = self._get_start(context["version"], tool)
|
||||
loader_shift(tool, start, relative=False)
|
||||
|
||||
imprint_container(tool,
|
||||
name=name,
|
||||
namespace=namespace,
|
||||
context=context,
|
||||
loader=self.__class__.__name__)
|
||||
imprint_container(
|
||||
tool,
|
||||
name=name,
|
||||
namespace=namespace,
|
||||
context=context,
|
||||
loader=self.__class__.__name__,
|
||||
)
|
||||
|
||||
def switch(self, container, representation):
|
||||
self.update(container, representation)
|
||||
|
|
@ -222,24 +234,28 @@ class FusionLoadSequence(load.LoaderPlugin):
|
|||
start = self._get_start(context["version"], tool)
|
||||
|
||||
with comp_lock_and_undo_chunk(comp, "Update Loader"):
|
||||
|
||||
# Update the loader's path whilst preserving some values
|
||||
with preserve_trim(tool, log=self.log):
|
||||
with preserve_inputs(tool,
|
||||
inputs=("HoldFirstFrame",
|
||||
"HoldLastFrame",
|
||||
"Reverse",
|
||||
"Depth",
|
||||
"KeyCode",
|
||||
"TimeCodeOffset")):
|
||||
with preserve_inputs(
|
||||
tool,
|
||||
inputs=(
|
||||
"HoldFirstFrame",
|
||||
"HoldLastFrame",
|
||||
"Reverse",
|
||||
"Depth",
|
||||
"KeyCode",
|
||||
"TimeCodeOffset",
|
||||
),
|
||||
):
|
||||
tool["Clip"] = path
|
||||
|
||||
# Set the global in to the start frame of the sequence
|
||||
global_in_changed = loader_shift(tool, start, relative=False)
|
||||
if global_in_changed:
|
||||
# Log this change to the user
|
||||
self.log.debug("Changed '%s' global in: %d" % (tool.Name,
|
||||
start))
|
||||
self.log.debug(
|
||||
"Changed '%s' global in: %d" % (tool.Name, start)
|
||||
)
|
||||
|
||||
# Update the imprinted representation
|
||||
tool.SetData("avalon.representation", str(representation["_id"]))
|
||||
|
|
@ -264,9 +280,11 @@ class FusionLoadSequence(load.LoaderPlugin):
|
|||
# Get frame start without handles
|
||||
start = data.get("frameStart")
|
||||
if start is None:
|
||||
self.log.warning("Missing start frame for version "
|
||||
"assuming starts at frame 0 for: "
|
||||
"{}".format(tool.Name))
|
||||
self.log.warning(
|
||||
"Missing start frame for version "
|
||||
"assuming starts at frame 0 for: "
|
||||
"{}".format(tool.Name)
|
||||
)
|
||||
return 0
|
||||
|
||||
# Use `handleStart` if the data is available
|
||||
|
|
|
|||
32
openpype/hosts/fusion/plugins/load/load_workfile.py
Normal file
32
openpype/hosts/fusion/plugins/load/load_workfile.py
Normal file
|
|
@ -0,0 +1,32 @@
|
|||
"""Import workfiles into your current comp.
|
||||
As all imported nodes are free floating and will probably be changed there
|
||||
is no update or reload function added for this plugin
|
||||
"""
|
||||
|
||||
from openpype.pipeline import load
|
||||
|
||||
from openpype.hosts.fusion.api import (
|
||||
get_current_comp,
|
||||
get_bmd_library,
|
||||
)
|
||||
|
||||
|
||||
class FusionLoadWorkfile(load.LoaderPlugin):
|
||||
"""Load the content of a workfile into Fusion"""
|
||||
|
||||
families = ["workfile"]
|
||||
representations = ["*"]
|
||||
extensions = {"comp"}
|
||||
|
||||
label = "Load Workfile"
|
||||
order = -10
|
||||
icon = "code-fork"
|
||||
color = "orange"
|
||||
|
||||
def load(self, context, name, namespace, data):
|
||||
# Get needed elements
|
||||
bmd = get_bmd_library()
|
||||
comp = get_current_comp()
|
||||
|
||||
# Paste the content of the file into the current comp
|
||||
comp.Paste(bmd.readfile(self.fname))
|
||||
|
|
@ -35,9 +35,10 @@ class CollectFusionCompFrameRanges(pyblish.api.ContextPlugin):
|
|||
|
||||
# Store comp render ranges
|
||||
start, end, global_start, global_end = get_comp_render_range(comp)
|
||||
context.data["frameStart"] = int(start)
|
||||
context.data["frameEnd"] = int(end)
|
||||
context.data["frameStartHandle"] = int(global_start)
|
||||
context.data["frameEndHandle"] = int(global_end)
|
||||
context.data["handleStart"] = int(start) - int(global_start)
|
||||
context.data["handleEnd"] = int(global_end) - int(end)
|
||||
|
||||
context.data.update({
|
||||
"renderFrameStart": int(start),
|
||||
"renderFrameEnd": int(end),
|
||||
"compFrameStart": int(global_start),
|
||||
"compFrameEnd": int(global_end)
|
||||
})
|
||||
|
|
|
|||
|
|
@ -1,50 +0,0 @@
|
|||
import pyblish.api
|
||||
from openpype.pipeline import publish
|
||||
import os
|
||||
|
||||
|
||||
class CollectFusionExpectedFrames(
|
||||
pyblish.api.InstancePlugin, publish.ColormanagedPyblishPluginMixin
|
||||
):
|
||||
"""Collect all frames needed to publish expected frames"""
|
||||
|
||||
order = pyblish.api.CollectorOrder + 0.5
|
||||
label = "Collect Expected Frames"
|
||||
hosts = ["fusion"]
|
||||
families = ["render"]
|
||||
|
||||
def process(self, instance):
|
||||
context = instance.context
|
||||
|
||||
frame_start = context.data["frameStartHandle"]
|
||||
frame_end = context.data["frameEndHandle"]
|
||||
path = instance.data["path"]
|
||||
output_dir = instance.data["outputDir"]
|
||||
|
||||
basename = os.path.basename(path)
|
||||
head, ext = os.path.splitext(basename)
|
||||
files = [
|
||||
f"{head}{str(frame).zfill(4)}{ext}"
|
||||
for frame in range(frame_start, frame_end + 1)
|
||||
]
|
||||
repre = {
|
||||
"name": ext[1:],
|
||||
"ext": ext[1:],
|
||||
"frameStart": f"%0{len(str(frame_end))}d" % frame_start,
|
||||
"files": files,
|
||||
"stagingDir": output_dir,
|
||||
}
|
||||
|
||||
self.set_representation_colorspace(
|
||||
representation=repre,
|
||||
context=context,
|
||||
)
|
||||
|
||||
# review representation
|
||||
if instance.data.get("review", False):
|
||||
repre["tags"] = ["review"]
|
||||
|
||||
# add the repre to the instance
|
||||
if "representations" not in instance.data:
|
||||
instance.data["representations"] = []
|
||||
instance.data["representations"].append(repre)
|
||||
|
|
@ -1,22 +0,0 @@
|
|||
import pyblish.api
|
||||
|
||||
|
||||
class CollectFusionVersion(pyblish.api.ContextPlugin):
|
||||
"""Collect current comp"""
|
||||
|
||||
order = pyblish.api.CollectorOrder
|
||||
label = "Collect Fusion Version"
|
||||
hosts = ["fusion"]
|
||||
|
||||
def process(self, context):
|
||||
"""Collect all image sequence tools"""
|
||||
|
||||
comp = context.data.get("currentComp")
|
||||
if not comp:
|
||||
raise RuntimeError("No comp previously collected, unable to "
|
||||
"retrieve Fusion version.")
|
||||
|
||||
version = comp.GetApp().Version
|
||||
context.data["fusionVersion"] = version
|
||||
|
||||
self.log.info("Fusion version: %s" % version)
|
||||
|
|
@ -1,5 +1,3 @@
|
|||
import os
|
||||
|
||||
import pyblish.api
|
||||
|
||||
|
||||
|
|
@ -24,23 +22,63 @@ class CollectInstanceData(pyblish.api.InstancePlugin):
|
|||
creator_attributes = instance.data["creator_attributes"]
|
||||
instance.data.update(creator_attributes)
|
||||
|
||||
# Include start and end render frame in label
|
||||
subset = instance.data["subset"]
|
||||
frame_range_source = creator_attributes.get("frame_range_source")
|
||||
instance.data["frame_range_source"] = frame_range_source
|
||||
|
||||
# get asset frame ranges to all instances
|
||||
# render family instances `asset_db` render target
|
||||
start = context.data["frameStart"]
|
||||
end = context.data["frameEnd"]
|
||||
label = "{subset} ({start}-{end})".format(subset=subset,
|
||||
start=int(start),
|
||||
end=int(end))
|
||||
handle_start = context.data["handleStart"]
|
||||
handle_end = context.data["handleEnd"]
|
||||
start_with_handle = start - handle_start
|
||||
end_with_handle = end + handle_end
|
||||
|
||||
# conditions for render family instances
|
||||
if frame_range_source == "render_range":
|
||||
# set comp render frame ranges
|
||||
start = context.data["renderFrameStart"]
|
||||
end = context.data["renderFrameEnd"]
|
||||
handle_start = 0
|
||||
handle_end = 0
|
||||
start_with_handle = start
|
||||
end_with_handle = end
|
||||
|
||||
if frame_range_source == "comp_range":
|
||||
comp_start = context.data["compFrameStart"]
|
||||
comp_end = context.data["compFrameEnd"]
|
||||
render_start = context.data["renderFrameStart"]
|
||||
render_end = context.data["renderFrameEnd"]
|
||||
# set comp frame ranges
|
||||
start = render_start
|
||||
end = render_end
|
||||
handle_start = render_start - comp_start
|
||||
handle_end = comp_end - render_end
|
||||
start_with_handle = comp_start
|
||||
end_with_handle = comp_end
|
||||
|
||||
# Include start and end render frame in label
|
||||
subset = instance.data["subset"]
|
||||
label = (
|
||||
"{subset} ({start}-{end}) [{handle_start}-{handle_end}]"
|
||||
).format(
|
||||
subset=subset,
|
||||
start=int(start),
|
||||
end=int(end),
|
||||
handle_start=int(handle_start),
|
||||
handle_end=int(handle_end)
|
||||
)
|
||||
|
||||
instance.data.update({
|
||||
"label": label,
|
||||
|
||||
# todo: Allow custom frame range per instance
|
||||
"frameStart": context.data["frameStart"],
|
||||
"frameEnd": context.data["frameEnd"],
|
||||
"frameStartHandle": context.data["frameStartHandle"],
|
||||
"frameEndHandle": context.data["frameStartHandle"],
|
||||
"handleStart": context.data["handleStart"],
|
||||
"handleEnd": context.data["handleEnd"],
|
||||
"frameStart": start,
|
||||
"frameEnd": end,
|
||||
"frameStartHandle": start_with_handle,
|
||||
"frameEndHandle": end_with_handle,
|
||||
"handleStart": handle_start,
|
||||
"handleEnd": handle_end,
|
||||
"fps": context.data["fps"],
|
||||
})
|
||||
|
||||
|
|
@ -49,31 +87,3 @@ class CollectInstanceData(pyblish.api.InstancePlugin):
|
|||
if instance.data.get("review", False):
|
||||
self.log.info("Adding review family..")
|
||||
instance.data["families"].append("review")
|
||||
|
||||
if instance.data["family"] == "render":
|
||||
# TODO: This should probably move into a collector of
|
||||
# its own for the "render" family
|
||||
from openpype.hosts.fusion.api.lib import get_frame_path
|
||||
comp = context.data["currentComp"]
|
||||
|
||||
# This is only the case for savers currently but not
|
||||
# for workfile instances. So we assume saver here.
|
||||
tool = instance.data["transientData"]["tool"]
|
||||
path = tool["Clip"][comp.TIME_UNDEFINED]
|
||||
|
||||
filename = os.path.basename(path)
|
||||
head, padding, tail = get_frame_path(filename)
|
||||
ext = os.path.splitext(path)[1]
|
||||
assert tail == ext, ("Tail does not match %s" % ext)
|
||||
|
||||
instance.data.update({
|
||||
"path": path,
|
||||
"outputDir": os.path.dirname(path),
|
||||
"ext": ext, # todo: should be redundant?
|
||||
|
||||
# Backwards compatibility: embed tool in instance.data
|
||||
"tool": tool
|
||||
})
|
||||
|
||||
# Add tool itself as member
|
||||
instance.append(tool)
|
||||
|
|
|
|||
209
openpype/hosts/fusion/plugins/publish/collect_render.py
Normal file
209
openpype/hosts/fusion/plugins/publish/collect_render.py
Normal file
|
|
@ -0,0 +1,209 @@
|
|||
import os
|
||||
import attr
|
||||
import pyblish.api
|
||||
|
||||
from openpype.pipeline import publish
|
||||
from openpype.pipeline.publish import RenderInstance
|
||||
from openpype.hosts.fusion.api.lib import get_frame_path
|
||||
|
||||
|
||||
@attr.s
|
||||
class FusionRenderInstance(RenderInstance):
|
||||
# extend generic, composition name is needed
|
||||
fps = attr.ib(default=None)
|
||||
projectEntity = attr.ib(default=None)
|
||||
stagingDir = attr.ib(default=None)
|
||||
app_version = attr.ib(default=None)
|
||||
tool = attr.ib(default=None)
|
||||
workfileComp = attr.ib(default=None)
|
||||
publish_attributes = attr.ib(default={})
|
||||
frameStartHandle = attr.ib(default=None)
|
||||
frameEndHandle = attr.ib(default=None)
|
||||
|
||||
|
||||
class CollectFusionRender(
|
||||
publish.AbstractCollectRender,
|
||||
publish.ColormanagedPyblishPluginMixin
|
||||
):
|
||||
|
||||
order = pyblish.api.CollectorOrder + 0.09
|
||||
label = "Collect Fusion Render"
|
||||
hosts = ["fusion"]
|
||||
|
||||
def get_instances(self, context):
|
||||
|
||||
comp = context.data.get("currentComp")
|
||||
comp_frame_format_prefs = comp.GetPrefs("Comp.FrameFormat")
|
||||
aspect_x = comp_frame_format_prefs["AspectX"]
|
||||
aspect_y = comp_frame_format_prefs["AspectY"]
|
||||
|
||||
instances = []
|
||||
instances_to_remove = []
|
||||
|
||||
current_file = context.data["currentFile"]
|
||||
version = context.data["version"]
|
||||
|
||||
project_entity = context.data["projectEntity"]
|
||||
|
||||
for inst in context:
|
||||
if not inst.data.get("active", True):
|
||||
continue
|
||||
|
||||
family = inst.data["family"]
|
||||
if family != "render":
|
||||
continue
|
||||
|
||||
task_name = context.data["task"]
|
||||
tool = inst.data["transientData"]["tool"]
|
||||
|
||||
instance_families = inst.data.get("families", [])
|
||||
subset_name = inst.data["subset"]
|
||||
instance = FusionRenderInstance(
|
||||
family="render",
|
||||
tool=tool,
|
||||
workfileComp=comp,
|
||||
families=instance_families,
|
||||
version=version,
|
||||
time="",
|
||||
source=current_file,
|
||||
label=inst.data["label"],
|
||||
subset=subset_name,
|
||||
asset=inst.data["asset"],
|
||||
task=task_name,
|
||||
attachTo=False,
|
||||
setMembers='',
|
||||
publish=True,
|
||||
name=subset_name,
|
||||
resolutionWidth=comp_frame_format_prefs.get("Width"),
|
||||
resolutionHeight=comp_frame_format_prefs.get("Height"),
|
||||
pixelAspect=aspect_x / aspect_y,
|
||||
tileRendering=False,
|
||||
tilesX=0,
|
||||
tilesY=0,
|
||||
review="review" in instance_families,
|
||||
frameStart=inst.data["frameStart"],
|
||||
frameEnd=inst.data["frameEnd"],
|
||||
handleStart=inst.data["handleStart"],
|
||||
handleEnd=inst.data["handleEnd"],
|
||||
frameStartHandle=inst.data["frameStartHandle"],
|
||||
frameEndHandle=inst.data["frameEndHandle"],
|
||||
frameStep=1,
|
||||
fps=comp_frame_format_prefs.get("Rate"),
|
||||
app_version=comp.GetApp().Version,
|
||||
publish_attributes=inst.data.get("publish_attributes", {})
|
||||
)
|
||||
|
||||
render_target = inst.data["creator_attributes"]["render_target"]
|
||||
|
||||
# Add render target family
|
||||
render_target_family = f"render.{render_target}"
|
||||
if render_target_family not in instance.families:
|
||||
instance.families.append(render_target_family)
|
||||
|
||||
# Add render target specific data
|
||||
if render_target in {"local", "frames"}:
|
||||
instance.projectEntity = project_entity
|
||||
|
||||
if render_target == "farm":
|
||||
fam = "render.farm"
|
||||
if fam not in instance.families:
|
||||
instance.families.append(fam)
|
||||
instance.toBeRenderedOn = "deadline"
|
||||
instance.farm = True # to skip integrate
|
||||
if "review" in instance.families:
|
||||
# to skip ExtractReview locally
|
||||
instance.families.remove("review")
|
||||
|
||||
# add new instance to the list and remove the original
|
||||
# instance since it is not needed anymore
|
||||
instances.append(instance)
|
||||
instances_to_remove.append(inst)
|
||||
|
||||
for instance in instances_to_remove:
|
||||
context.remove(instance)
|
||||
|
||||
return instances
|
||||
|
||||
def post_collecting_action(self):
|
||||
for instance in self._context:
|
||||
if "render.frames" in instance.data.get("families", []):
|
||||
# adding representation data to the instance
|
||||
self._update_for_frames(instance)
|
||||
|
||||
def get_expected_files(self, render_instance):
|
||||
"""
|
||||
Returns list of rendered files that should be created by
|
||||
Deadline. These are not published directly, they are source
|
||||
for later 'submit_publish_job'.
|
||||
|
||||
Args:
|
||||
render_instance (RenderInstance): to pull anatomy and parts used
|
||||
in url
|
||||
|
||||
Returns:
|
||||
(list) of absolute urls to rendered file
|
||||
"""
|
||||
start = render_instance.frameStart - render_instance.handleStart
|
||||
end = render_instance.frameEnd + render_instance.handleEnd
|
||||
|
||||
path = (
|
||||
render_instance.tool["Clip"]
|
||||
[render_instance.workfileComp.TIME_UNDEFINED]
|
||||
)
|
||||
output_dir = os.path.dirname(path)
|
||||
render_instance.outputDir = output_dir
|
||||
|
||||
basename = os.path.basename(path)
|
||||
|
||||
head, padding, ext = get_frame_path(basename)
|
||||
|
||||
expected_files = []
|
||||
for frame in range(start, end + 1):
|
||||
expected_files.append(
|
||||
os.path.join(
|
||||
output_dir,
|
||||
f"{head}{str(frame).zfill(padding)}{ext}"
|
||||
)
|
||||
)
|
||||
|
||||
return expected_files
|
||||
|
||||
def _update_for_frames(self, instance):
|
||||
"""Updating instance for render.frames family
|
||||
|
||||
Adding representation data to the instance. Also setting
|
||||
colorspaceData to the representation based on file rules.
|
||||
"""
|
||||
|
||||
expected_files = instance.data["expectedFiles"]
|
||||
|
||||
start = instance.data["frameStart"] - instance.data["handleStart"]
|
||||
|
||||
path = expected_files[0]
|
||||
basename = os.path.basename(path)
|
||||
staging_dir = os.path.dirname(path)
|
||||
_, padding, ext = get_frame_path(basename)
|
||||
|
||||
repre = {
|
||||
"name": ext[1:],
|
||||
"ext": ext[1:],
|
||||
"frameStart": f"%0{padding}d" % start,
|
||||
"files": [os.path.basename(f) for f in expected_files],
|
||||
"stagingDir": staging_dir,
|
||||
}
|
||||
|
||||
self.set_representation_colorspace(
|
||||
representation=repre,
|
||||
context=instance.context,
|
||||
)
|
||||
|
||||
# review representation
|
||||
if instance.data.get("review", False):
|
||||
repre["tags"] = ["review"]
|
||||
|
||||
# add the repre to the instance
|
||||
if "representations" not in instance.data:
|
||||
instance.data["representations"] = []
|
||||
instance.data["representations"].append(repre)
|
||||
|
||||
return instance
|
||||
|
|
@ -1,25 +0,0 @@
|
|||
import pyblish.api
|
||||
|
||||
|
||||
class CollectFusionRenders(pyblish.api.InstancePlugin):
|
||||
"""Collect current saver node's render Mode
|
||||
|
||||
Options:
|
||||
local (Render locally)
|
||||
frames (Use existing frames)
|
||||
|
||||
"""
|
||||
|
||||
order = pyblish.api.CollectorOrder + 0.4
|
||||
label = "Collect Renders"
|
||||
hosts = ["fusion"]
|
||||
families = ["render"]
|
||||
|
||||
def process(self, instance):
|
||||
render_target = instance.data["render_target"]
|
||||
family = instance.data["family"]
|
||||
|
||||
# add targeted family to families
|
||||
instance.data["families"].append(
|
||||
"{}.{}".format(family, render_target)
|
||||
)
|
||||
|
|
@ -1,8 +1,12 @@
|
|||
import os
|
||||
import logging
|
||||
import contextlib
|
||||
import collections
|
||||
import pyblish.api
|
||||
from openpype.hosts.fusion.api import comp_lock_and_undo_chunk
|
||||
|
||||
from openpype.pipeline import publish
|
||||
from openpype.hosts.fusion.api import comp_lock_and_undo_chunk
|
||||
from openpype.hosts.fusion.api.lib import get_frame_path, maintained_comp_range
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
|
@ -38,7 +42,10 @@ def enabled_savers(comp, savers):
|
|||
saver.SetAttrs({"TOOLB_PassThrough": original_state})
|
||||
|
||||
|
||||
class FusionRenderLocal(pyblish.api.InstancePlugin):
|
||||
class FusionRenderLocal(
|
||||
pyblish.api.InstancePlugin,
|
||||
publish.ColormanagedPyblishPluginMixin
|
||||
):
|
||||
"""Render the current Fusion composition locally."""
|
||||
|
||||
order = pyblish.api.ExtractorOrder - 0.2
|
||||
|
|
@ -46,11 +53,16 @@ class FusionRenderLocal(pyblish.api.InstancePlugin):
|
|||
hosts = ["fusion"]
|
||||
families = ["render.local"]
|
||||
|
||||
is_rendered_key = "_fusionrenderlocal_has_rendered"
|
||||
|
||||
def process(self, instance):
|
||||
context = instance.context
|
||||
|
||||
# Start render
|
||||
self.render_once(context)
|
||||
result = self.render(instance)
|
||||
if result is False:
|
||||
raise RuntimeError(f"Comp render failed for {instance}")
|
||||
|
||||
self._add_representation(instance)
|
||||
|
||||
# Log render status
|
||||
self.log.info(
|
||||
|
|
@ -61,39 +73,48 @@ class FusionRenderLocal(pyblish.api.InstancePlugin):
|
|||
)
|
||||
)
|
||||
|
||||
def render_once(self, context):
|
||||
"""Render context comp only once, even with more render instances"""
|
||||
def render(self, instance):
|
||||
"""Render instance.
|
||||
|
||||
# This plug-in assumes all render nodes get rendered at the same time
|
||||
# to speed up the rendering. The check below makes sure that we only
|
||||
# execute the rendering once and not for each instance.
|
||||
key = f"__hasRun{self.__class__.__name__}"
|
||||
We try to render the minimal amount of times by combining the instances
|
||||
that have a matching frame range in one Fusion render. Then for the
|
||||
batch of instances we store whether the render succeeded or failed.
|
||||
|
||||
savers_to_render = [
|
||||
# Get the saver tool from the instance
|
||||
instance[0] for instance in context if
|
||||
# Only active instances
|
||||
instance.data.get("publish", True) and
|
||||
# Only render.local instances
|
||||
"render.local" in instance.data["families"]
|
||||
]
|
||||
"""
|
||||
|
||||
if key not in context.data:
|
||||
# We initialize as false to indicate it wasn't successful yet
|
||||
# so we can keep track of whether Fusion succeeded
|
||||
context.data[key] = False
|
||||
if self.is_rendered_key in instance.data:
|
||||
# This instance was already processed in batch with another
|
||||
# instance, so we just return the render result directly
|
||||
self.log.debug(f"Instance {instance} was already rendered")
|
||||
return instance.data[self.is_rendered_key]
|
||||
|
||||
current_comp = context.data["currentComp"]
|
||||
frame_start = context.data["frameStartHandle"]
|
||||
frame_end = context.data["frameEndHandle"]
|
||||
instances_by_frame_range = self.get_render_instances_by_frame_range(
|
||||
instance.context
|
||||
)
|
||||
|
||||
self.log.info("Starting Fusion render")
|
||||
self.log.info(f"Start frame: {frame_start}")
|
||||
self.log.info(f"End frame: {frame_end}")
|
||||
saver_names = ", ".join(saver.Name for saver in savers_to_render)
|
||||
self.log.info(f"Rendering tools: {saver_names}")
|
||||
# Render matching batch of instances that share the same frame range
|
||||
frame_range = self.get_instance_render_frame_range(instance)
|
||||
render_instances = instances_by_frame_range[frame_range]
|
||||
|
||||
with comp_lock_and_undo_chunk(current_comp):
|
||||
# We initialize render state false to indicate it wasn't successful
|
||||
# yet to keep track of whether Fusion succeeded. This is for cases
|
||||
# where an error below this might cause the comp render result not
|
||||
# to be stored for the instances of this batch
|
||||
for render_instance in render_instances:
|
||||
render_instance.data[self.is_rendered_key] = False
|
||||
|
||||
savers_to_render = [inst.data["tool"] for inst in render_instances]
|
||||
current_comp = instance.context.data["currentComp"]
|
||||
frame_start, frame_end = frame_range
|
||||
|
||||
self.log.info(
|
||||
f"Starting Fusion render frame range {frame_start}-{frame_end}"
|
||||
)
|
||||
saver_names = ", ".join(saver.Name for saver in savers_to_render)
|
||||
self.log.info(f"Rendering tools: {saver_names}")
|
||||
|
||||
with comp_lock_and_undo_chunk(current_comp):
|
||||
with maintained_comp_range(current_comp):
|
||||
with enabled_savers(current_comp, savers_to_render):
|
||||
result = current_comp.Render(
|
||||
{
|
||||
|
|
@ -103,7 +124,76 @@ class FusionRenderLocal(pyblish.api.InstancePlugin):
|
|||
}
|
||||
)
|
||||
|
||||
context.data[key] = bool(result)
|
||||
# Store the render state for all the rendered instances
|
||||
for render_instance in render_instances:
|
||||
render_instance.data[self.is_rendered_key] = bool(result)
|
||||
|
||||
if context.data[key] is False:
|
||||
raise RuntimeError("Comp render failed")
|
||||
return result
|
||||
|
||||
def _add_representation(self, instance):
|
||||
"""Add representation to instance"""
|
||||
|
||||
expected_files = instance.data["expectedFiles"]
|
||||
|
||||
start = instance.data["frameStart"] - instance.data["handleStart"]
|
||||
|
||||
path = expected_files[0]
|
||||
_, padding, ext = get_frame_path(path)
|
||||
|
||||
staging_dir = os.path.dirname(path)
|
||||
|
||||
repre = {
|
||||
"name": ext[1:],
|
||||
"ext": ext[1:],
|
||||
"frameStart": f"%0{padding}d" % start,
|
||||
"files": [os.path.basename(f) for f in expected_files],
|
||||
"stagingDir": staging_dir,
|
||||
}
|
||||
|
||||
self.set_representation_colorspace(
|
||||
representation=repre,
|
||||
context=instance.context,
|
||||
)
|
||||
|
||||
# review representation
|
||||
if instance.data.get("review", False):
|
||||
repre["tags"] = ["review"]
|
||||
|
||||
# add the repre to the instance
|
||||
if "representations" not in instance.data:
|
||||
instance.data["representations"] = []
|
||||
instance.data["representations"].append(repre)
|
||||
|
||||
return instance
|
||||
|
||||
def get_render_instances_by_frame_range(self, context):
|
||||
"""Return enabled render.local instances grouped by their frame range.
|
||||
|
||||
Arguments:
|
||||
context (pyblish.Context): The pyblish context
|
||||
|
||||
Returns:
|
||||
dict: (start, end): instances mapping
|
||||
|
||||
"""
|
||||
|
||||
instances_to_render = [
|
||||
instance for instance in context if
|
||||
# Only active instances
|
||||
instance.data.get("publish", True) and
|
||||
# Only render.local instances
|
||||
"render.local" in instance.data.get("families", [])
|
||||
]
|
||||
|
||||
# Instances by frame ranges
|
||||
instances_by_frame_range = collections.defaultdict(list)
|
||||
for instance in instances_to_render:
|
||||
start, end = self.get_instance_render_frame_range(instance)
|
||||
instances_by_frame_range[(start, end)].append(instance)
|
||||
|
||||
return dict(instances_by_frame_range)
|
||||
|
||||
def get_instance_render_frame_range(self, instance):
|
||||
start = instance.data["frameStartHandle"]
|
||||
end = instance.data["frameEndHandle"]
|
||||
return start, end
|
||||
|
|
|
|||
|
|
@ -1,29 +1,39 @@
|
|||
import pyblish.api
|
||||
|
||||
from openpype.pipeline import OptionalPyblishPluginMixin
|
||||
from openpype.pipeline import KnownPublishError
|
||||
|
||||
class FusionIncrementCurrentFile(pyblish.api.ContextPlugin):
|
||||
|
||||
class FusionIncrementCurrentFile(
|
||||
pyblish.api.ContextPlugin, OptionalPyblishPluginMixin
|
||||
):
|
||||
"""Increment the current file.
|
||||
|
||||
Saves the current file with an increased version number.
|
||||
|
||||
"""
|
||||
|
||||
label = "Increment current file"
|
||||
label = "Increment workfile version"
|
||||
order = pyblish.api.IntegratorOrder + 9.0
|
||||
hosts = ["fusion"]
|
||||
families = ["workfile"]
|
||||
optional = True
|
||||
|
||||
def process(self, context):
|
||||
if not self.is_active(context.data):
|
||||
return
|
||||
|
||||
from openpype.lib import version_up
|
||||
from openpype.pipeline.publish import get_errored_plugins_from_context
|
||||
|
||||
errored_plugins = get_errored_plugins_from_context(context)
|
||||
if any(plugin.__name__ == "FusionSubmitDeadline"
|
||||
for plugin in errored_plugins):
|
||||
raise RuntimeError("Skipping incrementing current file because "
|
||||
"submission to render farm failed.")
|
||||
if any(
|
||||
plugin.__name__ == "FusionSubmitDeadline"
|
||||
for plugin in errored_plugins
|
||||
):
|
||||
raise KnownPublishError(
|
||||
"Skipping incrementing current file because "
|
||||
"submission to render farm failed."
|
||||
)
|
||||
|
||||
comp = context.data.get("currentComp")
|
||||
assert comp, "Must have comp"
|
||||
|
|
|
|||
|
|
@ -1,12 +1,17 @@
|
|||
import pyblish.api
|
||||
|
||||
from openpype.pipeline.publish import RepairAction
|
||||
from openpype.pipeline import PublishValidationError
|
||||
from openpype.pipeline import (
|
||||
publish,
|
||||
OptionalPyblishPluginMixin,
|
||||
PublishValidationError,
|
||||
)
|
||||
|
||||
from openpype.hosts.fusion.api.action import SelectInvalidAction
|
||||
|
||||
|
||||
class ValidateBackgroundDepth(pyblish.api.InstancePlugin):
|
||||
class ValidateBackgroundDepth(
|
||||
pyblish.api.InstancePlugin, OptionalPyblishPluginMixin
|
||||
):
|
||||
"""Validate if all Background tool are set to float32 bit"""
|
||||
|
||||
order = pyblish.api.ValidatorOrder
|
||||
|
|
@ -15,11 +20,10 @@ class ValidateBackgroundDepth(pyblish.api.InstancePlugin):
|
|||
families = ["render"]
|
||||
optional = True
|
||||
|
||||
actions = [SelectInvalidAction, RepairAction]
|
||||
actions = [SelectInvalidAction, publish.RepairAction]
|
||||
|
||||
@classmethod
|
||||
def get_invalid(cls, instance):
|
||||
|
||||
context = instance.context
|
||||
comp = context.data.get("currentComp")
|
||||
assert comp, "Must have Comp object"
|
||||
|
|
@ -31,12 +35,16 @@ class ValidateBackgroundDepth(pyblish.api.InstancePlugin):
|
|||
return [i for i in backgrounds if i.GetInput("Depth") != 4.0]
|
||||
|
||||
def process(self, instance):
|
||||
if not self.is_active(instance.data):
|
||||
return
|
||||
|
||||
invalid = self.get_invalid(instance)
|
||||
if invalid:
|
||||
raise PublishValidationError(
|
||||
"Found {} Backgrounds tools which"
|
||||
" are not set to float32".format(len(invalid)),
|
||||
title=self.label)
|
||||
title=self.label,
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def repair(cls, instance):
|
||||
|
|
|
|||
|
|
@ -21,7 +21,7 @@ class ValidateCreateFolderChecked(pyblish.api.InstancePlugin):
|
|||
|
||||
@classmethod
|
||||
def get_invalid(cls, instance):
|
||||
tool = instance[0]
|
||||
tool = instance.data["tool"]
|
||||
create_dir = tool.GetInput("CreateDir")
|
||||
if create_dir == 0.0:
|
||||
cls.log.error(
|
||||
|
|
|
|||
|
|
@ -14,7 +14,7 @@ class ValidateLocalFramesExistence(pyblish.api.InstancePlugin):
|
|||
|
||||
order = pyblish.api.ValidatorOrder
|
||||
label = "Validate Expected Frames Exists"
|
||||
families = ["render"]
|
||||
families = ["render.frames"]
|
||||
hosts = ["fusion"]
|
||||
actions = [RepairAction, SelectInvalidAction]
|
||||
|
||||
|
|
@ -23,31 +23,20 @@ class ValidateLocalFramesExistence(pyblish.api.InstancePlugin):
|
|||
if non_existing_frames is None:
|
||||
non_existing_frames = []
|
||||
|
||||
if instance.data.get("render_target") == "frames":
|
||||
tool = instance[0]
|
||||
tool = instance.data["tool"]
|
||||
|
||||
frame_start = instance.data["frameStart"]
|
||||
frame_end = instance.data["frameEnd"]
|
||||
path = instance.data["path"]
|
||||
output_dir = instance.data["outputDir"]
|
||||
expected_files = instance.data["expectedFiles"]
|
||||
|
||||
basename = os.path.basename(path)
|
||||
head, ext = os.path.splitext(basename)
|
||||
files = [
|
||||
f"{head}{str(frame).zfill(4)}{ext}"
|
||||
for frame in range(frame_start, frame_end + 1)
|
||||
]
|
||||
for file in expected_files:
|
||||
if not os.path.exists(file):
|
||||
cls.log.error(
|
||||
f"Missing file: {file}"
|
||||
)
|
||||
non_existing_frames.append(file)
|
||||
|
||||
for file in files:
|
||||
if not os.path.exists(os.path.join(output_dir, file)):
|
||||
cls.log.error(
|
||||
f"Missing file: {os.path.join(output_dir, file)}"
|
||||
)
|
||||
non_existing_frames.append(file)
|
||||
|
||||
if len(non_existing_frames) > 0:
|
||||
cls.log.error(f"Some of {tool.Name}'s files does not exist")
|
||||
return [tool]
|
||||
if len(non_existing_frames) > 0:
|
||||
cls.log.error(f"Some of {tool.Name}'s files does not exist")
|
||||
return [tool]
|
||||
|
||||
def process(self, instance):
|
||||
non_existing_frames = []
|
||||
|
|
@ -67,8 +56,7 @@ class ValidateLocalFramesExistence(pyblish.api.InstancePlugin):
|
|||
def repair(cls, instance):
|
||||
invalid = cls.get_invalid(instance)
|
||||
if invalid:
|
||||
tool = invalid[0]
|
||||
|
||||
tool = instance.data["tool"]
|
||||
# Change render target to local to render locally
|
||||
tool.SetData("openpype.creator_attributes.render_target", "local")
|
||||
|
||||
|
|
|
|||
|
|
@ -30,11 +30,11 @@ class ValidateFilenameHasExtension(pyblish.api.InstancePlugin):
|
|||
@classmethod
|
||||
def get_invalid(cls, instance):
|
||||
|
||||
path = instance.data["path"]
|
||||
path = instance.data["expectedFiles"][0]
|
||||
fname, ext = os.path.splitext(path)
|
||||
|
||||
if not ext:
|
||||
tool = instance[0]
|
||||
tool = instance.data["tool"]
|
||||
cls.log.error("%s has no extension specified" % tool.Name)
|
||||
return [tool]
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,41 @@
|
|||
import pyblish.api
|
||||
|
||||
from openpype.pipeline import PublishValidationError
|
||||
|
||||
|
||||
class ValidateInstanceFrameRange(pyblish.api.InstancePlugin):
|
||||
"""Validate instance frame range is within comp's global render range."""
|
||||
|
||||
order = pyblish.api.ValidatorOrder
|
||||
label = "Validate Filename Has Extension"
|
||||
families = ["render"]
|
||||
hosts = ["fusion"]
|
||||
|
||||
def process(self, instance):
|
||||
|
||||
context = instance.context
|
||||
global_start = context.data["compFrameStart"]
|
||||
global_end = context.data["compFrameEnd"]
|
||||
|
||||
render_start = instance.data["frameStartHandle"]
|
||||
render_end = instance.data["frameEndHandle"]
|
||||
|
||||
if render_start < global_start or render_end > global_end:
|
||||
|
||||
message = (
|
||||
f"Instance {instance} render frame range "
|
||||
f"({render_start}-{render_end}) is outside of the comp's "
|
||||
f"global render range ({global_start}-{global_end}) and thus "
|
||||
f"can't be rendered. "
|
||||
)
|
||||
description = (
|
||||
f"{message}\n\n"
|
||||
f"Either update the comp's global range or the instance's "
|
||||
f"frame range to ensure the comp's frame range includes the "
|
||||
f"to render frame range for the instance."
|
||||
)
|
||||
raise PublishValidationError(
|
||||
title="Frame range outside of comp range",
|
||||
message=message,
|
||||
description=description
|
||||
)
|
||||
|
|
@ -20,7 +20,7 @@ class ValidateSaverHasInput(pyblish.api.InstancePlugin):
|
|||
@classmethod
|
||||
def get_invalid(cls, instance):
|
||||
|
||||
saver = instance[0]
|
||||
saver = instance.data["tool"]
|
||||
if not saver.Input.GetConnectedOutput():
|
||||
return [saver]
|
||||
|
||||
|
|
|
|||
|
|
@ -37,7 +37,7 @@ class ValidateSaverPassthrough(pyblish.api.ContextPlugin):
|
|||
|
||||
def is_invalid(self, instance):
|
||||
|
||||
saver = instance[0]
|
||||
saver = instance.data["tool"]
|
||||
attr = saver.GetAttrs()
|
||||
active = not attr["TOOLB_PassThrough"]
|
||||
|
||||
|
|
|
|||
|
|
@ -81,7 +81,13 @@ class HoudiniHost(HostBase, IWorkfileHost, ILoadHost, IPublishHost):
|
|||
# TODO: make sure this doesn't trigger when
|
||||
# opening with last workfile.
|
||||
_set_context_settings()
|
||||
shelves.generate_shelves()
|
||||
|
||||
if not IS_HEADLESS:
|
||||
import hdefereval # noqa, hdefereval is only available in ui mode
|
||||
# Defer generation of shelves due to issue on Windows where shelf
|
||||
# initialization during start up delays Houdini UI by minutes
|
||||
# making it extremely slow to launch.
|
||||
hdefereval.executeDeferred(shelves.generate_shelves)
|
||||
|
||||
if not IS_HEADLESS:
|
||||
import hdefereval # noqa, hdefereval is only available in ui mode
|
||||
|
|
|
|||
|
|
@ -138,7 +138,7 @@ def get_default_render_folder(project_setting=None):
|
|||
["default_render_image_folder"])
|
||||
|
||||
|
||||
def set_framerange(start_frame, end_frame):
|
||||
def set_render_frame_range(start_frame, end_frame):
|
||||
"""
|
||||
Note:
|
||||
Frame range can be specified in different types. Possible values are:
|
||||
|
|
@ -150,10 +150,10 @@ def set_framerange(start_frame, end_frame):
|
|||
Todo:
|
||||
Current type is hard-coded, there should be a custom setting for this.
|
||||
"""
|
||||
rt.rendTimeType = 4
|
||||
rt.rendTimeType = 3
|
||||
if start_frame is not None and end_frame is not None:
|
||||
frame_range = "{0}-{1}".format(start_frame, end_frame)
|
||||
rt.rendPickupFrames = frame_range
|
||||
rt.rendStart = int(start_frame)
|
||||
rt.rendEnd = int(end_frame)
|
||||
|
||||
|
||||
def get_multipass_setting(project_setting=None):
|
||||
|
|
@ -173,10 +173,16 @@ def set_scene_resolution(width: int, height: int):
|
|||
None
|
||||
|
||||
"""
|
||||
# make sure the render dialog is closed
|
||||
# for the update of resolution
|
||||
# Changing the Render Setup dialog settingsshould be done
|
||||
# with the actual Render Setup dialog in a closed state.
|
||||
if rt.renderSceneDialog.isOpen():
|
||||
rt.renderSceneDialog.close()
|
||||
|
||||
rt.renderWidth = width
|
||||
rt.renderHeight = height
|
||||
|
||||
|
||||
def reset_scene_resolution():
|
||||
"""Apply the scene resolution from the project definition
|
||||
|
||||
|
|
@ -239,10 +245,15 @@ def reset_frame_range(fps: bool = True):
|
|||
fps_number = float(data_fps["data"]["fps"])
|
||||
rt.frameRate = fps_number
|
||||
frame_range = get_frame_range()
|
||||
frame_start = frame_range["frameStart"] - int(frame_range["handleStart"])
|
||||
frame_end = frame_range["frameEnd"] + int(frame_range["handleEnd"])
|
||||
frange_cmd = f"animationRange = interval {frame_start} {frame_end}"
|
||||
frame_start_handle = frame_range["frameStart"] - int(
|
||||
frame_range["handleStart"]
|
||||
)
|
||||
frame_end_handle = frame_range["frameEnd"] + int(frame_range["handleEnd"])
|
||||
frange_cmd = (
|
||||
f"animationRange = interval {frame_start_handle} {frame_end_handle}"
|
||||
)
|
||||
rt.execute(frange_cmd)
|
||||
set_render_frame_range(frame_start_handle, frame_end_handle)
|
||||
|
||||
|
||||
def set_context_setting():
|
||||
|
|
@ -259,6 +270,7 @@ def set_context_setting():
|
|||
None
|
||||
"""
|
||||
reset_scene_resolution()
|
||||
reset_frame_range()
|
||||
|
||||
|
||||
def get_max_version():
|
||||
|
|
|
|||
|
|
@ -36,8 +36,9 @@ class RenderProducts(object):
|
|||
container)
|
||||
|
||||
context = get_current_project_asset()
|
||||
startFrame = context["data"].get("frameStart")
|
||||
endFrame = context["data"].get("frameEnd") + 1
|
||||
# TODO: change the frame range follows the current render setting
|
||||
startFrame = int(rt.rendStart)
|
||||
endFrame = int(rt.rendEnd) + 1
|
||||
|
||||
img_fmt = self._project_settings["max"]["RenderSettings"]["image_format"] # noqa
|
||||
full_render_list = self.beauty_render_product(output_file,
|
||||
|
|
|
|||
|
|
@ -6,7 +6,7 @@ from openpype.pipeline import legacy_io
|
|||
from openpype.pipeline.context_tools import get_current_project_asset
|
||||
|
||||
from openpype.hosts.max.api.lib import (
|
||||
set_framerange,
|
||||
set_render_frame_range,
|
||||
get_current_renderer,
|
||||
get_default_render_folder
|
||||
)
|
||||
|
|
@ -68,7 +68,7 @@ class RenderSettings(object):
|
|||
# Set Frame Range
|
||||
frame_start = context["data"].get("frame_start")
|
||||
frame_end = context["data"].get("frame_end")
|
||||
set_framerange(frame_start, frame_end)
|
||||
set_render_frame_range(frame_start, frame_end)
|
||||
# get the production render
|
||||
renderer_class = get_current_renderer()
|
||||
renderer = str(renderer_class).split(":")[0]
|
||||
|
|
@ -105,6 +105,9 @@ class RenderSettings(object):
|
|||
|
||||
rt.rendSaveFile = True
|
||||
|
||||
if rt.renderSceneDialog.isOpen():
|
||||
rt.renderSceneDialog.close()
|
||||
|
||||
def arnold_setup(self):
|
||||
# get Arnold RenderView run in the background
|
||||
# for setting up renderable camera
|
||||
|
|
|
|||
|
|
@ -52,6 +52,7 @@ class MaxHost(HostBase, IWorkfileHost, ILoadHost, INewPublisher):
|
|||
|
||||
def context_setting():
|
||||
return lib.set_context_setting()
|
||||
|
||||
rt.callbacks.addScript(rt.Name('systemPostNew'),
|
||||
context_setting)
|
||||
|
||||
|
|
|
|||
|
|
@ -27,6 +27,11 @@ class CreateRender(plugin.MaxCreator):
|
|||
# for additional work on the node:
|
||||
# instance_node = rt.getNodeByName(instance.get("instance_node"))
|
||||
|
||||
# make sure the render dialog is closed
|
||||
# for the update of resolution
|
||||
# Changing the Render Setup dialog settings should be done
|
||||
# with the actual Render Setup dialog in a closed state.
|
||||
|
||||
# set viewport camera for rendering(mandatory for deadline)
|
||||
RenderSettings().set_render_camera(sel_obj)
|
||||
# set output paths for rendering(mandatory for deadline)
|
||||
|
|
|
|||
|
|
@ -20,28 +20,25 @@ class FbxLoader(load.LoaderPlugin):
|
|||
from pymxs import runtime as rt
|
||||
|
||||
filepath = os.path.normpath(self.fname)
|
||||
rt.FBXImporterSetParam("Animation", True)
|
||||
rt.FBXImporterSetParam("Camera", True)
|
||||
rt.FBXImporterSetParam("AxisConversionMethod", True)
|
||||
rt.FBXImporterSetParam("Preserveinstances", True)
|
||||
rt.importFile(
|
||||
filepath,
|
||||
rt.name("noPrompt"),
|
||||
using=rt.FBXIMP)
|
||||
|
||||
fbx_import_cmd = (
|
||||
f"""
|
||||
container = rt.getNodeByName(f"{name}")
|
||||
if not container:
|
||||
container = rt.container()
|
||||
container.name = f"{name}"
|
||||
|
||||
FBXImporterSetParam "Animation" true
|
||||
FBXImporterSetParam "Cameras" true
|
||||
FBXImporterSetParam "AxisConversionMethod" true
|
||||
FbxExporterSetParam "UpAxis" "Y"
|
||||
FbxExporterSetParam "Preserveinstances" true
|
||||
|
||||
importFile @"{filepath}" #noPrompt using:FBXIMP
|
||||
""")
|
||||
|
||||
self.log.debug(f"Executing command: {fbx_import_cmd}")
|
||||
rt.execute(fbx_import_cmd)
|
||||
|
||||
container_name = f"{name}_CON"
|
||||
|
||||
asset = rt.getNodeByName(f"{name}")
|
||||
for selection in rt.getCurrentSelection():
|
||||
selection.Parent = container
|
||||
|
||||
return containerise(
|
||||
name, [asset], context, loader=self.__class__.__name__)
|
||||
name, [container], context, loader=self.__class__.__name__)
|
||||
|
||||
def update(self, container, representation):
|
||||
from pymxs import runtime as rt
|
||||
|
|
|
|||
|
|
@ -21,26 +21,24 @@ class FbxModelLoader(load.LoaderPlugin):
|
|||
from pymxs import runtime as rt
|
||||
|
||||
filepath = os.path.normpath(self.fname)
|
||||
rt.FBXImporterSetParam("Animation", False)
|
||||
rt.FBXImporterSetParam("Cameras", False)
|
||||
rt.FBXImporterSetParam("Preserveinstances", True)
|
||||
rt.importFile(
|
||||
filepath,
|
||||
rt.name("noPrompt"),
|
||||
using=rt.FBXIMP)
|
||||
|
||||
fbx_import_cmd = (
|
||||
f"""
|
||||
container = rt.getNodeByName(f"{name}")
|
||||
if not container:
|
||||
container = rt.container()
|
||||
container.name = f"{name}"
|
||||
|
||||
FBXImporterSetParam "Animation" false
|
||||
FBXImporterSetParam "Cameras" false
|
||||
FBXImporterSetParam "AxisConversionMethod" true
|
||||
FbxExporterSetParam "UpAxis" "Y"
|
||||
FbxExporterSetParam "Preserveinstances" true
|
||||
|
||||
importFile @"{filepath}" #noPrompt using:FBXIMP
|
||||
""")
|
||||
|
||||
self.log.debug(f"Executing command: {fbx_import_cmd}")
|
||||
rt.execute(fbx_import_cmd)
|
||||
|
||||
asset = rt.getNodeByName(f"{name}")
|
||||
for selection in rt.getCurrentSelection():
|
||||
selection.Parent = container
|
||||
|
||||
return containerise(
|
||||
name, [asset], context, loader=self.__class__.__name__)
|
||||
name, [container], context, loader=self.__class__.__name__)
|
||||
|
||||
def update(self, container, representation):
|
||||
from pymxs import runtime as rt
|
||||
|
|
|
|||
|
|
@ -46,7 +46,6 @@ class CollectRender(pyblish.api.InstancePlugin):
|
|||
|
||||
self.log.debug(f"Setting {version_int} to context.")
|
||||
context.data["version"] = version_int
|
||||
|
||||
# setup the plugin as 3dsmax for the internal renderer
|
||||
data = {
|
||||
"subset": instance.name,
|
||||
|
|
@ -59,8 +58,8 @@ class CollectRender(pyblish.api.InstancePlugin):
|
|||
"source": filepath,
|
||||
"expectedFiles": render_layer_files,
|
||||
"plugin": "3dsmax",
|
||||
"frameStart": context.data['frameStart'],
|
||||
"frameEnd": context.data['frameEnd'],
|
||||
"frameStart": int(rt.rendStart),
|
||||
"frameEnd": int(rt.rendEnd),
|
||||
"version": version_int,
|
||||
"farm": True
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,18 +1,11 @@
|
|||
import os
|
||||
import pyblish.api
|
||||
from openpype.pipeline import (
|
||||
publish,
|
||||
OptionalPyblishPluginMixin
|
||||
)
|
||||
from openpype.pipeline import publish, OptionalPyblishPluginMixin
|
||||
from pymxs import runtime as rt
|
||||
from openpype.hosts.max.api import (
|
||||
maintained_selection,
|
||||
get_all_children
|
||||
)
|
||||
from openpype.hosts.max.api import maintained_selection, get_all_children
|
||||
|
||||
|
||||
class ExtractCameraAlembic(publish.Extractor,
|
||||
OptionalPyblishPluginMixin):
|
||||
class ExtractCameraAlembic(publish.Extractor, OptionalPyblishPluginMixin):
|
||||
"""
|
||||
Extract Camera with AlembicExport
|
||||
"""
|
||||
|
|
@ -38,38 +31,33 @@ class ExtractCameraAlembic(publish.Extractor,
|
|||
path = os.path.join(stagingdir, filename)
|
||||
|
||||
# We run the render
|
||||
self.log.info("Writing alembic '%s' to '%s'" % (filename,
|
||||
stagingdir))
|
||||
self.log.info("Writing alembic '%s' to '%s'" % (filename, stagingdir))
|
||||
|
||||
export_cmd = (
|
||||
f"""
|
||||
AlembicExport.ArchiveType = #ogawa
|
||||
AlembicExport.CoordinateSystem = #maya
|
||||
AlembicExport.StartFrame = {start}
|
||||
AlembicExport.EndFrame = {end}
|
||||
AlembicExport.CustomAttributes = true
|
||||
|
||||
exportFile @"{path}" #noPrompt selectedOnly:on using:AlembicExport
|
||||
|
||||
""")
|
||||
|
||||
self.log.debug(f"Executing command: {export_cmd}")
|
||||
rt.AlembicExport.ArchiveType = rt.name("ogawa")
|
||||
rt.AlembicExport.CoordinateSystem = rt.name("maya")
|
||||
rt.AlembicExport.StartFrame = start
|
||||
rt.AlembicExport.EndFrame = end
|
||||
rt.AlembicExport.CustomAttributes = True
|
||||
|
||||
with maintained_selection():
|
||||
# select and export
|
||||
rt.select(get_all_children(rt.getNodeByName(container)))
|
||||
rt.execute(export_cmd)
|
||||
rt.exportFile(
|
||||
path,
|
||||
rt.name("noPrompt"),
|
||||
selectedOnly=True,
|
||||
using=rt.AlembicExport,
|
||||
)
|
||||
|
||||
self.log.info("Performing Extraction ...")
|
||||
if "representations" not in instance.data:
|
||||
instance.data["representations"] = []
|
||||
|
||||
representation = {
|
||||
'name': 'abc',
|
||||
'ext': 'abc',
|
||||
'files': filename,
|
||||
"name": "abc",
|
||||
"ext": "abc",
|
||||
"files": filename,
|
||||
"stagingDir": stagingdir,
|
||||
}
|
||||
instance.data["representations"].append(representation)
|
||||
self.log.info("Extracted instance '%s' to: %s" % (instance.name,
|
||||
path))
|
||||
self.log.info("Extracted instance '%s' to: %s" % (instance.name, path))
|
||||
|
|
|
|||
|
|
@ -1,18 +1,11 @@
|
|||
import os
|
||||
import pyblish.api
|
||||
from openpype.pipeline import (
|
||||
publish,
|
||||
OptionalPyblishPluginMixin
|
||||
)
|
||||
from openpype.pipeline import publish, OptionalPyblishPluginMixin
|
||||
from pymxs import runtime as rt
|
||||
from openpype.hosts.max.api import (
|
||||
maintained_selection,
|
||||
get_all_children
|
||||
)
|
||||
from openpype.hosts.max.api import maintained_selection, get_all_children
|
||||
|
||||
|
||||
class ExtractCameraFbx(publish.Extractor,
|
||||
OptionalPyblishPluginMixin):
|
||||
class ExtractCameraFbx(publish.Extractor, OptionalPyblishPluginMixin):
|
||||
"""
|
||||
Extract Camera with FbxExporter
|
||||
"""
|
||||
|
|
@ -33,43 +26,35 @@ class ExtractCameraFbx(publish.Extractor,
|
|||
filename = "{name}.fbx".format(**instance.data)
|
||||
|
||||
filepath = os.path.join(stagingdir, filename)
|
||||
self.log.info("Writing fbx file '%s' to '%s'" % (filename,
|
||||
filepath))
|
||||
self.log.info("Writing fbx file '%s' to '%s'" % (filename, filepath))
|
||||
|
||||
# Need to export:
|
||||
# Animation = True
|
||||
# Cameras = True
|
||||
# AxisConversionMethod
|
||||
fbx_export_cmd = (
|
||||
f"""
|
||||
|
||||
FBXExporterSetParam "Animation" true
|
||||
FBXExporterSetParam "Cameras" true
|
||||
FBXExporterSetParam "AxisConversionMethod" "Animation"
|
||||
FbxExporterSetParam "UpAxis" "Y"
|
||||
FbxExporterSetParam "Preserveinstances" true
|
||||
|
||||
exportFile @"{filepath}" #noPrompt selectedOnly:true using:FBXEXP
|
||||
|
||||
""")
|
||||
|
||||
self.log.debug(f"Executing command: {fbx_export_cmd}")
|
||||
rt.FBXExporterSetParam("Animation", True)
|
||||
rt.FBXExporterSetParam("Cameras", True)
|
||||
rt.FBXExporterSetParam("AxisConversionMethod", "Animation")
|
||||
rt.FBXExporterSetParam("UpAxis", "Y")
|
||||
rt.FBXExporterSetParam("Preserveinstances", True)
|
||||
|
||||
with maintained_selection():
|
||||
# select and export
|
||||
rt.select(get_all_children(rt.getNodeByName(container)))
|
||||
rt.execute(fbx_export_cmd)
|
||||
rt.exportFile(
|
||||
filepath,
|
||||
rt.name("noPrompt"),
|
||||
selectedOnly=True,
|
||||
using=rt.FBXEXP,
|
||||
)
|
||||
|
||||
self.log.info("Performing Extraction ...")
|
||||
if "representations" not in instance.data:
|
||||
instance.data["representations"] = []
|
||||
|
||||
representation = {
|
||||
'name': 'fbx',
|
||||
'ext': 'fbx',
|
||||
'files': filename,
|
||||
"name": "fbx",
|
||||
"ext": "fbx",
|
||||
"files": filename,
|
||||
"stagingDir": stagingdir,
|
||||
}
|
||||
instance.data["representations"].append(representation)
|
||||
self.log.info("Extracted instance '%s' to: %s" % (instance.name,
|
||||
filepath))
|
||||
self.log.info(
|
||||
"Extracted instance '%s' to: %s" % (instance.name, filepath)
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,18 +1,11 @@
|
|||
import os
|
||||
import pyblish.api
|
||||
from openpype.pipeline import (
|
||||
publish,
|
||||
OptionalPyblishPluginMixin
|
||||
)
|
||||
from openpype.pipeline import publish, OptionalPyblishPluginMixin
|
||||
from pymxs import runtime as rt
|
||||
from openpype.hosts.max.api import (
|
||||
maintained_selection,
|
||||
get_all_children
|
||||
)
|
||||
from openpype.hosts.max.api import get_all_children
|
||||
|
||||
|
||||
class ExtractMaxSceneRaw(publish.Extractor,
|
||||
OptionalPyblishPluginMixin):
|
||||
class ExtractMaxSceneRaw(publish.Extractor, OptionalPyblishPluginMixin):
|
||||
"""
|
||||
Extract Raw Max Scene with SaveSelected
|
||||
"""
|
||||
|
|
@ -20,9 +13,7 @@ class ExtractMaxSceneRaw(publish.Extractor,
|
|||
order = pyblish.api.ExtractorOrder - 0.2
|
||||
label = "Extract Max Scene (Raw)"
|
||||
hosts = ["max"]
|
||||
families = ["camera",
|
||||
"maxScene",
|
||||
"model"]
|
||||
families = ["camera", "maxScene", "model"]
|
||||
optional = True
|
||||
|
||||
def process(self, instance):
|
||||
|
|
@ -37,26 +28,23 @@ class ExtractMaxSceneRaw(publish.Extractor,
|
|||
filename = "{name}.max".format(**instance.data)
|
||||
|
||||
max_path = os.path.join(stagingdir, filename)
|
||||
self.log.info("Writing max file '%s' to '%s'" % (filename,
|
||||
max_path))
|
||||
self.log.info("Writing max file '%s' to '%s'" % (filename, max_path))
|
||||
|
||||
if "representations" not in instance.data:
|
||||
instance.data["representations"] = []
|
||||
|
||||
# saving max scene
|
||||
with maintained_selection():
|
||||
# need to figure out how to select the camera
|
||||
rt.select(get_all_children(rt.getNodeByName(container)))
|
||||
rt.execute(f'saveNodes selection "{max_path}" quiet:true')
|
||||
nodes = get_all_children(rt.getNodeByName(container))
|
||||
rt.saveNodes(nodes, max_path, quiet=True)
|
||||
|
||||
self.log.info("Performing Extraction ...")
|
||||
|
||||
representation = {
|
||||
'name': 'max',
|
||||
'ext': 'max',
|
||||
'files': filename,
|
||||
"name": "max",
|
||||
"ext": "max",
|
||||
"files": filename,
|
||||
"stagingDir": stagingdir,
|
||||
}
|
||||
instance.data["representations"].append(representation)
|
||||
self.log.info("Extracted instance '%s' to: %s" % (instance.name,
|
||||
max_path))
|
||||
self.log.info(
|
||||
"Extracted instance '%s' to: %s" % (instance.name, max_path)
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,18 +1,11 @@
|
|||
import os
|
||||
import pyblish.api
|
||||
from openpype.pipeline import (
|
||||
publish,
|
||||
OptionalPyblishPluginMixin
|
||||
)
|
||||
from openpype.pipeline import publish, OptionalPyblishPluginMixin
|
||||
from pymxs import runtime as rt
|
||||
from openpype.hosts.max.api import (
|
||||
maintained_selection,
|
||||
get_all_children
|
||||
)
|
||||
from openpype.hosts.max.api import maintained_selection, get_all_children
|
||||
|
||||
|
||||
class ExtractModel(publish.Extractor,
|
||||
OptionalPyblishPluginMixin):
|
||||
class ExtractModel(publish.Extractor, OptionalPyblishPluginMixin):
|
||||
"""
|
||||
Extract Geometry in Alembic Format
|
||||
"""
|
||||
|
|
@ -36,39 +29,36 @@ class ExtractModel(publish.Extractor,
|
|||
filepath = os.path.join(stagingdir, filename)
|
||||
|
||||
# We run the render
|
||||
self.log.info("Writing alembic '%s' to '%s'" % (filename,
|
||||
stagingdir))
|
||||
self.log.info("Writing alembic '%s' to '%s'" % (filename, stagingdir))
|
||||
|
||||
export_cmd = (
|
||||
f"""
|
||||
AlembicExport.ArchiveType = #ogawa
|
||||
AlembicExport.CoordinateSystem = #maya
|
||||
AlembicExport.CustomAttributes = true
|
||||
AlembicExport.UVs = true
|
||||
AlembicExport.VertexColors = true
|
||||
AlembicExport.PreserveInstances = true
|
||||
|
||||
exportFile @"{filepath}" #noPrompt selectedOnly:on using:AlembicExport
|
||||
|
||||
""")
|
||||
|
||||
self.log.debug(f"Executing command: {export_cmd}")
|
||||
rt.AlembicExport.ArchiveType = rt.name("ogawa")
|
||||
rt.AlembicExport.CoordinateSystem = rt.name("maya")
|
||||
rt.AlembicExport.CustomAttributes = True
|
||||
rt.AlembicExport.UVs = True
|
||||
rt.AlembicExport.VertexColors = True
|
||||
rt.AlembicExport.PreserveInstances = True
|
||||
|
||||
with maintained_selection():
|
||||
# select and export
|
||||
rt.select(get_all_children(rt.getNodeByName(container)))
|
||||
rt.execute(export_cmd)
|
||||
rt.exportFile(
|
||||
filepath,
|
||||
rt.name("noPrompt"),
|
||||
selectedOnly=True,
|
||||
using=rt.AlembicExport,
|
||||
)
|
||||
|
||||
self.log.info("Performing Extraction ...")
|
||||
if "representations" not in instance.data:
|
||||
instance.data["representations"] = []
|
||||
|
||||
representation = {
|
||||
'name': 'abc',
|
||||
'ext': 'abc',
|
||||
'files': filename,
|
||||
"name": "abc",
|
||||
"ext": "abc",
|
||||
"files": filename,
|
||||
"stagingDir": stagingdir,
|
||||
}
|
||||
instance.data["representations"].append(representation)
|
||||
self.log.info("Extracted instance '%s' to: %s" % (instance.name,
|
||||
filepath))
|
||||
self.log.info(
|
||||
"Extracted instance '%s' to: %s" % (instance.name, filepath)
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,18 +1,11 @@
|
|||
import os
|
||||
import pyblish.api
|
||||
from openpype.pipeline import (
|
||||
publish,
|
||||
OptionalPyblishPluginMixin
|
||||
)
|
||||
from openpype.pipeline import publish, OptionalPyblishPluginMixin
|
||||
from pymxs import runtime as rt
|
||||
from openpype.hosts.max.api import (
|
||||
maintained_selection,
|
||||
get_all_children
|
||||
)
|
||||
from openpype.hosts.max.api import maintained_selection, get_all_children
|
||||
|
||||
|
||||
class ExtractModelFbx(publish.Extractor,
|
||||
OptionalPyblishPluginMixin):
|
||||
class ExtractModelFbx(publish.Extractor, OptionalPyblishPluginMixin):
|
||||
"""
|
||||
Extract Geometry in FBX Format
|
||||
"""
|
||||
|
|
@ -33,42 +26,38 @@ class ExtractModelFbx(publish.Extractor,
|
|||
|
||||
stagingdir = self.staging_dir(instance)
|
||||
filename = "{name}.fbx".format(**instance.data)
|
||||
filepath = os.path.join(stagingdir,
|
||||
filename)
|
||||
self.log.info("Writing FBX '%s' to '%s'" % (filepath,
|
||||
stagingdir))
|
||||
filepath = os.path.join(stagingdir, filename)
|
||||
self.log.info("Writing FBX '%s' to '%s'" % (filepath, stagingdir))
|
||||
|
||||
export_fbx_cmd = (
|
||||
f"""
|
||||
FBXExporterSetParam "Animation" false
|
||||
FBXExporterSetParam "Cameras" false
|
||||
FBXExporterSetParam "Lights" false
|
||||
FBXExporterSetParam "PointCache" false
|
||||
FBXExporterSetParam "AxisConversionMethod" "Animation"
|
||||
FbxExporterSetParam "UpAxis" "Y"
|
||||
FbxExporterSetParam "Preserveinstances" true
|
||||
|
||||
exportFile @"{filepath}" #noPrompt selectedOnly:true using:FBXEXP
|
||||
|
||||
""")
|
||||
|
||||
self.log.debug(f"Executing command: {export_fbx_cmd}")
|
||||
rt.FBXExporterSetParam("Animation", False)
|
||||
rt.FBXExporterSetParam("Cameras", False)
|
||||
rt.FBXExporterSetParam("Lights", False)
|
||||
rt.FBXExporterSetParam("PointCache", False)
|
||||
rt.FBXExporterSetParam("AxisConversionMethod", "Animation")
|
||||
rt.FBXExporterSetParam("UpAxis", "Y")
|
||||
rt.FBXExporterSetParam("Preserveinstances", True)
|
||||
|
||||
with maintained_selection():
|
||||
# select and export
|
||||
rt.select(get_all_children(rt.getNodeByName(container)))
|
||||
rt.execute(export_fbx_cmd)
|
||||
rt.exportFile(
|
||||
filepath,
|
||||
rt.name("noPrompt"),
|
||||
selectedOnly=True,
|
||||
using=rt.FBXEXP,
|
||||
)
|
||||
|
||||
self.log.info("Performing Extraction ...")
|
||||
if "representations" not in instance.data:
|
||||
instance.data["representations"] = []
|
||||
|
||||
representation = {
|
||||
'name': 'fbx',
|
||||
'ext': 'fbx',
|
||||
'files': filename,
|
||||
"name": "fbx",
|
||||
"ext": "fbx",
|
||||
"files": filename,
|
||||
"stagingDir": stagingdir,
|
||||
}
|
||||
instance.data["representations"].append(representation)
|
||||
self.log.info("Extracted instance '%s' to: %s" % (instance.name,
|
||||
filepath))
|
||||
self.log.info(
|
||||
"Extracted instance '%s' to: %s" % (instance.name, filepath)
|
||||
)
|
||||
|
|
|
|||
|
|
@ -1,18 +1,11 @@
|
|||
import os
|
||||
import pyblish.api
|
||||
from openpype.pipeline import (
|
||||
publish,
|
||||
OptionalPyblishPluginMixin
|
||||
)
|
||||
from openpype.pipeline import publish, OptionalPyblishPluginMixin
|
||||
from pymxs import runtime as rt
|
||||
from openpype.hosts.max.api import (
|
||||
maintained_selection,
|
||||
get_all_children
|
||||
)
|
||||
from openpype.hosts.max.api import maintained_selection, get_all_children
|
||||
|
||||
|
||||
class ExtractModelObj(publish.Extractor,
|
||||
OptionalPyblishPluginMixin):
|
||||
class ExtractModelObj(publish.Extractor, OptionalPyblishPluginMixin):
|
||||
"""
|
||||
Extract Geometry in OBJ Format
|
||||
"""
|
||||
|
|
@ -33,27 +26,31 @@ class ExtractModelObj(publish.Extractor,
|
|||
|
||||
stagingdir = self.staging_dir(instance)
|
||||
filename = "{name}.obj".format(**instance.data)
|
||||
filepath = os.path.join(stagingdir,
|
||||
filename)
|
||||
self.log.info("Writing OBJ '%s' to '%s'" % (filepath,
|
||||
stagingdir))
|
||||
filepath = os.path.join(stagingdir, filename)
|
||||
self.log.info("Writing OBJ '%s' to '%s'" % (filepath, stagingdir))
|
||||
|
||||
with maintained_selection():
|
||||
# select and export
|
||||
rt.select(get_all_children(rt.getNodeByName(container)))
|
||||
rt.execute(f'exportFile @"{filepath}" #noPrompt selectedOnly:true using:ObjExp') # noqa
|
||||
rt.exportFile(
|
||||
filepath,
|
||||
rt.name("noPrompt"),
|
||||
selectedOnly=True,
|
||||
using=rt.ObjExp,
|
||||
)
|
||||
|
||||
self.log.info("Performing Extraction ...")
|
||||
if "representations" not in instance.data:
|
||||
instance.data["representations"] = []
|
||||
|
||||
representation = {
|
||||
'name': 'obj',
|
||||
'ext': 'obj',
|
||||
'files': filename,
|
||||
"name": "obj",
|
||||
"ext": "obj",
|
||||
"files": filename,
|
||||
"stagingDir": stagingdir,
|
||||
}
|
||||
|
||||
instance.data["representations"].append(representation)
|
||||
self.log.info("Extracted instance '%s' to: %s" % (instance.name,
|
||||
filepath))
|
||||
self.log.info(
|
||||
"Extracted instance '%s' to: %s" % (instance.name, filepath)
|
||||
)
|
||||
|
|
|
|||
|
|
@ -41,10 +41,7 @@ import os
|
|||
import pyblish.api
|
||||
from openpype.pipeline import publish
|
||||
from pymxs import runtime as rt
|
||||
from openpype.hosts.max.api import (
|
||||
maintained_selection,
|
||||
get_all_children
|
||||
)
|
||||
from openpype.hosts.max.api import maintained_selection, get_all_children
|
||||
|
||||
|
||||
class ExtractAlembic(publish.Extractor):
|
||||
|
|
@ -66,35 +63,30 @@ class ExtractAlembic(publish.Extractor):
|
|||
path = os.path.join(parent_dir, file_name)
|
||||
|
||||
# We run the render
|
||||
self.log.info("Writing alembic '%s' to '%s'" % (file_name,
|
||||
parent_dir))
|
||||
self.log.info("Writing alembic '%s' to '%s'" % (file_name, parent_dir))
|
||||
|
||||
abc_export_cmd = (
|
||||
f"""
|
||||
AlembicExport.ArchiveType = #ogawa
|
||||
AlembicExport.CoordinateSystem = #maya
|
||||
AlembicExport.StartFrame = {start}
|
||||
AlembicExport.EndFrame = {end}
|
||||
|
||||
exportFile @"{path}" #noPrompt selectedOnly:on using:AlembicExport
|
||||
|
||||
""")
|
||||
|
||||
self.log.debug(f"Executing command: {abc_export_cmd}")
|
||||
rt.AlembicExport.ArchiveType = rt.name("ogawa")
|
||||
rt.AlembicExport.CoordinateSystem = rt.name("maya")
|
||||
rt.AlembicExport.StartFrame = start
|
||||
rt.AlembicExport.EndFrame = end
|
||||
|
||||
with maintained_selection():
|
||||
# select and export
|
||||
|
||||
rt.select(get_all_children(rt.getNodeByName(container)))
|
||||
rt.execute(abc_export_cmd)
|
||||
rt.exportFile(
|
||||
path,
|
||||
rt.name("noPrompt"),
|
||||
selectedOnly=True,
|
||||
using=rt.AlembicExport,
|
||||
)
|
||||
|
||||
if "representations" not in instance.data:
|
||||
instance.data["representations"] = []
|
||||
|
||||
representation = {
|
||||
'name': 'abc',
|
||||
'ext': 'abc',
|
||||
'files': file_name,
|
||||
"name": "abc",
|
||||
"ext": "abc",
|
||||
"files": file_name,
|
||||
"stagingDir": parent_dir,
|
||||
}
|
||||
instance.data["representations"].append(representation)
|
||||
|
|
|
|||
64
openpype/hosts/max/plugins/publish/validate_frame_range.py
Normal file
64
openpype/hosts/max/plugins/publish/validate_frame_range.py
Normal file
|
|
@ -0,0 +1,64 @@
|
|||
import pyblish.api
|
||||
|
||||
from pymxs import runtime as rt
|
||||
from openpype.pipeline import (
|
||||
OptionalPyblishPluginMixin
|
||||
)
|
||||
from openpype.pipeline.publish import (
|
||||
RepairAction,
|
||||
ValidateContentsOrder,
|
||||
PublishValidationError
|
||||
)
|
||||
|
||||
|
||||
class ValidateFrameRange(pyblish.api.InstancePlugin,
|
||||
OptionalPyblishPluginMixin):
|
||||
"""Validates the frame ranges.
|
||||
|
||||
This is an optional validator checking if the frame range on instance
|
||||
matches the frame range specified for the asset.
|
||||
|
||||
It also validates render frame ranges of render layers.
|
||||
|
||||
Repair action will change everything to match the asset frame range.
|
||||
|
||||
This can be turned off by the artist to allow custom ranges.
|
||||
"""
|
||||
|
||||
label = "Validate Frame Range"
|
||||
order = ValidateContentsOrder
|
||||
families = ["maxrender"]
|
||||
hosts = ["max"]
|
||||
optional = True
|
||||
actions = [RepairAction]
|
||||
|
||||
def process(self, instance):
|
||||
if not self.is_active(instance.data):
|
||||
self.log.info("Skipping validation...")
|
||||
return
|
||||
context = instance.context
|
||||
|
||||
frame_start = int(context.data.get("frameStart"))
|
||||
frame_end = int(context.data.get("frameEnd"))
|
||||
|
||||
inst_frame_start = int(instance.data.get("frameStart"))
|
||||
inst_frame_end = int(instance.data.get("frameEnd"))
|
||||
|
||||
errors = []
|
||||
if frame_start != inst_frame_start:
|
||||
errors.append(
|
||||
f"Start frame ({inst_frame_start}) on instance does not match " # noqa
|
||||
f"with the start frame ({frame_start}) set on the asset data. ") # noqa
|
||||
if frame_end != inst_frame_end:
|
||||
errors.append(
|
||||
f"End frame ({inst_frame_end}) on instance does not match "
|
||||
f"with the end frame ({frame_start}) from the asset data. ")
|
||||
|
||||
if errors:
|
||||
errors.append("You can use repair action to fix it.")
|
||||
raise PublishValidationError("\n".join(errors))
|
||||
|
||||
@classmethod
|
||||
def repair(cls, instance):
|
||||
rt.rendStart = instance.context.data.get("frameStart")
|
||||
rt.rendEnd = instance.context.data.get("frameEnd")
|
||||
|
|
@ -0,0 +1,65 @@
|
|||
import pyblish.api
|
||||
from openpype.pipeline import (
|
||||
PublishValidationError,
|
||||
OptionalPyblishPluginMixin
|
||||
)
|
||||
from pymxs import runtime as rt
|
||||
from openpype.hosts.max.api.lib import reset_scene_resolution
|
||||
|
||||
from openpype.pipeline.context_tools import (
|
||||
get_current_project_asset,
|
||||
get_current_project
|
||||
)
|
||||
|
||||
|
||||
class ValidateResolutionSetting(pyblish.api.InstancePlugin,
|
||||
OptionalPyblishPluginMixin):
|
||||
"""Validate the resolution setting aligned with DB"""
|
||||
|
||||
order = pyblish.api.ValidatorOrder - 0.01
|
||||
families = ["maxrender"]
|
||||
hosts = ["max"]
|
||||
label = "Validate Resolution Setting"
|
||||
optional = True
|
||||
|
||||
def process(self, instance):
|
||||
if not self.is_active(instance.data):
|
||||
return
|
||||
width, height = self.get_db_resolution(instance)
|
||||
current_width = rt.renderwidth
|
||||
current_height = rt.renderHeight
|
||||
if current_width != width and current_height != height:
|
||||
raise PublishValidationError("Resolution Setting "
|
||||
"not matching resolution "
|
||||
"set on asset or shot.")
|
||||
if current_width != width:
|
||||
raise PublishValidationError("Width in Resolution Setting "
|
||||
"not matching resolution set "
|
||||
"on asset or shot.")
|
||||
|
||||
if current_height != height:
|
||||
raise PublishValidationError("Height in Resolution Setting "
|
||||
"not matching resolution set "
|
||||
"on asset or shot.")
|
||||
|
||||
def get_db_resolution(self, instance):
|
||||
data = ["data.resolutionWidth", "data.resolutionHeight"]
|
||||
project_resolution = get_current_project(fields=data)
|
||||
project_resolution_data = project_resolution["data"]
|
||||
asset_resolution = get_current_project_asset(fields=data)
|
||||
asset_resolution_data = asset_resolution["data"]
|
||||
# Set project resolution
|
||||
project_width = int(
|
||||
project_resolution_data.get("resolutionWidth", 1920))
|
||||
project_height = int(
|
||||
project_resolution_data.get("resolutionHeight", 1080))
|
||||
width = int(
|
||||
asset_resolution_data.get("resolutionWidth", project_width))
|
||||
height = int(
|
||||
asset_resolution_data.get("resolutionHeight", project_height))
|
||||
|
||||
return width, height
|
||||
|
||||
@classmethod
|
||||
def repair(cls, instance):
|
||||
reset_scene_resolution()
|
||||
|
|
@ -191,6 +191,44 @@ def maintained_selection():
|
|||
cmds.select(clear=True)
|
||||
|
||||
|
||||
def get_custom_namespace(custom_namespace):
|
||||
"""Return unique namespace.
|
||||
|
||||
The input namespace can contain a single group
|
||||
of '#' number tokens to indicate where the namespace's
|
||||
unique index should go. The amount of tokens defines
|
||||
the zero padding of the number, e.g ### turns into 001.
|
||||
|
||||
Warning: Note that a namespace will always be
|
||||
prefixed with a _ if it starts with a digit
|
||||
|
||||
Example:
|
||||
>>> get_custom_namespace("myspace_##_")
|
||||
# myspace_01_
|
||||
>>> get_custom_namespace("##_myspace")
|
||||
# _01_myspace
|
||||
>>> get_custom_namespace("myspace##")
|
||||
# myspace01
|
||||
|
||||
"""
|
||||
split = re.split("([#]+)", custom_namespace, 1)
|
||||
|
||||
if len(split) == 3:
|
||||
base, padding, suffix = split
|
||||
padding = "%0{}d".format(len(padding))
|
||||
else:
|
||||
base = split[0]
|
||||
padding = "%02d" # default padding
|
||||
suffix = ""
|
||||
|
||||
return unique_namespace(
|
||||
base,
|
||||
format=padding,
|
||||
prefix="_" if not base or base[0].isdigit() else "",
|
||||
suffix=suffix
|
||||
)
|
||||
|
||||
|
||||
def unique_namespace(namespace, format="%02d", prefix="", suffix=""):
|
||||
"""Return unique namespace
|
||||
|
||||
|
|
@ -317,11 +355,13 @@ def collect_animation_data(fps=False):
|
|||
# get scene values as defaults
|
||||
frame_start = cmds.playbackOptions(query=True, minTime=True)
|
||||
frame_end = cmds.playbackOptions(query=True, maxTime=True)
|
||||
handle_start = cmds.playbackOptions(query=True, animationStartTime=True)
|
||||
handle_end = cmds.playbackOptions(query=True, animationEndTime=True)
|
||||
frame_start_handle = cmds.playbackOptions(
|
||||
query=True, animationStartTime=True
|
||||
)
|
||||
frame_end_handle = cmds.playbackOptions(query=True, animationEndTime=True)
|
||||
|
||||
handle_start = frame_start - handle_start
|
||||
handle_end = handle_end - frame_end
|
||||
handle_start = frame_start - frame_start_handle
|
||||
handle_end = frame_end_handle - frame_end
|
||||
|
||||
# build attributes
|
||||
data = OrderedDict()
|
||||
|
|
@ -3926,7 +3966,9 @@ def get_capture_preset(task_name, task_type, subset, project_settings, log):
|
|||
return capture_preset or {}
|
||||
|
||||
|
||||
def create_rig_animation_instance(nodes, context, namespace, log=None):
|
||||
def create_rig_animation_instance(
|
||||
nodes, context, namespace, options=None, log=None
|
||||
):
|
||||
"""Create an animation publish instance for loaded rigs.
|
||||
|
||||
See the RecreateRigAnimationInstance inventory action on how to use this
|
||||
|
|
@ -3936,12 +3978,16 @@ def create_rig_animation_instance(nodes, context, namespace, log=None):
|
|||
nodes (list): Member nodes of the rig instance.
|
||||
context (dict): Representation context of the rig container
|
||||
namespace (str): Namespace of the rig container
|
||||
options (dict, optional): Additional loader data
|
||||
log (logging.Logger, optional): Logger to log to if provided
|
||||
|
||||
Returns:
|
||||
None
|
||||
|
||||
"""
|
||||
if options is None:
|
||||
options = {}
|
||||
|
||||
output = next((node for node in nodes if
|
||||
node.endswith("out_SET")), None)
|
||||
controls = next((node for node in nodes if
|
||||
|
|
@ -3960,6 +4006,23 @@ def create_rig_animation_instance(nodes, context, namespace, log=None):
|
|||
asset = legacy_io.Session["AVALON_ASSET"]
|
||||
dependency = str(context["representation"]["_id"])
|
||||
|
||||
custom_subset = options.get("animationSubsetName")
|
||||
if custom_subset:
|
||||
formatting_data = {
|
||||
"asset_name": context['asset']['name'],
|
||||
"asset_type": context['asset']['type'],
|
||||
"subset": context['subset']['name'],
|
||||
"family": (
|
||||
context['subset']['data'].get('family') or
|
||||
context['subset']['data']['families'][0]
|
||||
)
|
||||
}
|
||||
namespace = get_custom_namespace(
|
||||
custom_subset.format(
|
||||
**formatting_data
|
||||
)
|
||||
)
|
||||
|
||||
if log:
|
||||
log.info("Creating subset: {}".format(namespace))
|
||||
|
||||
|
|
|
|||
|
|
@ -84,44 +84,6 @@ def get_reference_node_parents(ref):
|
|||
return parents
|
||||
|
||||
|
||||
def get_custom_namespace(custom_namespace):
|
||||
"""Return unique namespace.
|
||||
|
||||
The input namespace can contain a single group
|
||||
of '#' number tokens to indicate where the namespace's
|
||||
unique index should go. The amount of tokens defines
|
||||
the zero padding of the number, e.g ### turns into 001.
|
||||
|
||||
Warning: Note that a namespace will always be
|
||||
prefixed with a _ if it starts with a digit
|
||||
|
||||
Example:
|
||||
>>> get_custom_namespace("myspace_##_")
|
||||
# myspace_01_
|
||||
>>> get_custom_namespace("##_myspace")
|
||||
# _01_myspace
|
||||
>>> get_custom_namespace("myspace##")
|
||||
# myspace01
|
||||
|
||||
"""
|
||||
split = re.split("([#]+)", custom_namespace, 1)
|
||||
|
||||
if len(split) == 3:
|
||||
base, padding, suffix = split
|
||||
padding = "%0{}d".format(len(padding))
|
||||
else:
|
||||
base = split[0]
|
||||
padding = "%02d" # default padding
|
||||
suffix = ""
|
||||
|
||||
return lib.unique_namespace(
|
||||
base,
|
||||
format=padding,
|
||||
prefix="_" if not base or base[0].isdigit() else "",
|
||||
suffix=suffix
|
||||
)
|
||||
|
||||
|
||||
class Creator(LegacyCreator):
|
||||
defaults = ['Main']
|
||||
|
||||
|
|
@ -216,7 +178,7 @@ class ReferenceLoader(Loader):
|
|||
count = options.get("count") or 1
|
||||
|
||||
for c in range(0, count):
|
||||
namespace = get_custom_namespace(custom_namespace)
|
||||
namespace = lib.get_custom_namespace(custom_namespace)
|
||||
group_name = "{}:{}".format(
|
||||
namespace,
|
||||
custom_group_name
|
||||
|
|
|
|||
|
|
@ -43,7 +43,24 @@ class MayaTemplateBuilder(AbstractTemplateBuilder):
|
|||
))
|
||||
|
||||
cmds.sets(name=PLACEHOLDER_SET, empty=True)
|
||||
new_nodes = cmds.file(path, i=True, returnNewNodes=True)
|
||||
new_nodes = cmds.file(
|
||||
path,
|
||||
i=True,
|
||||
returnNewNodes=True,
|
||||
preserveReferences=True,
|
||||
loadReferenceDepth="all",
|
||||
)
|
||||
|
||||
# make default cameras non-renderable
|
||||
default_cameras = [cam for cam in cmds.ls(cameras=True)
|
||||
if cmds.camera(cam, query=True, startupCamera=True)]
|
||||
for cam in default_cameras:
|
||||
if not cmds.attributeQuery("renderable", node=cam, exists=True):
|
||||
self.log.debug(
|
||||
"Camera {} has no attribute 'renderable'".format(cam)
|
||||
)
|
||||
continue
|
||||
cmds.setAttr("{}.renderable".format(cam), 0)
|
||||
|
||||
cmds.setAttr(PLACEHOLDER_SET + ".hiddenInOutliner", True)
|
||||
|
||||
|
|
|
|||
|
|
@ -223,7 +223,7 @@ class ReferenceLoader(openpype.hosts.maya.api.plugin.ReferenceLoader):
|
|||
def _post_process_rig(self, name, namespace, context, options):
|
||||
nodes = self[:]
|
||||
create_rig_animation_instance(
|
||||
nodes, context, namespace, log=self.log
|
||||
nodes, context, namespace, options=options, log=self.log
|
||||
)
|
||||
|
||||
def _lock_camera_transforms(self, nodes):
|
||||
|
|
|
|||
|
|
@ -50,7 +50,8 @@ class ValidateShaderName(pyblish.api.InstancePlugin):
|
|||
asset_name = instance.data.get("asset", None)
|
||||
|
||||
# Check the number of connected shadingEngines per shape
|
||||
r = re.compile(cls.regex)
|
||||
regex_compile = re.compile(cls.regex)
|
||||
error_message = "object {0} has invalid shader name {1}"
|
||||
for shape in shapes:
|
||||
shading_engines = cmds.listConnections(shape,
|
||||
destination=True,
|
||||
|
|
@ -60,19 +61,18 @@ class ValidateShaderName(pyblish.api.InstancePlugin):
|
|||
)
|
||||
|
||||
for shader in shaders:
|
||||
m = r.match(cls.regex, shader)
|
||||
m = regex_compile.match(shader)
|
||||
if m is None:
|
||||
invalid.append(shape)
|
||||
cls.log.error(
|
||||
"object {0} has invalid shader name {1}".format(shape,
|
||||
shader)
|
||||
)
|
||||
cls.log.error(error_message.format(shape, shader))
|
||||
else:
|
||||
if 'asset' in r.groupindex:
|
||||
if 'asset' in regex_compile.groupindex:
|
||||
if m.group('asset') != asset_name:
|
||||
invalid.append(shape)
|
||||
cls.log.error(("object {0} has invalid "
|
||||
"shader name {1}").format(shape,
|
||||
shader))
|
||||
message = error_message
|
||||
message += " with missing asset name \"{2}\""
|
||||
cls.log.error(
|
||||
message.format(shape, shader, asset_name)
|
||||
)
|
||||
|
||||
return invalid
|
||||
|
|
|
|||
|
|
@ -2220,13 +2220,13 @@ class WorkfileSettings(object):
|
|||
handle_end = data["handleEnd"]
|
||||
|
||||
fps = float(data["fps"])
|
||||
frame_start = int(data["frameStart"]) - handle_start
|
||||
frame_end = int(data["frameEnd"]) + handle_end
|
||||
frame_start_handle = int(data["frameStart"]) - handle_start
|
||||
frame_end_handle = int(data["frameEnd"]) + handle_end
|
||||
|
||||
self._root_node["lock_range"].setValue(False)
|
||||
self._root_node["fps"].setValue(fps)
|
||||
self._root_node["first_frame"].setValue(frame_start)
|
||||
self._root_node["last_frame"].setValue(frame_end)
|
||||
self._root_node["first_frame"].setValue(frame_start_handle)
|
||||
self._root_node["last_frame"].setValue(frame_end_handle)
|
||||
self._root_node["lock_range"].setValue(True)
|
||||
|
||||
# setting active viewers
|
||||
|
|
|
|||
|
|
@ -133,11 +133,11 @@ class CollectNukeWrites(pyblish.api.InstancePlugin,
|
|||
else:
|
||||
representation['files'] = collected_frames
|
||||
|
||||
# inject colorspace data
|
||||
self.set_representation_colorspace(
|
||||
representation, instance.context,
|
||||
colorspace=colorspace
|
||||
)
|
||||
# inject colorspace data
|
||||
self.set_representation_colorspace(
|
||||
representation, instance.context,
|
||||
colorspace=colorspace
|
||||
)
|
||||
|
||||
instance.data["representations"].append(representation)
|
||||
self.log.info("Publishing rendered frames ...")
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
import os
|
||||
from pathlib import Path
|
||||
import platform
|
||||
from openpype.lib import PreLaunchHook
|
||||
from openpype.hosts.resolve.utils import setup
|
||||
|
|
@ -6,33 +7,57 @@ from openpype.hosts.resolve.utils import setup
|
|||
|
||||
class ResolvePrelaunch(PreLaunchHook):
|
||||
"""
|
||||
This hook will check if current workfile path has Resolve
|
||||
project inside. IF not, it initialize it and finally it pass
|
||||
path to the project by environment variable to Premiere launcher
|
||||
shell script.
|
||||
This hook will set up the Resolve scripting environment as described in
|
||||
Resolve's documentation found with the installed application at
|
||||
{resolve}/Support/Developer/Scripting/README.txt
|
||||
|
||||
Prepares the following environment variables:
|
||||
- `RESOLVE_SCRIPT_API`
|
||||
- `RESOLVE_SCRIPT_LIB`
|
||||
|
||||
It adds $RESOLVE_SCRIPT_API/Modules to PYTHONPATH.
|
||||
|
||||
Additionally it sets up the Python home for Python 3 based on the
|
||||
RESOLVE_PYTHON3_HOME in the environment (usually defined in OpenPype's
|
||||
Application environment for Resolve by the admin). For this it sets
|
||||
PYTHONHOME and PATH variables.
|
||||
|
||||
It also defines:
|
||||
- `RESOLVE_UTILITY_SCRIPTS_DIR`: Destination directory for OpenPype
|
||||
Fusion scripts to be copied to for Resolve to pick them up.
|
||||
- `OPENPYPE_LOG_NO_COLORS` to True to ensure OP doesn't try to
|
||||
use logging with terminal colors as it fails in Resolve.
|
||||
|
||||
"""
|
||||
|
||||
app_groups = ["resolve"]
|
||||
|
||||
def execute(self):
|
||||
current_platform = platform.system().lower()
|
||||
|
||||
PROGRAMDATA = self.launch_context.env.get("PROGRAMDATA", "")
|
||||
RESOLVE_SCRIPT_API_ = {
|
||||
programdata = self.launch_context.env.get("PROGRAMDATA", "")
|
||||
resolve_script_api_locations = {
|
||||
"windows": (
|
||||
f"{PROGRAMDATA}/Blackmagic Design/"
|
||||
f"{programdata}/Blackmagic Design/"
|
||||
"DaVinci Resolve/Support/Developer/Scripting"
|
||||
),
|
||||
"darwin": (
|
||||
"/Library/Application Support/Blackmagic Design"
|
||||
"/DaVinci Resolve/Developer/Scripting"
|
||||
),
|
||||
"linux": "/opt/resolve/Developer/Scripting"
|
||||
"linux": "/opt/resolve/Developer/Scripting",
|
||||
}
|
||||
RESOLVE_SCRIPT_API = os.path.normpath(
|
||||
RESOLVE_SCRIPT_API_[current_platform])
|
||||
self.launch_context.env["RESOLVE_SCRIPT_API"] = RESOLVE_SCRIPT_API
|
||||
resolve_script_api = Path(
|
||||
resolve_script_api_locations[current_platform]
|
||||
)
|
||||
self.log.info(
|
||||
f"setting RESOLVE_SCRIPT_API variable to {resolve_script_api}"
|
||||
)
|
||||
self.launch_context.env[
|
||||
"RESOLVE_SCRIPT_API"
|
||||
] = resolve_script_api.as_posix()
|
||||
|
||||
RESOLVE_SCRIPT_LIB_ = {
|
||||
resolve_script_lib_dirs = {
|
||||
"windows": (
|
||||
"C:/Program Files/Blackmagic Design"
|
||||
"/DaVinci Resolve/fusionscript.dll"
|
||||
|
|
@ -41,61 +66,69 @@ class ResolvePrelaunch(PreLaunchHook):
|
|||
"/Applications/DaVinci Resolve/DaVinci Resolve.app"
|
||||
"/Contents/Libraries/Fusion/fusionscript.so"
|
||||
),
|
||||
"linux": "/opt/resolve/libs/Fusion/fusionscript.so"
|
||||
"linux": "/opt/resolve/libs/Fusion/fusionscript.so",
|
||||
}
|
||||
RESOLVE_SCRIPT_LIB = os.path.normpath(
|
||||
RESOLVE_SCRIPT_LIB_[current_platform])
|
||||
self.launch_context.env["RESOLVE_SCRIPT_LIB"] = RESOLVE_SCRIPT_LIB
|
||||
resolve_script_lib = Path(resolve_script_lib_dirs[current_platform])
|
||||
self.launch_context.env[
|
||||
"RESOLVE_SCRIPT_LIB"
|
||||
] = resolve_script_lib.as_posix()
|
||||
self.log.info(
|
||||
f"setting RESOLVE_SCRIPT_LIB variable to {resolve_script_lib}"
|
||||
)
|
||||
|
||||
# TODO: add OTIO installation from `openpype/requirements.py`
|
||||
# TODO: add OTIO installation from `openpype/requirements.py`
|
||||
# making sure python <3.9.* is installed at provided path
|
||||
python3_home = os.path.normpath(
|
||||
self.launch_context.env.get("RESOLVE_PYTHON3_HOME", ""))
|
||||
python3_home = Path(
|
||||
self.launch_context.env.get("RESOLVE_PYTHON3_HOME", "")
|
||||
)
|
||||
|
||||
assert os.path.isdir(python3_home), (
|
||||
assert python3_home.is_dir(), (
|
||||
"Python 3 is not installed at the provided folder path. Either "
|
||||
"make sure the `environments\resolve.json` is having correctly "
|
||||
"set `RESOLVE_PYTHON3_HOME` or make sure Python 3 is installed "
|
||||
f"in given path. \nRESOLVE_PYTHON3_HOME: `{python3_home}`"
|
||||
)
|
||||
self.launch_context.env["PYTHONHOME"] = python3_home
|
||||
self.log.info(f"Path to Resolve Python folder: `{python3_home}`...")
|
||||
|
||||
# add to the python path to path
|
||||
env_path = self.launch_context.env["PATH"]
|
||||
self.launch_context.env["PATH"] = os.pathsep.join([
|
||||
python3_home,
|
||||
os.path.join(python3_home, "Scripts")
|
||||
] + env_path.split(os.pathsep))
|
||||
|
||||
self.log.debug(f"PATH: {self.launch_context.env['PATH']}")
|
||||
python3_home_str = python3_home.as_posix()
|
||||
self.launch_context.env["PYTHONHOME"] = python3_home_str
|
||||
self.log.info(f"Path to Resolve Python folder: `{python3_home_str}`")
|
||||
|
||||
# add to the PYTHONPATH
|
||||
env_pythonpath = self.launch_context.env["PYTHONPATH"]
|
||||
self.launch_context.env["PYTHONPATH"] = os.pathsep.join([
|
||||
os.path.join(python3_home, "Lib", "site-packages"),
|
||||
os.path.join(RESOLVE_SCRIPT_API, "Modules"),
|
||||
] + env_pythonpath.split(os.pathsep))
|
||||
modules_path = Path(resolve_script_api, "Modules").as_posix()
|
||||
self.launch_context.env[
|
||||
"PYTHONPATH"
|
||||
] = f"{modules_path}{os.pathsep}{env_pythonpath}"
|
||||
|
||||
self.log.debug(f"PYTHONPATH: {self.launch_context.env['PYTHONPATH']}")
|
||||
|
||||
RESOLVE_UTILITY_SCRIPTS_DIR_ = {
|
||||
# add the pythonhome folder to PATH because on Windows
|
||||
# this is needed for Py3 to be correctly detected within Resolve
|
||||
env_path = self.launch_context.env["PATH"]
|
||||
self.log.info(f"Adding `{python3_home_str}` to the PATH variable")
|
||||
self.launch_context.env[
|
||||
"PATH"
|
||||
] = f"{python3_home_str}{os.pathsep}{env_path}"
|
||||
|
||||
self.log.debug(f"PATH: {self.launch_context.env['PATH']}")
|
||||
|
||||
resolve_utility_scripts_dirs = {
|
||||
"windows": (
|
||||
f"{PROGRAMDATA}/Blackmagic Design"
|
||||
f"{programdata}/Blackmagic Design"
|
||||
"/DaVinci Resolve/Fusion/Scripts/Comp"
|
||||
),
|
||||
"darwin": (
|
||||
"/Library/Application Support/Blackmagic Design"
|
||||
"/DaVinci Resolve/Fusion/Scripts/Comp"
|
||||
),
|
||||
"linux": "/opt/resolve/Fusion/Scripts/Comp"
|
||||
"linux": "/opt/resolve/Fusion/Scripts/Comp",
|
||||
}
|
||||
RESOLVE_UTILITY_SCRIPTS_DIR = os.path.normpath(
|
||||
RESOLVE_UTILITY_SCRIPTS_DIR_[current_platform]
|
||||
resolve_utility_scripts_dir = Path(
|
||||
resolve_utility_scripts_dirs[current_platform]
|
||||
)
|
||||
# setting utility scripts dir for scripts syncing
|
||||
self.launch_context.env["RESOLVE_UTILITY_SCRIPTS_DIR"] = (
|
||||
RESOLVE_UTILITY_SCRIPTS_DIR)
|
||||
self.launch_context.env[
|
||||
"RESOLVE_UTILITY_SCRIPTS_DIR"
|
||||
] = resolve_utility_scripts_dir.as_posix()
|
||||
|
||||
# remove terminal coloring tags
|
||||
self.launch_context.env["OPENPYPE_LOG_NO_COLORS"] = "True"
|
||||
|
|
|
|||
|
|
@ -8,30 +8,30 @@ RESOLVE_ROOT_DIR = os.path.dirname(os.path.abspath(__file__))
|
|||
def setup(env):
|
||||
log = Logger.get_logger("ResolveSetup")
|
||||
scripts = {}
|
||||
us_env = env.get("RESOLVE_UTILITY_SCRIPTS_SOURCE_DIR")
|
||||
us_dir = env["RESOLVE_UTILITY_SCRIPTS_DIR"]
|
||||
util_scripts_env = env.get("RESOLVE_UTILITY_SCRIPTS_SOURCE_DIR")
|
||||
util_scripts_dir = env["RESOLVE_UTILITY_SCRIPTS_DIR"]
|
||||
|
||||
us_paths = [os.path.join(
|
||||
util_scripts_paths = [os.path.join(
|
||||
RESOLVE_ROOT_DIR,
|
||||
"utility_scripts"
|
||||
)]
|
||||
|
||||
# collect script dirs
|
||||
if us_env:
|
||||
log.info("Utility Scripts Env: `{}`".format(us_env))
|
||||
us_paths = us_env.split(
|
||||
os.pathsep) + us_paths
|
||||
if util_scripts_env:
|
||||
log.info("Utility Scripts Env: `{}`".format(util_scripts_env))
|
||||
util_scripts_paths = util_scripts_env.split(
|
||||
os.pathsep) + util_scripts_paths
|
||||
|
||||
# collect scripts from dirs
|
||||
for path in us_paths:
|
||||
for path in util_scripts_paths:
|
||||
scripts.update({path: os.listdir(path)})
|
||||
|
||||
log.info("Utility Scripts Dir: `{}`".format(us_paths))
|
||||
log.info("Utility Scripts Dir: `{}`".format(util_scripts_paths))
|
||||
log.info("Utility Scripts: `{}`".format(scripts))
|
||||
|
||||
# make sure no script file is in folder
|
||||
for s in os.listdir(us_dir):
|
||||
path = os.path.join(us_dir, s)
|
||||
for script in os.listdir(util_scripts_dir):
|
||||
path = os.path.join(util_scripts_dir, script)
|
||||
log.info("Removing `{}`...".format(path))
|
||||
if os.path.isdir(path):
|
||||
shutil.rmtree(path, onerror=None)
|
||||
|
|
@ -39,12 +39,10 @@ def setup(env):
|
|||
os.remove(path)
|
||||
|
||||
# copy scripts into Resolve's utility scripts dir
|
||||
for d, sl in scripts.items():
|
||||
# directory and scripts list
|
||||
for s in sl:
|
||||
# script in script list
|
||||
src = os.path.join(d, s)
|
||||
dst = os.path.join(us_dir, s)
|
||||
for directory, scripts in scripts.items():
|
||||
for script in scripts:
|
||||
src = os.path.join(directory, script)
|
||||
dst = os.path.join(util_scripts_dir, script)
|
||||
log.info("Copying `{}` to `{}`...".format(src, dst))
|
||||
if os.path.isdir(src):
|
||||
shutil.copytree(
|
||||
|
|
|
|||
|
|
@ -4,6 +4,6 @@ Supported Unreal Engine version is 4.26+ (mainly because of major Python changes
|
|||
|
||||
### Project naming
|
||||
Unreal doesn't support project names starting with non-alphabetic character. So names like `123_myProject` are
|
||||
invalid. If OpenPype detects such name it automatically prepends letter **P** to make it valid name, so `123_myProject`
|
||||
invalid. If Ayon detects such name it automatically prepends letter **P** to make it valid name, so `123_myProject`
|
||||
will become `P123_myProject`. There is also soft-limit on project name length to be shorter than 20 characters.
|
||||
Longer names will issue warning in Unreal Editor that there might be possible side effects.
|
||||
Longer names will issue warning in Unreal Editor that there might be possible side effects.
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
import os
|
||||
from openpype.modules import OpenPypeModule, IHostAddon
|
||||
from openpype.modules import IHostAddon, OpenPypeModule
|
||||
|
||||
UNREAL_ROOT_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
|
@ -13,15 +13,27 @@ class UnrealAddon(OpenPypeModule, IHostAddon):
|
|||
|
||||
def add_implementation_envs(self, env, app):
|
||||
"""Modify environments to contain all required for implementation."""
|
||||
# Set OPENPYPE_UNREAL_PLUGIN required for Unreal implementation
|
||||
# Set AYON_UNREAL_PLUGIN required for Unreal implementation
|
||||
# Imports are in this method for Python 2 compatiblity of an addon
|
||||
from pathlib import Path
|
||||
|
||||
ue_plugin = "UE_5.0" if app.name[:1] == "5" else "UE_4.7"
|
||||
from .lib import get_compatible_integration
|
||||
|
||||
ue_version = app.name.replace("-", ".")
|
||||
unreal_plugin_path = os.path.join(
|
||||
UNREAL_ROOT_DIR, "integration", ue_plugin, "OpenPype"
|
||||
UNREAL_ROOT_DIR, "integration", "UE_{}".format(ue_version), "Ayon"
|
||||
)
|
||||
if not env.get("OPENPYPE_UNREAL_PLUGIN") or \
|
||||
env.get("OPENPYPE_UNREAL_PLUGIN") != unreal_plugin_path:
|
||||
env["OPENPYPE_UNREAL_PLUGIN"] = unreal_plugin_path
|
||||
if not Path(unreal_plugin_path).exists():
|
||||
compatible_versions = get_compatible_integration(
|
||||
ue_version, Path(UNREAL_ROOT_DIR) / "integration"
|
||||
)
|
||||
if compatible_versions:
|
||||
unreal_plugin_path = compatible_versions[-1] / "Ayon"
|
||||
unreal_plugin_path = unreal_plugin_path.as_posix()
|
||||
|
||||
if not env.get("AYON_UNREAL_PLUGIN") or \
|
||||
env.get("AYON_UNREAL_PLUGIN") != unreal_plugin_path:
|
||||
env["AYON_UNREAL_PLUGIN"] = unreal_plugin_path
|
||||
|
||||
# Set default environments if are not set via settings
|
||||
defaults = {
|
||||
|
|
|
|||
|
|
@ -1,5 +1,5 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
"""Unreal Editor OpenPype host API."""
|
||||
"""Unreal Editor Ayon host API."""
|
||||
|
||||
from .plugin import (
|
||||
UnrealActorCreator,
|
||||
|
|
|
|||
|
|
@ -2,15 +2,15 @@
|
|||
import unreal # noqa
|
||||
|
||||
|
||||
class OpenPypeUnrealException(Exception):
|
||||
class AyonUnrealException(Exception):
|
||||
pass
|
||||
|
||||
|
||||
@unreal.uclass()
|
||||
class OpenPypeHelpers(unreal.OpenPypeLib):
|
||||
"""Class wrapping some useful functions for OpenPype.
|
||||
class AyonHelpers(unreal.AyonLib):
|
||||
"""Class wrapping some useful functions for Ayon.
|
||||
|
||||
This class is extending native BP class in OpenPype Integration Plugin.
|
||||
This class is extending native BP class in Ayon Integration Plugin.
|
||||
|
||||
"""
|
||||
|
||||
|
|
@ -29,13 +29,13 @@ class OpenPypeHelpers(unreal.OpenPypeLib):
|
|||
|
||||
Example:
|
||||
|
||||
OpenPypeHelpers().set_folder_color(
|
||||
AyonHelpers().set_folder_color(
|
||||
"/Game/Path", unreal.LinearColor(a=1.0, r=1.0, g=0.5, b=0)
|
||||
)
|
||||
|
||||
Note:
|
||||
This will take effect only after Editor is restarted. I couldn't
|
||||
find a way to refresh it. Also this saves the color definition
|
||||
find a way to refresh it. Also, this saves the color definition
|
||||
into the project config, binding this path with color. So if you
|
||||
delete this path and later re-create, it will set this color
|
||||
again.
|
||||
|
|
|
|||
|
|
@ -14,7 +14,7 @@ from openpype.pipeline import (
|
|||
register_creator_plugin_path,
|
||||
deregister_loader_plugin_path,
|
||||
deregister_creator_plugin_path,
|
||||
AVALON_CONTAINER_ID,
|
||||
AYON_CONTAINER_ID,
|
||||
)
|
||||
from openpype.tools.utils import host_tools
|
||||
import openpype.hosts.unreal
|
||||
|
|
@ -22,12 +22,13 @@ from openpype.host import HostBase, ILoadHost, IPublishHost
|
|||
|
||||
import unreal # noqa
|
||||
|
||||
# Rename to Ayon once parent module renames
|
||||
logger = logging.getLogger("openpype.hosts.unreal")
|
||||
|
||||
OPENPYPE_CONTAINERS = "OpenPypeContainers"
|
||||
CONTEXT_CONTAINER = "OpenPype/context.json"
|
||||
AYON_CONTAINERS = "AyonContainers"
|
||||
CONTEXT_CONTAINER = "Ayon/context.json"
|
||||
UNREAL_VERSION = semver.VersionInfo(
|
||||
*os.getenv("OPENPYPE_UNREAL_VERSION").split(".")
|
||||
*os.getenv("AYON_UNREAL_VERSION").split(".")
|
||||
)
|
||||
|
||||
HOST_DIR = os.path.dirname(os.path.abspath(openpype.hosts.unreal.__file__))
|
||||
|
|
@ -53,14 +54,14 @@ class UnrealHost(HostBase, ILoadHost, IPublishHost):
|
|||
def get_containers(self):
|
||||
return ls()
|
||||
|
||||
def show_tools_popup(self):
|
||||
@staticmethod
|
||||
def show_tools_popup():
|
||||
"""Show tools popup with actions leading to show other tools."""
|
||||
|
||||
show_tools_popup()
|
||||
|
||||
def show_tools_dialog(self):
|
||||
@staticmethod
|
||||
def show_tools_dialog():
|
||||
"""Show tools dialog with actions leading to show other tools."""
|
||||
|
||||
show_tools_dialog()
|
||||
|
||||
def update_context_data(self, data, changes):
|
||||
|
|
@ -72,9 +73,10 @@ class UnrealHost(HostBase, ILoadHost, IPublishHost):
|
|||
with open(op_ctx, "w+") as f:
|
||||
json.dump(data, f)
|
||||
break
|
||||
except IOError:
|
||||
except IOError as e:
|
||||
if i == attempts - 1:
|
||||
raise Exception("Failed to write context data. Aborting.")
|
||||
raise Exception(
|
||||
"Failed to write context data. Aborting.") from e
|
||||
unreal.log_warning("Failed to write context data. Retrying...")
|
||||
i += 1
|
||||
time.sleep(3)
|
||||
|
|
@ -95,19 +97,30 @@ def install():
|
|||
print("-=" * 40)
|
||||
logo = '''.
|
||||
.
|
||||
____________
|
||||
/ \\ __ \\
|
||||
\\ \\ \\/_\\ \\
|
||||
\\ \\ _____/ ______
|
||||
\\ \\ \\___// \\ \\
|
||||
\\ \\____\\ \\ \\_____\\
|
||||
\\/_____/ \\/______/ PYPE Club .
|
||||
·
|
||||
│
|
||||
·∙/
|
||||
·-∙•∙-·
|
||||
/ \\ /∙· / \\
|
||||
∙ \\ │ / ∙
|
||||
\\ \\ · / /
|
||||
\\\\ ∙ ∙ //
|
||||
\\\\/ \\//
|
||||
___
|
||||
│ │
|
||||
│ │
|
||||
│ │
|
||||
│___│
|
||||
-·
|
||||
|
||||
·-─═─-∙ A Y O N ∙-─═─-·
|
||||
by YNPUT
|
||||
.
|
||||
'''
|
||||
print(logo)
|
||||
print("installing OpenPype for Unreal ...")
|
||||
print("installing Ayon for Unreal ...")
|
||||
print("-=" * 40)
|
||||
logger.info("installing OpenPype for Unreal")
|
||||
logger.info("installing Ayon for Unreal")
|
||||
pyblish.api.register_host("unreal")
|
||||
pyblish.api.register_plugin_path(str(PUBLISH_PATH))
|
||||
register_loader_plugin_path(str(LOAD_PATH))
|
||||
|
|
@ -117,7 +130,7 @@ def install():
|
|||
|
||||
|
||||
def uninstall():
|
||||
"""Uninstall Unreal configuration for Avalon."""
|
||||
"""Uninstall Unreal configuration for Ayon."""
|
||||
pyblish.api.deregister_plugin_path(str(PUBLISH_PATH))
|
||||
deregister_loader_plugin_path(str(LOAD_PATH))
|
||||
deregister_creator_plugin_path(str(CREATE_PATH))
|
||||
|
|
@ -125,14 +138,14 @@ def uninstall():
|
|||
|
||||
def _register_callbacks():
|
||||
"""
|
||||
TODO: Implement callbacks if supported by UE4
|
||||
TODO: Implement callbacks if supported by UE
|
||||
"""
|
||||
pass
|
||||
|
||||
|
||||
def _register_events():
|
||||
"""
|
||||
TODO: Implement callbacks if supported by UE4
|
||||
TODO: Implement callbacks if supported by UE
|
||||
"""
|
||||
pass
|
||||
|
||||
|
|
@ -146,32 +159,30 @@ def ls():
|
|||
"""
|
||||
ar = unreal.AssetRegistryHelpers.get_asset_registry()
|
||||
# UE 5.1 changed how class name is specified
|
||||
class_name = ["/Script/OpenPype", "AssetContainer"] if UNREAL_VERSION.major == 5 and UNREAL_VERSION.minor > 0 else "AssetContainer" # noqa
|
||||
openpype_containers = ar.get_assets_by_class(class_name, True)
|
||||
class_name = ["/Script/Ayon", "AyonAssetContainer"] if UNREAL_VERSION.major == 5 and UNREAL_VERSION.minor > 0 else "AyonAssetContainer" # noqa
|
||||
ayon_containers = ar.get_assets_by_class(class_name, True)
|
||||
|
||||
# get_asset_by_class returns AssetData. To get all metadata we need to
|
||||
# load asset. get_tag_values() work only on metadata registered in
|
||||
# Asset Registry Project settings (and there is no way to set it with
|
||||
# python short of editing ini configuration file).
|
||||
for asset_data in openpype_containers:
|
||||
for asset_data in ayon_containers:
|
||||
asset = asset_data.get_asset()
|
||||
data = unreal.EditorAssetLibrary.get_metadata_tag_values(asset)
|
||||
data["objectName"] = asset_data.asset_name
|
||||
data = cast_map_to_str_dict(data)
|
||||
|
||||
yield data
|
||||
yield cast_map_to_str_dict(data)
|
||||
|
||||
|
||||
def ls_inst():
|
||||
ar = unreal.AssetRegistryHelpers.get_asset_registry()
|
||||
# UE 5.1 changed how class name is specified
|
||||
class_name = [
|
||||
"/Script/OpenPype",
|
||||
"OpenPypePublishInstance"
|
||||
"/Script/Ayon",
|
||||
"AyonPublishInstance"
|
||||
] if (
|
||||
UNREAL_VERSION.major == 5
|
||||
and UNREAL_VERSION.minor > 0
|
||||
) else "OpenPypePublishInstance" # noqa
|
||||
) else "AyonPublishInstance" # noqa
|
||||
instances = ar.get_assets_by_class(class_name, True)
|
||||
|
||||
# get_asset_by_class returns AssetData. To get all metadata we need to
|
||||
|
|
@ -182,13 +193,11 @@ def ls_inst():
|
|||
asset = asset_data.get_asset()
|
||||
data = unreal.EditorAssetLibrary.get_metadata_tag_values(asset)
|
||||
data["objectName"] = asset_data.asset_name
|
||||
data = cast_map_to_str_dict(data)
|
||||
|
||||
yield data
|
||||
yield cast_map_to_str_dict(data)
|
||||
|
||||
|
||||
def parse_container(container):
|
||||
"""To get data from container, AssetContainer must be loaded.
|
||||
"""To get data from container, AyonAssetContainer must be loaded.
|
||||
|
||||
Args:
|
||||
container(str): path to container
|
||||
|
|
@ -217,7 +226,7 @@ def containerise(name, namespace, nodes, context, loader=None, suffix="_CON"):
|
|||
Unreal doesn't support *groups* of assets that you can add metadata to.
|
||||
But it does support folders that helps to organize asset. Unfortunately
|
||||
those folders are just that - you cannot add any additional information
|
||||
to them. OpenPype Integration Plugin is providing way out - Implementing
|
||||
to them. Ayon Integration Plugin is providing way out - Implementing
|
||||
`AssetContainer` Blueprint class. This class when added to folder can
|
||||
handle metadata on it using standard
|
||||
:func:`unreal.EditorAssetLibrary.set_metadata_tag()` and
|
||||
|
|
@ -226,30 +235,30 @@ def containerise(name, namespace, nodes, context, loader=None, suffix="_CON"):
|
|||
those assets is available as `assets` property.
|
||||
|
||||
This is list of strings starting with asset type and ending with its path:
|
||||
`Material /Game/OpenPype/Test/TestMaterial.TestMaterial`
|
||||
`Material /Game/Ayon/Test/TestMaterial.TestMaterial`
|
||||
|
||||
"""
|
||||
# 1 - create directory for container
|
||||
root = "/Game"
|
||||
container_name = "{}{}".format(name, suffix)
|
||||
container_name = f"{name}{suffix}"
|
||||
new_name = move_assets_to_path(root, container_name, nodes)
|
||||
|
||||
# 2 - create Asset Container there
|
||||
path = "{}/{}".format(root, new_name)
|
||||
path = f"{root}/{new_name}"
|
||||
create_container(container=container_name, path=path)
|
||||
|
||||
namespace = path
|
||||
|
||||
data = {
|
||||
"schema": "openpype:container-2.0",
|
||||
"id": AVALON_CONTAINER_ID,
|
||||
"schema": "ayon:container-2.0",
|
||||
"id": AYON_CONTAINER_ID,
|
||||
"name": new_name,
|
||||
"namespace": namespace,
|
||||
"loader": str(loader),
|
||||
"representation": context["representation"]["_id"],
|
||||
}
|
||||
# 3 - imprint data
|
||||
imprint("{}/{}".format(path, container_name), data)
|
||||
imprint(f"{path}/{container_name}", data)
|
||||
return path
|
||||
|
||||
|
||||
|
|
@ -257,7 +266,7 @@ def instantiate(root, name, data, assets=None, suffix="_INS"):
|
|||
"""Bundles *nodes* into *container*.
|
||||
|
||||
Marking it with metadata as publishable instance. If assets are provided,
|
||||
they are moved to new path where `OpenPypePublishInstance` class asset is
|
||||
they are moved to new path where `AyonPublishInstance` class asset is
|
||||
created and imprinted with metadata.
|
||||
|
||||
This can then be collected for publishing by Pyblish for example.
|
||||
|
|
@ -271,7 +280,7 @@ def instantiate(root, name, data, assets=None, suffix="_INS"):
|
|||
suffix (str): suffix string to append to instance name
|
||||
|
||||
"""
|
||||
container_name = "{}{}".format(name, suffix)
|
||||
container_name = f"{name}{suffix}"
|
||||
|
||||
# if we specify assets, create new folder and move them there. If not,
|
||||
# just create empty folder
|
||||
|
|
@ -280,10 +289,10 @@ def instantiate(root, name, data, assets=None, suffix="_INS"):
|
|||
else:
|
||||
new_name = create_folder(root, name)
|
||||
|
||||
path = "{}/{}".format(root, new_name)
|
||||
path = f"{root}/{new_name}"
|
||||
create_publish_instance(instance=container_name, path=path)
|
||||
|
||||
imprint("{}/{}".format(path, container_name), data)
|
||||
imprint(f"{path}/{container_name}", data)
|
||||
|
||||
|
||||
def imprint(node, data):
|
||||
|
|
@ -299,7 +308,7 @@ def imprint(node, data):
|
|||
loaded_asset, key, str(value)
|
||||
)
|
||||
|
||||
with unreal.ScopedEditorTransaction("OpenPype containerising"):
|
||||
with unreal.ScopedEditorTransaction("Ayon containerising"):
|
||||
unreal.EditorAssetLibrary.save_asset(node)
|
||||
|
||||
|
||||
|
|
@ -366,11 +375,11 @@ def create_folder(root: str, name: str) -> str:
|
|||
eal = unreal.EditorAssetLibrary
|
||||
index = 1
|
||||
while True:
|
||||
if eal.does_directory_exist("{}/{}".format(root, name)):
|
||||
name = "{}{}".format(name, index)
|
||||
if eal.does_directory_exist(f"{root}/{name}"):
|
||||
name = f"{name}{index}"
|
||||
index += 1
|
||||
else:
|
||||
eal.make_directory("{}/{}".format(root, name))
|
||||
eal.make_directory(f"{root}/{name}")
|
||||
break
|
||||
|
||||
return name
|
||||
|
|
@ -403,9 +412,7 @@ def move_assets_to_path(root: str, name: str, assets: List[str]) -> str:
|
|||
unreal.log(assets)
|
||||
for asset in assets:
|
||||
loaded = eal.load_asset(asset)
|
||||
eal.rename_asset(
|
||||
asset, "{}/{}/{}".format(root, name, loaded.get_name())
|
||||
)
|
||||
eal.rename_asset(asset, f"{root}/{name}/{loaded.get_name()}")
|
||||
|
||||
return name
|
||||
|
||||
|
|
@ -432,17 +439,16 @@ def create_container(container: str, path: str) -> unreal.Object:
|
|||
)
|
||||
|
||||
"""
|
||||
factory = unreal.AssetContainerFactory()
|
||||
factory = unreal.AyonAssetContainerFactory()
|
||||
tools = unreal.AssetToolsHelpers().get_asset_tools()
|
||||
|
||||
asset = tools.create_asset(container, path, None, factory)
|
||||
return asset
|
||||
return tools.create_asset(container, path, None, factory)
|
||||
|
||||
|
||||
def create_publish_instance(instance: str, path: str) -> unreal.Object:
|
||||
"""Helper function to create OpenPype Publish Instance on given path.
|
||||
"""Helper function to create Ayon Publish Instance on given path.
|
||||
|
||||
This behaves similarly as :func:`create_openpype_container`.
|
||||
This behaves similarly as :func:`create_ayon_container`.
|
||||
|
||||
Args:
|
||||
path (str): Path where to create Publish Instance.
|
||||
|
|
@ -460,10 +466,9 @@ def create_publish_instance(instance: str, path: str) -> unreal.Object:
|
|||
)
|
||||
|
||||
"""
|
||||
factory = unreal.OpenPypePublishInstanceFactory()
|
||||
factory = unreal.AyonPublishInstanceFactory()
|
||||
tools = unreal.AssetToolsHelpers().get_asset_tools()
|
||||
asset = tools.create_asset(instance, path, None, factory)
|
||||
return asset
|
||||
return tools.create_asset(instance, path, None, factory)
|
||||
|
||||
|
||||
def cast_map_to_str_dict(umap) -> dict:
|
||||
|
|
@ -494,11 +499,14 @@ def get_subsequences(sequence: unreal.LevelSequence):
|
|||
|
||||
"""
|
||||
tracks = sequence.get_master_tracks()
|
||||
subscene_track = None
|
||||
for t in tracks:
|
||||
if t.get_class() == unreal.MovieSceneSubTrack.static_class():
|
||||
subscene_track = t
|
||||
break
|
||||
subscene_track = next(
|
||||
(
|
||||
t
|
||||
for t in tracks
|
||||
if t.get_class() == unreal.MovieSceneSubTrack.static_class()
|
||||
),
|
||||
None,
|
||||
)
|
||||
if subscene_track is not None and subscene_track.get_sections():
|
||||
return subscene_track.get_sections()
|
||||
return []
|
||||
|
|
|
|||
|
|
@ -31,7 +31,7 @@ from openpype.pipeline import (
|
|||
@six.add_metaclass(ABCMeta)
|
||||
class UnrealBaseCreator(Creator):
|
||||
"""Base class for Unreal creator plugins."""
|
||||
root = "/Game/OpenPype/PublishInstances"
|
||||
root = "/Game/Ayon/AyonPublishInstances"
|
||||
suffix = "_INS"
|
||||
|
||||
@staticmethod
|
||||
|
|
@ -243,5 +243,5 @@ class UnrealActorCreator(UnrealBaseCreator):
|
|||
|
||||
|
||||
class Loader(LoaderPlugin, ABC):
|
||||
"""This serves as skeleton for future OpenPype specific functionality"""
|
||||
"""This serves as skeleton for future Ayon specific functionality"""
|
||||
pass
|
||||
|
|
|
|||
|
|
@ -51,7 +51,7 @@ def start_rendering():
|
|||
# instances = pipeline.ls_inst()
|
||||
instances = [
|
||||
a for a in assets
|
||||
if a.get_class().get_name() == "OpenPypePublishInstance"]
|
||||
if a.get_class().get_name() == "AyonPublishInstance"]
|
||||
|
||||
inst_data = []
|
||||
|
||||
|
|
@ -64,8 +64,9 @@ def start_rendering():
|
|||
project = os.environ.get("AVALON_PROJECT")
|
||||
anatomy = Anatomy(project)
|
||||
root = anatomy.roots['renders']
|
||||
except Exception:
|
||||
raise Exception("Could not find render root in anatomy settings.")
|
||||
except Exception as e:
|
||||
raise Exception(
|
||||
"Could not find render root in anatomy settings.") from e
|
||||
|
||||
render_dir = f"{root}/{project}"
|
||||
|
||||
|
|
@ -121,7 +122,7 @@ def start_rendering():
|
|||
job = queue.allocate_new_job(unreal.MoviePipelineExecutorJob)
|
||||
job.sequence = unreal.SoftObjectPath(i["master_sequence"])
|
||||
job.map = unreal.SoftObjectPath(i["master_level"])
|
||||
job.author = "OpenPype"
|
||||
job.author = "Ayon"
|
||||
|
||||
# If we have a saved configuration, copy it to the job.
|
||||
if config:
|
||||
|
|
@ -129,7 +130,7 @@ def start_rendering():
|
|||
|
||||
# User data could be used to pass data to the job, that can be
|
||||
# read in the job's OnJobFinished callback. We could,
|
||||
# for instance, pass the AvalonPublishInstance's path to the job.
|
||||
# for instance, pass the AyonPublishInstance's path to the job.
|
||||
# job.user_data = ""
|
||||
|
||||
output_dir = render_setting.get('output')
|
||||
|
|
|
|||
|
|
@ -64,7 +64,7 @@ class ToolsDialog(QtWidgets.QDialog):
|
|||
def __init__(self, *args, **kwargs):
|
||||
super(ToolsDialog, self).__init__(*args, **kwargs)
|
||||
|
||||
self.setWindowTitle("OpenPype tools")
|
||||
self.setWindowTitle("Ayon tools")
|
||||
icon = QtGui.QIcon(resources.get_openpype_icon_filepath())
|
||||
self.setWindowIcon(icon)
|
||||
|
||||
|
|
|
|||
|
|
@ -186,15 +186,15 @@ class UnrealPrelaunchHook(PreLaunchHook):
|
|||
|
||||
project_path.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Set "OPENPYPE_UNREAL_PLUGIN" to current process environment for
|
||||
# Set "AYON_UNREAL_PLUGIN" to current process environment for
|
||||
# execution of `create_unreal_project`
|
||||
|
||||
if self.launch_context.env.get("OPENPYPE_UNREAL_PLUGIN"):
|
||||
if self.launch_context.env.get("AYON_UNREAL_PLUGIN"):
|
||||
self.log.info((
|
||||
f"{self.signature} using OpenPype plugin from "
|
||||
f"{self.launch_context.env.get('OPENPYPE_UNREAL_PLUGIN')}"
|
||||
f"{self.signature} using Ayon plugin from "
|
||||
f"{self.launch_context.env.get('AYON_UNREAL_PLUGIN')}"
|
||||
))
|
||||
env_key = "OPENPYPE_UNREAL_PLUGIN"
|
||||
env_key = "AYON_UNREAL_PLUGIN"
|
||||
if self.launch_context.env.get(env_key):
|
||||
os.environ[env_key] = self.launch_context.env[env_key]
|
||||
|
||||
|
|
@ -213,7 +213,7 @@ class UnrealPrelaunchHook(PreLaunchHook):
|
|||
engine_path,
|
||||
project_path)
|
||||
|
||||
self.launch_context.env["OPENPYPE_UNREAL_VERSION"] = engine_version
|
||||
self.launch_context.env["AYON_UNREAL_VERSION"] = engine_version
|
||||
# Append project file to launch arguments
|
||||
self.launch_context.launch_args.append(
|
||||
f"\"{project_file.as_posix()}\"")
|
||||
|
|
|
|||
1
openpype/hosts/unreal/integration
Submodule
1
openpype/hosts/unreal/integration
Submodule
|
|
@ -0,0 +1 @@
|
|||
Subproject commit ff15c700771e719cc5f3d561ac5d6f7590623986
|
||||
|
|
@ -1,8 +0,0 @@
|
|||
/Saved
|
||||
/DerivedDataCache
|
||||
/Intermediate
|
||||
/Content
|
||||
/Config
|
||||
/Binaries
|
||||
/.idea
|
||||
/.vs
|
||||
|
|
@ -1,12 +0,0 @@
|
|||
{
|
||||
"FileVersion": 3,
|
||||
"EngineAssociation": "4.27",
|
||||
"Category": "",
|
||||
"Description": "",
|
||||
"Plugins": [
|
||||
{
|
||||
"Name": "OpenPype",
|
||||
"Enabled": true
|
||||
}
|
||||
]
|
||||
}
|
||||
|
|
@ -1,35 +0,0 @@
|
|||
# Prerequisites
|
||||
*.d
|
||||
|
||||
# Compiled Object files
|
||||
*.slo
|
||||
*.lo
|
||||
*.o
|
||||
*.obj
|
||||
|
||||
# Precompiled Headers
|
||||
*.gch
|
||||
*.pch
|
||||
|
||||
# Compiled Dynamic libraries
|
||||
*.so
|
||||
*.dylib
|
||||
*.dll
|
||||
|
||||
# Fortran module files
|
||||
*.mod
|
||||
*.smod
|
||||
|
||||
# Compiled Static libraries
|
||||
*.lai
|
||||
*.la
|
||||
*.a
|
||||
*.lib
|
||||
|
||||
# Executables
|
||||
*.exe
|
||||
*.out
|
||||
*.app
|
||||
|
||||
/Binaries
|
||||
/Intermediate
|
||||
|
|
@ -1,2 +0,0 @@
|
|||
[/Script/OpenPype.OpenPypeSettings]
|
||||
FolderColor=(R=91,G=197,B=220,A=255)
|
||||
|
|
@ -1,8 +0,0 @@
|
|||
[FilterPlugin]
|
||||
; This section lists additional files which will be packaged along with your plugin. Paths should be listed relative to the root plugin directory, and
|
||||
; may include "...", "*", and "?" wildcards to match directories, files, and individual characters respectively.
|
||||
;
|
||||
; Examples:
|
||||
; /README.txt
|
||||
; /Extras/...
|
||||
; /Binaries/ThirdParty/*.dll
|
||||
|
|
@ -1,30 +0,0 @@
|
|||
import unreal
|
||||
|
||||
openpype_detected = True
|
||||
try:
|
||||
from openpype.pipeline import install_host
|
||||
from openpype.hosts.unreal.api import UnrealHost
|
||||
|
||||
openpype_host = UnrealHost()
|
||||
except ImportError as exc:
|
||||
openpype_host = None
|
||||
openpype_detected = False
|
||||
unreal.log_error("OpenPype: cannot load OpenPype [ {} ]".format(exc))
|
||||
|
||||
if openpype_detected:
|
||||
install_host(openpype_host)
|
||||
|
||||
|
||||
@unreal.uclass()
|
||||
class OpenPypeIntegration(unreal.OpenPypePythonBridge):
|
||||
@unreal.ufunction(override=True)
|
||||
def RunInPython_Popup(self):
|
||||
unreal.log_warning("OpenPype: showing tools popup")
|
||||
if openpype_detected:
|
||||
openpype_host.show_tools_popup()
|
||||
|
||||
@unreal.ufunction(override=True)
|
||||
def RunInPython_Dialog(self):
|
||||
unreal.log_warning("OpenPype: showing tools dialog")
|
||||
if openpype_detected:
|
||||
openpype_host.show_tools_dialog()
|
||||
|
|
@ -1,23 +0,0 @@
|
|||
{
|
||||
"FileVersion": 3,
|
||||
"Version": 1,
|
||||
"VersionName": "1.0",
|
||||
"FriendlyName": "OpenPype",
|
||||
"Description": "OpenPype Integration",
|
||||
"Category": "OpenPype.Integration",
|
||||
"CreatedBy": "Ondrej Samohel",
|
||||
"CreatedByURL": "https://openpype.io",
|
||||
"DocsURL": "https://openpype.io/docs/artist_hosts_unreal",
|
||||
"MarketplaceURL": "",
|
||||
"SupportURL": "https://pype.club/",
|
||||
"EngineVersion": "4.27",
|
||||
"CanContainContent": true,
|
||||
"Installed": true,
|
||||
"Modules": [
|
||||
{
|
||||
"Name": "OpenPype",
|
||||
"Type": "Editor",
|
||||
"LoadingPhase": "Default"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
|
@ -1,11 +0,0 @@
|
|||
# OpenPype Unreal Integration plugin - UE 4.x
|
||||
|
||||
This is plugin for Unreal Editor, creating menu for [OpenPype](https://github.com/getavalon) tools to run.
|
||||
|
||||
## How does this work
|
||||
|
||||
Plugin is creating basic menu items in **Window/OpenPype** section of Unreal Editor main menu and a button
|
||||
on the main toolbar with associated menu. Clicking on those menu items is calling callbacks that are
|
||||
declared in c++ but needs to be implemented during Unreal Editor
|
||||
startup in `Plugins/OpenPype/Content/Python/init_unreal.py` - this should be executed by Unreal Editor
|
||||
automatically.
|
||||
Binary file not shown.
|
Before Width: | Height: | Size: 14 KiB |
Binary file not shown.
|
Before Width: | Height: | Size: 4.8 KiB |
Binary file not shown.
|
Before Width: | Height: | Size: 84 KiB |
|
|
@ -1,59 +0,0 @@
|
|||
// Copyright 2023, Ayon, All rights reserved.
|
||||
|
||||
using UnrealBuildTool;
|
||||
|
||||
public class OpenPype : ModuleRules
|
||||
{
|
||||
public OpenPype(ReadOnlyTargetRules Target) : base(Target)
|
||||
{
|
||||
PCHUsage = PCHUsageMode.UseExplicitOrSharedPCHs;
|
||||
|
||||
PublicIncludePaths.AddRange(
|
||||
new string[] {
|
||||
// ... add public include paths required here ...
|
||||
}
|
||||
);
|
||||
|
||||
|
||||
PrivateIncludePaths.AddRange(
|
||||
new string[] {
|
||||
// ... add other private include paths required here ...
|
||||
}
|
||||
);
|
||||
|
||||
|
||||
PublicDependencyModuleNames.AddRange(
|
||||
new string[]
|
||||
{
|
||||
"Core",
|
||||
// ... add other public dependencies that you statically link with here ...
|
||||
}
|
||||
);
|
||||
|
||||
|
||||
PrivateDependencyModuleNames.AddRange(
|
||||
new string[]
|
||||
{
|
||||
"GameProjectGeneration",
|
||||
"Projects",
|
||||
"InputCore",
|
||||
"UnrealEd",
|
||||
"LevelEditor",
|
||||
"CoreUObject",
|
||||
"Engine",
|
||||
"Slate",
|
||||
"SlateCore",
|
||||
"AssetTools"
|
||||
// ... add private dependencies that you statically link with here ...
|
||||
}
|
||||
);
|
||||
|
||||
|
||||
DynamicallyLoadedModuleNames.AddRange(
|
||||
new string[]
|
||||
{
|
||||
// ... add any modules that your module loads dynamically here ...
|
||||
}
|
||||
);
|
||||
}
|
||||
}
|
||||
|
|
@ -1,141 +0,0 @@
|
|||
// Copyright 2023, Ayon, All rights reserved.
|
||||
#include "Commandlets/Implementations/OPGenerateProjectCommandlet.h"
|
||||
|
||||
#include "Editor.h"
|
||||
#include "GameProjectUtils.h"
|
||||
#include "OPConstants.h"
|
||||
#include "Commandlets/OPActionResult.h"
|
||||
#include "ProjectDescriptor.h"
|
||||
|
||||
int32 UOPGenerateProjectCommandlet::Main(const FString& CommandLineParams)
|
||||
{
|
||||
//Parses command line parameters & creates structure FProjectInformation
|
||||
const FOPGenerateProjectParams ParsedParams = FOPGenerateProjectParams(CommandLineParams);
|
||||
ProjectInformation = ParsedParams.GenerateUEProjectInformation();
|
||||
|
||||
//Creates .uproject & other UE files
|
||||
EVALUATE_OP_ACTION_RESULT(TryCreateProject());
|
||||
|
||||
//Loads created .uproject
|
||||
EVALUATE_OP_ACTION_RESULT(TryLoadProjectDescriptor());
|
||||
|
||||
//Adds needed plugin to .uproject
|
||||
AttachPluginsToProjectDescriptor();
|
||||
|
||||
//Saves .uproject
|
||||
EVALUATE_OP_ACTION_RESULT(TrySave());
|
||||
|
||||
//When we are here, there should not be problems in generating Unreal Project for OpenPype
|
||||
return 0;
|
||||
}
|
||||
|
||||
|
||||
FOPGenerateProjectParams::FOPGenerateProjectParams(): FOPGenerateProjectParams("")
|
||||
{
|
||||
}
|
||||
|
||||
FOPGenerateProjectParams::FOPGenerateProjectParams(const FString& CommandLineParams): CommandLineParams(
|
||||
CommandLineParams)
|
||||
{
|
||||
UCommandlet::ParseCommandLine(*CommandLineParams, Tokens, Switches);
|
||||
}
|
||||
|
||||
FProjectInformation FOPGenerateProjectParams::GenerateUEProjectInformation() const
|
||||
{
|
||||
FProjectInformation ProjectInformation = FProjectInformation();
|
||||
ProjectInformation.ProjectFilename = GetProjectFileName();
|
||||
|
||||
ProjectInformation.bShouldGenerateCode = IsSwitchPresent("GenerateCode");
|
||||
|
||||
return ProjectInformation;
|
||||
}
|
||||
|
||||
FString FOPGenerateProjectParams::TryGetToken(const int32 Index) const
|
||||
{
|
||||
return Tokens.IsValidIndex(Index) ? Tokens[Index] : "";
|
||||
}
|
||||
|
||||
FString FOPGenerateProjectParams::GetProjectFileName() const
|
||||
{
|
||||
return TryGetToken(0);
|
||||
}
|
||||
|
||||
bool FOPGenerateProjectParams::IsSwitchPresent(const FString& Switch) const
|
||||
{
|
||||
return INDEX_NONE != Switches.IndexOfByPredicate([&Switch](const FString& Item) -> bool
|
||||
{
|
||||
return Item.Equals(Switch);
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
|
||||
UOPGenerateProjectCommandlet::UOPGenerateProjectCommandlet()
|
||||
{
|
||||
LogToConsole = true;
|
||||
}
|
||||
|
||||
FOP_ActionResult UOPGenerateProjectCommandlet::TryCreateProject() const
|
||||
{
|
||||
FText FailReason;
|
||||
FText FailLog;
|
||||
TArray<FString> OutCreatedFiles;
|
||||
|
||||
if (!GameProjectUtils::CreateProject(ProjectInformation, FailReason, FailLog, &OutCreatedFiles))
|
||||
return FOP_ActionResult(EOP_ActionResult::ProjectNotCreated, FailReason);
|
||||
return FOP_ActionResult();
|
||||
}
|
||||
|
||||
FOP_ActionResult UOPGenerateProjectCommandlet::TryLoadProjectDescriptor()
|
||||
{
|
||||
FText FailReason;
|
||||
const bool bLoaded = ProjectDescriptor.Load(ProjectInformation.ProjectFilename, FailReason);
|
||||
|
||||
return FOP_ActionResult(bLoaded ? EOP_ActionResult::Ok : EOP_ActionResult::ProjectNotLoaded, FailReason);
|
||||
}
|
||||
|
||||
void UOPGenerateProjectCommandlet::AttachPluginsToProjectDescriptor()
|
||||
{
|
||||
FPluginReferenceDescriptor OPPluginDescriptor;
|
||||
OPPluginDescriptor.bEnabled = true;
|
||||
OPPluginDescriptor.Name = OPConstants::OP_PluginName;
|
||||
ProjectDescriptor.Plugins.Add(OPPluginDescriptor);
|
||||
|
||||
FPluginReferenceDescriptor PythonPluginDescriptor;
|
||||
PythonPluginDescriptor.bEnabled = true;
|
||||
PythonPluginDescriptor.Name = OPConstants::PythonScript_PluginName;
|
||||
ProjectDescriptor.Plugins.Add(PythonPluginDescriptor);
|
||||
|
||||
FPluginReferenceDescriptor SequencerScriptingPluginDescriptor;
|
||||
SequencerScriptingPluginDescriptor.bEnabled = true;
|
||||
SequencerScriptingPluginDescriptor.Name = OPConstants::SequencerScripting_PluginName;
|
||||
ProjectDescriptor.Plugins.Add(SequencerScriptingPluginDescriptor);
|
||||
|
||||
FPluginReferenceDescriptor MovieRenderPipelinePluginDescriptor;
|
||||
MovieRenderPipelinePluginDescriptor.bEnabled = true;
|
||||
MovieRenderPipelinePluginDescriptor.Name = OPConstants::MovieRenderPipeline_PluginName;
|
||||
ProjectDescriptor.Plugins.Add(MovieRenderPipelinePluginDescriptor);
|
||||
|
||||
FPluginReferenceDescriptor EditorScriptingPluginDescriptor;
|
||||
EditorScriptingPluginDescriptor.bEnabled = true;
|
||||
EditorScriptingPluginDescriptor.Name = OPConstants::EditorScriptingUtils_PluginName;
|
||||
ProjectDescriptor.Plugins.Add(EditorScriptingPluginDescriptor);
|
||||
}
|
||||
|
||||
FOP_ActionResult UOPGenerateProjectCommandlet::TrySave()
|
||||
{
|
||||
FText FailReason;
|
||||
const bool bSaved = ProjectDescriptor.Save(ProjectInformation.ProjectFilename, FailReason);
|
||||
|
||||
return FOP_ActionResult(bSaved ? EOP_ActionResult::Ok : EOP_ActionResult::ProjectNotSaved, FailReason);
|
||||
}
|
||||
|
||||
FOPGenerateProjectParams UOPGenerateProjectCommandlet::ParseParameters(const FString& Params) const
|
||||
{
|
||||
FOPGenerateProjectParams ParamsResult;
|
||||
|
||||
TArray<FString> Tokens, Switches;
|
||||
ParseCommandLine(*Params, Tokens, Switches);
|
||||
|
||||
return ParamsResult;
|
||||
}
|
||||
|
|
@ -1,41 +0,0 @@
|
|||
// Copyright 2023, Ayon, All rights reserved.
|
||||
|
||||
|
||||
#include "Commandlets/OPActionResult.h"
|
||||
#include "Logging/OP_Log.h"
|
||||
|
||||
EOP_ActionResult::Type& FOP_ActionResult::GetStatus()
|
||||
{
|
||||
return Status;
|
||||
}
|
||||
|
||||
FText& FOP_ActionResult::GetReason()
|
||||
{
|
||||
return Reason;
|
||||
}
|
||||
|
||||
FOP_ActionResult::FOP_ActionResult():Status(EOP_ActionResult::Type::Ok)
|
||||
{
|
||||
|
||||
}
|
||||
|
||||
FOP_ActionResult::FOP_ActionResult(const EOP_ActionResult::Type& InEnum):Status(InEnum)
|
||||
{
|
||||
TryLog();
|
||||
}
|
||||
|
||||
FOP_ActionResult::FOP_ActionResult(const EOP_ActionResult::Type& InEnum, const FText& InReason):Status(InEnum), Reason(InReason)
|
||||
{
|
||||
TryLog();
|
||||
};
|
||||
|
||||
bool FOP_ActionResult::IsProblem() const
|
||||
{
|
||||
return Status != EOP_ActionResult::Ok;
|
||||
}
|
||||
|
||||
void FOP_ActionResult::TryLog() const
|
||||
{
|
||||
if(IsProblem())
|
||||
UE_LOG(LogCommandletOPGenerateProject, Error, TEXT("%s"), *Reason.ToString());
|
||||
}
|
||||
|
|
@ -1 +0,0 @@
|
|||
#include "Logging/OP_Log.h"
|
||||
|
|
@ -1,155 +0,0 @@
|
|||
// Copyright 2023, Ayon, All rights reserved.
|
||||
#include "OpenPype.h"
|
||||
|
||||
#include "ISettingsContainer.h"
|
||||
#include "ISettingsModule.h"
|
||||
#include "ISettingsSection.h"
|
||||
#include "LevelEditor.h"
|
||||
#include "OpenPypePythonBridge.h"
|
||||
#include "OpenPypeSettings.h"
|
||||
#include "OpenPypeStyle.h"
|
||||
|
||||
|
||||
static const FName OpenPypeTabName("OpenPype");
|
||||
|
||||
#define LOCTEXT_NAMESPACE "FOpenPypeModule"
|
||||
|
||||
// This function is triggered when the plugin is staring up
|
||||
void FOpenPypeModule::StartupModule()
|
||||
{
|
||||
if (!IsRunningCommandlet()) {
|
||||
FOpenPypeStyle::Initialize();
|
||||
FOpenPypeStyle::SetIcon("Logo", "openpype40");
|
||||
|
||||
// Create the Extender that will add content to the menu
|
||||
FLevelEditorModule& LevelEditorModule = FModuleManager::LoadModuleChecked<FLevelEditorModule>("LevelEditor");
|
||||
|
||||
TSharedPtr<FExtender> MenuExtender = MakeShareable(new FExtender());
|
||||
TSharedPtr<FExtender> ToolbarExtender = MakeShareable(new FExtender());
|
||||
|
||||
MenuExtender->AddMenuExtension(
|
||||
"LevelEditor",
|
||||
EExtensionHook::After,
|
||||
NULL,
|
||||
FMenuExtensionDelegate::CreateRaw(this, &FOpenPypeModule::AddMenuEntry)
|
||||
);
|
||||
ToolbarExtender->AddToolBarExtension(
|
||||
"Settings",
|
||||
EExtensionHook::After,
|
||||
NULL,
|
||||
FToolBarExtensionDelegate::CreateRaw(this, &FOpenPypeModule::AddToobarEntry));
|
||||
|
||||
|
||||
LevelEditorModule.GetMenuExtensibilityManager()->AddExtender(MenuExtender);
|
||||
LevelEditorModule.GetToolBarExtensibilityManager()->AddExtender(ToolbarExtender);
|
||||
|
||||
RegisterSettings();
|
||||
}
|
||||
}
|
||||
|
||||
void FOpenPypeModule::ShutdownModule()
|
||||
{
|
||||
FOpenPypeStyle::Shutdown();
|
||||
}
|
||||
|
||||
|
||||
void FOpenPypeModule::AddMenuEntry(FMenuBuilder& MenuBuilder)
|
||||
{
|
||||
// Create Section
|
||||
MenuBuilder.BeginSection("OpenPype", TAttribute<FText>(FText::FromString("OpenPype")));
|
||||
{
|
||||
// Create a Submenu inside of the Section
|
||||
MenuBuilder.AddMenuEntry(
|
||||
FText::FromString("Tools..."),
|
||||
FText::FromString("Pipeline tools"),
|
||||
FSlateIcon(FOpenPypeStyle::GetStyleSetName(), "OpenPype.Logo"),
|
||||
FUIAction(FExecuteAction::CreateRaw(this, &FOpenPypeModule::MenuPopup))
|
||||
);
|
||||
|
||||
MenuBuilder.AddMenuEntry(
|
||||
FText::FromString("Tools dialog..."),
|
||||
FText::FromString("Pipeline tools dialog"),
|
||||
FSlateIcon(FOpenPypeStyle::GetStyleSetName(), "OpenPype.Logo"),
|
||||
FUIAction(FExecuteAction::CreateRaw(this, &FOpenPypeModule::MenuDialog))
|
||||
);
|
||||
}
|
||||
MenuBuilder.EndSection();
|
||||
}
|
||||
|
||||
void FOpenPypeModule::AddToobarEntry(FToolBarBuilder& ToolbarBuilder)
|
||||
{
|
||||
ToolbarBuilder.BeginSection(TEXT("OpenPype"));
|
||||
{
|
||||
ToolbarBuilder.AddToolBarButton(
|
||||
FUIAction(
|
||||
FExecuteAction::CreateRaw(this, &FOpenPypeModule::MenuPopup),
|
||||
NULL,
|
||||
FIsActionChecked()
|
||||
|
||||
),
|
||||
NAME_None,
|
||||
LOCTEXT("OpenPype_label", "OpenPype"),
|
||||
LOCTEXT("OpenPype_tooltip", "OpenPype Tools"),
|
||||
FSlateIcon(FOpenPypeStyle::GetStyleSetName(), "OpenPype.Logo")
|
||||
);
|
||||
}
|
||||
ToolbarBuilder.EndSection();
|
||||
}
|
||||
|
||||
void FOpenPypeModule::RegisterSettings()
|
||||
{
|
||||
ISettingsModule& SettingsModule = FModuleManager::LoadModuleChecked<ISettingsModule>("Settings");
|
||||
|
||||
// Create the new category
|
||||
// TODO: After the movement of the plugin from the game to editor, it might be necessary to move this!
|
||||
ISettingsContainerPtr SettingsContainer = SettingsModule.GetContainer("Project");
|
||||
|
||||
UOpenPypeSettings* Settings = GetMutableDefault<UOpenPypeSettings>();
|
||||
|
||||
// Register the settings
|
||||
ISettingsSectionPtr SettingsSection = SettingsModule.RegisterSettings("Project", "OpenPype", "General",
|
||||
LOCTEXT("RuntimeGeneralSettingsName",
|
||||
"General"),
|
||||
LOCTEXT("RuntimeGeneralSettingsDescription",
|
||||
"Base configuration for Open Pype Module"),
|
||||
Settings
|
||||
);
|
||||
|
||||
// Register the save handler to your settings, you might want to use it to
|
||||
// validate those or just act to settings changes.
|
||||
if (SettingsSection.IsValid())
|
||||
{
|
||||
SettingsSection->OnModified().BindRaw(this, &FOpenPypeModule::HandleSettingsSaved);
|
||||
}
|
||||
}
|
||||
|
||||
bool FOpenPypeModule::HandleSettingsSaved()
|
||||
{
|
||||
UOpenPypeSettings* Settings = GetMutableDefault<UOpenPypeSettings>();
|
||||
bool ResaveSettings = false;
|
||||
|
||||
// You can put any validation code in here and resave the settings in case an invalid
|
||||
// value has been entered
|
||||
|
||||
if (ResaveSettings)
|
||||
{
|
||||
Settings->SaveConfig();
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
|
||||
void FOpenPypeModule::MenuPopup()
|
||||
{
|
||||
UOpenPypePythonBridge* bridge = UOpenPypePythonBridge::Get();
|
||||
bridge->RunInPython_Popup();
|
||||
}
|
||||
|
||||
void FOpenPypeModule::MenuDialog()
|
||||
{
|
||||
UOpenPypePythonBridge* bridge = UOpenPypePythonBridge::Get();
|
||||
bridge->RunInPython_Dialog();
|
||||
}
|
||||
|
||||
IMPLEMENT_MODULE(FOpenPypeModule, OpenPype)
|
||||
|
|
@ -1,53 +0,0 @@
|
|||
// Copyright 2023, Ayon, All rights reserved.
|
||||
#include "OpenPypeLib.h"
|
||||
|
||||
#include "AssetViewUtils.h"
|
||||
#include "Misc/Paths.h"
|
||||
#include "Misc/ConfigCacheIni.h"
|
||||
#include "UObject/UnrealType.h"
|
||||
|
||||
/**
|
||||
* Sets color on folder icon on given path
|
||||
* @param InPath - path to folder
|
||||
* @param InFolderColor - color of the folder
|
||||
* @warning This color will appear only after Editor restart. Is there a better way?
|
||||
*/
|
||||
|
||||
bool UOpenPypeLib::SetFolderColor(const FString& FolderPath, const FLinearColor& FolderColor, const bool& bForceAdd)
|
||||
{
|
||||
if (AssetViewUtils::DoesFolderExist(FolderPath))
|
||||
{
|
||||
const TSharedPtr<FLinearColor> LinearColor = MakeShared<FLinearColor>(FolderColor);
|
||||
|
||||
AssetViewUtils::SaveColor(FolderPath, LinearColor, true);
|
||||
UE_LOG(LogAssetData, Display, TEXT("A color {%s} has been set to folder \"%s\""), *LinearColor->ToString(),
|
||||
*FolderPath)
|
||||
return true;
|
||||
}
|
||||
|
||||
UE_LOG(LogAssetData, Display, TEXT("Setting a color {%s} to folder \"%s\" has failed! Directory doesn't exist!"),
|
||||
*FolderColor.ToString(), *FolderPath)
|
||||
return false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns all properties on given object
|
||||
* @param cls - class
|
||||
* @return TArray of properties
|
||||
*/
|
||||
TArray<FString> UOpenPypeLib::GetAllProperties(UClass* cls)
|
||||
{
|
||||
TArray<FString> Ret;
|
||||
if (cls != nullptr)
|
||||
{
|
||||
for (TFieldIterator<FProperty> It(cls); It; ++It)
|
||||
{
|
||||
FProperty* Property = *It;
|
||||
if (Property->HasAnyPropertyFlags(EPropertyFlags::CPF_Edit))
|
||||
{
|
||||
Ret.Add(Property->GetName());
|
||||
}
|
||||
}
|
||||
}
|
||||
return Ret;
|
||||
}
|
||||
|
|
@ -1,201 +0,0 @@
|
|||
// Copyright 2023, Ayon, All rights reserved.
|
||||
#pragma once
|
||||
|
||||
#include "OpenPypePublishInstance.h"
|
||||
#include "AssetRegistryModule.h"
|
||||
#include "OpenPypeLib.h"
|
||||
#include "OpenPypeSettings.h"
|
||||
#include "Framework/Notifications/NotificationManager.h"
|
||||
#include "Widgets/Notifications/SNotificationList.h"
|
||||
|
||||
//Moves all the invalid pointers to the end to prepare them for the shrinking
|
||||
#define REMOVE_INVALID_ENTRIES(VAR) VAR.CompactStable(); \
|
||||
VAR.Shrink();
|
||||
|
||||
UOpenPypePublishInstance::UOpenPypePublishInstance(const FObjectInitializer& ObjectInitializer)
|
||||
: UPrimaryDataAsset(ObjectInitializer)
|
||||
{
|
||||
const FAssetRegistryModule& AssetRegistryModule = FModuleManager::LoadModuleChecked<
|
||||
FAssetRegistryModule>("AssetRegistry");
|
||||
|
||||
const FPropertyEditorModule& PropertyEditorModule = FModuleManager::LoadModuleChecked<FPropertyEditorModule>(
|
||||
"PropertyEditor");
|
||||
|
||||
FString Left, Right;
|
||||
GetPathName().Split("/" + GetName(), &Left, &Right);
|
||||
|
||||
FARFilter Filter;
|
||||
Filter.PackagePaths.Emplace(FName(Left));
|
||||
|
||||
TArray<FAssetData> FoundAssets;
|
||||
AssetRegistryModule.GetRegistry().GetAssets(Filter, FoundAssets);
|
||||
|
||||
for (const FAssetData& AssetData : FoundAssets)
|
||||
OnAssetCreated(AssetData);
|
||||
|
||||
REMOVE_INVALID_ENTRIES(AssetDataInternal)
|
||||
REMOVE_INVALID_ENTRIES(AssetDataExternal)
|
||||
|
||||
AssetRegistryModule.Get().OnAssetAdded().AddUObject(this, &UOpenPypePublishInstance::OnAssetCreated);
|
||||
AssetRegistryModule.Get().OnAssetRemoved().AddUObject(this, &UOpenPypePublishInstance::OnAssetRemoved);
|
||||
AssetRegistryModule.Get().OnAssetUpdated().AddUObject(this, &UOpenPypePublishInstance::OnAssetUpdated);
|
||||
|
||||
#ifdef WITH_EDITOR
|
||||
ColorOpenPypeDirs();
|
||||
#endif
|
||||
|
||||
}
|
||||
|
||||
void UOpenPypePublishInstance::OnAssetCreated(const FAssetData& InAssetData)
|
||||
{
|
||||
TArray<FString> split;
|
||||
|
||||
UObject* Asset = InAssetData.GetAsset();
|
||||
|
||||
if (!IsValid(Asset))
|
||||
{
|
||||
UE_LOG(LogAssetData, Warning, TEXT("Asset \"%s\" is not valid! Skipping the addition."),
|
||||
*InAssetData.ObjectPath.ToString());
|
||||
return;
|
||||
}
|
||||
|
||||
const bool result = IsUnderSameDir(Asset) && Cast<UOpenPypePublishInstance>(Asset) == nullptr;
|
||||
|
||||
if (result)
|
||||
{
|
||||
if (AssetDataInternal.Emplace(Asset).IsValidId())
|
||||
{
|
||||
UE_LOG(LogTemp, Log, TEXT("Added an Asset to PublishInstance - Publish Instance: %s, Asset %s"),
|
||||
*this->GetName(), *Asset->GetName());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
void UOpenPypePublishInstance::OnAssetRemoved(const FAssetData& InAssetData)
|
||||
{
|
||||
if (Cast<UOpenPypePublishInstance>(InAssetData.GetAsset()) == nullptr)
|
||||
{
|
||||
if (AssetDataInternal.Contains(nullptr))
|
||||
{
|
||||
AssetDataInternal.Remove(nullptr);
|
||||
REMOVE_INVALID_ENTRIES(AssetDataInternal)
|
||||
}
|
||||
else
|
||||
{
|
||||
AssetDataExternal.Remove(nullptr);
|
||||
REMOVE_INVALID_ENTRIES(AssetDataExternal)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
void UOpenPypePublishInstance::OnAssetUpdated(const FAssetData& InAssetData)
|
||||
{
|
||||
REMOVE_INVALID_ENTRIES(AssetDataInternal);
|
||||
REMOVE_INVALID_ENTRIES(AssetDataExternal);
|
||||
}
|
||||
|
||||
bool UOpenPypePublishInstance::IsUnderSameDir(const UObject* InAsset) const
|
||||
{
|
||||
FString ThisLeft, ThisRight;
|
||||
this->GetPathName().Split(this->GetName(), &ThisLeft, &ThisRight);
|
||||
|
||||
return InAsset->GetPathName().StartsWith(ThisLeft);
|
||||
}
|
||||
|
||||
#ifdef WITH_EDITOR
|
||||
|
||||
void UOpenPypePublishInstance::ColorOpenPypeDirs()
|
||||
{
|
||||
FString PathName = this->GetPathName();
|
||||
|
||||
//Check whether the path contains the defined OpenPype folder
|
||||
if (!PathName.Contains(TEXT("OpenPype"))) return;
|
||||
|
||||
//Get the base path for open pype
|
||||
FString PathLeft, PathRight;
|
||||
PathName.Split(FString("OpenPype"), &PathLeft, &PathRight);
|
||||
|
||||
if (PathLeft.IsEmpty() || PathRight.IsEmpty())
|
||||
{
|
||||
UE_LOG(LogAssetData, Error, TEXT("Failed to retrieve the base OpenPype directory!"))
|
||||
return;
|
||||
}
|
||||
|
||||
PathName.RemoveFromEnd(PathRight, ESearchCase::CaseSensitive);
|
||||
|
||||
//Get the current settings
|
||||
const UOpenPypeSettings* Settings = GetMutableDefault<UOpenPypeSettings>();
|
||||
|
||||
//Color the base folder
|
||||
UOpenPypeLib::SetFolderColor(PathName, Settings->GetFolderFColor(), false);
|
||||
|
||||
//Get Sub paths, iterate through them and color them according to the folder color in UOpenPypeSettings
|
||||
const FAssetRegistryModule& AssetRegistryModule = FModuleManager::LoadModuleChecked<FAssetRegistryModule>(
|
||||
"AssetRegistry");
|
||||
|
||||
TArray<FString> PathList;
|
||||
|
||||
AssetRegistryModule.Get().GetSubPaths(PathName, PathList, true);
|
||||
|
||||
if (PathList.Num() > 0)
|
||||
{
|
||||
for (const FString& Path : PathList)
|
||||
{
|
||||
UOpenPypeLib::SetFolderColor(Path, Settings->GetFolderFColor(), false);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
void UOpenPypePublishInstance::SendNotification(const FString& Text) const
|
||||
{
|
||||
FNotificationInfo Info{FText::FromString(Text)};
|
||||
|
||||
Info.bFireAndForget = true;
|
||||
Info.bUseLargeFont = false;
|
||||
Info.bUseThrobber = false;
|
||||
Info.bUseSuccessFailIcons = false;
|
||||
Info.ExpireDuration = 4.f;
|
||||
Info.FadeOutDuration = 2.f;
|
||||
|
||||
FSlateNotificationManager::Get().AddNotification(Info);
|
||||
|
||||
UE_LOG(LogAssetData, Warning,
|
||||
TEXT(
|
||||
"Removed duplicated asset from the AssetsDataExternal in Container \"%s\", Asset is already included in the AssetDataInternal!"
|
||||
), *GetName()
|
||||
)
|
||||
}
|
||||
|
||||
|
||||
void UOpenPypePublishInstance::PostEditChangeProperty(FPropertyChangedEvent& PropertyChangedEvent)
|
||||
{
|
||||
Super::PostEditChangeProperty(PropertyChangedEvent);
|
||||
|
||||
if (PropertyChangedEvent.ChangeType == EPropertyChangeType::ValueSet &&
|
||||
PropertyChangedEvent.Property->GetFName() == GET_MEMBER_NAME_CHECKED(
|
||||
UOpenPypePublishInstance, AssetDataExternal))
|
||||
{
|
||||
// Check for duplicated assets
|
||||
for (const auto& Asset : AssetDataInternal)
|
||||
{
|
||||
if (AssetDataExternal.Contains(Asset))
|
||||
{
|
||||
AssetDataExternal.Remove(Asset);
|
||||
return SendNotification(
|
||||
"You are not allowed to add assets into AssetDataExternal which are already included in AssetDataInternal!");
|
||||
}
|
||||
}
|
||||
|
||||
// Check if no UOpenPypePublishInstance type assets are included
|
||||
for (const auto& Asset : AssetDataExternal)
|
||||
{
|
||||
if (Cast<UOpenPypePublishInstance>(Asset.Get()) != nullptr)
|
||||
{
|
||||
AssetDataExternal.Remove(Asset);
|
||||
return SendNotification("You are not allowed to add publish instances!");
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#endif
|
||||
|
|
@ -1,21 +0,0 @@
|
|||
// Copyright 2023, Ayon, All rights reserved.
|
||||
#include "OpenPypePublishInstanceFactory.h"
|
||||
#include "OpenPypePublishInstance.h"
|
||||
|
||||
UOpenPypePublishInstanceFactory::UOpenPypePublishInstanceFactory(const FObjectInitializer& ObjectInitializer)
|
||||
: UFactory(ObjectInitializer)
|
||||
{
|
||||
SupportedClass = UOpenPypePublishInstance::StaticClass();
|
||||
bCreateNew = false;
|
||||
bEditorImport = true;
|
||||
}
|
||||
|
||||
UObject* UOpenPypePublishInstanceFactory::FactoryCreateNew(UClass* InClass, UObject* InParent, FName InName, EObjectFlags Flags, UObject* Context, FFeedbackContext* Warn)
|
||||
{
|
||||
check(InClass->IsChildOf(UOpenPypePublishInstance::StaticClass()));
|
||||
return NewObject<UOpenPypePublishInstance>(InParent, InClass, InName, Flags);
|
||||
}
|
||||
|
||||
bool UOpenPypePublishInstanceFactory::ShouldShowInNewMenu() const {
|
||||
return false;
|
||||
}
|
||||
|
|
@ -1,14 +0,0 @@
|
|||
// Copyright 2023, Ayon, All rights reserved.
|
||||
#include "OpenPypePythonBridge.h"
|
||||
|
||||
UOpenPypePythonBridge* UOpenPypePythonBridge::Get()
|
||||
{
|
||||
TArray<UClass*> OpenPypePythonBridgeClasses;
|
||||
GetDerivedClasses(UOpenPypePythonBridge::StaticClass(), OpenPypePythonBridgeClasses);
|
||||
int32 NumClasses = OpenPypePythonBridgeClasses.Num();
|
||||
if (NumClasses > 0)
|
||||
{
|
||||
return Cast<UOpenPypePythonBridge>(OpenPypePythonBridgeClasses[NumClasses - 1]->GetDefaultObject());
|
||||
}
|
||||
return nullptr;
|
||||
};
|
||||
|
|
@ -1,20 +0,0 @@
|
|||
// Copyright 2023, Ayon, All rights reserved.
|
||||
|
||||
#include "OpenPypeSettings.h"
|
||||
|
||||
#include "Interfaces/IPluginManager.h"
|
||||
|
||||
/**
|
||||
* Mainly is used for initializing default values if the DefaultOpenPypeSettings.ini file does not exist in the saved config
|
||||
*/
|
||||
UOpenPypeSettings::UOpenPypeSettings(const FObjectInitializer& ObjectInitializer)
|
||||
{
|
||||
|
||||
const FString ConfigFilePath = OPENPYPE_SETTINGS_FILEPATH;
|
||||
|
||||
// This has to be probably in the future set using the UE Reflection system
|
||||
FColor Color;
|
||||
GConfig->GetColor(TEXT("/Script/OpenPype.OpenPypeSettings"), TEXT("FolderColor"), Color, ConfigFilePath);
|
||||
|
||||
FolderColor = Color;
|
||||
}
|
||||
|
|
@ -1,70 +0,0 @@
|
|||
// Copyright 2023, Ayon, All rights reserved.
|
||||
#include "OpenPypeStyle.h"
|
||||
#include "Framework/Application/SlateApplication.h"
|
||||
#include "Styling/SlateStyle.h"
|
||||
#include "Styling/SlateStyleRegistry.h"
|
||||
|
||||
|
||||
TUniquePtr< FSlateStyleSet > FOpenPypeStyle::OpenPypeStyleInstance = nullptr;
|
||||
|
||||
void FOpenPypeStyle::Initialize()
|
||||
{
|
||||
if (!OpenPypeStyleInstance.IsValid())
|
||||
{
|
||||
OpenPypeStyleInstance = Create();
|
||||
FSlateStyleRegistry::RegisterSlateStyle(*OpenPypeStyleInstance);
|
||||
}
|
||||
}
|
||||
|
||||
void FOpenPypeStyle::Shutdown()
|
||||
{
|
||||
if (OpenPypeStyleInstance.IsValid())
|
||||
{
|
||||
FSlateStyleRegistry::UnRegisterSlateStyle(*OpenPypeStyleInstance);
|
||||
OpenPypeStyleInstance.Reset();
|
||||
}
|
||||
}
|
||||
|
||||
FName FOpenPypeStyle::GetStyleSetName()
|
||||
{
|
||||
static FName StyleSetName(TEXT("OpenPypeStyle"));
|
||||
return StyleSetName;
|
||||
}
|
||||
|
||||
FName FOpenPypeStyle::GetContextName()
|
||||
{
|
||||
static FName ContextName(TEXT("OpenPype"));
|
||||
return ContextName;
|
||||
}
|
||||
|
||||
#define IMAGE_BRUSH(RelativePath, ...) FSlateImageBrush( Style->RootToContentDir( RelativePath, TEXT(".png") ), __VA_ARGS__ )
|
||||
|
||||
const FVector2D Icon40x40(40.0f, 40.0f);
|
||||
|
||||
TUniquePtr< FSlateStyleSet > FOpenPypeStyle::Create()
|
||||
{
|
||||
TUniquePtr< FSlateStyleSet > Style = MakeUnique<FSlateStyleSet>(GetStyleSetName());
|
||||
Style->SetContentRoot(FPaths::EnginePluginsDir() / TEXT("Marketplace/OpenPype/Resources"));
|
||||
|
||||
return Style;
|
||||
}
|
||||
|
||||
void FOpenPypeStyle::SetIcon(const FString& StyleName, const FString& ResourcePath)
|
||||
{
|
||||
FSlateStyleSet* Style = OpenPypeStyleInstance.Get();
|
||||
|
||||
FString Name(GetContextName().ToString());
|
||||
Name = Name + "." + StyleName;
|
||||
Style->Set(*Name, new FSlateImageBrush(Style->RootToContentDir(ResourcePath, TEXT(".png")), Icon40x40));
|
||||
|
||||
|
||||
FSlateApplication::Get().GetRenderer()->ReloadTextureResources();
|
||||
}
|
||||
|
||||
#undef IMAGE_BRUSH
|
||||
|
||||
const ISlateStyle& FOpenPypeStyle::Get()
|
||||
{
|
||||
check(OpenPypeStyleInstance);
|
||||
return *OpenPypeStyleInstance;
|
||||
}
|
||||
|
|
@ -1,60 +0,0 @@
|
|||
// Copyright 2023, Ayon, All rights reserved.
|
||||
#pragma once
|
||||
|
||||
#include "GameProjectUtils.h"
|
||||
#include "Commandlets/OPActionResult.h"
|
||||
#include "ProjectDescriptor.h"
|
||||
#include "Commandlets/Commandlet.h"
|
||||
#include "OPGenerateProjectCommandlet.generated.h"
|
||||
|
||||
struct FProjectDescriptor;
|
||||
struct FProjectInformation;
|
||||
|
||||
/**
|
||||
* @brief Structure which parses command line parameters and generates FProjectInformation
|
||||
*/
|
||||
USTRUCT()
|
||||
struct FOPGenerateProjectParams
|
||||
{
|
||||
GENERATED_BODY()
|
||||
|
||||
private:
|
||||
FString CommandLineParams;
|
||||
TArray<FString> Tokens;
|
||||
TArray<FString> Switches;
|
||||
|
||||
public:
|
||||
FOPGenerateProjectParams();
|
||||
FOPGenerateProjectParams(const FString& CommandLineParams);
|
||||
|
||||
FProjectInformation GenerateUEProjectInformation() const;
|
||||
|
||||
private:
|
||||
FString TryGetToken(const int32 Index) const;
|
||||
FString GetProjectFileName() const;
|
||||
|
||||
bool IsSwitchPresent(const FString& Switch) const;
|
||||
};
|
||||
|
||||
UCLASS()
|
||||
class OPENPYPE_API UOPGenerateProjectCommandlet : public UCommandlet
|
||||
{
|
||||
GENERATED_BODY()
|
||||
|
||||
private:
|
||||
FProjectInformation ProjectInformation;
|
||||
FProjectDescriptor ProjectDescriptor;
|
||||
|
||||
public:
|
||||
UOPGenerateProjectCommandlet();
|
||||
|
||||
virtual int32 Main(const FString& CommandLineParams) override;
|
||||
|
||||
private:
|
||||
FOPGenerateProjectParams ParseParameters(const FString& Params) const;
|
||||
FOP_ActionResult TryCreateProject() const;
|
||||
FOP_ActionResult TryLoadProjectDescriptor();
|
||||
void AttachPluginsToProjectDescriptor();
|
||||
FOP_ActionResult TrySave();
|
||||
};
|
||||
|
||||
|
|
@ -1,83 +0,0 @@
|
|||
// Copyright 2023, Ayon, All rights reserved.
|
||||
|
||||
#pragma once
|
||||
|
||||
#include "CoreMinimal.h"
|
||||
#include "OPActionResult.generated.h"
|
||||
|
||||
/**
|
||||
* @brief This macro returns error code when is problem or does nothing when there is no problem.
|
||||
* @param ActionResult FOP_ActionResult structure
|
||||
*/
|
||||
#define EVALUATE_OP_ACTION_RESULT(ActionResult) \
|
||||
if(ActionResult.IsProblem()) \
|
||||
return ActionResult.GetStatus();
|
||||
|
||||
/**
|
||||
* @brief This enum values are humanly readable mapping of error codes.
|
||||
* Here should be all error codes to be possible find what went wrong.
|
||||
* TODO: In the future a web document should exists with the mapped error code & what problem occurred & how to repair it...
|
||||
*/
|
||||
UENUM()
|
||||
namespace EOP_ActionResult
|
||||
{
|
||||
enum Type
|
||||
{
|
||||
Ok,
|
||||
ProjectNotCreated,
|
||||
ProjectNotLoaded,
|
||||
ProjectNotSaved,
|
||||
//....Here insert another values
|
||||
|
||||
//Do not remove!
|
||||
//Usable for looping through enum values
|
||||
__Last UMETA(Hidden)
|
||||
};
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* @brief This struct holds action result enum and optionally reason of fail
|
||||
*/
|
||||
USTRUCT()
|
||||
struct FOP_ActionResult
|
||||
{
|
||||
GENERATED_BODY()
|
||||
|
||||
public:
|
||||
/** @brief Default constructor usable when there is no problem */
|
||||
FOP_ActionResult();
|
||||
|
||||
/**
|
||||
* @brief This constructor initializes variables & attempts to log when is error
|
||||
* @param InEnum Status
|
||||
*/
|
||||
FOP_ActionResult(const EOP_ActionResult::Type& InEnum);
|
||||
|
||||
/**
|
||||
* @brief This constructor initializes variables & attempts to log when is error
|
||||
* @param InEnum Status
|
||||
* @param InReason Reason of potential fail
|
||||
*/
|
||||
FOP_ActionResult(const EOP_ActionResult::Type& InEnum, const FText& InReason);
|
||||
|
||||
private:
|
||||
/** @brief Action status */
|
||||
EOP_ActionResult::Type Status;
|
||||
|
||||
/** @brief Optional reason of fail */
|
||||
FText Reason;
|
||||
|
||||
public:
|
||||
/**
|
||||
* @brief Checks if there is problematic state
|
||||
* @return true when status is not equal to EOP_ActionResult::Ok
|
||||
*/
|
||||
bool IsProblem() const;
|
||||
EOP_ActionResult::Type& GetStatus();
|
||||
FText& GetReason();
|
||||
|
||||
private:
|
||||
void TryLog() const;
|
||||
};
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue