mirror of
https://github.com/ynput/ayon-core.git
synced 2025-12-24 21:04:40 +01:00
Merged in release/v2.1.0 (pull request #259)
Release/v2.1.0 Approved-by: Milan Kolar <milan@orbi.tools>
This commit is contained in:
commit
c20f4ba5ee
4057 changed files with 416063 additions and 11995 deletions
9
.flake8
Normal file
9
.flake8
Normal file
|
|
@ -0,0 +1,9 @@
|
|||
[flake8]
|
||||
# ignore = D203
|
||||
exclude =
|
||||
.git,
|
||||
__pycache__,
|
||||
docs,
|
||||
*/vendor
|
||||
|
||||
max-complexity = 30
|
||||
23
.gitignore
vendored
23
.gitignore
vendored
|
|
@ -4,3 +4,26 @@
|
|||
__pycache__/
|
||||
*.py[cod]
|
||||
*$py.class
|
||||
|
||||
# Documentation
|
||||
###############
|
||||
/docs/build
|
||||
|
||||
# Editor backup files #
|
||||
#######################
|
||||
*~
|
||||
|
||||
# Unit test / coverage reports
|
||||
##############################
|
||||
htmlcov/
|
||||
.tox/
|
||||
.nox/
|
||||
.coverage
|
||||
.coverage.*
|
||||
/coverage
|
||||
.cache
|
||||
nosetests.xml
|
||||
coverage.xml
|
||||
*.cover
|
||||
.hypothesis/
|
||||
.pytest_cache/
|
||||
|
|
|
|||
27
README.md
27
README.md
|
|
@ -1,20 +1,31 @@
|
|||
The base studio *config* for [Avalon](https://getavalon.github.io/)
|
||||
Pype
|
||||
====
|
||||
|
||||
<br>
|
||||
The base studio _config_ for [Avalon](https://getavalon.github.io/)
|
||||
|
||||
Currently this config is dependent on our customised avalon instalation so it won't work with vanilla avalon core. We're working on open sourcing all of the necessary code though. You can still get inspiration or take our individual validators and scripts which should work just fine in other pipelines.
|
||||
|
||||
_This configuration acts as a starting point for all pype club clients wth avalon deployment._
|
||||
|
||||
### Code convention
|
||||
Code convention
|
||||
---------------
|
||||
|
||||
Below are some of the standard practices applied to this repositories.
|
||||
|
||||
- **Etiquette: PEP8**
|
||||
- All code is written in PEP8. It is recommended you use a linter as you work, flake8 and pylinter are both good options.
|
||||
|
||||
All code is written in PEP8. It is recommended you use a linter as you work, flake8 and pylinter are both good options.
|
||||
- **Etiquette: Napoleon docstrings**
|
||||
- Any docstrings are made in Google Napoleon format. See [Napoleon](https://sphinxcontrib-napoleon.readthedocs.io/en/latest/example_google.html) for details.
|
||||
|
||||
Any docstrings are made in Google Napoleon format. See [Napoleon](https://sphinxcontrib-napoleon.readthedocs.io/en/latest/example_google.html) for details.
|
||||
|
||||
- **Etiquette: Semantic Versioning**
|
||||
- This project follows [semantic versioning](http://semver.org).
|
||||
|
||||
This project follows [semantic versioning](http://semver.org).
|
||||
- **Etiquette: Underscore means private**
|
||||
- Anything prefixed with an underscore means that it is internal to wherever it is used. For example, a variable name is only ever used in the parent function or class. A module is not for use by the end-user. In contrast, anything without an underscore is public, but not necessarily part of the API. Members of the API resides in `api.py`.
|
||||
|
||||
Anything prefixed with an underscore means that it is internal to wherever it is used. For example, a variable name is only ever used in the parent function or class. A module is not for use by the end-user. In contrast, anything without an underscore is public, but not necessarily part of the API. Members of the API resides in `api.py`.
|
||||
|
||||
- **API: Idempotence**
|
||||
- A public function must be able to be called twice and produce the exact same result. This means no changing of state without restoring previous state when finishing. For example, if a function requires changing the current selection in Autodesk Maya, it must restore the previous selection prior to completing.
|
||||
|
||||
A public function must be able to be called twice and produce the exact same result. This means no changing of state without restoring previous state when finishing. For example, if a function requires changing the current selection in Autodesk Maya, it must restore the previous selection prior to completing.
|
||||
|
|
|
|||
54
changelog.md
Normal file
54
changelog.md
Normal file
|
|
@ -0,0 +1,54 @@
|
|||
# Pype changelog #
|
||||
Welcome to pype changelog
|
||||
|
||||
## 2.1 ##
|
||||
|
||||
A large cleanup release. Most of the change are under the hood.
|
||||
|
||||
**new**:
|
||||
- _(pype)_ add customisable workflow for creating quicktimes from renders or playblasts
|
||||
- _(pype)_ Added configurable option to add burnins to any generated quicktimes
|
||||
- _(ftrack)_ Action that identifies what machines pype is running on.
|
||||
- _(system)_ unify subprocess calls
|
||||
- _(maya)_ add audio to review quicktimes
|
||||
- _(nuke)_ add crop before write node to prevent overscan problems in ffmpeg
|
||||
- **Nuke Studio** publishing and workfiles support
|
||||
- **Muster** render manager support
|
||||
- _(nuke)_ Framerange, FPS and Resolution are set automatically at startup
|
||||
- _(maya)_ Ability to load published sequences as image planes
|
||||
- _(system)_ Ftrack event that sets asset folder permissions based on task assignees in ftrack.
|
||||
- _(maya)_ Pyblish plugin that allow validation of maya attributes
|
||||
- _(system)_ added better startup logging to tray debug, including basic connection information
|
||||
- _(avalon)_ option to group published subsets to groups in the loader
|
||||
- _(avalon)_ loader family filters are working now
|
||||
|
||||
**changed**:
|
||||
- change multiple key attributes to unify their behaviour across the pipeline
|
||||
- `frameRate` to `fps`
|
||||
- `startFrame` to `frameStart`
|
||||
- `endFrame` to `frameEnd`
|
||||
- `fstart` to `frameStart`
|
||||
- `fend` to `frameEnd`
|
||||
- `handle_start` to `handleStart`
|
||||
- `handle_end` to `handleEnd`
|
||||
- `resolution_width` to `resolutionWidth`
|
||||
- `resolution_height` to `resolutionHeight`
|
||||
- `pixel_aspect` to `pixelAspect`
|
||||
|
||||
- _(nuke)_ write nodes are now created inside group with only some attributes editable by the artist
|
||||
- rendered frames are now deleted from temporary location after their publishing is finished.
|
||||
- _(ftrack)_ RV action can now be launched from any entity
|
||||
- after publishing only refresh button is now available in pyblish UI
|
||||
- added context instance pyblish-lite so that artist knows if context plugin fails
|
||||
- _(avalon)_ allow opening selected files using enter key
|
||||
- _(avalon)_ core updated to v5.2.9 with our forked changes on top
|
||||
|
||||
**fix**:
|
||||
- faster hierarchy retrieval from db
|
||||
- _(nuke)_ A lot of stability enhancements
|
||||
- _(nuke studio)_ A lot of stability enhancements
|
||||
- _(nuke)_ now only renders a single write node on farm
|
||||
- _(ftrack)_ pype would crash when launcher project level task
|
||||
- work directory was sometimes not being created correctly
|
||||
- major pype.lib cleanup. Removing of unused functions, merging those that were doing the same and general house cleaning.
|
||||
- _(avalon)_ subsets in maya 2019 weren't behaving correctly in the outliner
|
||||
19
docs/Makefile
Normal file
19
docs/Makefile
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
# Minimal makefile for Sphinx documentation
|
||||
#
|
||||
|
||||
# You can set these variables from the command line.
|
||||
SPHINXOPTS =
|
||||
SPHINXBUILD = sphinx-build
|
||||
SOURCEDIR = source
|
||||
BUILDDIR = build
|
||||
|
||||
# Put it first so that "make" without argument is like "make help".
|
||||
help:
|
||||
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
|
||||
|
||||
.PHONY: help Makefile
|
||||
|
||||
# Catch-all target: route all unknown targets to Sphinx using the new
|
||||
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
|
||||
%: Makefile
|
||||
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
|
||||
35
docs/make.bat
Normal file
35
docs/make.bat
Normal file
|
|
@ -0,0 +1,35 @@
|
|||
@ECHO OFF
|
||||
|
||||
pushd %~dp0
|
||||
|
||||
REM Command file for Sphinx documentation
|
||||
|
||||
if "%SPHINXBUILD%" == "" (
|
||||
set SPHINXBUILD=sphinx-build
|
||||
)
|
||||
set SOURCEDIR=source
|
||||
set BUILDDIR=build
|
||||
|
||||
if "%1" == "" goto help
|
||||
|
||||
%SPHINXBUILD% >NUL 2>NUL
|
||||
if errorlevel 9009 (
|
||||
echo.
|
||||
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
|
||||
echo.installed, then set the SPHINXBUILD environment variable to point
|
||||
echo.to the full path of the 'sphinx-build' executable. Alternatively you
|
||||
echo.may add the Sphinx directory to PATH.
|
||||
echo.
|
||||
echo.If you don't have Sphinx installed, grab it from
|
||||
echo.http://sphinx-doc.org/
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS%
|
||||
goto end
|
||||
|
||||
:help
|
||||
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS%
|
||||
|
||||
:end
|
||||
popd
|
||||
222
docs/source/conf.py
Normal file
222
docs/source/conf.py
Normal file
|
|
@ -0,0 +1,222 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# Configuration file for the Sphinx documentation builder.
|
||||
#
|
||||
# This file does only contain a selection of the most common options. For a
|
||||
# full list see the documentation:
|
||||
# http://www.sphinx-doc.org/en/master/config
|
||||
|
||||
# -- Path setup --------------------------------------------------------------
|
||||
|
||||
# If extensions (or modules to document with autodoc) are in another directory,
|
||||
# add these directories to sys.path here. If the directory is relative to the
|
||||
# documentation root, use os.path.abspath to make it absolute, like shown here.
|
||||
#
|
||||
# import os
|
||||
# import sys
|
||||
# sys.path.insert(0, os.path.abspath('.'))
|
||||
import sys
|
||||
import os
|
||||
from pprint import pprint
|
||||
from pypeapp.pypeLauncher import PypeLauncher
|
||||
from pypeapp.storage import Storage
|
||||
from pypeapp.deployment import Deployment
|
||||
|
||||
pype_setup = os.getenv('PYPE_ROOT')
|
||||
d = Deployment(pype_setup)
|
||||
launcher = PypeLauncher()
|
||||
|
||||
tools, config_path = d.get_environment_data()
|
||||
|
||||
os.environ['PYPE_CONFIG'] = config_path
|
||||
os.environ['TOOL_ENV'] = os.path.normpath(os.path.join(config_path,
|
||||
'environments'))
|
||||
launcher._add_modules()
|
||||
Storage().update_environment()
|
||||
launcher._load_default_environments(tools=tools)
|
||||
|
||||
# -- Project information -----------------------------------------------------
|
||||
|
||||
project = 'pype'
|
||||
copyright = '2019, Orbi Tools'
|
||||
author = 'Orbi Tools'
|
||||
|
||||
# The short X.Y version
|
||||
version = ''
|
||||
# The full version, including alpha/beta/rc tags
|
||||
release = ''
|
||||
|
||||
|
||||
# -- General configuration ---------------------------------------------------
|
||||
|
||||
# If your documentation needs a minimal Sphinx version, state it here.
|
||||
#
|
||||
# needs_sphinx = '1.0'
|
||||
|
||||
# Add any Sphinx extension module names here, as strings. They can be
|
||||
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
|
||||
# ones.
|
||||
extensions = [
|
||||
'sphinx.ext.autodoc',
|
||||
'sphinx.ext.napoleon',
|
||||
'sphinx.ext.intersphinx',
|
||||
'sphinx.ext.todo',
|
||||
'sphinx.ext.coverage',
|
||||
'sphinx.ext.mathjax',
|
||||
'sphinx.ext.viewcode',
|
||||
'sphinx.ext.autosummary',
|
||||
'recommonmark'
|
||||
]
|
||||
|
||||
# Add any paths that contain templates here, relative to this directory.
|
||||
templates_path = ['_templates']
|
||||
|
||||
# The suffix(es) of source filenames.
|
||||
# You can specify multiple suffix as a list of string:
|
||||
#
|
||||
# source_suffix = ['.rst', '.md']
|
||||
source_suffix = '.rst'
|
||||
|
||||
# The master toctree document.
|
||||
master_doc = 'index'
|
||||
|
||||
# The language for content autogenerated by Sphinx. Refer to documentation
|
||||
# for a list of supported languages.
|
||||
#
|
||||
# This is also used if you do content translation via gettext catalogs.
|
||||
# Usually you set "language" from the command line for these cases.
|
||||
language = None
|
||||
|
||||
# List of patterns, relative to source directory, that match files and
|
||||
# directories to ignore when looking for source files.
|
||||
# This pattern also affects html_static_path and html_extra_path.
|
||||
exclude_patterns = []
|
||||
|
||||
# The name of the Pygments (syntax highlighting) style to use.
|
||||
pygments_style = 'friendly'
|
||||
|
||||
# -- Options for autodoc -----------------------------------------------------
|
||||
autodoc_default_flags = ['members']
|
||||
autosummary_generate = True
|
||||
|
||||
|
||||
# -- Options for HTML output -------------------------------------------------
|
||||
|
||||
|
||||
# -- Options for HTML output -------------------------------------------------
|
||||
|
||||
# The theme to use for HTML and HTML Help pages. See the documentation for
|
||||
# a list of builtin themes.
|
||||
#
|
||||
html_theme = 'sphinx_rtd_theme'
|
||||
|
||||
# Theme options are theme-specific and customize the look and feel of a theme
|
||||
# further. For a list of options available for each theme, see the
|
||||
# documentation.
|
||||
#
|
||||
html_theme_options = {
|
||||
'collapse_navigation': False
|
||||
}
|
||||
|
||||
# Add any paths that contain custom static files (such as style sheets) here,
|
||||
# relative to this directory. They are copied after the builtin static files,
|
||||
# so a file named "default.css" will overwrite the builtin "default.css".
|
||||
html_static_path = ['_static']
|
||||
|
||||
# Custom sidebar templates, must be a dictionary that maps document names
|
||||
# to template names.
|
||||
#
|
||||
# The default sidebars (for documents that don't match any pattern) are
|
||||
# defined by theme itself. Builtin themes are using these templates by
|
||||
# default: ``['localtoc.html', 'relations.html', 'sourcelink.html',
|
||||
# 'searchbox.html']``.
|
||||
#
|
||||
# html_sidebars = {}
|
||||
|
||||
|
||||
# -- Options for HTMLHelp output ---------------------------------------------
|
||||
|
||||
# Output file base name for HTML help builder.
|
||||
htmlhelp_basename = 'pypedoc'
|
||||
|
||||
|
||||
# -- Options for LaTeX output ------------------------------------------------
|
||||
|
||||
latex_elements = {
|
||||
# The paper size ('letterpaper' or 'a4paper').
|
||||
#
|
||||
# 'papersize': 'letterpaper',
|
||||
|
||||
# The font size ('10pt', '11pt' or '12pt').
|
||||
#
|
||||
# 'pointsize': '10pt',
|
||||
|
||||
# Additional stuff for the LaTeX preamble.
|
||||
#
|
||||
# 'preamble': '',
|
||||
|
||||
# Latex figure (float) alignment
|
||||
#
|
||||
# 'figure_align': 'htbp',
|
||||
}
|
||||
|
||||
# Grouping the document tree into LaTeX files. List of tuples
|
||||
# (source start file, target name, title,
|
||||
# author, documentclass [howto, manual, or own class]).
|
||||
latex_documents = [
|
||||
(master_doc, 'pype.tex', 'pype Documentation',
|
||||
'OrbiTools', 'manual'),
|
||||
]
|
||||
|
||||
|
||||
# -- Options for manual page output ------------------------------------------
|
||||
|
||||
# One entry per manual page. List of tuples
|
||||
# (source start file, name, description, authors, manual section).
|
||||
man_pages = [
|
||||
(master_doc, 'pype', 'pype Documentation',
|
||||
[author], 1)
|
||||
]
|
||||
|
||||
|
||||
# -- Options for Texinfo output ----------------------------------------------
|
||||
|
||||
# Grouping the document tree into Texinfo files. List of tuples
|
||||
# (source start file, target name, title, author,
|
||||
# dir menu entry, description, category)
|
||||
texinfo_documents = [
|
||||
(master_doc, 'pype', 'pype Documentation',
|
||||
author, 'pype', 'One line description of project.',
|
||||
'Miscellaneous'),
|
||||
]
|
||||
|
||||
|
||||
# -- Options for Epub output -------------------------------------------------
|
||||
|
||||
# Bibliographic Dublin Core info.
|
||||
epub_title = project
|
||||
|
||||
# The unique identifier of the text. This can be a ISBN number
|
||||
# or the project homepage.
|
||||
#
|
||||
# epub_identifier = ''
|
||||
|
||||
# A unique identification for the text.
|
||||
#
|
||||
# epub_uid = ''
|
||||
|
||||
# A list of files that should not be packed into the epub file.
|
||||
epub_exclude_files = ['search.html']
|
||||
|
||||
|
||||
# -- Extension configuration -------------------------------------------------
|
||||
|
||||
# -- Options for intersphinx extension ---------------------------------------
|
||||
|
||||
# Example configuration for intersphinx: refer to the Python standard library.
|
||||
intersphinx_mapping = {'https://docs.python.org/': None}
|
||||
|
||||
# -- Options for todo extension ----------------------------------------------
|
||||
|
||||
# If true, `todo` and `todoList` produce output, else they produce nothing.
|
||||
todo_include_todos = True
|
||||
18
docs/source/index.rst
Normal file
18
docs/source/index.rst
Normal file
|
|
@ -0,0 +1,18 @@
|
|||
.. pype documentation master file, created by
|
||||
sphinx-quickstart on Mon May 13 17:18:23 2019.
|
||||
You can adapt this file completely to your liking, but it should at least
|
||||
contain the root `toctree` directive.
|
||||
|
||||
Welcome to pype's documentation!
|
||||
================================
|
||||
|
||||
.. toctree::
|
||||
readme
|
||||
modules
|
||||
|
||||
Indices and tables
|
||||
==================
|
||||
|
||||
* :ref:`genindex`
|
||||
* :ref:`modindex`
|
||||
* :ref:`search`
|
||||
7
docs/source/modules.rst
Normal file
7
docs/source/modules.rst
Normal file
|
|
@ -0,0 +1,7 @@
|
|||
pype
|
||||
====
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 6
|
||||
|
||||
pype
|
||||
20
docs/source/pype.aport.rst
Normal file
20
docs/source/pype.aport.rst
Normal file
|
|
@ -0,0 +1,20 @@
|
|||
pype.aport package
|
||||
==================
|
||||
|
||||
.. automodule:: pype.aport
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
pype.aport.api module
|
||||
---------------------
|
||||
|
||||
.. automodule:: pype.aport.api
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
|
||||
20
docs/source/pype.avalon_apps.rst
Normal file
20
docs/source/pype.avalon_apps.rst
Normal file
|
|
@ -0,0 +1,20 @@
|
|||
pype.avalon\_apps package
|
||||
=========================
|
||||
|
||||
.. automodule:: pype.avalon_apps
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
pype.avalon\_apps.avalon\_app module
|
||||
------------------------------------
|
||||
|
||||
.. automodule:: pype.avalon_apps.avalon_app
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
|
||||
36
docs/source/pype.clockify.rst
Normal file
36
docs/source/pype.clockify.rst
Normal file
|
|
@ -0,0 +1,36 @@
|
|||
pype.clockify package
|
||||
=====================
|
||||
|
||||
.. automodule:: pype.clockify
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
pype.clockify.clockify module
|
||||
-----------------------------
|
||||
|
||||
.. automodule:: pype.clockify.clockify
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.clockify.clockify\_api module
|
||||
----------------------------------
|
||||
|
||||
.. automodule:: pype.clockify.clockify_api
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.clockify.widget\_settings module
|
||||
-------------------------------------
|
||||
|
||||
.. automodule:: pype.clockify.widget_settings
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
|
||||
36
docs/source/pype.ftrack.ftrack_server.rst
Normal file
36
docs/source/pype.ftrack.ftrack_server.rst
Normal file
|
|
@ -0,0 +1,36 @@
|
|||
pype.ftrack.ftrack\_server package
|
||||
==================================
|
||||
|
||||
.. automodule:: pype.ftrack.ftrack_server
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
pype.ftrack.ftrack\_server.event\_server module
|
||||
-----------------------------------------------
|
||||
|
||||
.. automodule:: pype.ftrack.ftrack_server.event_server
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.ftrack.ftrack\_server.event\_server\_cli module
|
||||
----------------------------------------------------
|
||||
|
||||
.. automodule:: pype.ftrack.ftrack_server.event_server_cli
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.ftrack.ftrack\_server.ftrack\_server module
|
||||
------------------------------------------------
|
||||
|
||||
.. automodule:: pype.ftrack.ftrack_server.ftrack_server
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
|
||||
52
docs/source/pype.ftrack.lib.rst
Normal file
52
docs/source/pype.ftrack.lib.rst
Normal file
|
|
@ -0,0 +1,52 @@
|
|||
pype.ftrack.lib package
|
||||
=======================
|
||||
|
||||
.. automodule:: pype.ftrack.lib
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
pype.ftrack.lib.avalon\_sync module
|
||||
-----------------------------------
|
||||
|
||||
.. automodule:: pype.ftrack.lib.avalon_sync
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.ftrack.lib.ftrack\_action\_handler module
|
||||
----------------------------------------------
|
||||
|
||||
.. automodule:: pype.ftrack.lib.ftrack_action_handler
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.ftrack.lib.ftrack\_app\_handler module
|
||||
-------------------------------------------
|
||||
|
||||
.. automodule:: pype.ftrack.lib.ftrack_app_handler
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.ftrack.lib.ftrack\_base\_handler module
|
||||
--------------------------------------------
|
||||
|
||||
.. automodule:: pype.ftrack.lib.ftrack_base_handler
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.ftrack.lib.ftrack\_event\_handler module
|
||||
---------------------------------------------
|
||||
|
||||
.. automodule:: pype.ftrack.lib.ftrack_event_handler
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
|
||||
52
docs/source/pype.ftrack.rst
Normal file
52
docs/source/pype.ftrack.rst
Normal file
|
|
@ -0,0 +1,52 @@
|
|||
pype.ftrack package
|
||||
===================
|
||||
|
||||
.. automodule:: pype.ftrack
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
Subpackages
|
||||
-----------
|
||||
|
||||
.. toctree::
|
||||
|
||||
pype.ftrack.ftrack_server
|
||||
pype.ftrack.lib
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
pype.ftrack.credentials module
|
||||
------------------------------
|
||||
|
||||
.. automodule:: pype.ftrack.credentials
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.ftrack.ftrack\_module module
|
||||
---------------------------------
|
||||
|
||||
.. automodule:: pype.ftrack.ftrack_module
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.ftrack.login\_dialog module
|
||||
--------------------------------
|
||||
|
||||
.. automodule:: pype.ftrack.login_dialog
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.ftrack.login\_tools module
|
||||
-------------------------------
|
||||
|
||||
.. automodule:: pype.ftrack.login_tools
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
|
||||
27
docs/source/pype.fusion.rst
Normal file
27
docs/source/pype.fusion.rst
Normal file
|
|
@ -0,0 +1,27 @@
|
|||
pype.fusion package
|
||||
===================
|
||||
|
||||
.. automodule:: pype.fusion
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
Subpackages
|
||||
-----------
|
||||
|
||||
.. toctree::
|
||||
|
||||
pype.fusion.scripts
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
pype.fusion.lib module
|
||||
----------------------
|
||||
|
||||
.. automodule:: pype.fusion.lib
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
|
||||
28
docs/source/pype.fusion.scripts.rst
Normal file
28
docs/source/pype.fusion.scripts.rst
Normal file
|
|
@ -0,0 +1,28 @@
|
|||
pype.fusion.scripts package
|
||||
===========================
|
||||
|
||||
.. automodule:: pype.fusion.scripts
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
pype.fusion.scripts.fusion\_switch\_shot module
|
||||
-----------------------------------------------
|
||||
|
||||
.. automodule:: pype.fusion.scripts.fusion_switch_shot
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.fusion.scripts.publish\_filesequence module
|
||||
------------------------------------------------
|
||||
|
||||
.. automodule:: pype.fusion.scripts.publish_filesequence
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
|
||||
20
docs/source/pype.houdini.rst
Normal file
20
docs/source/pype.houdini.rst
Normal file
|
|
@ -0,0 +1,20 @@
|
|||
pype.houdini package
|
||||
====================
|
||||
|
||||
.. automodule:: pype.houdini
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
pype.houdini.lib module
|
||||
-----------------------
|
||||
|
||||
.. automodule:: pype.houdini.lib
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
|
||||
52
docs/source/pype.maya.rst
Normal file
52
docs/source/pype.maya.rst
Normal file
|
|
@ -0,0 +1,52 @@
|
|||
pype.maya package
|
||||
=================
|
||||
|
||||
.. automodule:: pype.maya
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
pype.maya.action module
|
||||
-----------------------
|
||||
|
||||
.. automodule:: pype.maya.action
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.maya.customize module
|
||||
--------------------------
|
||||
|
||||
.. automodule:: pype.maya.customize
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.maya.lib module
|
||||
--------------------
|
||||
|
||||
.. automodule:: pype.maya.lib
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.maya.menu module
|
||||
---------------------
|
||||
|
||||
.. automodule:: pype.maya.menu
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.maya.plugin module
|
||||
-----------------------
|
||||
|
||||
.. automodule:: pype.maya.plugin
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
|
||||
44
docs/source/pype.nuke.rst
Normal file
44
docs/source/pype.nuke.rst
Normal file
|
|
@ -0,0 +1,44 @@
|
|||
pype.nuke package
|
||||
=================
|
||||
|
||||
.. automodule:: pype.nuke
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
pype.nuke.actions module
|
||||
------------------------
|
||||
|
||||
.. automodule:: pype.nuke.actions
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.nuke.lib module
|
||||
--------------------
|
||||
|
||||
.. automodule:: pype.nuke.lib
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.nuke.menu module
|
||||
---------------------
|
||||
|
||||
.. automodule:: pype.nuke.menu
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.nuke.templates module
|
||||
--------------------------
|
||||
|
||||
.. automodule:: pype.nuke.templates
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
|
||||
20
docs/source/pype.premiere.rst
Normal file
20
docs/source/pype.premiere.rst
Normal file
|
|
@ -0,0 +1,20 @@
|
|||
pype.premiere package
|
||||
=====================
|
||||
|
||||
.. automodule:: pype.premiere
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
pype.premiere.templates module
|
||||
------------------------------
|
||||
|
||||
.. automodule:: pype.premiere.templates
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
|
||||
88
docs/source/pype.rst
Normal file
88
docs/source/pype.rst
Normal file
|
|
@ -0,0 +1,88 @@
|
|||
pype package
|
||||
============
|
||||
|
||||
.. automodule:: pype
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
Subpackages
|
||||
-----------
|
||||
|
||||
.. toctree::
|
||||
|
||||
pype.aport
|
||||
pype.avalon_apps
|
||||
pype.clockify
|
||||
pype.ftrack
|
||||
pype.fusion
|
||||
pype.houdini
|
||||
pype.maya
|
||||
pype.nuke
|
||||
pype.premiere
|
||||
pype.scripts
|
||||
pype.services
|
||||
pype.standalonepublish
|
||||
pype.tools
|
||||
pype.widgets
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
pype.action module
|
||||
------------------
|
||||
|
||||
.. automodule:: pype.action
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.api module
|
||||
---------------
|
||||
|
||||
.. automodule:: pype.api
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.launcher\_actions module
|
||||
-----------------------------
|
||||
|
||||
.. automodule:: pype.launcher_actions
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.lib module
|
||||
---------------
|
||||
|
||||
.. automodule:: pype.lib
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.plugin module
|
||||
------------------
|
||||
|
||||
.. automodule:: pype.plugin
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.setdress\_api module
|
||||
-------------------------
|
||||
|
||||
.. automodule:: pype.setdress_api
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.templates module
|
||||
---------------------
|
||||
|
||||
.. automodule:: pype.templates
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
|
||||
28
docs/source/pype.scripts.rst
Normal file
28
docs/source/pype.scripts.rst
Normal file
|
|
@ -0,0 +1,28 @@
|
|||
pype.scripts package
|
||||
====================
|
||||
|
||||
.. automodule:: pype.scripts
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
pype.scripts.fusion\_switch\_shot module
|
||||
----------------------------------------
|
||||
|
||||
.. automodule:: pype.scripts.fusion_switch_shot
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.scripts.publish\_filesequence module
|
||||
-----------------------------------------
|
||||
|
||||
.. automodule:: pype.scripts.publish_filesequence
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
|
||||
20
docs/source/pype.services.idle_manager.rst
Normal file
20
docs/source/pype.services.idle_manager.rst
Normal file
|
|
@ -0,0 +1,20 @@
|
|||
pype.services.idle\_manager package
|
||||
===================================
|
||||
|
||||
.. automodule:: pype.services.idle_manager
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
pype.services.idle\_manager.idle\_manager module
|
||||
------------------------------------------------
|
||||
|
||||
.. automodule:: pype.services.idle_manager.idle_manager
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
|
||||
17
docs/source/pype.services.rst
Normal file
17
docs/source/pype.services.rst
Normal file
|
|
@ -0,0 +1,17 @@
|
|||
pype.services package
|
||||
=====================
|
||||
|
||||
.. automodule:: pype.services
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
Subpackages
|
||||
-----------
|
||||
|
||||
.. toctree::
|
||||
|
||||
pype.services.idle_manager
|
||||
pype.services.statics_server
|
||||
pype.services.timers_manager
|
||||
|
||||
20
docs/source/pype.services.statics_server.rst
Normal file
20
docs/source/pype.services.statics_server.rst
Normal file
|
|
@ -0,0 +1,20 @@
|
|||
pype.services.statics\_server package
|
||||
=====================================
|
||||
|
||||
.. automodule:: pype.services.statics_server
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
pype.services.statics\_server.statics\_server module
|
||||
----------------------------------------------------
|
||||
|
||||
.. automodule:: pype.services.statics_server.statics_server
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
|
||||
28
docs/source/pype.services.timers_manager.rst
Normal file
28
docs/source/pype.services.timers_manager.rst
Normal file
|
|
@ -0,0 +1,28 @@
|
|||
pype.services.timers\_manager package
|
||||
=====================================
|
||||
|
||||
.. automodule:: pype.services.timers_manager
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
pype.services.timers\_manager.timers\_manager module
|
||||
----------------------------------------------------
|
||||
|
||||
.. automodule:: pype.services.timers_manager.timers_manager
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.services.timers\_manager.widget\_user\_idle module
|
||||
-------------------------------------------------------
|
||||
|
||||
.. automodule:: pype.services.timers_manager.widget_user_idle
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
|
||||
8
docs/source/pype.standalonepublish.resources.rst
Normal file
8
docs/source/pype.standalonepublish.resources.rst
Normal file
|
|
@ -0,0 +1,8 @@
|
|||
pype.standalonepublish.resources package
|
||||
========================================
|
||||
|
||||
.. automodule:: pype.standalonepublish.resources
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
44
docs/source/pype.standalonepublish.rst
Normal file
44
docs/source/pype.standalonepublish.rst
Normal file
|
|
@ -0,0 +1,44 @@
|
|||
pype.standalonepublish package
|
||||
==============================
|
||||
|
||||
.. automodule:: pype.standalonepublish
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
Subpackages
|
||||
-----------
|
||||
|
||||
.. toctree::
|
||||
|
||||
pype.standalonepublish.resources
|
||||
pype.standalonepublish.widgets
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
pype.standalonepublish.app module
|
||||
---------------------------------
|
||||
|
||||
.. automodule:: pype.standalonepublish.app
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.standalonepublish.publish module
|
||||
-------------------------------------
|
||||
|
||||
.. automodule:: pype.standalonepublish.publish
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.standalonepublish.standalonepublish\_module module
|
||||
-------------------------------------------------------
|
||||
|
||||
.. automodule:: pype.standalonepublish.standalonepublish_module
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
|
||||
156
docs/source/pype.standalonepublish.widgets.rst
Normal file
156
docs/source/pype.standalonepublish.widgets.rst
Normal file
|
|
@ -0,0 +1,156 @@
|
|||
pype.standalonepublish.widgets package
|
||||
======================================
|
||||
|
||||
.. automodule:: pype.standalonepublish.widgets
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
pype.standalonepublish.widgets.button\_from\_svgs module
|
||||
--------------------------------------------------------
|
||||
|
||||
.. automodule:: pype.standalonepublish.widgets.button_from_svgs
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.standalonepublish.widgets.model\_asset module
|
||||
--------------------------------------------------
|
||||
|
||||
.. automodule:: pype.standalonepublish.widgets.model_asset
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.standalonepublish.widgets.model\_filter\_proxy\_exact\_match module
|
||||
------------------------------------------------------------------------
|
||||
|
||||
.. automodule:: pype.standalonepublish.widgets.model_filter_proxy_exact_match
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.standalonepublish.widgets.model\_filter\_proxy\_recursive\_sort module
|
||||
---------------------------------------------------------------------------
|
||||
|
||||
.. automodule:: pype.standalonepublish.widgets.model_filter_proxy_recursive_sort
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.standalonepublish.widgets.model\_node module
|
||||
-------------------------------------------------
|
||||
|
||||
.. automodule:: pype.standalonepublish.widgets.model_node
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.standalonepublish.widgets.model\_tasks\_template module
|
||||
------------------------------------------------------------
|
||||
|
||||
.. automodule:: pype.standalonepublish.widgets.model_tasks_template
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.standalonepublish.widgets.model\_tree module
|
||||
-------------------------------------------------
|
||||
|
||||
.. automodule:: pype.standalonepublish.widgets.model_tree
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.standalonepublish.widgets.model\_tree\_view\_deselectable module
|
||||
---------------------------------------------------------------------
|
||||
|
||||
.. automodule:: pype.standalonepublish.widgets.model_tree_view_deselectable
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.standalonepublish.widgets.widget\_asset module
|
||||
---------------------------------------------------
|
||||
|
||||
.. automodule:: pype.standalonepublish.widgets.widget_asset
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.standalonepublish.widgets.widget\_asset\_view module
|
||||
---------------------------------------------------------
|
||||
|
||||
.. automodule:: pype.standalonepublish.widgets.widget_asset_view
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.standalonepublish.widgets.widget\_component\_item module
|
||||
-------------------------------------------------------------
|
||||
|
||||
.. automodule:: pype.standalonepublish.widgets.widget_component_item
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.standalonepublish.widgets.widget\_components module
|
||||
--------------------------------------------------------
|
||||
|
||||
.. automodule:: pype.standalonepublish.widgets.widget_components
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.standalonepublish.widgets.widget\_components\_list module
|
||||
--------------------------------------------------------------
|
||||
|
||||
.. automodule:: pype.standalonepublish.widgets.widget_components_list
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.standalonepublish.widgets.widget\_drop\_empty module
|
||||
---------------------------------------------------------
|
||||
|
||||
.. automodule:: pype.standalonepublish.widgets.widget_drop_empty
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.standalonepublish.widgets.widget\_drop\_frame module
|
||||
---------------------------------------------------------
|
||||
|
||||
.. automodule:: pype.standalonepublish.widgets.widget_drop_frame
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.standalonepublish.widgets.widget\_family module
|
||||
----------------------------------------------------
|
||||
|
||||
.. automodule:: pype.standalonepublish.widgets.widget_family
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.standalonepublish.widgets.widget\_family\_desc module
|
||||
----------------------------------------------------------
|
||||
|
||||
.. automodule:: pype.standalonepublish.widgets.widget_family_desc
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.standalonepublish.widgets.widget\_shadow module
|
||||
----------------------------------------------------
|
||||
|
||||
.. automodule:: pype.standalonepublish.widgets.widget_shadow
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
|
||||
36
docs/source/pype.tools.assetcreator.rst
Normal file
36
docs/source/pype.tools.assetcreator.rst
Normal file
|
|
@ -0,0 +1,36 @@
|
|||
pype.tools.assetcreator package
|
||||
===============================
|
||||
|
||||
.. automodule:: pype.tools.assetcreator
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
pype.tools.assetcreator.app module
|
||||
----------------------------------
|
||||
|
||||
.. automodule:: pype.tools.assetcreator.app
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.tools.assetcreator.model module
|
||||
------------------------------------
|
||||
|
||||
.. automodule:: pype.tools.assetcreator.model
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.tools.assetcreator.widget module
|
||||
-------------------------------------
|
||||
|
||||
.. automodule:: pype.tools.assetcreator.widget
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
|
||||
15
docs/source/pype.tools.rst
Normal file
15
docs/source/pype.tools.rst
Normal file
|
|
@ -0,0 +1,15 @@
|
|||
pype.tools package
|
||||
==================
|
||||
|
||||
.. automodule:: pype.tools
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
Subpackages
|
||||
-----------
|
||||
|
||||
.. toctree::
|
||||
|
||||
pype.tools.assetcreator
|
||||
|
||||
36
docs/source/pype.widgets.rst
Normal file
36
docs/source/pype.widgets.rst
Normal file
|
|
@ -0,0 +1,36 @@
|
|||
pype.widgets package
|
||||
====================
|
||||
|
||||
.. automodule:: pype.widgets
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
pype.widgets.message\_window module
|
||||
-----------------------------------
|
||||
|
||||
.. automodule:: pype.widgets.message_window
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.widgets.popup module
|
||||
-------------------------
|
||||
|
||||
.. automodule:: pype.widgets.popup
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
pype.widgets.project\_settings module
|
||||
-------------------------------------
|
||||
|
||||
.. automodule:: pype.widgets.project_settings
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
|
||||
1
docs/source/readme.rst
Normal file
1
docs/source/readme.rst
Normal file
|
|
@ -0,0 +1 @@
|
|||
.. include:: ../../README.md
|
||||
46
make_docs.bat
Normal file
46
make_docs.bat
Normal file
|
|
@ -0,0 +1,46 @@
|
|||
@echo off
|
||||
echo [92m^>^>^>[0m Generating pype-setup documentation, please wait ...
|
||||
call "C:\Users\Public\pype_env2\Scripts\activate.bat"
|
||||
|
||||
setlocal enableextensions enabledelayedexpansion
|
||||
set _OLD_PYTHONPATH=%PYTHONPATH%
|
||||
echo [92m^>^>^>[0m Adding repos path
|
||||
rem add stuff in repos
|
||||
call :ResolvePath repodir "..\"
|
||||
|
||||
for /d %%d in ( %repodir%*) do (
|
||||
echo - adding path %%d
|
||||
set PYTHONPATH=%%d;!PYTHONPATH!
|
||||
)
|
||||
|
||||
echo [92m^>^>^>[0m Adding python vendors path
|
||||
rem add python vendor paths
|
||||
call :ResolvePath vendordir "..\..\vendor\python\"
|
||||
|
||||
for /d %%d in ( %vendordir%*) do (
|
||||
echo - adding path %%d
|
||||
set PYTHONPATH=%%d;!PYTHONPATH!
|
||||
)
|
||||
|
||||
echo [92m^>^>^>[0m Setting PYPE_CONFIG
|
||||
call :ResolvePath pypeconfig "..\pype-config"
|
||||
set PYPE_CONFIG=%pypeconfig%
|
||||
echo [92m^>^>^>[0m Setting PYPE_ROOT
|
||||
call :ResolvePath pyperoot "..\..\"
|
||||
set PYPE_ROOT=%pyperoot%
|
||||
set PYTHONPATH=%PYPE_ROOT%;%PYTHONPATH%
|
||||
echo [92m^>^>^>[0m Setting PYPE_ENV
|
||||
set PYPE_ENV="C:\Users\Public\pype_env2"
|
||||
|
||||
call "docs\make.bat" clean
|
||||
sphinx-apidoc -M -f -d 6 --ext-autodoc --ext-intersphinx --ext-viewcode -o docs\source pype %PYPE_ROOT%\repos\pype\pype\vendor\*
|
||||
call "docs\make.bat" html
|
||||
echo [92m^>^>^>[0m Doing cleanup ...
|
||||
set PYTHONPATH=%_OLD_PYTHONPATH%
|
||||
set PYPE_CONFIG=
|
||||
call "C:\Users\Public\pype_env2\Scripts\deactivate.bat"
|
||||
exit /b
|
||||
|
||||
:ResolvePath
|
||||
set %1=%~dpfn2
|
||||
exit /b
|
||||
0
pype/.coveragerc
Normal file
0
pype/.coveragerc
Normal file
|
|
@ -2,9 +2,12 @@ import os
|
|||
|
||||
from pyblish import api as pyblish
|
||||
from avalon import api as avalon
|
||||
from .lib import filter_pyblish_plugins
|
||||
|
||||
from .launcher_actions import register_launcher_actions
|
||||
from .lib import collect_container_metadata
|
||||
import logging
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
__version__ = "2.1.0"
|
||||
|
||||
PACKAGE_DIR = os.path.dirname(__file__)
|
||||
PLUGINS_DIR = os.path.join(PACKAGE_DIR, "plugins")
|
||||
|
|
@ -15,12 +18,15 @@ LOAD_PATH = os.path.join(PLUGINS_DIR, "global", "load")
|
|||
|
||||
|
||||
def install():
|
||||
print("Registering global plug-ins..")
|
||||
log.info("Registering global plug-ins..")
|
||||
pyblish.register_plugin_path(PUBLISH_PATH)
|
||||
pyblish.register_discovery_filter(filter_pyblish_plugins)
|
||||
avalon.register_plugin_path(avalon.Loader, LOAD_PATH)
|
||||
|
||||
|
||||
def uninstall():
|
||||
print("Deregistering global plug-ins..")
|
||||
log.info("Deregistering global plug-ins..")
|
||||
pyblish.deregister_plugin_path(PUBLISH_PATH)
|
||||
pyblish.deregister_discovery_filter(filter_pyblish_plugins)
|
||||
avalon.deregister_plugin_path(avalon.Loader, LOAD_PATH)
|
||||
log.info("Global plug-ins unregistred")
|
||||
|
|
|
|||
|
|
@ -87,6 +87,4 @@ class RepairContextAction(pyblish.api.Action):
|
|||
# Apply pyblish.logic to get the instances for the plug-in
|
||||
if plugin in errored_plugins:
|
||||
self.log.info("Attempting fix ...")
|
||||
plugin.repair()
|
||||
|
||||
|
||||
plugin.repair(context)
|
||||
|
|
|
|||
36
pype/api.py
36
pype/api.py
|
|
@ -5,7 +5,8 @@ from .plugin import (
|
|||
ValidatePipelineOrder,
|
||||
ValidateContentsOrder,
|
||||
ValidateSceneOrder,
|
||||
ValidateMeshOrder
|
||||
ValidateMeshOrder,
|
||||
ValidationException
|
||||
)
|
||||
|
||||
# temporary fix, might
|
||||
|
|
@ -15,6 +16,21 @@ from .action import (
|
|||
RepairContextAction
|
||||
)
|
||||
|
||||
from pypeapp import Logger
|
||||
|
||||
from .lib import (
|
||||
version_up,
|
||||
get_asset,
|
||||
get_project,
|
||||
get_hierarchy,
|
||||
get_version_from_path,
|
||||
modified_environ,
|
||||
add_tool_to_environment
|
||||
)
|
||||
|
||||
# Special naming case for subprocess since its a built-in method.
|
||||
from .lib import _subprocess as subprocess
|
||||
|
||||
__all__ = [
|
||||
# plugin classes
|
||||
"Extractor",
|
||||
|
|
@ -25,5 +41,21 @@ __all__ = [
|
|||
"ValidateMeshOrder",
|
||||
# action
|
||||
"get_errored_instances_from_context",
|
||||
"RepairAction"
|
||||
"RepairAction",
|
||||
"RepairContextAction",
|
||||
|
||||
"Logger",
|
||||
|
||||
"ValidationException",
|
||||
|
||||
# get contextual data
|
||||
"version_up",
|
||||
"get_project",
|
||||
"get_hierarchy",
|
||||
"get_asset",
|
||||
"get_version_from_path",
|
||||
"modified_environ",
|
||||
"add_tool_to_environment",
|
||||
|
||||
"subprocess"
|
||||
]
|
||||
|
|
|
|||
91
pype/aport/__init__.py
Normal file
91
pype/aport/__init__.py
Normal file
|
|
@ -0,0 +1,91 @@
|
|||
import os
|
||||
import sys
|
||||
|
||||
from avalon import api as avalon
|
||||
from pyblish import api as pyblish
|
||||
from pypeapp import execute, Logger
|
||||
|
||||
from .. import api
|
||||
from .lib import set_avalon_workdir
|
||||
|
||||
log = Logger().get_logger(__name__, "aport")
|
||||
|
||||
AVALON_CONFIG = os.getenv("AVALON_CONFIG", "pype")
|
||||
|
||||
PARENT_DIR = os.path.dirname(__file__)
|
||||
PACKAGE_DIR = os.path.dirname(PARENT_DIR)
|
||||
PLUGINS_DIR = os.path.join(PACKAGE_DIR, "plugins")
|
||||
|
||||
PUBLISH_PATH = os.path.join(
|
||||
PLUGINS_DIR, "aport", "publish"
|
||||
).replace("\\", "/")
|
||||
|
||||
if os.getenv("PUBLISH_PATH", None):
|
||||
os.environ["PUBLISH_PATH"] = os.pathsep.join(
|
||||
os.environ["PUBLISH_PATH"].split(os.pathsep) +
|
||||
[PUBLISH_PATH]
|
||||
)
|
||||
else:
|
||||
os.environ["PUBLISH_PATH"] = PUBLISH_PATH
|
||||
|
||||
LOAD_PATH = os.path.join(PLUGINS_DIR, "aport", "load")
|
||||
CREATE_PATH = os.path.join(PLUGINS_DIR, "aport", "create")
|
||||
INVENTORY_PATH = os.path.join(PLUGINS_DIR, "aport", "inventory")
|
||||
|
||||
|
||||
def install():
|
||||
set_avalon_workdir()
|
||||
|
||||
log.info("Registering Aport plug-ins..")
|
||||
pyblish.register_plugin_path(PUBLISH_PATH)
|
||||
avalon.register_plugin_path(avalon.Loader, LOAD_PATH)
|
||||
avalon.register_plugin_path(avalon.Creator, CREATE_PATH)
|
||||
avalon.register_plugin_path(avalon.InventoryAction, INVENTORY_PATH)
|
||||
|
||||
# Disable all families except for the ones we explicitly want to see
|
||||
family_states = [
|
||||
"imagesequence",
|
||||
"mov"
|
||||
|
||||
]
|
||||
avalon.data["familiesStateDefault"] = False
|
||||
avalon.data["familiesStateToggled"] = family_states
|
||||
|
||||
# launch pico server
|
||||
pico_server_launch()
|
||||
|
||||
|
||||
def uninstall():
|
||||
log.info("Deregistering Aport plug-ins..")
|
||||
pyblish.deregister_plugin_path(PUBLISH_PATH)
|
||||
avalon.deregister_plugin_path(avalon.Loader, LOAD_PATH)
|
||||
avalon.deregister_plugin_path(avalon.Creator, CREATE_PATH)
|
||||
|
||||
# reset data from templates
|
||||
api.reset_data_from_templates()
|
||||
|
||||
|
||||
def pico_server_launch():
|
||||
# path = "C:/Users/hubert/CODE/github/pico/examples/everything"
|
||||
path = os.path.join(
|
||||
os.path.dirname(__file__),
|
||||
# "package"
|
||||
)
|
||||
|
||||
os.chdir(path)
|
||||
print(os.getcwd())
|
||||
print(os.listdir(path))
|
||||
try:
|
||||
args = [sys.executable, "-m", "pico.server",
|
||||
# "pipeline",
|
||||
"api"
|
||||
]
|
||||
|
||||
execute(
|
||||
args,
|
||||
cwd=path
|
||||
)
|
||||
except Exception as e:
|
||||
log.error(e)
|
||||
log.error(sys.exc_info())
|
||||
# sys.exit(returncode)
|
||||
150
pype/aport/api.py
Normal file
150
pype/aport/api.py
Normal file
|
|
@ -0,0 +1,150 @@
|
|||
# api.py
|
||||
import os
|
||||
import sys
|
||||
import tempfile
|
||||
|
||||
import pico
|
||||
from pico import PicoApp
|
||||
from pico.decorators import request_args, set_cookie, delete_cookie, stream
|
||||
from pico.decorators import header, cookie
|
||||
|
||||
from werkzeug.exceptions import Unauthorized, ImATeapot, BadRequest
|
||||
|
||||
from avalon import api as avalon
|
||||
from avalon import io
|
||||
|
||||
import pyblish.api as pyblish
|
||||
|
||||
from pypeapp import execute
|
||||
from pype import api as pype
|
||||
|
||||
|
||||
log = pype.Logger().get_logger(__name__, "aport")
|
||||
|
||||
|
||||
SESSION = avalon.session
|
||||
if not SESSION:
|
||||
io.install()
|
||||
|
||||
|
||||
@pico.expose()
|
||||
def publish(json_data_path, gui):
|
||||
"""
|
||||
Runs standalone pyblish and adds link to
|
||||
data in external json file
|
||||
|
||||
It is necessary to run `register_plugin_path` if particular
|
||||
host is needed
|
||||
|
||||
Args:
|
||||
json_data_path (string): path to temp json file with
|
||||
context data
|
||||
staging_dir (strign, optional): path to temp directory
|
||||
|
||||
Returns:
|
||||
dict: return_json_path
|
||||
|
||||
Raises:
|
||||
Exception: description
|
||||
|
||||
"""
|
||||
cwd = os.getenv('AVALON_WORKDIR').replace("\\", "/")
|
||||
|
||||
staging_dir = tempfile.mkdtemp(prefix="pype_aport_").replace("\\", "/")
|
||||
log.info("staging_dir: {}".format(staging_dir))
|
||||
return_json_path = os.path.join(staging_dir, "return_data.json").replace("\\", "/")
|
||||
|
||||
log.info("avalon.session is: \n{}".format(SESSION))
|
||||
|
||||
pype_start = os.path.join(os.getenv('PYPE_ROOT'),
|
||||
"app", "pype-start.py")
|
||||
|
||||
publish = "--publish-gui" if gui else "--publish"
|
||||
|
||||
args = [pype_start, publish,
|
||||
"-pp", os.environ["PUBLISH_PATH"],
|
||||
"-d", "rqst_json_data_path", json_data_path,
|
||||
"-d", "post_json_data_path", return_json_path
|
||||
]
|
||||
|
||||
log.debug(args)
|
||||
|
||||
# start standalone pyblish qml
|
||||
execute([
|
||||
sys.executable, "-u"
|
||||
] + args,
|
||||
cwd=cwd
|
||||
)
|
||||
|
||||
return {"return_json_path": return_json_path}
|
||||
|
||||
|
||||
@pico.expose()
|
||||
def context(project_name, asset, task, app):
|
||||
# http://localhost:4242/pipeline/context?project=this&asset=shot01&task=comp
|
||||
|
||||
os.environ["AVALON_PROJECT"] = project_name
|
||||
io.Session["AVALON_PROJECT"] = project_name
|
||||
|
||||
avalon.update_current_task(task, asset, app)
|
||||
|
||||
project_code = pype.get_project()["data"].get("code", '')
|
||||
|
||||
os.environ["AVALON_PROJECTCODE"] = project_code
|
||||
io.Session["AVALON_PROJECTCODE"] = project_code
|
||||
|
||||
hierarchy = pype.get_hierarchy()
|
||||
os.environ["AVALON_HIERARCHY"] = hierarchy
|
||||
io.Session["AVALON_HIERARCHY"] = hierarchy
|
||||
|
||||
fix_paths = {k: v.replace("\\", "/") for k, v in SESSION.items()
|
||||
if isinstance(v, str)}
|
||||
SESSION.update(fix_paths)
|
||||
SESSION.update({"AVALON_HIERARCHY": hierarchy,
|
||||
"AVALON_PROJECTCODE": project_code,
|
||||
"current_dir": os.getcwd().replace("\\", "/")
|
||||
})
|
||||
|
||||
return SESSION
|
||||
|
||||
|
||||
@pico.expose()
|
||||
def deregister_plugin_path():
|
||||
if os.getenv("PUBLISH_PATH", None):
|
||||
aport_plugin_path = [p.replace("\\", "/") for p in os.environ["PUBLISH_PATH"].split(
|
||||
os.pathsep) if "aport" in p][0]
|
||||
os.environ["PUBLISH_PATH"] = aport_plugin_path
|
||||
else:
|
||||
log.warning("deregister_plugin_path(): No PUBLISH_PATH is registred")
|
||||
|
||||
return "Publish path deregistered"
|
||||
|
||||
|
||||
@pico.expose()
|
||||
def register_plugin_path(publish_path):
|
||||
deregister_plugin_path()
|
||||
if os.getenv("PUBLISH_PATH", None):
|
||||
os.environ["PUBLISH_PATH"] = os.pathsep.join(
|
||||
os.environ["PUBLISH_PATH"].split(os.pathsep) +
|
||||
[publish_path.replace("\\", "/")]
|
||||
)
|
||||
else:
|
||||
os.environ["PUBLISH_PATH"] = publish_path
|
||||
|
||||
log.info(os.environ["PUBLISH_PATH"].split(os.pathsep))
|
||||
|
||||
return "Publish registered paths: {}".format(
|
||||
os.environ["PUBLISH_PATH"].split(os.pathsep)
|
||||
)
|
||||
|
||||
|
||||
app = PicoApp()
|
||||
app.register_module(__name__)
|
||||
|
||||
# remove all Handlers created by pico
|
||||
for name, handler in [(handler.get_name(), handler)
|
||||
for handler in pype.Logger.logging.root.handlers[:]]:
|
||||
if "pype" not in str(name).lower():
|
||||
print(name)
|
||||
print(handler)
|
||||
pype.Logger.logging.root.removeHandler(handler)
|
||||
135
pype/aport/lib.py
Normal file
135
pype/aport/lib.py
Normal file
|
|
@ -0,0 +1,135 @@
|
|||
import os
|
||||
import re
|
||||
import sys
|
||||
from avalon import io, api as avalon, lib as avalonlib
|
||||
from pype import lib
|
||||
from pype import api as pype
|
||||
# from pypeapp.api import (Templates, Logger, format)
|
||||
from pypeapp import Logger, Anatomy
|
||||
log = Logger().get_logger(__name__, os.getenv("AVALON_APP", "pype-config"))
|
||||
|
||||
|
||||
def get_asset():
|
||||
"""
|
||||
Obtain Asset string from session or environment variable
|
||||
|
||||
Returns:
|
||||
string: asset name
|
||||
|
||||
Raises:
|
||||
log: error
|
||||
"""
|
||||
lib.set_io_database()
|
||||
asset = io.Session.get("AVALON_ASSET", None) \
|
||||
or os.getenv("AVALON_ASSET", None)
|
||||
log.info("asset: {}".format(asset))
|
||||
assert asset, log.error("missing `AVALON_ASSET`"
|
||||
"in avalon session "
|
||||
"or os.environ!")
|
||||
return asset
|
||||
|
||||
|
||||
def get_context_data(
|
||||
project_name=None, hierarchy=None, asset=None, task_name=None
|
||||
):
|
||||
"""
|
||||
Collect all main contextual data
|
||||
|
||||
Args:
|
||||
project (string, optional): project name
|
||||
hierarchy (string, optional): hierarchy path
|
||||
asset (string, optional): asset name
|
||||
task (string, optional): task name
|
||||
|
||||
Returns:
|
||||
dict: contextual data
|
||||
|
||||
"""
|
||||
if not task_name:
|
||||
lib.set_io_database()
|
||||
task_name = io.Session.get("AVALON_TASK", None) \
|
||||
or os.getenv("AVALON_TASK", None)
|
||||
assert task_name, log.error(
|
||||
"missing `AVALON_TASK` in avalon session or os.environ!"
|
||||
)
|
||||
|
||||
application = avalonlib.get_application(os.environ["AVALON_APP_NAME"])
|
||||
|
||||
os.environ['AVALON_PROJECT'] = project_name
|
||||
io.Session['AVALON_PROJECT'] = project_name
|
||||
|
||||
if not hierarchy:
|
||||
hierarchy = pype.get_hierarchy()
|
||||
|
||||
project_doc = io.find_one({"type": "project"})
|
||||
|
||||
data = {
|
||||
"task": task_name,
|
||||
"asset": asset or get_asset(),
|
||||
"project": {
|
||||
"name": project_doc["name"],
|
||||
"code": project_doc["data"].get("code", '')
|
||||
},
|
||||
"hierarchy": hierarchy,
|
||||
"app": application["application_dir"]
|
||||
}
|
||||
return data
|
||||
|
||||
|
||||
def set_avalon_workdir(
|
||||
project=None, hierarchy=None, asset=None, task=None
|
||||
):
|
||||
"""
|
||||
Updates os.environ and session with filled workdir
|
||||
|
||||
Args:
|
||||
project (string, optional): project name
|
||||
hierarchy (string, optional): hierarchy path
|
||||
asset (string, optional): asset name
|
||||
task (string, optional): task name
|
||||
|
||||
Returns:
|
||||
os.environ[AVALON_WORKDIR]: workdir path
|
||||
avalon.session[AVALON_WORKDIR]: workdir path
|
||||
|
||||
"""
|
||||
|
||||
lib.set_io_database()
|
||||
awd = io.Session.get("AVALON_WORKDIR", None) or \
|
||||
os.getenv("AVALON_WORKDIR", None)
|
||||
|
||||
data = get_context_data(project, hierarchy, asset, task)
|
||||
|
||||
if (not awd) or ("{" not in awd):
|
||||
anatomy_filled = Anatomy(io.Session["AVALON_PROJECT"]).format(data)
|
||||
awd = anatomy_filled["work"]["folder"]
|
||||
|
||||
awd_filled = os.path.normpath(format(awd, data))
|
||||
|
||||
io.Session["AVALON_WORKDIR"] = awd_filled
|
||||
os.environ["AVALON_WORKDIR"] = awd_filled
|
||||
log.info("`AVALON_WORKDIR` fixed to: {}".format(awd_filled))
|
||||
|
||||
|
||||
def get_workdir_template(data=None):
|
||||
"""
|
||||
Obtain workdir templated path from Anatomy()
|
||||
|
||||
Args:
|
||||
data (dict, optional): basic contextual data
|
||||
|
||||
Returns:
|
||||
string: template path
|
||||
"""
|
||||
|
||||
anatomy = Anatomy()
|
||||
anatomy_filled = anatomy.format(data or get_context_data())
|
||||
|
||||
try:
|
||||
work = anatomy_filled["work"]
|
||||
except Exception as e:
|
||||
log.error(
|
||||
"{0} Error in get_workdir_template(): {1}".format(__name__, str(e))
|
||||
)
|
||||
|
||||
return work
|
||||
252
pype/aport/original/api.py
Normal file
252
pype/aport/original/api.py
Normal file
|
|
@ -0,0 +1,252 @@
|
|||
# api.py
|
||||
import os
|
||||
import sys
|
||||
import tempfile
|
||||
|
||||
import pico
|
||||
from pico import PicoApp
|
||||
from pico.decorators import request_args, set_cookie, delete_cookie, stream
|
||||
from pico.decorators import header, cookie
|
||||
|
||||
from werkzeug.exceptions import Unauthorized, ImATeapot, BadRequest
|
||||
|
||||
from avalon import api as avalon
|
||||
from avalon import io
|
||||
|
||||
import pyblish.api as pyblish
|
||||
|
||||
from pypeapp import execute
|
||||
from pype import api as pype
|
||||
|
||||
|
||||
log = pype.Logger().get_logger(__name__, "aport")
|
||||
|
||||
|
||||
SESSION = avalon.session
|
||||
if not SESSION:
|
||||
io.install()
|
||||
|
||||
|
||||
@pico.expose()
|
||||
def publish(json_data_path, staging_dir=None):
|
||||
"""
|
||||
Runs standalone pyblish and adds link to
|
||||
data in external json file
|
||||
|
||||
It is necessary to run `register_plugin_path` if particular
|
||||
host is needed
|
||||
|
||||
Args:
|
||||
json_data_path (string): path to temp json file with
|
||||
context data
|
||||
staging_dir (strign, optional): path to temp directory
|
||||
|
||||
Returns:
|
||||
dict: return_json_path
|
||||
|
||||
Raises:
|
||||
Exception: description
|
||||
|
||||
"""
|
||||
cwd = os.getenv('AVALON_WORKDIR').replace("\\", "/")
|
||||
os.chdir(cwd)
|
||||
log.info(os.getcwd())
|
||||
staging_dir = tempfile.mkdtemp(prefix="pype_aport_").replace("\\", "/")
|
||||
log.info("staging_dir: {}".format(staging_dir))
|
||||
return_json_path = os.path.join(staging_dir, "return_data.json")
|
||||
|
||||
log.info("avalon.session is: \n{}".format(SESSION))
|
||||
pype_start = os.path.join(os.getenv('PYPE_ROOT'),
|
||||
"app", "pype-start.py")
|
||||
|
||||
args = [pype_start, "--publish",
|
||||
"-pp", os.environ["PUBLISH_PATH"],
|
||||
"-d", "rqst_json_data_path", json_data_path,
|
||||
"-d", "post_json_data_path", return_json_path
|
||||
]
|
||||
|
||||
log.debug(args)
|
||||
|
||||
# start standalone pyblish qml
|
||||
execute([
|
||||
sys.executable, "-u"
|
||||
] + args,
|
||||
cwd=cwd
|
||||
)
|
||||
|
||||
return {"return_json_path": return_json_path}
|
||||
|
||||
|
||||
@pico.expose()
|
||||
def context(project, asset, task, app):
|
||||
# http://localhost:4242/pipeline/context?project=this&asset=shot01&task=comp
|
||||
|
||||
os.environ["AVALON_PROJECT"] = project
|
||||
io.Session["AVALON_PROJECT"] = project
|
||||
|
||||
avalon.update_current_task(task, asset, app)
|
||||
|
||||
project_code = pype.get_project()["data"].get("code", '')
|
||||
|
||||
os.environ["AVALON_PROJECTCODE"] = project_code
|
||||
io.Session["AVALON_PROJECTCODE"] = project_code
|
||||
|
||||
hierarchy = pype.get_hierarchy()
|
||||
os.environ["AVALON_HIERARCHY"] = hierarchy
|
||||
io.Session["AVALON_HIERARCHY"] = hierarchy
|
||||
|
||||
fix_paths = {k: v.replace("\\", "/") for k, v in SESSION.items()
|
||||
if isinstance(v, str)}
|
||||
SESSION.update(fix_paths)
|
||||
SESSION.update({"AVALON_HIERARCHY": hierarchy,
|
||||
"AVALON_PROJECTCODE": project_code,
|
||||
"current_dir": os.getcwd().replace("\\", "/")
|
||||
})
|
||||
|
||||
return SESSION
|
||||
|
||||
|
||||
@pico.expose()
|
||||
def deregister_plugin_path():
|
||||
if os.getenv("PUBLISH_PATH", None):
|
||||
aport_plugin_path = [p.replace("\\", "/") for p in os.environ["PUBLISH_PATH"].split(
|
||||
os.pathsep) if "aport" in p][0]
|
||||
os.environ["PUBLISH_PATH"] = aport_plugin_path
|
||||
else:
|
||||
log.warning("deregister_plugin_path(): No PUBLISH_PATH is registred")
|
||||
|
||||
return "Publish path deregistered"
|
||||
|
||||
|
||||
@pico.expose()
|
||||
def register_plugin_path(publish_path):
|
||||
deregister_plugin_path()
|
||||
if os.getenv("PUBLISH_PATH", None):
|
||||
os.environ["PUBLISH_PATH"] = os.pathsep.join(
|
||||
os.environ["PUBLISH_PATH"].split(os.pathsep) +
|
||||
[publish_path.replace("\\", "/")]
|
||||
)
|
||||
else:
|
||||
os.environ["PUBLISH_PATH"] = publish_path
|
||||
|
||||
log.info(os.environ["PUBLISH_PATH"].split(os.pathsep))
|
||||
|
||||
return "Publish registered paths: {}".format(
|
||||
os.environ["PUBLISH_PATH"].split(os.pathsep)
|
||||
)
|
||||
|
||||
|
||||
@pico.expose()
|
||||
def nuke_test():
|
||||
import nuke
|
||||
n = nuke.createNode("Constant")
|
||||
log.info(n)
|
||||
|
||||
|
||||
@pico.expose()
|
||||
def hello(who='world'):
|
||||
return 'Hello %s' % who
|
||||
|
||||
|
||||
@pico.expose()
|
||||
def multiply(x, y):
|
||||
return x * y
|
||||
|
||||
|
||||
@pico.expose()
|
||||
def fail():
|
||||
raise Exception('fail!')
|
||||
|
||||
|
||||
@pico.expose()
|
||||
def make_coffee():
|
||||
raise ImATeapot()
|
||||
|
||||
|
||||
@pico.expose()
|
||||
def upload(upload, filename):
|
||||
if not filename.endswith('.txt'):
|
||||
raise BadRequest('Upload must be a .txt file!')
|
||||
return upload.read().decode()
|
||||
|
||||
|
||||
@pico.expose()
|
||||
@request_args(ip='remote_addr')
|
||||
def my_ip(ip):
|
||||
return ip
|
||||
|
||||
|
||||
@pico.expose()
|
||||
@request_args(ip=lambda req: req.remote_addr)
|
||||
def my_ip3(ip):
|
||||
return ip
|
||||
|
||||
|
||||
@pico.prehandle()
|
||||
def set_user(request, kwargs):
|
||||
if request.authorization:
|
||||
if request.authorization.password != 'secret':
|
||||
raise Unauthorized('Incorrect username or password')
|
||||
request.user = request.authorization.username
|
||||
else:
|
||||
request.user = None
|
||||
|
||||
|
||||
@pico.expose()
|
||||
@request_args(username='user')
|
||||
def current_user(username):
|
||||
return username
|
||||
|
||||
|
||||
@pico.expose()
|
||||
@request_args(session=cookie('session_id'))
|
||||
def session_id(session):
|
||||
return session
|
||||
|
||||
|
||||
@pico.expose()
|
||||
@set_cookie()
|
||||
def start_session():
|
||||
return {'session_id': '42'}
|
||||
|
||||
|
||||
@pico.expose()
|
||||
@delete_cookie('session_id')
|
||||
def end_session():
|
||||
return True
|
||||
|
||||
|
||||
@pico.expose()
|
||||
@request_args(session=header('x-session-id'))
|
||||
def session_id2(session):
|
||||
return session
|
||||
|
||||
|
||||
@pico.expose()
|
||||
@stream()
|
||||
def countdown(n=10):
|
||||
for i in reversed(range(n)):
|
||||
yield '%i' % i
|
||||
time.sleep(0.5)
|
||||
|
||||
|
||||
@pico.expose()
|
||||
def user_description(user):
|
||||
return '{name} is a {occupation} aged {age}'.format(**user)
|
||||
|
||||
|
||||
@pico.expose()
|
||||
def show_source():
|
||||
return open(__file__.replace('.pyc', '.py')).read()
|
||||
|
||||
|
||||
app = PicoApp()
|
||||
app.register_module(__name__)
|
||||
|
||||
# remove all Handlers created by pico
|
||||
for name, handler in [(handler.get_name(), handler)
|
||||
for handler in Logger().logging.root.handlers[:]]:
|
||||
if "pype" not in str(name).lower():
|
||||
print(name)
|
||||
print(handler)
|
||||
Logger().logging.root.removeHandler(handler)
|
||||
196
pype/aport/original/index.html
Normal file
196
pype/aport/original/index.html
Normal file
|
|
@ -0,0 +1,196 @@
|
|||
<!DOCTYPE html>
|
||||
<html>
|
||||
<head>
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1">
|
||||
<title>Pico Example - Everything</title>
|
||||
<!-- Load the pico Javascript client, always automatically available at /pico.js -->
|
||||
<script src="/pico.js"></script>
|
||||
<!-- Or load our module proxy -->
|
||||
<script src="/api.js"></script>
|
||||
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/css/bootstrap.min.css" integrity="sha384-1q8mTJOASx8j1Au+a5WDVnPi2lkFfwwEAa8hDDdjZlpLegxhjVME1fgjWPGmkzs7" crossorigin="anonymous">
|
||||
|
||||
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/css/bootstrap-theme.min.css" integrity="sha384-fLW2N01lMqjakBkx3l/M9EahuwpSfeNvV63J5ezn3uZzapT0u7EYsXMjQV+0En5r" crossorigin="anonymous">
|
||||
|
||||
<link rel="stylesheet" href="//cdnjs.cloudflare.com/ajax/libs/highlight.js/9.6.0/styles/default.min.css">
|
||||
|
||||
<script src="//cdnjs.cloudflare.com/ajax/libs/highlight.js/9.6.0/highlight.min.js"></script>
|
||||
<script></script>
|
||||
|
||||
<style type="text/css">
|
||||
html, body {
|
||||
height: 100%;
|
||||
margin: 0px;
|
||||
padding: 0px;
|
||||
}
|
||||
div {
|
||||
padding: 5px;
|
||||
}
|
||||
#container {
|
||||
height: 100%;
|
||||
}
|
||||
#header {
|
||||
height: 5%;
|
||||
}
|
||||
#main {
|
||||
height: 70%;
|
||||
}
|
||||
#output {
|
||||
background-color: #333;
|
||||
color: #aaa;
|
||||
min-height: 15%;
|
||||
overflow-y: scroll;
|
||||
padding: 20px;
|
||||
position: fixed;
|
||||
bottom: 0px;
|
||||
width: 100%;
|
||||
}
|
||||
.error {
|
||||
color: #f00 !important;
|
||||
}
|
||||
#examples li{
|
||||
padding: 10px;
|
||||
margin: 10px;
|
||||
background-color: silver;
|
||||
}
|
||||
code {
|
||||
border-radius: 0;*/
|
||||
margin: 5px;
|
||||
white-space: pre !important;
|
||||
}
|
||||
#source {
|
||||
height: 100%;
|
||||
}
|
||||
#examples {
|
||||
height: 100%;
|
||||
}
|
||||
#spacer {
|
||||
height: 20%;
|
||||
}
|
||||
|
||||
.highlight {
|
||||
background-color: yellow;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<div id="container">
|
||||
<div class="row row-eq-height">
|
||||
<div class="col-md-12">
|
||||
<h1>Pico Examples</h1>
|
||||
<p>Here we show some simple examples of using Pico. Click any <code>api.X</code> link to see the corresponding Python source.</p>
|
||||
</div>
|
||||
</div>
|
||||
<div class="row row-eq-height" id="main">
|
||||
<div class="col-md-6" id="examples">
|
||||
<ol>
|
||||
|
||||
<li id="example1">
|
||||
<h4>Hello World</h4>
|
||||
<pre><code class="js"></code></pre>
|
||||
Name: <input type="text" name="name" value="Bob"/>
|
||||
<button class="btn btn-default btn-sm" type="button" onclick="example1()">Submit</button>
|
||||
</li>
|
||||
<li id="deregister">
|
||||
<h4>deregister_plugin_path</h4>
|
||||
<pre><code class="js"></code></pre>
|
||||
<button class="btn btn-default btn-sm" type="button" onclick="deregister()">Deregister</button>
|
||||
</li>
|
||||
<li id="register">
|
||||
<h4>register_plugin_path</h4>
|
||||
<pre><code class="js"></code></pre>
|
||||
Path: <input type="text" name="path" value="C:/Users/hubert/CODE/pype-setup/repos/pype-config/pype/plugins/premiere/publish"/>
|
||||
<button class="btn btn-default btn-sm" type="button" onclick="register()">Register path</button>
|
||||
</li>
|
||||
<li id="example2">
|
||||
<h4>Numeric Multiplication</h4>
|
||||
<pre><code class="js"></code></pre>
|
||||
<input type="number" name="x" value="6"/> x <input type="number" name="y" value="7"/>
|
||||
<button class="btn btn-default btn-sm" type="button" onclick="example2()">Multiply</button>
|
||||
</li>
|
||||
<li id="example3">
|
||||
<h4>File Upload</h4>
|
||||
<pre><code class="js"></code></pre>
|
||||
<input type="file" name="upload"/>
|
||||
<button class="btn btn-default btn-sm" type="button" onclick="example3()">Upload</button>
|
||||
</li>
|
||||
<li id="example4">
|
||||
<h4>Request parameters (IP address)</h4>
|
||||
<pre><code class="js"></code></pre>
|
||||
<button class="btn btn-default btn-sm" type="button" onclick="example4()">What's my IP?</button>
|
||||
</li>
|
||||
<li id="example5">
|
||||
<h4>Authentication</h4>
|
||||
<pre><code class="js"></code></pre>
|
||||
<p class="bg-info">Note: see <a href="#set_user" onclick="jumpTo('set_user')">api.set_user</a> for the authentication handler.</p>
|
||||
Username: <input type="text" name="username" value="bob"/>
|
||||
Password: <input type="password" name="password" value="secret"/>
|
||||
<button class="btn btn-default btn-sm" type="button" onclick="example5()">Sign In</button>
|
||||
</li>
|
||||
<li id="example6">
|
||||
<h4>Sessions (cookies)</h4>
|
||||
<pre><code class="js"></code></pre>
|
||||
<button class="btn btn-default btn-sm" type="button" onclick="example6()">What's my session id?</button>
|
||||
</li>
|
||||
<li id="example7">
|
||||
<h4>Sessions (header)</h4>
|
||||
<pre><code class="js"></code></pre>
|
||||
<button class="btn btn-default btn-sm" type="button" onclick="example7()">What's my session id?</button>
|
||||
</li>
|
||||
<li id="example8">
|
||||
<h4>Streaming Response</h4>
|
||||
<pre><code class="js"></code></pre>
|
||||
<button class="btn btn-default btn-sm" type="button" onclick="example8()">Countdown</button>
|
||||
</li>
|
||||
<li id="example9">
|
||||
<h4>Objects</h4>
|
||||
<pre><code class="js"></code></pre>
|
||||
<button class="btn btn-default btn-sm" type="button" onclick="example9()">Submit</button>
|
||||
</li>
|
||||
<li id="example10">
|
||||
<h4>Errors</h4>
|
||||
<pre><code class="js"></code></pre>
|
||||
<button class="btn btn-default btn-sm" type="button" onclick="example10()">Submit</button>
|
||||
</li>
|
||||
<li id="example11">
|
||||
<h4>Errors</h4>
|
||||
<pre><code class="js"></code></pre>
|
||||
<button class="btn btn-default btn-sm" type="button" onclick="example11()">Submit</button>
|
||||
</li>
|
||||
<li id="example12">
|
||||
<h4>Forms</h4>
|
||||
<p>This example submits a form as a whole instead of individual arguments.
|
||||
The form input names must match the function argument names.
|
||||
</p>
|
||||
<pre><code class="html"></code></pre>
|
||||
<pre><code class="js"></code></pre>
|
||||
<div class="example">
|
||||
<form>
|
||||
x: <input type="number" name="x" value="6"/><br/>
|
||||
y: <input type="number" name="y" value="7"/>
|
||||
</form>
|
||||
<button class="btn btn-default btn-sm" type="button" onclick="example12()">Multiply</button>
|
||||
</div>
|
||||
</li>
|
||||
<li id="example13">
|
||||
<h4>JSON</h4>
|
||||
<p>This example submits data as JSON instead of individual arguments.
|
||||
The object keys must match the function argument names.
|
||||
</p>
|
||||
<pre><code class="js"></code></pre>
|
||||
<button class="btn btn-default btn-sm" type="button" onclick="example13()">Multiply</button>
|
||||
</li>
|
||||
</ol>
|
||||
<div id="spacer">
|
||||
</div>
|
||||
</div>
|
||||
<div class="col-md-6" id="source">
|
||||
<pre><code class="python"></code></pre>
|
||||
</div>
|
||||
</div>
|
||||
<div class="row" id="output">
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script src="script.js"></script>
|
||||
</body>
|
||||
</html>
|
||||
146
pype/aport/original/pipeline.py
Normal file
146
pype/aport/original/pipeline.py
Normal file
|
|
@ -0,0 +1,146 @@
|
|||
import os
|
||||
import sys
|
||||
import tempfile
|
||||
import pico
|
||||
# from pico.decorators import request_args, prehandle
|
||||
from pico import PicoApp
|
||||
from pico import client
|
||||
|
||||
from avalon import api as avalon
|
||||
from avalon import io
|
||||
|
||||
import pyblish.api as pyblish
|
||||
|
||||
from pypeapp import execute
|
||||
from pype import api as pype
|
||||
|
||||
# remove all Handlers created by pico
|
||||
for name, handler in [(handler.get_name(), handler)
|
||||
for handler in pype.Logger.logging.root.handlers[:]]:
|
||||
if "pype" not in str(name).lower():
|
||||
pype.Logger.logging.root.removeHandler(handler)
|
||||
|
||||
log = pype.Logger().get_logger(__name__, "aport")
|
||||
|
||||
|
||||
SESSION = avalon.session
|
||||
if not SESSION:
|
||||
io.install()
|
||||
|
||||
|
||||
@pico.expose()
|
||||
def publish(json_data_path, staging_dir=None):
|
||||
"""
|
||||
Runs standalone pyblish and adds link to
|
||||
data in external json file
|
||||
|
||||
It is necessary to run `register_plugin_path` if particular
|
||||
host is needed
|
||||
|
||||
Args:
|
||||
json_data_path (string): path to temp json file with
|
||||
context data
|
||||
staging_dir (strign, optional): path to temp directory
|
||||
|
||||
Returns:
|
||||
dict: return_json_path
|
||||
|
||||
Raises:
|
||||
Exception: description
|
||||
|
||||
"""
|
||||
staging_dir = staging_dir \
|
||||
or tempfile.mkdtemp(prefix="pype_aport_")
|
||||
|
||||
return_json_path = os.path.join(staging_dir, "return_data.json")
|
||||
|
||||
log.debug("avalon.session is: \n{}".format(SESSION))
|
||||
pype_start = os.path.join(os.getenv('PYPE_ROOT'),
|
||||
"app", "pype-start.py")
|
||||
|
||||
args = [pype_start, "--publish",
|
||||
"-pp", os.environ["PUBLISH_PATH"],
|
||||
"-d", "rqst_json_data_path", json_data_path,
|
||||
"-d", "post_json_data_path", return_json_path
|
||||
]
|
||||
|
||||
log.debug(args)
|
||||
|
||||
# start standalone pyblish qml
|
||||
execute([
|
||||
sys.executable, "-u"
|
||||
] + args,
|
||||
cwd=os.getenv('AVALON_WORKDIR').replace("\\", "/")
|
||||
)
|
||||
|
||||
return {"return_json_path": return_json_path}
|
||||
|
||||
|
||||
@pico.expose()
|
||||
def context(project, asset, task, app):
|
||||
# http://localhost:4242/pipeline/context?project=this&asset=shot01&task=comp
|
||||
|
||||
os.environ["AVALON_PROJECT"] = project
|
||||
io.Session["AVALON_PROJECT"] = project
|
||||
|
||||
avalon.update_current_task(task, asset, app)
|
||||
|
||||
project_code = pype.get_project()["data"].get("code", '')
|
||||
|
||||
os.environ["AVALON_PROJECTCODE"] = project_code
|
||||
io.Session["AVALON_PROJECTCODE"] = project_code
|
||||
|
||||
hierarchy = pype.get_hierarchy()
|
||||
os.environ["AVALON_HIERARCHY"] = hierarchy
|
||||
io.Session["AVALON_HIERARCHY"] = hierarchy
|
||||
|
||||
fix_paths = {k: v.replace("\\", "/") for k, v in SESSION.items()
|
||||
if isinstance(v, str)}
|
||||
SESSION.update(fix_paths)
|
||||
SESSION.update({"AVALON_HIERARCHY": hierarchy,
|
||||
"AVALON_PROJECTCODE": project_code,
|
||||
"current_dir": os.getcwd().replace("\\", "/")
|
||||
})
|
||||
|
||||
return SESSION
|
||||
|
||||
|
||||
@pico.expose()
|
||||
def deregister_plugin_path():
|
||||
if os.getenv("PUBLISH_PATH", None):
|
||||
aport_plugin_path = [p.replace("\\", "/") for p in os.environ["PUBLISH_PATH"].split(
|
||||
os.pathsep) if "aport" in p][0]
|
||||
os.environ["PUBLISH_PATH"] = aport_plugin_path
|
||||
else:
|
||||
log.warning("deregister_plugin_path(): No PUBLISH_PATH is registred")
|
||||
|
||||
return "Publish path deregistered"
|
||||
|
||||
|
||||
@pico.expose()
|
||||
def register_plugin_path(publish_path):
|
||||
deregister_plugin_path()
|
||||
if os.getenv("PUBLISH_PATH", None):
|
||||
os.environ["PUBLISH_PATH"] = os.pathsep.join(
|
||||
os.environ["PUBLISH_PATH"].split(os.pathsep) +
|
||||
[publish_path.replace("\\", "/")]
|
||||
)
|
||||
else:
|
||||
os.environ["PUBLISH_PATH"] = publish_path
|
||||
|
||||
log.info(os.environ["PUBLISH_PATH"].split(os.pathsep))
|
||||
|
||||
return "Publish registered paths: {}".format(
|
||||
os.environ["PUBLISH_PATH"].split(os.pathsep)
|
||||
)
|
||||
|
||||
|
||||
@pico.expose()
|
||||
def nuke_test():
|
||||
import nuke
|
||||
n = nuke.createNode("Constant")
|
||||
log.info(n)
|
||||
|
||||
|
||||
app = PicoApp()
|
||||
app.register_module(__name__)
|
||||
45
pype/aport/original/templates.py
Normal file
45
pype/aport/original/templates.py
Normal file
|
|
@ -0,0 +1,45 @@
|
|||
from pype import api as pype
|
||||
from pypeapp import Anatomy, config
|
||||
|
||||
|
||||
log = pype.Logger().get_logger(__name__, "aport")
|
||||
|
||||
|
||||
def get_anatomy(**kwarg):
|
||||
return Anatomy()
|
||||
|
||||
|
||||
def get_dataflow(**kwarg):
|
||||
log.info(kwarg)
|
||||
host = kwarg.get("host", "aport")
|
||||
cls = kwarg.get("class", None)
|
||||
preset = kwarg.get("preset", None)
|
||||
assert any([host, cls]), log.error("aport.templates.get_dataflow():"
|
||||
"Missing mandatory kwargs `host`, `cls`")
|
||||
|
||||
presets = config.get_init_presets()
|
||||
aport_dataflow = getattr(presets["dataflow"], str(host), None)
|
||||
aport_dataflow_node = getattr(aport_dataflow.nodes, str(cls), None)
|
||||
if preset:
|
||||
aport_dataflow_node = getattr(aport_dataflow_node, str(preset), None)
|
||||
|
||||
log.info("Dataflow: {}".format(aport_dataflow_node))
|
||||
return aport_dataflow_node
|
||||
|
||||
|
||||
def get_colorspace(**kwarg):
|
||||
log.info(kwarg)
|
||||
host = kwarg.get("host", "aport")
|
||||
cls = kwarg.get("class", None)
|
||||
preset = kwarg.get("preset", None)
|
||||
assert any([host, cls]), log.error("aport.templates.get_colorspace():"
|
||||
"Missing mandatory kwargs `host`, `cls`")
|
||||
|
||||
presets = config.get_init_presets()
|
||||
aport_colorspace = getattr(presets["colorspace"], str(host), None)
|
||||
aport_colorspace_node = getattr(aport_colorspace, str(cls), None)
|
||||
if preset:
|
||||
aport_colorspace_node = getattr(aport_colorspace_node, str(preset), None)
|
||||
|
||||
log.info("Colorspace: {}".format(aport_colorspace_node))
|
||||
return aport_colorspace_node
|
||||
4862
pype/aport/static/build.js
Normal file
4862
pype/aport/static/build.js
Normal file
File diff suppressed because it is too large
Load diff
149
pype/aport/static/index.html
Normal file
149
pype/aport/static/index.html
Normal file
|
|
@ -0,0 +1,149 @@
|
|||
<!DOCTYPE html>
|
||||
<html>
|
||||
|
||||
<head>
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1">
|
||||
<title>Pype extention</title>
|
||||
<!-- Load the pico Javascript client, always automatically available at /pico.js -->
|
||||
<script src="/pico.js"></script>
|
||||
<!-- Or load our module proxy -->
|
||||
<script src="/api.js"></script>
|
||||
|
||||
<script>
|
||||
if (typeof module === 'object') {
|
||||
window.module = module;
|
||||
module = undefined;
|
||||
}
|
||||
</script>
|
||||
|
||||
<script src="./build.js"></script>
|
||||
<script>
|
||||
if (window.module) module = window.module;
|
||||
</script>
|
||||
<!-- <link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/css/bootstrap.min.css" integrity="sha384-1q8mTJOASx8j1Au+a5WDVnPi2lkFfwwEAa8hDDdjZlpLegxhjVME1fgjWPGmkzs7" crossorigin="anonymous">
|
||||
|
||||
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/css/bootstrap-theme.min.css" integrity="sha384-fLW2N01lMqjakBkx3l/M9EahuwpSfeNvV63J5ezn3uZzapT0u7EYsXMjQV+0En5r" crossorigin="anonymous">
|
||||
|
||||
<link rel="stylesheet" href="//cdnjs.cloudflare.com/ajax/libs/highlight.js/9.6.0/styles/default.min.css">
|
||||
|
||||
<script src="//cdnjs.cloudflare.com/ajax/libs/highlight.js/9.6.0/highlight.min.js"></script>
|
||||
<script></script> -->
|
||||
|
||||
<style type="text/css">
|
||||
html,
|
||||
body {
|
||||
height: 100%;
|
||||
margin: 0px;
|
||||
padding: 0px;
|
||||
}
|
||||
|
||||
div {
|
||||
padding: 5px;
|
||||
}
|
||||
|
||||
#container {
|
||||
height: 100%;
|
||||
}
|
||||
|
||||
#header {
|
||||
height: 5%;
|
||||
}
|
||||
|
||||
#main {
|
||||
height: 70%;
|
||||
}
|
||||
|
||||
#output {
|
||||
background-color: #333;
|
||||
color: #aaa;
|
||||
min-height: 15%;
|
||||
overflow-y: scroll;
|
||||
padding: 20px;
|
||||
position: fixed;
|
||||
bottom: 0px;
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
.error {
|
||||
color: #f00 !important;
|
||||
}
|
||||
|
||||
#examples li {
|
||||
padding: 10px;
|
||||
margin: 10px;
|
||||
background-color: silver;
|
||||
}
|
||||
|
||||
code {
|
||||
border-radius: 0;
|
||||
*/ margin: 5px;
|
||||
white-space: pre !important;
|
||||
}
|
||||
|
||||
#source {
|
||||
height: 100%;
|
||||
}
|
||||
|
||||
#examples {
|
||||
height: 100%;
|
||||
}
|
||||
|
||||
#spacer {
|
||||
height: 20%;
|
||||
}
|
||||
|
||||
.highlight {
|
||||
background-color: yellow;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
|
||||
<body onresize="resizePanel()">
|
||||
<a href="javascript:history.go(0)">Refresh panel</a>
|
||||
<div id="container">
|
||||
<div class="row row-eq-height" id="main">
|
||||
<div class="col-md-6" id="examples">
|
||||
<ol>
|
||||
<li id="context">
|
||||
<h4>Set context here</h4>
|
||||
<pre><code class="js"></code></pre>
|
||||
Project<input type="text" name="project" value="jakub_projectx" />Asset<input type="text" name="asset" value="shot01" />task<input type="text" name="task" value="compositing" />app<input type="text" name="app" value="premiera" />
|
||||
<button class="btn btn-default btn-sm" type="button" onclick="context()">Set context</button>
|
||||
</li>
|
||||
<li id="deregister">
|
||||
<h4>deregister_plugin_path</h4>
|
||||
<pre><code class="js"></code></pre>
|
||||
<button class="btn btn-default btn-sm" type="button" onclick="deregister()">Deregister</button>
|
||||
</li>
|
||||
<li id="register">
|
||||
<h4>register_plugin_path</h4>
|
||||
<pre><code class="js"></code></pre>
|
||||
Path: <input type="text" name="path" value="C:/Users/hubertCODE/pype-setup/repos/pype-config/pype/plugins/premiere/publish" />
|
||||
<button class="btn btn-default btn-sm" type="button" onclick="register()">Register path</button>
|
||||
</li>
|
||||
<li id="publish">
|
||||
<h4>Publish</h4>
|
||||
<pre><code class="js"></code></pre>
|
||||
Json path: <input type="text" name="path" value="C:/Users/hubert/CODE/pype-setup/repos/pype-config/pype/premiere/example_publish_reqst.json" />
|
||||
Gui<input type="checkbox" name="gui" value="True" checked>
|
||||
<button class="btn btn-default btn-sm" type="button" onclick="publish()">Publish</button>
|
||||
</li>
|
||||
|
||||
</ol>
|
||||
<div id="spacer">
|
||||
</div>
|
||||
</div>
|
||||
<div class="col-md-6" id="source">
|
||||
<!-- <pre>
|
||||
<code class="python"></code>
|
||||
</pre> -->
|
||||
</div>
|
||||
</div>
|
||||
<div class="row" id="output">
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script src="script.js"></script>
|
||||
</body>
|
||||
|
||||
</html>
|
||||
214
pype/aport/static/script.js
Normal file
214
pype/aport/static/script.js
Normal file
|
|
@ -0,0 +1,214 @@
|
|||
var api = pico.importModule('api');
|
||||
|
||||
var output = document.getElementById('output');
|
||||
|
||||
function querySelector(parent){
|
||||
return function(child){
|
||||
return document.querySelector(parent).querySelector(child)
|
||||
};
|
||||
}
|
||||
|
||||
var defs = {}
|
||||
function jumpTo(name){
|
||||
var e = defs[name];
|
||||
document.querySelectorAll('.highlight').forEach(function(el){
|
||||
el.classList.remove('highlight');
|
||||
});
|
||||
e.classList.add('highlight');
|
||||
return false;
|
||||
}
|
||||
|
||||
function displayResult(r){
|
||||
output.classList.remove("error");
|
||||
output.innerText = JSON.stringify(r);
|
||||
}
|
||||
|
||||
function displayError(e){
|
||||
output.classList.add("error");
|
||||
output.innerText = e.message;
|
||||
}
|
||||
|
||||
function unindent(code){
|
||||
var lines = code.split('\n');
|
||||
var margin = -1;
|
||||
for(var j=0; j < lines.length; j++){
|
||||
var l = lines[j];
|
||||
for(i=0; i < l.length; i++){
|
||||
if(l[i] != " "){
|
||||
margin = i;
|
||||
break;
|
||||
}
|
||||
}
|
||||
if(margin > -1){
|
||||
break;
|
||||
}
|
||||
}
|
||||
lines = lines.slice(j);
|
||||
return lines.map(function(s){ return s.substr(margin)}).join('\n');
|
||||
}
|
||||
|
||||
function deregister(){
|
||||
var $ = querySelector("#deregister");
|
||||
api.deregister_plugin_path().then(displayResult);
|
||||
}
|
||||
|
||||
function register(){
|
||||
var $ = querySelector("#register");
|
||||
var path = $("input[name=path]").value;
|
||||
api.register_plugin_path(path).then(displayResult);
|
||||
}
|
||||
|
||||
|
||||
function publish(){
|
||||
var $ = querySelector("#publish");
|
||||
var path = $("input[name=path]").value;
|
||||
var gui = $("input[name=gui]").checked;
|
||||
api.publish(path, gui).then(displayResult);
|
||||
}
|
||||
|
||||
function context(){
|
||||
var $ = querySelector("#context");
|
||||
var project = $("input[name=project]").value;
|
||||
var asset = $("input[name=asset]").value;
|
||||
var task = $("input[name=task]").value;
|
||||
var app = $("input[name=app]").value;
|
||||
api.context(project,asset,task,app).then(displayResult);
|
||||
}
|
||||
|
||||
|
||||
//
|
||||
// function example1(){
|
||||
// var $ = querySelector("#example1");
|
||||
// var name = $("input[name=name]").value;
|
||||
// api.hello(name).then(displayResult);
|
||||
// }
|
||||
//
|
||||
//
|
||||
// function example2(){
|
||||
// var $ = querySelector("#example2");
|
||||
// var x = $("input[name=x]").valueAsNumber;
|
||||
// var y = $("#example2 input[name=y]").valueAsNumber;
|
||||
// api.multiply(x, y).then(displayResult);
|
||||
// }
|
||||
//
|
||||
// function example3(){
|
||||
// var $ = querySelector("#example3");
|
||||
// var file = $("input[name=upload]").files[0];
|
||||
// api.upload(file, file.name).then(displayResult).catch(displayError);
|
||||
// }
|
||||
//
|
||||
// function example4(){
|
||||
// var $ = querySelector("#example4");
|
||||
// api.my_ip().then(displayResult)
|
||||
// }
|
||||
//
|
||||
// function example5(){
|
||||
// var $ = querySelector("#example5");
|
||||
// var username = $("input[name=username]").value;
|
||||
// var password = $("input[name=password]").value;
|
||||
// pico.setAuthentication(api, username, password);
|
||||
// api.current_user().then(displayResult).catch(displayError);
|
||||
// pico.clearAuthentication(api);
|
||||
// }
|
||||
//
|
||||
// function example6(){
|
||||
// var $ = querySelector("#example6");
|
||||
// api.start_session().then(function(){
|
||||
// api.session_id().then(displayResult).then(function(){
|
||||
// api.end_session();
|
||||
// })
|
||||
// })
|
||||
// }
|
||||
//
|
||||
// function example7(){
|
||||
// var $ = querySelector("#example7");
|
||||
// var session_id = "4242";
|
||||
// pico.setRequestHook(api, 'session', function(req) {
|
||||
// req.headers.set('X-SESSION-ID', session_id)
|
||||
// })
|
||||
// api.session_id2().then(displayResult)
|
||||
// pico.clearRequestHook(api, 'session');
|
||||
// }
|
||||
//
|
||||
// function example8(){
|
||||
// var $ = querySelector("#example8");
|
||||
// api.countdown(10).each(displayResult).then(function(){
|
||||
// displayResult("Boom!");
|
||||
// });
|
||||
// }
|
||||
//
|
||||
// function example9(){
|
||||
// var $ = querySelector("#example9");
|
||||
// var user = {
|
||||
// name: "Bob",
|
||||
// age: 30,
|
||||
// occupation: "Software Engineer",
|
||||
// }
|
||||
// api.user_description(user).then(displayResult);
|
||||
// }
|
||||
//
|
||||
// function example10(){
|
||||
// var $ = querySelector("#example10");
|
||||
// api.fail().then(displayResult).catch(displayError);
|
||||
// }
|
||||
//
|
||||
// function example11(){
|
||||
// var $ = querySelector("#example11");
|
||||
// api.make_coffee().then(displayResult).catch(displayError);
|
||||
// }
|
||||
//
|
||||
//
|
||||
// function example12(){
|
||||
// var $ = querySelector("#example12");
|
||||
// var form = $("form");
|
||||
// api.multiply.submitFormData(new FormData(form)).then(displayResult).catch(displayError);
|
||||
// }
|
||||
//
|
||||
// function example13(){
|
||||
// var $ = querySelector("#example13");
|
||||
// var data = {
|
||||
// x: 6,
|
||||
// y: 7,
|
||||
// }
|
||||
// api.multiply.submitJSON(data).then(displayResult).catch(displayError);
|
||||
// }
|
||||
|
||||
|
||||
// api.show_source().then(function(s){
|
||||
// document.querySelector('#source code').innerText = s;
|
||||
// }).then(ready);
|
||||
|
||||
|
||||
function ready(){
|
||||
// // set the <code> element of each example to the corresponding functions source
|
||||
// document.querySelectorAll('li pre code.js').forEach(function(e){
|
||||
// var id = e.parentElement.parentElement.id;
|
||||
// var f = window[id];
|
||||
// var code = f.toString().split('\n').slice(2, -1).join('\n');
|
||||
// e.innerText = unindent(code);
|
||||
// })
|
||||
|
||||
document.querySelectorAll('li pre code.html').forEach(function(e){
|
||||
var html = e.parentElement.parentElement.querySelector('div.example').innerHTML;
|
||||
e.innerText = unindent(html);
|
||||
})
|
||||
|
||||
hljs.initHighlighting();
|
||||
|
||||
// // find all the elements representing the function definitions in the python source
|
||||
// document.querySelectorAll('.python .hljs-function .hljs-title').forEach(function(e){
|
||||
// var a = document.createElement('a');
|
||||
// a.name = e.innerText;
|
||||
// e.parentElement.insertBefore(a, e)
|
||||
// return defs[e.innerText] = e.parentElement;
|
||||
// });
|
||||
|
||||
// convert all 'api.X' strings to hyperlinks to jump to python source
|
||||
document.querySelectorAll('.js').forEach(function(e){
|
||||
var code = e.innerHTML;
|
||||
Object.keys(defs).forEach(function(k){
|
||||
code = code.replace('api.' + k + '(', '<a href="#' + k + '" onclick="jumpTo(\'' + k + '\')">api.' + k + '</a>(');
|
||||
})
|
||||
e.innerHTML = code;
|
||||
})
|
||||
}
|
||||
5
pype/avalon_apps/__init__.py
Normal file
5
pype/avalon_apps/__init__.py
Normal file
|
|
@ -0,0 +1,5 @@
|
|||
from .avalon_app import AvalonApps
|
||||
|
||||
|
||||
def tray_init(tray_widget, main_widget):
|
||||
return AvalonApps(main_widget, tray_widget)
|
||||
55
pype/avalon_apps/avalon_app.py
Normal file
55
pype/avalon_apps/avalon_app.py
Normal file
|
|
@ -0,0 +1,55 @@
|
|||
import os
|
||||
import argparse
|
||||
from Qt import QtGui, QtWidgets
|
||||
from avalon.tools import libraryloader
|
||||
from pypeapp import Logger
|
||||
from avalon import io
|
||||
from launcher import launcher_widget, lib as launcher_lib
|
||||
|
||||
|
||||
class AvalonApps:
|
||||
def __init__(self, main_parent=None, parent=None):
|
||||
self.log = Logger().get_logger(__name__)
|
||||
self.main_parent = main_parent
|
||||
self.parent = parent
|
||||
self.app_launcher = None
|
||||
|
||||
# Definition of Tray menu
|
||||
def tray_menu(self, parent_menu=None):
|
||||
# Actions
|
||||
if parent_menu is None:
|
||||
if self.parent is None:
|
||||
self.log.warning('Parent menu is not set')
|
||||
return
|
||||
elif self.parent.hasattr('menu'):
|
||||
parent_menu = self.parent.menu
|
||||
else:
|
||||
self.log.warning('Parent menu is not set')
|
||||
return
|
||||
|
||||
icon = QtGui.QIcon(launcher_lib.resource("icon", "main.png"))
|
||||
aShowLauncher = QtWidgets.QAction(icon, "&Launcher", parent_menu)
|
||||
aLibraryLoader = QtWidgets.QAction("Library", parent_menu)
|
||||
|
||||
aShowLauncher.triggered.connect(self.show_launcher)
|
||||
aLibraryLoader.triggered.connect(self.show_library_loader)
|
||||
|
||||
parent_menu.addAction(aShowLauncher)
|
||||
parent_menu.addAction(aLibraryLoader)
|
||||
|
||||
def show_launcher(self):
|
||||
# if app_launcher don't exist create it/otherwise only show main window
|
||||
if self.app_launcher is None:
|
||||
root = os.path.realpath(os.environ["AVALON_PROJECTS"])
|
||||
io.install()
|
||||
APP_PATH = launcher_lib.resource("qml", "main.qml")
|
||||
self.app_launcher = launcher_widget.Launcher(root, APP_PATH)
|
||||
self.app_launcher.window.show()
|
||||
|
||||
def show_library_loader(self):
|
||||
libraryloader.show(
|
||||
parent=self.main_parent,
|
||||
icon=self.parent.icon,
|
||||
show_projects=True,
|
||||
show_libraries=True
|
||||
)
|
||||
9
pype/clockify/__init__.py
Normal file
9
pype/clockify/__init__.py
Normal file
|
|
@ -0,0 +1,9 @@
|
|||
from .clockify_api import ClockifyAPI
|
||||
from .widget_settings import ClockifySettings
|
||||
from .clockify import ClockifyModule
|
||||
|
||||
__all__ = [
|
||||
'ClockifyAPI',
|
||||
'ClockifySettings',
|
||||
'ClockifyModule'
|
||||
]
|
||||
119
pype/clockify/clockify.py
Normal file
119
pype/clockify/clockify.py
Normal file
|
|
@ -0,0 +1,119 @@
|
|||
import threading
|
||||
from pypeapp import style
|
||||
from Qt import QtWidgets
|
||||
from pype.clockify import ClockifySettings, ClockifyAPI
|
||||
|
||||
|
||||
class ClockifyModule:
|
||||
|
||||
def __init__(self, main_parent=None, parent=None):
|
||||
self.main_parent = main_parent
|
||||
self.parent = parent
|
||||
self.clockapi = ClockifyAPI()
|
||||
self.widget_settings = ClockifySettings(main_parent, self)
|
||||
self.widget_settings_required = None
|
||||
|
||||
self.thread_timer_check = None
|
||||
# Bools
|
||||
self.bool_thread_check_running = False
|
||||
self.bool_api_key_set = False
|
||||
self.bool_workspace_set = False
|
||||
self.bool_timer_run = False
|
||||
|
||||
def start_up(self):
|
||||
self.clockapi.set_master(self)
|
||||
self.bool_api_key_set = self.clockapi.set_api()
|
||||
if self.bool_api_key_set is False:
|
||||
self.show_settings()
|
||||
return
|
||||
|
||||
self.bool_workspace_set = self.clockapi.workspace_id is not None
|
||||
if self.bool_workspace_set is False:
|
||||
return
|
||||
|
||||
self.start_timer_check()
|
||||
|
||||
self.set_menu_visibility()
|
||||
|
||||
def process_modules(self, modules):
|
||||
if 'FtrackModule' in modules:
|
||||
actions_path = os.path.sep.join([
|
||||
os.path.dirname(__file__),
|
||||
'ftrack_actions'
|
||||
])
|
||||
current = os.environ('FTRACK_ACTIONS_PATH', '')
|
||||
if current:
|
||||
current += os.pathsep
|
||||
os.environ['FTRACK_ACTIONS_PATH'] = current + actions_path
|
||||
|
||||
if 'AvalonApps' in modules:
|
||||
from launcher import lib
|
||||
actions_path = os.path.sep.join([
|
||||
os.path.dirname(__file__),
|
||||
'launcher_actions'
|
||||
])
|
||||
current = os.environ.get('AVALON_ACTIONS', '')
|
||||
if current:
|
||||
current += os.pathsep
|
||||
os.environ['AVALON_ACTIONS'] = current + actions_path
|
||||
|
||||
def start_timer_check(self):
|
||||
self.bool_thread_check_running = True
|
||||
if self.thread_timer_check is None:
|
||||
self.thread_timer_check = threading.Thread(
|
||||
target=self.check_running
|
||||
)
|
||||
self.thread_timer_check.daemon = True
|
||||
self.thread_timer_check.start()
|
||||
|
||||
def stop_timer_check(self):
|
||||
self.bool_thread_check_running = True
|
||||
if self.thread_timer_check is not None:
|
||||
self.thread_timer_check.join()
|
||||
self.thread_timer_check = None
|
||||
|
||||
def check_running(self):
|
||||
import time
|
||||
while self.bool_thread_check_running is True:
|
||||
if self.clockapi.get_in_progress() is not None:
|
||||
self.bool_timer_run = True
|
||||
else:
|
||||
self.bool_timer_run = False
|
||||
self.set_menu_visibility()
|
||||
time.sleep(5)
|
||||
|
||||
def stop_timer(self):
|
||||
self.clockapi.finish_time_entry()
|
||||
self.bool_timer_run = False
|
||||
|
||||
# Definition of Tray menu
|
||||
def tray_menu(self, parent):
|
||||
# Menu for Tray App
|
||||
self.menu = QtWidgets.QMenu('Clockify', parent)
|
||||
self.menu.setProperty('submenu', 'on')
|
||||
self.menu.setStyleSheet(style.load_stylesheet())
|
||||
|
||||
# Actions
|
||||
self.aShowSettings = QtWidgets.QAction(
|
||||
"Settings", self.menu
|
||||
)
|
||||
self.aStopTimer = QtWidgets.QAction(
|
||||
"Stop timer", self.menu
|
||||
)
|
||||
|
||||
self.menu.addAction(self.aShowSettings)
|
||||
self.menu.addAction(self.aStopTimer)
|
||||
|
||||
self.aShowSettings.triggered.connect(self.show_settings)
|
||||
self.aStopTimer.triggered.connect(self.stop_timer)
|
||||
|
||||
self.set_menu_visibility()
|
||||
|
||||
return self.menu
|
||||
|
||||
def show_settings(self):
|
||||
self.widget_settings.input_api_key.setText(self.clockapi.get_api_key())
|
||||
self.widget_settings.show()
|
||||
|
||||
def set_menu_visibility(self):
|
||||
self.aStopTimer.setVisible(self.bool_timer_run)
|
||||
434
pype/clockify/clockify_api.py
Normal file
434
pype/clockify/clockify_api.py
Normal file
|
|
@ -0,0 +1,434 @@
|
|||
import os
|
||||
import requests
|
||||
import json
|
||||
import datetime
|
||||
import appdirs
|
||||
|
||||
|
||||
class Singleton(type):
|
||||
_instances = {}
|
||||
|
||||
def __call__(cls, *args, **kwargs):
|
||||
if cls not in cls._instances:
|
||||
cls._instances[cls] = super(
|
||||
Singleton, cls
|
||||
).__call__(*args, **kwargs)
|
||||
return cls._instances[cls]
|
||||
|
||||
|
||||
class ClockifyAPI(metaclass=Singleton):
|
||||
endpoint = "https://api.clockify.me/api/"
|
||||
headers = {"X-Api-Key": None}
|
||||
app_dir = os.path.normpath(appdirs.user_data_dir('pype-app', 'pype'))
|
||||
file_name = 'clockify.json'
|
||||
fpath = os.path.join(app_dir, file_name)
|
||||
master_parent = None
|
||||
workspace_id = None
|
||||
|
||||
def set_master(self, master_parent):
|
||||
self.master_parent = master_parent
|
||||
|
||||
def verify_api(self):
|
||||
for key, value in self.headers.items():
|
||||
if value is None or value.strip() == '':
|
||||
return False
|
||||
return True
|
||||
|
||||
def set_api(self, api_key=None):
|
||||
if api_key is None:
|
||||
api_key = self.get_api_key()
|
||||
|
||||
if api_key is not None and self.validate_api_key(api_key) is True:
|
||||
self.headers["X-Api-Key"] = api_key
|
||||
self.set_workspace()
|
||||
return True
|
||||
return False
|
||||
|
||||
def validate_api_key(self, api_key):
|
||||
test_headers = {'X-Api-Key': api_key}
|
||||
action_url = 'workspaces/'
|
||||
response = requests.get(
|
||||
self.endpoint + action_url,
|
||||
headers=test_headers
|
||||
)
|
||||
if response.status_code != 200:
|
||||
return False
|
||||
return True
|
||||
|
||||
def validate_workspace_perm(self):
|
||||
test_project = '__test__'
|
||||
action_url = 'workspaces/{}/projects/'.format(self.workspace_id)
|
||||
body = {
|
||||
"name": test_project, "clientId": "", "isPublic": "false",
|
||||
"estimate": {"type": "AUTO"},
|
||||
"color": "#f44336", "billable": "true"
|
||||
}
|
||||
response = requests.post(
|
||||
self.endpoint + action_url,
|
||||
headers=self.headers, json=body
|
||||
)
|
||||
if response.status_code == 201:
|
||||
self.delete_project(self.get_project_id(test_project))
|
||||
return True
|
||||
else:
|
||||
projects = self.get_projects()
|
||||
if test_project in projects:
|
||||
try:
|
||||
self.delete_project(self.get_project_id(test_project))
|
||||
return True
|
||||
except json.decoder.JSONDecodeError:
|
||||
return False
|
||||
return False
|
||||
|
||||
def set_workspace(self, name=None):
|
||||
if name is None:
|
||||
name = os.environ.get('CLOCKIFY_WORKSPACE', None)
|
||||
self.workspace = name
|
||||
self.workspace_id = None
|
||||
if self.workspace is None:
|
||||
return
|
||||
try:
|
||||
result = self.validate_workspace()
|
||||
except Exception:
|
||||
result = False
|
||||
if result is not False:
|
||||
self.workspace_id = result
|
||||
if self.master_parent is not None:
|
||||
self.master_parent.start_timer_check()
|
||||
return True
|
||||
return False
|
||||
|
||||
def validate_workspace(self, name=None):
|
||||
if name is None:
|
||||
name = self.workspace
|
||||
all_workspaces = self.get_workspaces()
|
||||
if name in all_workspaces:
|
||||
return all_workspaces[name]
|
||||
return False
|
||||
|
||||
def get_api_key(self):
|
||||
api_key = None
|
||||
try:
|
||||
file = open(self.fpath, 'r')
|
||||
api_key = json.load(file).get('api_key', None)
|
||||
if api_key == '':
|
||||
api_key = None
|
||||
except Exception:
|
||||
file = open(self.fpath, 'w')
|
||||
file.close()
|
||||
return api_key
|
||||
|
||||
def save_api_key(self, api_key):
|
||||
data = {'api_key': api_key}
|
||||
file = open(self.fpath, 'w')
|
||||
file.write(json.dumps(data))
|
||||
file.close()
|
||||
|
||||
def get_workspaces(self):
|
||||
action_url = 'workspaces/'
|
||||
response = requests.get(
|
||||
self.endpoint + action_url,
|
||||
headers=self.headers
|
||||
)
|
||||
return {
|
||||
workspace["name"]: workspace["id"] for workspace in response.json()
|
||||
}
|
||||
|
||||
def get_projects(self, workspace_id=None):
|
||||
if workspace_id is None:
|
||||
workspace_id = self.workspace_id
|
||||
action_url = 'workspaces/{}/projects/'.format(workspace_id)
|
||||
response = requests.get(
|
||||
self.endpoint + action_url,
|
||||
headers=self.headers
|
||||
)
|
||||
|
||||
return {
|
||||
project["name"]: project["id"] for project in response.json()
|
||||
}
|
||||
|
||||
def get_tags(self, workspace_id=None):
|
||||
if workspace_id is None:
|
||||
workspace_id = self.workspace_id
|
||||
action_url = 'workspaces/{}/tags/'.format(workspace_id)
|
||||
response = requests.get(
|
||||
self.endpoint + action_url,
|
||||
headers=self.headers
|
||||
)
|
||||
|
||||
return {
|
||||
tag["name"]: tag["id"] for tag in response.json()
|
||||
}
|
||||
|
||||
def get_tasks(self, project_id, workspace_id=None):
|
||||
if workspace_id is None:
|
||||
workspace_id = self.workspace_id
|
||||
action_url = 'workspaces/{}/projects/{}/tasks/'.format(
|
||||
workspace_id, project_id
|
||||
)
|
||||
response = requests.get(
|
||||
self.endpoint + action_url,
|
||||
headers=self.headers
|
||||
)
|
||||
|
||||
return {
|
||||
task["name"]: task["id"] for task in response.json()
|
||||
}
|
||||
|
||||
def get_workspace_id(self, workspace_name):
|
||||
all_workspaces = self.get_workspaces()
|
||||
if workspace_name not in all_workspaces:
|
||||
return None
|
||||
return all_workspaces[workspace_name]
|
||||
|
||||
def get_project_id(self, project_name, workspace_id=None):
|
||||
if workspace_id is None:
|
||||
workspace_id = self.workspace_id
|
||||
all_projects = self.get_projects(workspace_id)
|
||||
if project_name not in all_projects:
|
||||
return None
|
||||
return all_projects[project_name]
|
||||
|
||||
def get_tag_id(self, tag_name, workspace_id=None):
|
||||
if workspace_id is None:
|
||||
workspace_id = self.workspace_id
|
||||
all_tasks = self.get_tags(workspace_id)
|
||||
if tag_name not in all_tasks:
|
||||
return None
|
||||
return all_tasks[tag_name]
|
||||
|
||||
def get_task_id(
|
||||
self, task_name, project_id, workspace_id=None
|
||||
):
|
||||
if workspace_id is None:
|
||||
workspace_id = self.workspace_id
|
||||
all_tasks = self.get_tasks(
|
||||
project_id, workspace_id
|
||||
)
|
||||
if task_name not in all_tasks:
|
||||
return None
|
||||
return all_tasks[task_name]
|
||||
|
||||
def get_current_time(self):
|
||||
return str(datetime.datetime.utcnow().isoformat())+'Z'
|
||||
|
||||
def start_time_entry(
|
||||
self, description, project_id, task_id=None, tag_ids=[],
|
||||
workspace_id=None, billable=True
|
||||
):
|
||||
# Workspace
|
||||
if workspace_id is None:
|
||||
workspace_id = self.workspace_id
|
||||
|
||||
# Check if is currently run another times and has same values
|
||||
current = self.get_in_progress(workspace_id)
|
||||
if current is not None:
|
||||
if (
|
||||
current.get("description", None) == description and
|
||||
current.get("projectId", None) == project_id and
|
||||
current.get("taskId", None) == task_id
|
||||
):
|
||||
self.bool_timer_run = True
|
||||
return self.bool_timer_run
|
||||
self.finish_time_entry(workspace_id)
|
||||
|
||||
# Convert billable to strings
|
||||
if billable:
|
||||
billable = 'true'
|
||||
else:
|
||||
billable = 'false'
|
||||
# Rest API Action
|
||||
action_url = 'workspaces/{}/timeEntries/'.format(workspace_id)
|
||||
start = self.get_current_time()
|
||||
body = {
|
||||
"start": start,
|
||||
"billable": billable,
|
||||
"description": description,
|
||||
"projectId": project_id,
|
||||
"taskId": task_id,
|
||||
"tagIds": tag_ids
|
||||
}
|
||||
response = requests.post(
|
||||
self.endpoint + action_url,
|
||||
headers=self.headers,
|
||||
json=body
|
||||
)
|
||||
|
||||
success = False
|
||||
if response.status_code < 300:
|
||||
success = True
|
||||
return success
|
||||
|
||||
def get_in_progress(self, workspace_id=None):
|
||||
if workspace_id is None:
|
||||
workspace_id = self.workspace_id
|
||||
action_url = 'workspaces/{}/timeEntries/inProgress'.format(
|
||||
workspace_id
|
||||
)
|
||||
response = requests.get(
|
||||
self.endpoint + action_url,
|
||||
headers=self.headers
|
||||
)
|
||||
try:
|
||||
output = response.json()
|
||||
except json.decoder.JSONDecodeError:
|
||||
output = None
|
||||
return output
|
||||
|
||||
def finish_time_entry(self, workspace_id=None):
|
||||
if workspace_id is None:
|
||||
workspace_id = self.workspace_id
|
||||
current = self.get_in_progress(workspace_id)
|
||||
current_id = current["id"]
|
||||
action_url = 'workspaces/{}/timeEntries/{}'.format(
|
||||
workspace_id, current_id
|
||||
)
|
||||
body = {
|
||||
"start": current["timeInterval"]["start"],
|
||||
"billable": current["billable"],
|
||||
"description": current["description"],
|
||||
"projectId": current["projectId"],
|
||||
"taskId": current["taskId"],
|
||||
"tagIds": current["tagIds"],
|
||||
"end": self.get_current_time()
|
||||
}
|
||||
response = requests.put(
|
||||
self.endpoint + action_url,
|
||||
headers=self.headers,
|
||||
json=body
|
||||
)
|
||||
return response.json()
|
||||
|
||||
def get_time_entries(
|
||||
self, workspace_id=None, quantity=10
|
||||
):
|
||||
if workspace_id is None:
|
||||
workspace_id = self.workspace_id
|
||||
action_url = 'workspaces/{}/timeEntries/'.format(workspace_id)
|
||||
response = requests.get(
|
||||
self.endpoint + action_url,
|
||||
headers=self.headers
|
||||
)
|
||||
return response.json()[:quantity]
|
||||
|
||||
def remove_time_entry(self, tid, workspace_id=None):
|
||||
if workspace_id is None:
|
||||
workspace_id = self.workspace_id
|
||||
action_url = 'workspaces/{}/timeEntries/{}'.format(
|
||||
workspace_id, tid
|
||||
)
|
||||
response = requests.delete(
|
||||
self.endpoint + action_url,
|
||||
headers=self.headers
|
||||
)
|
||||
return response.json()
|
||||
|
||||
def add_project(self, name, workspace_id=None):
|
||||
if workspace_id is None:
|
||||
workspace_id = self.workspace_id
|
||||
action_url = 'workspaces/{}/projects/'.format(workspace_id)
|
||||
body = {
|
||||
"name": name,
|
||||
"clientId": "",
|
||||
"isPublic": "false",
|
||||
"estimate": {
|
||||
# "estimate": "3600",
|
||||
"type": "AUTO"
|
||||
},
|
||||
"color": "#f44336",
|
||||
"billable": "true"
|
||||
}
|
||||
response = requests.post(
|
||||
self.endpoint + action_url,
|
||||
headers=self.headers,
|
||||
json=body
|
||||
)
|
||||
return response.json()
|
||||
|
||||
def add_workspace(self, name):
|
||||
action_url = 'workspaces/'
|
||||
body = {"name": name}
|
||||
response = requests.post(
|
||||
self.endpoint + action_url,
|
||||
headers=self.headers,
|
||||
json=body
|
||||
)
|
||||
return response.json()
|
||||
|
||||
def add_task(
|
||||
self, name, project_id, workspace_id=None
|
||||
):
|
||||
if workspace_id is None:
|
||||
workspace_id = self.workspace_id
|
||||
action_url = 'workspaces/{}/projects/{}/tasks/'.format(
|
||||
workspace_id, project_id
|
||||
)
|
||||
body = {
|
||||
"name": name,
|
||||
"projectId": project_id
|
||||
}
|
||||
response = requests.post(
|
||||
self.endpoint + action_url,
|
||||
headers=self.headers,
|
||||
json=body
|
||||
)
|
||||
return response.json()
|
||||
|
||||
def add_tag(self, name, workspace_id=None):
|
||||
if workspace_id is None:
|
||||
workspace_id = self.workspace_id
|
||||
action_url = 'workspaces/{}/tags'.format(workspace_id)
|
||||
body = {
|
||||
"name": name
|
||||
}
|
||||
response = requests.post(
|
||||
self.endpoint + action_url,
|
||||
headers=self.headers,
|
||||
json=body
|
||||
)
|
||||
return response.json()
|
||||
|
||||
def delete_project(
|
||||
self, project_id, workspace_id=None
|
||||
):
|
||||
if workspace_id is None:
|
||||
workspace_id = self.workspace_id
|
||||
action_url = '/workspaces/{}/projects/{}'.format(
|
||||
workspace_id, project_id
|
||||
)
|
||||
response = requests.delete(
|
||||
self.endpoint + action_url,
|
||||
headers=self.headers,
|
||||
)
|
||||
return response.json()
|
||||
|
||||
def convert_input(
|
||||
self, entity_id, entity_name, mode='Workspace', project_id=None
|
||||
):
|
||||
if entity_id is None:
|
||||
error = False
|
||||
error_msg = 'Missing information "{}"'
|
||||
if mode.lower() == 'workspace':
|
||||
if entity_id is None and entity_name is None:
|
||||
if self.workspace_id is not None:
|
||||
entity_id = self.workspace_id
|
||||
else:
|
||||
error = True
|
||||
else:
|
||||
entity_id = self.get_workspace_id(entity_name)
|
||||
else:
|
||||
if entity_id is None and entity_name is None:
|
||||
error = True
|
||||
elif mode.lower() == 'project':
|
||||
entity_id = self.get_project_id(entity_name)
|
||||
elif mode.lower() == 'task':
|
||||
entity_id = self.get_task_id(
|
||||
task_name=entity_name, project_id=project_id
|
||||
)
|
||||
else:
|
||||
raise TypeError('Unknown type')
|
||||
# Raise error
|
||||
if error:
|
||||
raise ValueError(error_msg.format(mode))
|
||||
|
||||
return entity_id
|
||||
108
pype/clockify/ftrack_actions/action_clockify_start.py
Normal file
108
pype/clockify/ftrack_actions/action_clockify_start.py
Normal file
|
|
@ -0,0 +1,108 @@
|
|||
import os
|
||||
import sys
|
||||
import argparse
|
||||
import logging
|
||||
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction
|
||||
from pype.clockify import ClockifyAPI
|
||||
|
||||
|
||||
class StartClockify(BaseAction):
|
||||
'''Starts timer on clockify.'''
|
||||
|
||||
#: Action identifier.
|
||||
identifier = 'clockify.start.timer'
|
||||
#: Action label.
|
||||
label = 'Start timer'
|
||||
#: Action description.
|
||||
description = 'Starts timer on clockify'
|
||||
#: roles that are allowed to register this action
|
||||
icon = '{}/app_icons/clockify.png'.format(
|
||||
os.environ.get('PYPE_STATICS_SERVER', '')
|
||||
)
|
||||
#: Clockify api
|
||||
clockapi = ClockifyAPI()
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
if len(entities) != 1:
|
||||
return False
|
||||
if entities[0].entity_type.lower() != 'task':
|
||||
return False
|
||||
if self.clockapi.workspace_id is None:
|
||||
return False
|
||||
return True
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
task = entities[0]
|
||||
task_name = task['type']['name']
|
||||
project_name = task['project']['full_name']
|
||||
|
||||
def get_parents(entity):
|
||||
output = []
|
||||
if entity.entity_type.lower() == 'project':
|
||||
return output
|
||||
output.extend(get_parents(entity['parent']))
|
||||
output.append(entity['name'])
|
||||
|
||||
return output
|
||||
|
||||
desc_items = get_parents(task['parent'])
|
||||
desc_items.append(task['name'])
|
||||
description = '/'.join(desc_items)
|
||||
project_id = self.clockapi.get_project_id(project_name)
|
||||
tag_ids = []
|
||||
tag_ids.append(self.clockapi.get_tag_id(task_name))
|
||||
self.clockapi.start_time_entry(
|
||||
description, project_id, tag_ids=tag_ids
|
||||
)
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
StartClockify(session).register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
'''Set up logging and register action.'''
|
||||
if arguments is None:
|
||||
arguments = []
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
# Allow setting of logging level from arguments.
|
||||
loggingLevels = {}
|
||||
for level in (
|
||||
logging.NOTSET, logging.DEBUG, logging.INFO, logging.WARNING,
|
||||
logging.ERROR, logging.CRITICAL
|
||||
):
|
||||
loggingLevels[logging.getLevelName(level).lower()] = level
|
||||
|
||||
parser.add_argument(
|
||||
'-v', '--verbosity',
|
||||
help='Set the logging output verbosity.',
|
||||
choices=loggingLevels.keys(),
|
||||
default='info'
|
||||
)
|
||||
namespace = parser.parse_args(arguments)
|
||||
|
||||
# Set up basic logging
|
||||
logging.basicConfig(level=loggingLevels[namespace.verbosity])
|
||||
|
||||
session = ftrack_api.Session()
|
||||
register(session)
|
||||
|
||||
# Wait for events
|
||||
logging.info(
|
||||
'Registered actions and listening for events. Use Ctrl-C to abort.'
|
||||
)
|
||||
session.event_hub.wait()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
raise SystemExit(main(sys.argv[1:]))
|
||||
151
pype/clockify/ftrack_actions/action_clockify_sync.py
Normal file
151
pype/clockify/ftrack_actions/action_clockify_sync.py
Normal file
|
|
@ -0,0 +1,151 @@
|
|||
import os
|
||||
import sys
|
||||
import argparse
|
||||
import logging
|
||||
import json
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction, MissingPermision
|
||||
from pype.clockify import ClockifyAPI
|
||||
|
||||
|
||||
class SyncClocify(BaseAction):
|
||||
'''Synchronise project names and task types.'''
|
||||
|
||||
#: Action identifier.
|
||||
identifier = 'clockify.sync'
|
||||
#: Action label.
|
||||
label = 'Sync To Clockify'
|
||||
#: Action description.
|
||||
description = 'Synchronise data to Clockify workspace'
|
||||
#: priority
|
||||
priority = 100
|
||||
#: roles that are allowed to register this action
|
||||
role_list = ['Pypeclub', 'Administrator']
|
||||
#: icon
|
||||
icon = '{}/app_icons/clockify-white.png'.format(
|
||||
os.environ.get('PYPE_STATICS_SERVER', '')
|
||||
)
|
||||
#: CLockifyApi
|
||||
clockapi = ClockifyAPI()
|
||||
|
||||
def register(self):
|
||||
if self.clockapi.workspace_id is None:
|
||||
raise ValueError('Clockify Workspace or API key are not set!')
|
||||
|
||||
if self.clockapi.validate_workspace_perm() is False:
|
||||
raise MissingPermision('Clockify')
|
||||
super().register()
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
return True
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
# JOB SETTINGS
|
||||
userId = event['source']['user']['id']
|
||||
user = session.query('User where id is ' + userId).one()
|
||||
|
||||
job = session.create('Job', {
|
||||
'user': user,
|
||||
'status': 'running',
|
||||
'data': json.dumps({
|
||||
'description': 'Sync Ftrack to Clockify'
|
||||
})
|
||||
})
|
||||
session.commit()
|
||||
try:
|
||||
entity = entities[0]
|
||||
|
||||
if entity.entity_type.lower() == 'project':
|
||||
project = entity
|
||||
else:
|
||||
project = entity['project']
|
||||
project_name = project['full_name']
|
||||
|
||||
task_types = []
|
||||
for task_type in project['project_schema']['_task_type_schema'][
|
||||
'types'
|
||||
]:
|
||||
task_types.append(task_type['name'])
|
||||
|
||||
clockify_projects = self.clockapi.get_projects()
|
||||
|
||||
if project_name not in clockify_projects:
|
||||
response = self.clockapi.add_project(project_name)
|
||||
if 'id' not in response:
|
||||
self.log.error('Project {} can\'t be created'.format(
|
||||
project_name
|
||||
))
|
||||
return {
|
||||
'success': False,
|
||||
'message': 'Can\'t create project, unexpected error'
|
||||
}
|
||||
project_id = response['id']
|
||||
else:
|
||||
project_id = clockify_projects[project_name]
|
||||
|
||||
clockify_workspace_tags = self.clockapi.get_tags()
|
||||
for task_type in task_types:
|
||||
if task_type not in clockify_workspace_tags:
|
||||
response = self.clockapi.add_tag(task_type)
|
||||
if 'id' not in response:
|
||||
self.log.error('Task {} can\'t be created'.format(
|
||||
task_type
|
||||
))
|
||||
continue
|
||||
except Exception:
|
||||
job['status'] = 'failed'
|
||||
session.commit()
|
||||
return False
|
||||
|
||||
job['status'] = 'done'
|
||||
session.commit()
|
||||
return True
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
SyncClocify(session).register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
'''Set up logging and register action.'''
|
||||
if arguments is None:
|
||||
arguments = []
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
# Allow setting of logging level from arguments.
|
||||
loggingLevels = {}
|
||||
for level in (
|
||||
logging.NOTSET, logging.DEBUG, logging.INFO, logging.WARNING,
|
||||
logging.ERROR, logging.CRITICAL
|
||||
):
|
||||
loggingLevels[logging.getLevelName(level).lower()] = level
|
||||
|
||||
parser.add_argument(
|
||||
'-v', '--verbosity',
|
||||
help='Set the logging output verbosity.',
|
||||
choices=loggingLevels.keys(),
|
||||
default='info'
|
||||
)
|
||||
namespace = parser.parse_args(arguments)
|
||||
|
||||
# Set up basic logging
|
||||
logging.basicConfig(level=loggingLevels[namespace.verbosity])
|
||||
|
||||
session = ftrack_api.Session()
|
||||
register(session)
|
||||
|
||||
# Wait for events
|
||||
logging.info(
|
||||
'Registered actions and listening for events. Use Ctrl-C to abort.'
|
||||
)
|
||||
session.event_hub.wait()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
raise SystemExit(main(sys.argv[1:]))
|
||||
44
pype/clockify/launcher_actions/ClockifyStart.py
Normal file
44
pype/clockify/launcher_actions/ClockifyStart.py
Normal file
|
|
@ -0,0 +1,44 @@
|
|||
from avalon import api, io
|
||||
from pype.api import Logger
|
||||
from pype.clockify import ClockifyAPI
|
||||
|
||||
|
||||
log = Logger().get_logger(__name__, "clockify_start")
|
||||
|
||||
|
||||
class ClockifyStart(api.Action):
|
||||
|
||||
name = "clockify_start_timer"
|
||||
label = "Clockify - Start Timer"
|
||||
icon = "clockify_icon"
|
||||
order = 500
|
||||
clockapi = ClockifyAPI()
|
||||
|
||||
def is_compatible(self, session):
|
||||
"""Return whether the action is compatible with the session"""
|
||||
if "AVALON_TASK" in session:
|
||||
return True
|
||||
return False
|
||||
|
||||
def process(self, session, **kwargs):
|
||||
project_name = session['AVALON_PROJECT']
|
||||
asset_name = session['AVALON_ASSET']
|
||||
task_name = session['AVALON_TASK']
|
||||
|
||||
description = asset_name
|
||||
asset = io.find_one({
|
||||
'type': 'asset',
|
||||
'name': asset_name
|
||||
})
|
||||
if asset is not None:
|
||||
desc_items = asset.get('data', {}).get('parents', [])
|
||||
desc_items.append(asset_name)
|
||||
desc_items.append(task_name)
|
||||
description = '/'.join(desc_items)
|
||||
|
||||
project_id = self.clockapi.get_project_id(project_name)
|
||||
tag_ids = []
|
||||
tag_ids.append(self.clockapi.get_tag_id(task_name))
|
||||
self.clockapi.start_time_entry(
|
||||
description, project_id, tag_ids=tag_ids
|
||||
)
|
||||
57
pype/clockify/launcher_actions/ClockifySync.py
Normal file
57
pype/clockify/launcher_actions/ClockifySync.py
Normal file
|
|
@ -0,0 +1,57 @@
|
|||
from avalon import api, io
|
||||
from pype.clockify import ClockifyAPI
|
||||
from pype.api import Logger
|
||||
log = Logger().get_logger(__name__, "clockify_sync")
|
||||
|
||||
|
||||
class ClockifySync(api.Action):
|
||||
|
||||
name = "sync_to_clockify"
|
||||
label = "Sync to Clockify"
|
||||
icon = "clockify_white_icon"
|
||||
order = 500
|
||||
clockapi = ClockifyAPI()
|
||||
have_permissions = clockapi.validate_workspace_perm()
|
||||
|
||||
def is_compatible(self, session):
|
||||
"""Return whether the action is compatible with the session"""
|
||||
return self.have_permissions
|
||||
|
||||
def process(self, session, **kwargs):
|
||||
project_name = session.get('AVALON_PROJECT', None)
|
||||
|
||||
projects_to_sync = []
|
||||
if project_name.strip() == '' or project_name is None:
|
||||
for project in io.projects():
|
||||
projects_to_sync.append(project)
|
||||
else:
|
||||
project = io.find_one({'type': 'project'})
|
||||
projects_to_sync.append(project)
|
||||
|
||||
projects_info = {}
|
||||
for project in projects_to_sync:
|
||||
task_types = [task['name'] for task in project['config']['tasks']]
|
||||
projects_info[project['name']] = task_types
|
||||
|
||||
clockify_projects = self.clockapi.get_projects()
|
||||
for project_name, task_types in projects_info.items():
|
||||
if project_name not in clockify_projects:
|
||||
response = self.clockapi.add_project(project_name)
|
||||
if 'id' not in response:
|
||||
self.log.error('Project {} can\'t be created'.format(
|
||||
project_name
|
||||
))
|
||||
continue
|
||||
project_id = response['id']
|
||||
else:
|
||||
project_id = clockify_projects[project_name]
|
||||
|
||||
clockify_workspace_tags = self.clockapi.get_tags()
|
||||
for task_type in task_types:
|
||||
if task_type not in clockify_workspace_tags:
|
||||
response = self.clockapi.add_tag(task_type)
|
||||
if 'id' not in response:
|
||||
self.log.error('Task {} can\'t be created'.format(
|
||||
task_type
|
||||
))
|
||||
continue
|
||||
155
pype/clockify/widget_settings.py
Normal file
155
pype/clockify/widget_settings.py
Normal file
|
|
@ -0,0 +1,155 @@
|
|||
import os
|
||||
from Qt import QtCore, QtGui, QtWidgets
|
||||
from pypeapp import style
|
||||
|
||||
|
||||
class ClockifySettings(QtWidgets.QWidget):
|
||||
|
||||
SIZE_W = 300
|
||||
SIZE_H = 130
|
||||
|
||||
loginSignal = QtCore.Signal(object, object, object)
|
||||
|
||||
def __init__(self, main_parent=None, parent=None, optional=True):
|
||||
|
||||
super(ClockifySettings, self).__init__()
|
||||
|
||||
self.parent = parent
|
||||
self.main_parent = main_parent
|
||||
self.clockapi = parent.clockapi
|
||||
self.optional = optional
|
||||
self.validated = False
|
||||
|
||||
# Icon
|
||||
if hasattr(parent, 'icon'):
|
||||
self.setWindowIcon(self.parent.icon)
|
||||
elif hasattr(parent, 'parent') and hasattr(parent.parent, 'icon'):
|
||||
self.setWindowIcon(self.parent.parent.icon)
|
||||
else:
|
||||
pype_setup = os.getenv('PYPE_ROOT')
|
||||
items = [pype_setup, "app", "resources", "icon.png"]
|
||||
fname = os.path.sep.join(items)
|
||||
icon = QtGui.QIcon(fname)
|
||||
self.setWindowIcon(icon)
|
||||
|
||||
self.setWindowFlags(
|
||||
QtCore.Qt.WindowCloseButtonHint |
|
||||
QtCore.Qt.WindowMinimizeButtonHint
|
||||
)
|
||||
|
||||
self._translate = QtCore.QCoreApplication.translate
|
||||
|
||||
# Font
|
||||
self.font = QtGui.QFont()
|
||||
self.font.setFamily("DejaVu Sans Condensed")
|
||||
self.font.setPointSize(9)
|
||||
self.font.setBold(True)
|
||||
self.font.setWeight(50)
|
||||
self.font.setKerning(True)
|
||||
|
||||
# Size setting
|
||||
self.resize(self.SIZE_W, self.SIZE_H)
|
||||
self.setMinimumSize(QtCore.QSize(self.SIZE_W, self.SIZE_H))
|
||||
self.setMaximumSize(QtCore.QSize(self.SIZE_W+100, self.SIZE_H+100))
|
||||
self.setStyleSheet(style.load_stylesheet())
|
||||
|
||||
self.setLayout(self._main())
|
||||
self.setWindowTitle('Clockify settings')
|
||||
|
||||
def _main(self):
|
||||
self.main = QtWidgets.QVBoxLayout()
|
||||
self.main.setObjectName("main")
|
||||
|
||||
self.form = QtWidgets.QFormLayout()
|
||||
self.form.setContentsMargins(10, 15, 10, 5)
|
||||
self.form.setObjectName("form")
|
||||
|
||||
self.label_api_key = QtWidgets.QLabel("Clockify API key:")
|
||||
self.label_api_key.setFont(self.font)
|
||||
self.label_api_key.setCursor(QtGui.QCursor(QtCore.Qt.ArrowCursor))
|
||||
self.label_api_key.setTextFormat(QtCore.Qt.RichText)
|
||||
self.label_api_key.setObjectName("label_api_key")
|
||||
|
||||
self.input_api_key = QtWidgets.QLineEdit()
|
||||
self.input_api_key.setEnabled(True)
|
||||
self.input_api_key.setFrame(True)
|
||||
self.input_api_key.setObjectName("input_api_key")
|
||||
self.input_api_key.setPlaceholderText(
|
||||
self._translate("main", "e.g. XX1XxXX2x3x4xXxx")
|
||||
)
|
||||
|
||||
self.error_label = QtWidgets.QLabel("")
|
||||
self.error_label.setFont(self.font)
|
||||
self.error_label.setTextFormat(QtCore.Qt.RichText)
|
||||
self.error_label.setObjectName("error_label")
|
||||
self.error_label.setWordWrap(True)
|
||||
self.error_label.hide()
|
||||
|
||||
self.form.addRow(self.label_api_key, self.input_api_key)
|
||||
self.form.addRow(self.error_label)
|
||||
|
||||
self.btn_group = QtWidgets.QHBoxLayout()
|
||||
self.btn_group.addStretch(1)
|
||||
self.btn_group.setObjectName("btn_group")
|
||||
|
||||
self.btn_ok = QtWidgets.QPushButton("Ok")
|
||||
self.btn_ok.setToolTip('Sets Clockify API Key so can Start/Stop timer')
|
||||
self.btn_ok.clicked.connect(self.click_ok)
|
||||
|
||||
self.btn_cancel = QtWidgets.QPushButton("Cancel")
|
||||
cancel_tooltip = 'Application won\'t start'
|
||||
if self.optional:
|
||||
cancel_tooltip = 'Close this window'
|
||||
self.btn_cancel.setToolTip(cancel_tooltip)
|
||||
self.btn_cancel.clicked.connect(self._close_widget)
|
||||
|
||||
self.btn_group.addWidget(self.btn_ok)
|
||||
self.btn_group.addWidget(self.btn_cancel)
|
||||
|
||||
self.main.addLayout(self.form)
|
||||
self.main.addLayout(self.btn_group)
|
||||
|
||||
return self.main
|
||||
|
||||
def setError(self, msg):
|
||||
self.error_label.setText(msg)
|
||||
self.error_label.show()
|
||||
|
||||
def invalid_input(self, entity):
|
||||
entity.setStyleSheet("border: 1px solid red;")
|
||||
|
||||
def click_ok(self):
|
||||
api_key = self.input_api_key.text().strip()
|
||||
if self.optional is True and api_key == '':
|
||||
self.clockapi.save_api_key(None)
|
||||
self.clockapi.set_api(api_key)
|
||||
self.validated = False
|
||||
self._close_widget()
|
||||
return
|
||||
|
||||
validation = self.clockapi.validate_api_key(api_key)
|
||||
|
||||
if validation:
|
||||
self.clockapi.save_api_key(api_key)
|
||||
self.clockapi.set_api(api_key)
|
||||
self.validated = True
|
||||
self._close_widget()
|
||||
else:
|
||||
self.invalid_input(self.input_api_key)
|
||||
self.validated = False
|
||||
self.setError(
|
||||
"Entered invalid API key"
|
||||
)
|
||||
|
||||
def closeEvent(self, event):
|
||||
if self.optional is True:
|
||||
event.ignore()
|
||||
self._close_widget()
|
||||
else:
|
||||
self.validated = False
|
||||
|
||||
def _close_widget(self):
|
||||
if self.optional is True:
|
||||
self.hide()
|
||||
else:
|
||||
self.close()
|
||||
|
|
@ -1 +1,2 @@
|
|||
|
||||
from .lib import *
|
||||
from .ftrack_server import *
|
||||
|
|
|
|||
|
|
@ -1,43 +0,0 @@
|
|||
import os
|
||||
import logging
|
||||
import toml
|
||||
import ftrack_api
|
||||
from ftrack_action_handler import AppAction
|
||||
from avalon import io, lib
|
||||
|
||||
|
||||
def register(session):
|
||||
|
||||
# TODO AVALON_PROJECT, AVALON_ASSET, AVALON_SILO need to be set or debug from avalon
|
||||
|
||||
# Get all projects from Avalon DB
|
||||
io.install()
|
||||
projects = sorted(io.projects(), key=lambda x: x['name'])
|
||||
io.uninstall()
|
||||
|
||||
apps=[]
|
||||
actions = []
|
||||
|
||||
# Get all application from all projects
|
||||
for project in projects:
|
||||
os.environ['AVALON_PROJECT'] = project['name']
|
||||
for app in project['config']['apps']:
|
||||
if app not in apps:
|
||||
apps.append(app)
|
||||
|
||||
for app in apps:
|
||||
name = app['name'].split("_")[0]
|
||||
variant = app['name'].split("_")[1]
|
||||
label = app['label']
|
||||
executable = toml.load(lib.which_app(app['name']))['executable']
|
||||
icon = None
|
||||
# TODO get right icons
|
||||
if 'nuke' in app['name']:
|
||||
icon = "https://mbtskoudsalg.com/images/nuke-icon-png-2.png"
|
||||
label = "Nuke"
|
||||
elif 'maya' in app['name']:
|
||||
icon = "http://icons.iconarchive.com/icons/froyoshark/enkel/256/Maya-icon.png"
|
||||
label = "Autodesk Maya"
|
||||
|
||||
# register action
|
||||
AppAction(session, label, name, executable, variant, icon).register()
|
||||
82
pype/ftrack/actions/action_application_loader.py
Normal file
82
pype/ftrack/actions/action_application_loader.py
Normal file
|
|
@ -0,0 +1,82 @@
|
|||
import os
|
||||
import toml
|
||||
import time
|
||||
from pype.ftrack import AppAction
|
||||
from avalon import lib
|
||||
from pypeapp import Logger
|
||||
from pype.lib import get_all_avalon_projects
|
||||
|
||||
log = Logger().get_logger(__name__)
|
||||
|
||||
|
||||
def registerApp(app, session):
|
||||
name = app['name']
|
||||
variant = ""
|
||||
try:
|
||||
variant = app['name'].split("_")[1]
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
abspath = lib.which_app(app['name'])
|
||||
if abspath is None:
|
||||
log.error(
|
||||
"'{0}' - App don't have config toml file".format(app['name'])
|
||||
)
|
||||
return
|
||||
|
||||
apptoml = toml.load(abspath)
|
||||
|
||||
''' REQUIRED '''
|
||||
executable = apptoml['executable']
|
||||
|
||||
''' OPTIONAL '''
|
||||
label = apptoml.get('ftrack_label', app.get('label', name))
|
||||
icon = apptoml.get('ftrack_icon', None)
|
||||
description = apptoml.get('description', None)
|
||||
preactions = apptoml.get('preactions', [])
|
||||
|
||||
if icon:
|
||||
icon = icon.format(os.environ.get('PYPE_STATICS_SERVER', ''))
|
||||
|
||||
# register action
|
||||
AppAction(
|
||||
session, label, name, executable, variant,
|
||||
icon, description, preactions
|
||||
).register()
|
||||
|
||||
if not variant:
|
||||
log.info('- Variant is not set')
|
||||
|
||||
|
||||
def register(session):
|
||||
# WARNING getting projects only helps to check connection to mongo
|
||||
# - without will `discover` of ftrack apps actions take ages
|
||||
result = get_all_avalon_projects()
|
||||
|
||||
apps = []
|
||||
|
||||
launchers_path = os.path.join(os.environ["PYPE_CONFIG"], "launchers")
|
||||
for file in os.listdir(launchers_path):
|
||||
filename, ext = os.path.splitext(file)
|
||||
if ext.lower() != ".toml":
|
||||
continue
|
||||
loaded_data = toml.load(os.path.join(launchers_path, file))
|
||||
app_data = {
|
||||
"name": filename,
|
||||
"label": loaded_data.get("label", filename)
|
||||
}
|
||||
apps.append(app_data)
|
||||
|
||||
apps = sorted(apps, key=lambda x: x['name'])
|
||||
app_counter = 0
|
||||
for app in apps:
|
||||
try:
|
||||
registerApp(app, session)
|
||||
if app_counter%5 == 0:
|
||||
time.sleep(0.1)
|
||||
app_counter += 1
|
||||
except Exception as exc:
|
||||
log.exception(
|
||||
"\"{}\" - not a proper App ({})".format(app['name'], str(exc)),
|
||||
exc_info=True
|
||||
)
|
||||
|
|
@ -1,9 +1,8 @@
|
|||
import sys
|
||||
import argparse
|
||||
import logging
|
||||
import getpass
|
||||
import ftrack_api
|
||||
from ftrack_action_handler import BaseAction
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction
|
||||
|
||||
|
||||
class AssetDelete(BaseAction):
|
||||
|
|
@ -14,17 +13,17 @@ class AssetDelete(BaseAction):
|
|||
#: Action label.
|
||||
label = 'Asset Delete'
|
||||
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
|
||||
if (len(entities) != 1 or entities[0].entity_type
|
||||
not in ['Shot', 'Asset Build']):
|
||||
|
||||
if (
|
||||
len(entities) != 1 or
|
||||
entities[0].entity_type not in ['Shot', 'Asset Build']
|
||||
):
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def interface(self, session, entities, event):
|
||||
|
||||
if not event['data'].get('values', {}):
|
||||
|
|
@ -38,10 +37,10 @@ class AssetDelete(BaseAction):
|
|||
label = asset['name']
|
||||
|
||||
items.append({
|
||||
'label':label,
|
||||
'name':label,
|
||||
'value':False,
|
||||
'type':'boolean'
|
||||
'label': label,
|
||||
'name': label,
|
||||
'value': False,
|
||||
'type': 'boolean'
|
||||
})
|
||||
|
||||
if len(items) < 1:
|
||||
|
|
@ -69,7 +68,7 @@ class AssetDelete(BaseAction):
|
|||
session.delete(asset)
|
||||
try:
|
||||
session.commit()
|
||||
except:
|
||||
except Exception:
|
||||
session.rollback()
|
||||
raise
|
||||
|
||||
|
|
@ -88,8 +87,7 @@ def register(session, **kw):
|
|||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
action_handler = AssetDelete(session)
|
||||
action_handler.register()
|
||||
AssetDelete(session).register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
|
|
|
|||
285
pype/ftrack/actions/action_attributes_remapper.py
Normal file
285
pype/ftrack/actions/action_attributes_remapper.py
Normal file
|
|
@ -0,0 +1,285 @@
|
|||
import os
|
||||
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction
|
||||
from avalon.tools.libraryloader.io_nonsingleton import DbConnector
|
||||
|
||||
|
||||
class AttributesRemapper(BaseAction):
|
||||
'''Edit meta data action.'''
|
||||
|
||||
#: Action identifier.
|
||||
identifier = 'attributes.remapper'
|
||||
#: Action label.
|
||||
label = 'Attributes Remapper'
|
||||
#: Action description.
|
||||
description = 'Remaps attributes in avalon DB'
|
||||
|
||||
#: roles that are allowed to register this action
|
||||
role_list = ["Pypeclub", "Administrator"]
|
||||
icon = '{}/ftrack/action_icons/AttributesRemapper.svg'.format(
|
||||
os.environ.get('PYPE_STATICS_SERVER', '')
|
||||
)
|
||||
|
||||
db_con = DbConnector()
|
||||
keys_to_change = {
|
||||
"fstart": "frameStart",
|
||||
"startFrame": "frameStart",
|
||||
"edit_in": "frameStart",
|
||||
|
||||
"fend": "frameEnd",
|
||||
"endFrame": "frameEnd",
|
||||
"edit_out": "frameEnd",
|
||||
|
||||
"handle_start": "handleStart",
|
||||
"handle_end": "handleEnd",
|
||||
"handles": ["handleEnd", "handleStart"],
|
||||
|
||||
"frameRate": "fps",
|
||||
"framerate": "fps",
|
||||
"resolution_width": "resolutionWidth",
|
||||
"resolution_height": "resolutionHeight",
|
||||
"pixel_aspect": "pixelAspect"
|
||||
}
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
|
||||
return True
|
||||
|
||||
def interface(self, session, entities, event):
|
||||
if event['data'].get('values', {}):
|
||||
return
|
||||
|
||||
title = 'Select Projects where attributes should be remapped'
|
||||
|
||||
items = []
|
||||
|
||||
selection_enum = {
|
||||
'label': 'Process type',
|
||||
'type': 'enumerator',
|
||||
'name': 'process_type',
|
||||
'data': [
|
||||
{
|
||||
'label': 'Selection',
|
||||
'value': 'selection'
|
||||
}, {
|
||||
'label': 'Inverted selection',
|
||||
'value': 'except'
|
||||
}
|
||||
],
|
||||
'value': 'selection'
|
||||
}
|
||||
selection_label = {
|
||||
'type': 'label',
|
||||
'value': (
|
||||
'Selection based variants:<br/>'
|
||||
'- `Selection` - '
|
||||
'NOTHING is processed when nothing is selected<br/>'
|
||||
'- `Inverted selection` - '
|
||||
'ALL Projects are processed when nothing is selected'
|
||||
)
|
||||
}
|
||||
|
||||
items.append(selection_enum)
|
||||
items.append(selection_label)
|
||||
|
||||
item_splitter = {'type': 'label', 'value': '---'}
|
||||
|
||||
all_projects = session.query('Project').all()
|
||||
for project in all_projects:
|
||||
item_label = {
|
||||
'type': 'label',
|
||||
'value': '{} (<i>{}</i>)'.format(
|
||||
project['full_name'], project['name']
|
||||
)
|
||||
}
|
||||
item = {
|
||||
'name': project['id'],
|
||||
'type': 'boolean',
|
||||
'value': False
|
||||
}
|
||||
if len(items) > 0:
|
||||
items.append(item_splitter)
|
||||
items.append(item_label)
|
||||
items.append(item)
|
||||
|
||||
if len(items) == 0:
|
||||
return {
|
||||
'success': False,
|
||||
'message': 'Didn\'t found any projects'
|
||||
}
|
||||
else:
|
||||
return {
|
||||
'items': items,
|
||||
'title': title
|
||||
}
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
if 'values' not in event['data']:
|
||||
return
|
||||
|
||||
values = event['data']['values']
|
||||
process_type = values.pop('process_type')
|
||||
|
||||
selection = True
|
||||
if process_type == 'except':
|
||||
selection = False
|
||||
|
||||
interface_messages = {}
|
||||
|
||||
projects_to_update = []
|
||||
for project_id, update_bool in values.items():
|
||||
if not update_bool and selection:
|
||||
continue
|
||||
|
||||
if update_bool and not selection:
|
||||
continue
|
||||
|
||||
project = session.query(
|
||||
'Project where id is "{}"'.format(project_id)
|
||||
).one()
|
||||
projects_to_update.append(project)
|
||||
|
||||
if not projects_to_update:
|
||||
self.log.debug('Nothing to update')
|
||||
return {
|
||||
'success': True,
|
||||
'message': 'Nothing to update'
|
||||
}
|
||||
|
||||
|
||||
self.db_con.install()
|
||||
|
||||
relevant_types = ["project", "asset", "version"]
|
||||
|
||||
for ft_project in projects_to_update:
|
||||
self.log.debug(
|
||||
"Processing project \"{}\"".format(ft_project["full_name"])
|
||||
)
|
||||
|
||||
self.db_con.Session["AVALON_PROJECT"] = ft_project["full_name"]
|
||||
project = self.db_con.find_one({'type': 'project'})
|
||||
if not project:
|
||||
key = "Projects not synchronized to db"
|
||||
if key not in interface_messages:
|
||||
interface_messages[key] = []
|
||||
interface_messages[key].append(ft_project["full_name"])
|
||||
continue
|
||||
|
||||
# Get all entities in project collection from MongoDB
|
||||
_entities = self.db_con.find({})
|
||||
for _entity in _entities:
|
||||
ent_t = _entity.get("type", "*unknown type")
|
||||
name = _entity.get("name", "*unknown name")
|
||||
|
||||
self.log.debug(
|
||||
"- {} ({})".format(name, ent_t)
|
||||
)
|
||||
|
||||
# Skip types that do not store keys to change
|
||||
if ent_t.lower() not in relevant_types:
|
||||
self.log.debug("-- skipping - type is not relevant")
|
||||
continue
|
||||
|
||||
# Get data which will change
|
||||
updating_data = {}
|
||||
source_data = _entity["data"]
|
||||
|
||||
for key_from, key_to in self.keys_to_change.items():
|
||||
# continue if final key already exists
|
||||
if type(key_to) == list:
|
||||
for key in key_to:
|
||||
# continue if final key was set in update_data
|
||||
if key in updating_data:
|
||||
continue
|
||||
|
||||
# continue if source key not exist or value is None
|
||||
value = source_data.get(key_from)
|
||||
if value is None:
|
||||
continue
|
||||
|
||||
self.log.debug(
|
||||
"-- changing key {} to {}".format(
|
||||
key_from,
|
||||
key
|
||||
)
|
||||
)
|
||||
|
||||
updating_data[key] = value
|
||||
else:
|
||||
if key_to in source_data:
|
||||
continue
|
||||
|
||||
# continue if final key was set in update_data
|
||||
if key_to in updating_data:
|
||||
continue
|
||||
|
||||
# continue if source key not exist or value is None
|
||||
value = source_data.get(key_from)
|
||||
if value is None:
|
||||
continue
|
||||
|
||||
self.log.debug(
|
||||
"-- changing key {} to {}".format(key_from, key_to)
|
||||
)
|
||||
updating_data[key_to] = value
|
||||
|
||||
# Pop out old keys from entity
|
||||
is_obsolete = False
|
||||
for key in self.keys_to_change:
|
||||
if key not in source_data:
|
||||
continue
|
||||
is_obsolete = True
|
||||
source_data.pop(key)
|
||||
|
||||
# continue if there is nothing to change
|
||||
if not is_obsolete and not updating_data:
|
||||
self.log.debug("-- nothing to change")
|
||||
continue
|
||||
|
||||
source_data.update(updating_data)
|
||||
|
||||
self.db_con.update_many(
|
||||
{"_id": _entity["_id"]},
|
||||
{"$set": {"data": source_data}}
|
||||
)
|
||||
|
||||
self.db_con.uninstall()
|
||||
|
||||
if interface_messages:
|
||||
self.show_interface_from_dict(
|
||||
messages=interface_messages,
|
||||
title="Errors during remapping attributes",
|
||||
event=event
|
||||
)
|
||||
|
||||
return True
|
||||
|
||||
def show_interface_from_dict(self, event, messages, title=""):
|
||||
items = []
|
||||
|
||||
for key, value in messages.items():
|
||||
if not value:
|
||||
continue
|
||||
subtitle = {'type': 'label', 'value': '# {}'.format(key)}
|
||||
items.append(subtitle)
|
||||
if isinstance(value, list):
|
||||
for item in value:
|
||||
message = {
|
||||
'type': 'label', 'value': '<p>{}</p>'.format(item)
|
||||
}
|
||||
items.append(message)
|
||||
else:
|
||||
message = {'type': 'label', 'value': '<p>{}</p>'.format(value)}
|
||||
items.append(message)
|
||||
|
||||
self.show_interface(event, items, title)
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
AttributesRemapper(session).register()
|
||||
|
|
@ -1,11 +1,9 @@
|
|||
import sys
|
||||
import argparse
|
||||
import logging
|
||||
import os
|
||||
import getpass
|
||||
|
||||
import ftrack_api
|
||||
from ftrack_action_handler import BaseAction
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction
|
||||
|
||||
|
||||
class ClientReviewSort(BaseAction):
|
||||
|
|
@ -17,7 +15,6 @@ class ClientReviewSort(BaseAction):
|
|||
#: Action label.
|
||||
label = 'Sort Review'
|
||||
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
|
||||
|
|
@ -26,7 +23,6 @@ class ClientReviewSort(BaseAction):
|
|||
|
||||
return True
|
||||
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
|
||||
entity = entities[0]
|
||||
|
|
@ -40,7 +36,9 @@ class ClientReviewSort(BaseAction):
|
|||
|
||||
# Sort criteria
|
||||
obj_list = sorted(obj_list, key=lambda k: k['version'])
|
||||
obj_list = sorted(obj_list, key=lambda k: k['asset_version']['task']['name'])
|
||||
obj_list = sorted(
|
||||
obj_list, key=lambda k: k['asset_version']['task']['name']
|
||||
)
|
||||
obj_list = sorted(obj_list, key=lambda k: k['name'])
|
||||
|
||||
# Set 'sort order' to sorted list, so they are sorted in Ftrack also
|
||||
|
|
|
|||
|
|
@ -1,14 +1,10 @@
|
|||
# :coding: utf-8
|
||||
# :copyright: Copyright (c) 2015 Milan Kolar
|
||||
|
||||
import os
|
||||
import sys
|
||||
import argparse
|
||||
import logging
|
||||
import getpass
|
||||
import subprocess
|
||||
import os
|
||||
import ftrack_api
|
||||
from ftrack_action_handler import BaseAction
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction
|
||||
|
||||
|
||||
class ComponentOpen(BaseAction):
|
||||
|
|
@ -19,8 +15,9 @@ class ComponentOpen(BaseAction):
|
|||
# Action label
|
||||
label = 'Open File'
|
||||
# Action icon
|
||||
icon = 'https://cdn4.iconfinder.com/data/icons/rcons-application/32/application_go_run-256.png',
|
||||
|
||||
icon = '{}/ftrack/action_icons/ComponentOpen.svg'.format(
|
||||
os.environ.get('PYPE_STATICS_SERVER', '')
|
||||
)
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
|
|
@ -29,13 +26,13 @@ class ComponentOpen(BaseAction):
|
|||
|
||||
return True
|
||||
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
|
||||
entity = entities[0]
|
||||
|
||||
# Return error if component is on ftrack server
|
||||
if entity['component_locations'][0]['location']['name'] == 'ftrack.server':
|
||||
location_name = entity['component_locations'][0]['location']['name']
|
||||
if location_name == 'ftrack.server':
|
||||
return {
|
||||
'success': False,
|
||||
'message': "This component is stored on ftrack server!"
|
||||
|
|
@ -44,12 +41,10 @@ class ComponentOpen(BaseAction):
|
|||
# Get component filepath
|
||||
# TODO with locations it will be different???
|
||||
fpath = entity['component_locations'][0]['resource_identifier']
|
||||
items = fpath.split(os.sep)
|
||||
items.pop(-1)
|
||||
fpath = os.sep.join(items)
|
||||
fpath = os.path.normpath(os.path.dirname(fpath))
|
||||
|
||||
if os.path.isdir(fpath):
|
||||
if 'win' in sys.platform: # windows
|
||||
if 'win' in sys.platform: # windows
|
||||
subprocess.Popen('explorer "%s"' % fpath)
|
||||
elif sys.platform == 'darwin': # macOS
|
||||
subprocess.Popen(['open', fpath])
|
||||
|
|
@ -79,8 +74,7 @@ def register(session, **kw):
|
|||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
action_handler = ComponentOpen(session)
|
||||
action_handler.register()
|
||||
ComponentOpen(session).register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
|
|
|
|||
|
|
@ -1,185 +0,0 @@
|
|||
# :coding: utf-8
|
||||
# :copyright: Copyright (c) 2017 ftrack
|
||||
import sys
|
||||
import argparse
|
||||
import logging
|
||||
import os
|
||||
import json
|
||||
import ftrack_api
|
||||
from ftrack_action_handler import BaseAction
|
||||
|
||||
from avalon import io, inventory, lib
|
||||
from avalon.vendor import toml
|
||||
|
||||
|
||||
class AvalonIdAttribute(BaseAction):
|
||||
'''Edit meta data action.'''
|
||||
|
||||
#: Action identifier.
|
||||
identifier = 'avalon.id.attribute'
|
||||
#: Action label.
|
||||
label = 'Create Avalon Attribute'
|
||||
#: Action description.
|
||||
description = 'Creates Avalon/Mongo ID for double check'
|
||||
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
|
||||
# userId = event['source']['user']['id']
|
||||
# user = session.query('User where id is ' + userId).one()
|
||||
# if user['user_security_roles'][0]['security_role']['name'] != 'Administrator':
|
||||
# return False
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
# JOB SETTINGS
|
||||
|
||||
userId = event['source']['user']['id']
|
||||
user = session.query('User where id is ' + userId).one()
|
||||
|
||||
job = session.create('Job', {
|
||||
'user': user,
|
||||
'status': 'running',
|
||||
'data': json.dumps({
|
||||
'description': 'Custom Attribute creation.'
|
||||
})
|
||||
})
|
||||
session.commit()
|
||||
try:
|
||||
# Attribute Name and Label
|
||||
custAttrName = 'avalon_mongo_id'
|
||||
custAttrLabel = 'Avalon/Mongo Id'
|
||||
# Types that don't need object_type_id
|
||||
base = {'show'}
|
||||
# Don't create custom attribute on these entity types:
|
||||
exceptions = ['task','milestone','library']
|
||||
exceptions.extend(base)
|
||||
# Get all possible object types
|
||||
all_obj_types = session.query('ObjectType').all()
|
||||
count_types = len(all_obj_types)
|
||||
# Filter object types by exceptions
|
||||
for index in range(count_types):
|
||||
i = count_types - 1 - index
|
||||
name = all_obj_types[i]['name'].lower()
|
||||
|
||||
if " " in name:
|
||||
name = name.replace(" ","")
|
||||
|
||||
if name in exceptions:
|
||||
all_obj_types.pop(i)
|
||||
|
||||
# Get IDs of filtered object types
|
||||
all_obj_types_id = set()
|
||||
for obj in all_obj_types:
|
||||
all_obj_types_id.add(obj['id'])
|
||||
|
||||
# Get all custom attributes
|
||||
current_cust_attr = session.query('CustomAttributeConfiguration').all()
|
||||
# Filter already existing AvalonMongoID attr.
|
||||
for attr in current_cust_attr:
|
||||
if attr['key'] == custAttrName:
|
||||
if attr['entity_type'] in base:
|
||||
base.remove(attr['entity_type'])
|
||||
if attr['object_type_id'] in all_obj_types_id:
|
||||
all_obj_types_id.remove(attr['object_type_id'])
|
||||
|
||||
# Set session back to begin("session.query" raises error on commit)
|
||||
session.rollback()
|
||||
# Set security roles for attribute
|
||||
custAttrSecuRole = session.query('SecurityRole').all()
|
||||
# Set Text type of Attribute
|
||||
custom_attribute_type = session.query(
|
||||
'CustomAttributeType where name is "text"'
|
||||
).one()
|
||||
|
||||
for entity_type in base:
|
||||
# Create a custom attribute configuration.
|
||||
session.create('CustomAttributeConfiguration', {
|
||||
'entity_type': entity_type,
|
||||
'type': custom_attribute_type,
|
||||
'label': custAttrLabel,
|
||||
'key': custAttrName,
|
||||
'default': '',
|
||||
'write_security_roles': custAttrSecuRole,
|
||||
'read_security_roles': custAttrSecuRole,
|
||||
'config': json.dumps({'markdown': False})
|
||||
})
|
||||
|
||||
for type in all_obj_types_id:
|
||||
# Create a custom attribute configuration.
|
||||
session.create('CustomAttributeConfiguration', {
|
||||
'entity_type': 'task',
|
||||
'object_type_id': type,
|
||||
'type': custom_attribute_type,
|
||||
'label': custAttrLabel,
|
||||
'key': custAttrName,
|
||||
'default': '',
|
||||
'write_security_roles': custAttrSecuRole,
|
||||
'read_security_roles': custAttrSecuRole,
|
||||
'config': json.dumps({'markdown': False})
|
||||
})
|
||||
|
||||
job['status'] = 'done'
|
||||
session.commit()
|
||||
|
||||
except Exception as e:
|
||||
job['status'] = 'failed'
|
||||
print("Creating custom attributes failed")
|
||||
print(e)
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
# Validate that session is an instance of ftrack_api.Session. If not,
|
||||
# assume that register is being called from an old or incompatible API and
|
||||
# return without doing anything.
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
action_handler = AvalonIdAttribute(session)
|
||||
action_handler.register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
'''Set up logging and register action.'''
|
||||
if arguments is None:
|
||||
arguments = []
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
# Allow setting of logging level from arguments.
|
||||
loggingLevels = {}
|
||||
for level in (
|
||||
logging.NOTSET, logging.DEBUG, logging.INFO, logging.WARNING,
|
||||
logging.ERROR, logging.CRITICAL
|
||||
):
|
||||
loggingLevels[logging.getLevelName(level).lower()] = level
|
||||
|
||||
parser.add_argument(
|
||||
'-v', '--verbosity',
|
||||
help='Set the logging output verbosity.',
|
||||
choices=loggingLevels.keys(),
|
||||
default='info'
|
||||
)
|
||||
namespace = parser.parse_args(arguments)
|
||||
|
||||
# Set up basic logging
|
||||
logging.basicConfig(level=loggingLevels[namespace.verbosity])
|
||||
|
||||
session = ftrack_api.Session()
|
||||
register(session)
|
||||
|
||||
# Wait for events
|
||||
logging.info(
|
||||
'Registered actions and listening for events. Use Ctrl-C to abort.'
|
||||
)
|
||||
session.event_hub.wait()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
raise SystemExit(main(sys.argv[1:]))
|
||||
619
pype/ftrack/actions/action_create_cust_attrs.py
Normal file
619
pype/ftrack/actions/action_create_cust_attrs.py
Normal file
|
|
@ -0,0 +1,619 @@
|
|||
import os
|
||||
import sys
|
||||
import argparse
|
||||
import json
|
||||
import arrow
|
||||
import logging
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction, get_ca_mongoid
|
||||
from pypeapp import config
|
||||
from ftrack_api.exception import NoResultFoundError
|
||||
|
||||
"""
|
||||
This action creates/updates custom attributes.
|
||||
- first part take care about avalon_mongo_id attribute
|
||||
- second part is based on json file in templates:
|
||||
~/PYPE-TEMPLATES/presets/ftrack/ftrack_custom_attributes.json
|
||||
- you can add Custom attributes based on these conditions
|
||||
|
||||
*** Required ***************************************************************
|
||||
|
||||
label (string)
|
||||
- label that will show in ftrack
|
||||
|
||||
key (string)
|
||||
- must contain only chars [a-z0-9_]
|
||||
|
||||
type (string)
|
||||
- type of custom attribute
|
||||
- possibilities: text, boolean, date, enumerator, dynamic enumerator, number
|
||||
|
||||
*** Required with conditions ***********************************************
|
||||
|
||||
entity_type (string)
|
||||
- if 'is_hierarchical' is set to False
|
||||
- type of entity
|
||||
- possibilities: task, show, assetversion, user, list, asset
|
||||
|
||||
config (dictionary)
|
||||
- for each entity type different requirements and possibilities:
|
||||
- enumerator: multiSelect = True/False(default: False)
|
||||
data = {key_1:value_1,key_2:value_2,..,key_n:value_n}
|
||||
- 'data' is Required value with enumerator
|
||||
- 'key' must contain only chars [a-z0-9_]
|
||||
|
||||
- number: isdecimal = True/False(default: False)
|
||||
|
||||
- text: markdown = True/False(default: False)
|
||||
|
||||
object_type (string)
|
||||
- IF ENTITY_TYPE is set to 'task'
|
||||
- default possibilities: Folder, Shot, Sequence, Task, Library,
|
||||
Milestone, Episode, Asset Build,...
|
||||
|
||||
*** Optional ***************************************************************
|
||||
|
||||
write_security_roles/read_security_roles (array of strings)
|
||||
- default: ["ALL"]
|
||||
- strings should be role names (e.g.: ["API", "Administrator"])
|
||||
- if set to ["ALL"] - all roles will be availabled
|
||||
- if first is 'except' - roles will be set to all except roles in array
|
||||
- Warning: Be carefull with except - roles can be different by company
|
||||
- example:
|
||||
write_security_roles = ["except", "User"]
|
||||
read_security_roles = ["ALL"]
|
||||
- User is unable to write but can read
|
||||
|
||||
group (string)
|
||||
- default: None
|
||||
- name of group
|
||||
|
||||
default
|
||||
- default: None
|
||||
- sets default value for custom attribute:
|
||||
- text -> string
|
||||
- number -> integer
|
||||
- enumerator -> array with string of key/s
|
||||
- boolean -> bool true/false
|
||||
- date -> string in format: 'YYYY.MM.DD' or 'YYYY.MM.DD HH:mm:ss'
|
||||
- example: "2018.12.24" / "2018.1.1 6:0:0"
|
||||
- dynamic enumerator -> DON'T HAVE DEFAULT VALUE!!!
|
||||
|
||||
is_hierarchical (bool)
|
||||
- default: False
|
||||
- will set hierachical attribute
|
||||
- False by default
|
||||
|
||||
EXAMPLE:
|
||||
{
|
||||
"avalon_auto_sync": {
|
||||
"label": "Avalon auto-sync",
|
||||
"key": "avalon_auto_sync",
|
||||
"type": "boolean",
|
||||
"entity_type": "show",
|
||||
"group": "avalon",
|
||||
"default": false,
|
||||
"write_security_role": ["API","Administrator"],
|
||||
"read_security_role": ["API","Administrator"]
|
||||
}
|
||||
}
|
||||
"""
|
||||
|
||||
|
||||
class CustAttrException(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class CustomAttributes(BaseAction):
|
||||
'''Edit meta data action.'''
|
||||
|
||||
#: Action identifier.
|
||||
identifier = 'create.update.attributes'
|
||||
#: Action label.
|
||||
label = 'Create/Update Avalon Attributes'
|
||||
#: Action description.
|
||||
description = 'Creates Avalon/Mongo ID for double check'
|
||||
#: roles that are allowed to register this action
|
||||
role_list = ['Pypeclub', 'Administrator']
|
||||
icon = '{}/ftrack/action_icons/CustomAttributes.svg'.format(
|
||||
os.environ.get('PYPE_STATICS_SERVER', '')
|
||||
)
|
||||
|
||||
required_keys = ['key', 'label', 'type']
|
||||
type_posibilities = [
|
||||
'text', 'boolean', 'date', 'enumerator',
|
||||
'dynamic enumerator', 'number'
|
||||
]
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
'''
|
||||
Validation
|
||||
- action is only for Administrators
|
||||
'''
|
||||
return True
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
self.types = {}
|
||||
self.object_type_ids = {}
|
||||
self.groups = {}
|
||||
self.security_roles = {}
|
||||
|
||||
# JOB SETTINGS
|
||||
userId = event['source']['user']['id']
|
||||
user = session.query('User where id is ' + userId).one()
|
||||
|
||||
job = session.create('Job', {
|
||||
'user': user,
|
||||
'status': 'running',
|
||||
'data': json.dumps({
|
||||
'description': 'Custom Attribute creation.'
|
||||
})
|
||||
})
|
||||
session.commit()
|
||||
try:
|
||||
self.avalon_mongo_id_attributes(session)
|
||||
self.custom_attributes_from_file(session, event)
|
||||
|
||||
job['status'] = 'done'
|
||||
session.commit()
|
||||
|
||||
except Exception as exc:
|
||||
session.rollback()
|
||||
job['status'] = 'failed'
|
||||
session.commit()
|
||||
self.log.error(
|
||||
'Creating custom attributes failed ({})'.format(exc),
|
||||
exc_info=True
|
||||
)
|
||||
|
||||
return True
|
||||
|
||||
def avalon_mongo_id_attributes(self, session):
|
||||
# Attribute Name and Label
|
||||
cust_attr_name = get_ca_mongoid()
|
||||
cust_attr_label = 'Avalon/Mongo Id'
|
||||
|
||||
# Types that don't need object_type_id
|
||||
base = {'show'}
|
||||
|
||||
# Don't create custom attribute on these entity types:
|
||||
exceptions = ['task', 'milestone']
|
||||
exceptions.extend(base)
|
||||
|
||||
# Get all possible object types
|
||||
all_obj_types = session.query('ObjectType').all()
|
||||
|
||||
# Filter object types by exceptions
|
||||
filtered_types_id = set()
|
||||
|
||||
for obj_type in all_obj_types:
|
||||
name = obj_type['name']
|
||||
if " " in name:
|
||||
name = name.replace(' ', '')
|
||||
|
||||
if obj_type['name'] not in self.object_type_ids:
|
||||
self.object_type_ids[name] = obj_type['id']
|
||||
|
||||
if name.lower() not in exceptions:
|
||||
filtered_types_id.add(obj_type['id'])
|
||||
|
||||
# Set security roles for attribute
|
||||
role_list = ['API', 'Administrator']
|
||||
roles = self.get_security_role(role_list)
|
||||
# Set Text type of Attribute
|
||||
custom_attribute_type = self.get_type('text')
|
||||
# Set group to 'avalon'
|
||||
group = self.get_group('avalon')
|
||||
|
||||
data = {}
|
||||
data['key'] = cust_attr_name
|
||||
data['label'] = cust_attr_label
|
||||
data['type'] = custom_attribute_type
|
||||
data['default'] = ''
|
||||
data['write_security_roles'] = roles
|
||||
data['read_security_roles'] = roles
|
||||
data['group'] = group
|
||||
data['config'] = json.dumps({'markdown': False})
|
||||
|
||||
for entity_type in base:
|
||||
data['entity_type'] = entity_type
|
||||
self.process_attribute(data)
|
||||
|
||||
data['entity_type'] = 'task'
|
||||
for object_type_id in filtered_types_id:
|
||||
data['object_type_id'] = str(object_type_id)
|
||||
self.process_attribute(data)
|
||||
|
||||
def custom_attributes_from_file(self, session, event):
|
||||
presets = config.get_presets()['ftrack']['ftrack_custom_attributes']
|
||||
|
||||
for cust_attr_data in presets:
|
||||
cust_attr_name = cust_attr_data.get(
|
||||
'label',
|
||||
cust_attr_data.get('key')
|
||||
)
|
||||
try:
|
||||
data = {}
|
||||
# Get key, label, type
|
||||
data.update(self.get_required(cust_attr_data))
|
||||
# Get hierachical/ entity_type/ object_id
|
||||
data.update(self.get_entity_type(cust_attr_data))
|
||||
# Get group, default, security roles
|
||||
data.update(self.get_optional(cust_attr_data))
|
||||
# Process data
|
||||
self.process_attribute(data)
|
||||
|
||||
except CustAttrException as cae:
|
||||
if cust_attr_name:
|
||||
msg = 'Custom attribute error "{}" - {}'.format(
|
||||
cust_attr_name, str(cae)
|
||||
)
|
||||
else:
|
||||
msg = 'Custom attribute error - {}'.format(str(cae))
|
||||
self.log.warning(msg, exc_info=True)
|
||||
self.show_message(event, msg)
|
||||
|
||||
return True
|
||||
|
||||
def process_attribute(self, data):
|
||||
existing_atr = self.session.query('CustomAttributeConfiguration').all()
|
||||
matching = []
|
||||
for attr in existing_atr:
|
||||
if (
|
||||
attr['key'] != data['key'] or
|
||||
attr['type']['name'] != data['type']['name']
|
||||
):
|
||||
continue
|
||||
|
||||
if data.get('is_hierarchical', False) is True:
|
||||
if attr['is_hierarchical'] is True:
|
||||
matching.append(attr)
|
||||
elif 'object_type_id' in data:
|
||||
if (
|
||||
attr['entity_type'] == data['entity_type'] and
|
||||
attr['object_type_id'] == data['object_type_id']
|
||||
):
|
||||
matching.append(attr)
|
||||
else:
|
||||
if attr['entity_type'] == data['entity_type']:
|
||||
matching.append(attr)
|
||||
|
||||
if len(matching) == 0:
|
||||
self.session.create('CustomAttributeConfiguration', data)
|
||||
self.session.commit()
|
||||
self.log.debug(
|
||||
'{}: "{}" created'.format(self.label, data['label'])
|
||||
)
|
||||
|
||||
elif len(matching) == 1:
|
||||
attr_update = matching[0]
|
||||
for key in data:
|
||||
if (
|
||||
key not in [
|
||||
'is_hierarchical', 'entity_type', 'object_type_id'
|
||||
]
|
||||
):
|
||||
attr_update[key] = data[key]
|
||||
|
||||
self.log.debug(
|
||||
'{}: "{}" updated'.format(self.label, data['label'])
|
||||
)
|
||||
self.session.commit()
|
||||
|
||||
else:
|
||||
raise CustAttrException('Is duplicated')
|
||||
|
||||
def get_required(self, attr):
|
||||
output = {}
|
||||
for key in self.required_keys:
|
||||
if key not in attr:
|
||||
raise CustAttrException(
|
||||
'Key {} is required - please set'.format(key)
|
||||
)
|
||||
|
||||
if attr['type'].lower() not in self.type_posibilities:
|
||||
raise CustAttrException(
|
||||
'Type {} is not valid'.format(attr['type'])
|
||||
)
|
||||
|
||||
type_name = attr['type'].lower()
|
||||
|
||||
output['key'] = attr['key']
|
||||
output['label'] = attr['label']
|
||||
output['type'] = self.get_type(type_name)
|
||||
|
||||
config = None
|
||||
if type_name == 'number':
|
||||
config = self.get_number_config(attr)
|
||||
elif type_name == 'text':
|
||||
config = self.get_text_config(attr)
|
||||
elif type_name == 'enumerator':
|
||||
config = self.get_enumerator_config(attr)
|
||||
|
||||
if config is not None:
|
||||
output['config'] = config
|
||||
|
||||
return output
|
||||
|
||||
def get_number_config(self, attr):
|
||||
if 'config' in attr and 'isdecimal' in attr['config']:
|
||||
isdecimal = attr['config']['isdecimal']
|
||||
else:
|
||||
isdecimal = False
|
||||
|
||||
config = json.dumps({'isdecimal': isdecimal})
|
||||
|
||||
return config
|
||||
|
||||
def get_text_config(self, attr):
|
||||
if 'config' in attr and 'markdown' in attr['config']:
|
||||
markdown = attr['config']['markdown']
|
||||
else:
|
||||
markdown = False
|
||||
config = json.dumps({'markdown': markdown})
|
||||
|
||||
return config
|
||||
|
||||
def get_enumerator_config(self, attr):
|
||||
if 'config' not in attr:
|
||||
raise CustAttrException('Missing config with data')
|
||||
if 'data' not in attr['config']:
|
||||
raise CustAttrException('Missing data in config')
|
||||
|
||||
data = []
|
||||
for item in attr['config']['data']:
|
||||
item_data = {}
|
||||
for key in item:
|
||||
# TODO key check by regex
|
||||
item_data['menu'] = item[key]
|
||||
item_data['value'] = key
|
||||
data.append(item_data)
|
||||
|
||||
multiSelect = False
|
||||
for k in attr['config']:
|
||||
if k.lower() == 'multiselect':
|
||||
if isinstance(attr['config'][k], bool):
|
||||
multiSelect = attr['config'][k]
|
||||
else:
|
||||
raise CustAttrException('Multiselect must be boolean')
|
||||
break
|
||||
|
||||
config = json.dumps({
|
||||
'multiSelect': multiSelect,
|
||||
'data': json.dumps(data)
|
||||
})
|
||||
|
||||
return config
|
||||
|
||||
def get_group(self, attr):
|
||||
if isinstance(attr, str):
|
||||
group_name = attr
|
||||
else:
|
||||
group_name = attr['group'].lower()
|
||||
if group_name in self.groups:
|
||||
return self.groups[group_name]
|
||||
|
||||
query = 'CustomAttributeGroup where name is "{}"'.format(group_name)
|
||||
groups = self.session.query(query).all()
|
||||
|
||||
if len(groups) == 1:
|
||||
group = groups[0]
|
||||
self.groups[group_name] = group
|
||||
|
||||
return group
|
||||
|
||||
elif len(groups) < 1:
|
||||
group = self.session.create('CustomAttributeGroup', {
|
||||
'name': group_name,
|
||||
})
|
||||
self.session.commit()
|
||||
|
||||
return group
|
||||
|
||||
else:
|
||||
raise CustAttrException(
|
||||
'Found more than one group "{}"'.format(group_name)
|
||||
)
|
||||
|
||||
def get_role_ALL(self):
|
||||
role_name = 'ALL'
|
||||
if role_name in self.security_roles:
|
||||
all_roles = self.security_roles[role_name]
|
||||
else:
|
||||
all_roles = self.session.query('SecurityRole').all()
|
||||
self.security_roles[role_name] = all_roles
|
||||
for role in all_roles:
|
||||
if role['name'] not in self.security_roles:
|
||||
self.security_roles[role['name']] = role
|
||||
return all_roles
|
||||
|
||||
def get_security_role(self, security_roles):
|
||||
roles = []
|
||||
security_roles_lowered = [role.lower() for role in security_roles]
|
||||
if len(security_roles) == 0 or 'all' in security_roles_lowered:
|
||||
roles = self.get_role_ALL()
|
||||
elif security_roles_lowered[0] == 'except':
|
||||
excepts = security_roles[1:]
|
||||
all = self.get_role_ALL()
|
||||
for role in all:
|
||||
if role['name'] not in excepts:
|
||||
roles.append(role)
|
||||
if role['name'] not in self.security_roles:
|
||||
self.security_roles[role['name']] = role
|
||||
else:
|
||||
for role_name in security_roles:
|
||||
if role_name in self.security_roles:
|
||||
roles.append(self.security_roles[role_name])
|
||||
continue
|
||||
|
||||
try:
|
||||
query = 'SecurityRole where name is "{}"'.format(role_name)
|
||||
role = self.session.query(query).one()
|
||||
self.security_roles[role_name] = role
|
||||
roles.append(role)
|
||||
except NoResultFoundError:
|
||||
raise CustAttrException((
|
||||
'Securit role "{}" does not exist'
|
||||
).format(role_name))
|
||||
|
||||
return roles
|
||||
|
||||
def get_default(self, attr):
|
||||
type = attr['type']
|
||||
default = attr['default']
|
||||
if default is None:
|
||||
return default
|
||||
err_msg = 'Default value is not'
|
||||
if type == 'number':
|
||||
if not isinstance(default, (float, int)):
|
||||
raise CustAttrException('{} integer'.format(err_msg))
|
||||
elif type == 'text':
|
||||
if not isinstance(default, str):
|
||||
raise CustAttrException('{} string'.format(err_msg))
|
||||
elif type == 'boolean':
|
||||
if not isinstance(default, bool):
|
||||
raise CustAttrException('{} boolean'.format(err_msg))
|
||||
elif type == 'enumerator':
|
||||
if not isinstance(default, list):
|
||||
raise CustAttrException(
|
||||
'{} array with strings'.format(err_msg)
|
||||
)
|
||||
# TODO check if multiSelect is available
|
||||
# and if default is one of data menu
|
||||
if not isinstance(default[0], str):
|
||||
raise CustAttrException('{} array of strings'.format(err_msg))
|
||||
elif type == 'date':
|
||||
date_items = default.split(' ')
|
||||
try:
|
||||
if len(date_items) == 1:
|
||||
default = arrow.get(default, 'YY.M.D')
|
||||
elif len(date_items) == 2:
|
||||
default = arrow.get(default, 'YY.M.D H:m:s')
|
||||
else:
|
||||
raise Exception
|
||||
except Exception:
|
||||
raise CustAttrException('Date is not in proper format')
|
||||
elif type == 'dynamic enumerator':
|
||||
raise CustAttrException('Dynamic enumerator can\'t have default')
|
||||
|
||||
return default
|
||||
|
||||
def get_optional(self, attr):
|
||||
output = {}
|
||||
if 'group' in attr:
|
||||
output['group'] = self.get_group(attr)
|
||||
if 'default' in attr:
|
||||
output['default'] = self.get_default(attr)
|
||||
|
||||
roles_read = []
|
||||
roles_write = []
|
||||
if 'read_security_roles' in output:
|
||||
roles_read = attr['read_security_roles']
|
||||
if 'read_security_roles' in output:
|
||||
roles_write = attr['write_security_roles']
|
||||
output['read_security_roles'] = self.get_security_role(roles_read)
|
||||
output['write_security_roles'] = self.get_security_role(roles_write)
|
||||
|
||||
return output
|
||||
|
||||
def get_type(self, type_name):
|
||||
if type_name in self.types:
|
||||
return self.types[type_name]
|
||||
|
||||
query = 'CustomAttributeType where name is "{}"'.format(type_name)
|
||||
type = self.session.query(query).one()
|
||||
self.types[type_name] = type
|
||||
|
||||
return type
|
||||
|
||||
def get_entity_type(self, attr):
|
||||
if 'is_hierarchical' in attr:
|
||||
if attr['is_hierarchical'] is True:
|
||||
type = 'show'
|
||||
if 'entity_type' in attr:
|
||||
type = attr['entity_type']
|
||||
|
||||
return {
|
||||
'is_hierarchical': True,
|
||||
'entity_type': type
|
||||
}
|
||||
|
||||
if 'entity_type' not in attr:
|
||||
raise CustAttrException('Missing entity_type')
|
||||
|
||||
if attr['entity_type'].lower() != 'task':
|
||||
return {'entity_type': attr['entity_type']}
|
||||
|
||||
if 'object_type' not in attr:
|
||||
raise CustAttrException('Missing object_type')
|
||||
|
||||
object_type_name = attr['object_type']
|
||||
if object_type_name not in self.object_type_ids:
|
||||
try:
|
||||
query = 'ObjectType where name is "{}"'.format(
|
||||
object_type_name
|
||||
)
|
||||
object_type_id = self.session.query(query).one()['id']
|
||||
except Exception:
|
||||
raise CustAttrException((
|
||||
'Object type with name "{}" don\'t exist'
|
||||
).format(object_type_name))
|
||||
self.object_type_ids[object_type_name] = object_type_id
|
||||
else:
|
||||
object_type_id = self.object_type_ids[object_type_name]
|
||||
|
||||
return {
|
||||
'entity_type': attr['entity_type'],
|
||||
'object_type_id': object_type_id
|
||||
}
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
# Validate that session is an instance of ftrack_api.Session. If not,
|
||||
# assume that register is being called from an old or incompatible API and
|
||||
# return without doing anything.
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
CustomAttributes(session).register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
'''Set up logging and register action.'''
|
||||
if arguments is None:
|
||||
arguments = []
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
# Allow setting of logging level from arguments.
|
||||
loggingLevels = {}
|
||||
for level in (
|
||||
logging.NOTSET, logging.DEBUG, logging.INFO, logging.WARNING,
|
||||
logging.ERROR, logging.CRITICAL
|
||||
):
|
||||
loggingLevels[logging.getLevelName(level).lower()] = level
|
||||
|
||||
parser.add_argument(
|
||||
'-v', '--verbosity',
|
||||
help='Set the logging output verbosity.',
|
||||
choices=loggingLevels.keys(),
|
||||
default='info'
|
||||
)
|
||||
namespace = parser.parse_args(arguments)
|
||||
|
||||
# Set up basic logging
|
||||
logging.basicConfig(level=loggingLevels[namespace.verbosity])
|
||||
|
||||
session = ftrack_api.Session()
|
||||
register(session)
|
||||
|
||||
# Wait for events
|
||||
logging.info(
|
||||
'Registered actions and listening for events. Use Ctrl-C to abort.'
|
||||
)
|
||||
session.event_hub.wait()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
raise SystemExit(main(sys.argv[1:]))
|
||||
370
pype/ftrack/actions/action_create_folders.py
Normal file
370
pype/ftrack/actions/action_create_folders.py
Normal file
|
|
@ -0,0 +1,370 @@
|
|||
import os
|
||||
import sys
|
||||
import logging
|
||||
import argparse
|
||||
import re
|
||||
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction
|
||||
from avalon import lib as avalonlib
|
||||
from avalon.tools.libraryloader.io_nonsingleton import DbConnector
|
||||
from pypeapp import config, Anatomy
|
||||
|
||||
|
||||
class CreateFolders(BaseAction):
|
||||
|
||||
'''Custom action.'''
|
||||
|
||||
#: Action identifier.
|
||||
identifier = 'create.folders'
|
||||
|
||||
#: Action label.
|
||||
label = 'Create Folders'
|
||||
|
||||
#: Action Icon.
|
||||
icon = '{}/ftrack/action_icons/CreateFolders.svg'.format(
|
||||
os.environ.get('PYPE_STATICS_SERVER', '')
|
||||
)
|
||||
|
||||
db = DbConnector()
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
not_allowed = ['assetversion']
|
||||
if len(entities) != 1:
|
||||
return False
|
||||
if entities[0].entity_type.lower() in not_allowed:
|
||||
return False
|
||||
return True
|
||||
|
||||
def interface(self, session, entities, event):
|
||||
if event['data'].get('values', {}):
|
||||
return
|
||||
entity = entities[0]
|
||||
without_interface = True
|
||||
for child in entity['children']:
|
||||
if child['object_type']['name'].lower() != 'task':
|
||||
without_interface = False
|
||||
break
|
||||
self.without_interface = without_interface
|
||||
if without_interface:
|
||||
return
|
||||
title = 'Create folders'
|
||||
|
||||
entity_name = entity['name']
|
||||
msg = (
|
||||
'<h2>Do you want create folders also'
|
||||
' for all children of "{}"?</h2>'
|
||||
)
|
||||
if entity.entity_type.lower() == 'project':
|
||||
entity_name = entity['full_name']
|
||||
msg = msg.replace(' also', '')
|
||||
msg += '<h3>(Project root won\'t be created if not checked)</h3>'
|
||||
items = []
|
||||
item_msg = {
|
||||
'type': 'label',
|
||||
'value': msg.format(entity_name)
|
||||
}
|
||||
item_label = {
|
||||
'type': 'label',
|
||||
'value': 'With all chilren entities'
|
||||
}
|
||||
item = {
|
||||
'name': 'children_included',
|
||||
'type': 'boolean',
|
||||
'value': False
|
||||
}
|
||||
items.append(item_msg)
|
||||
items.append(item_label)
|
||||
items.append(item)
|
||||
|
||||
if len(items) == 0:
|
||||
return {
|
||||
'success': False,
|
||||
'message': 'Didn\'t found any running jobs'
|
||||
}
|
||||
else:
|
||||
return {
|
||||
'items': items,
|
||||
'title': title
|
||||
}
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
'''Callback method for custom action.'''
|
||||
with_childrens = True
|
||||
if self.without_interface is False:
|
||||
if 'values' not in event['data']:
|
||||
return
|
||||
with_childrens = event['data']['values']['children_included']
|
||||
entity = entities[0]
|
||||
if entity.entity_type.lower() == 'project':
|
||||
proj = entity
|
||||
else:
|
||||
proj = entity['project']
|
||||
project_name = proj['full_name']
|
||||
project_code = proj['name']
|
||||
if entity.entity_type.lower() == 'project' and with_childrens == False:
|
||||
return {
|
||||
'success': True,
|
||||
'message': 'Nothing was created'
|
||||
}
|
||||
data = {
|
||||
"root": os.environ["AVALON_PROJECTS"],
|
||||
"project": {
|
||||
"name": project_name,
|
||||
"code": project_code
|
||||
}
|
||||
}
|
||||
all_entities = []
|
||||
all_entities.append(entity)
|
||||
if with_childrens:
|
||||
all_entities = self.get_notask_children(entity)
|
||||
|
||||
av_project = None
|
||||
try:
|
||||
self.db.install()
|
||||
self.db.Session['AVALON_PROJECT'] = project_name
|
||||
av_project = self.db.find_one({'type': 'project'})
|
||||
template_work = av_project['config']['template']['work']
|
||||
template_publish = av_project['config']['template']['publish']
|
||||
self.db.uninstall()
|
||||
except Exception:
|
||||
templates = Anatomy().templates
|
||||
template_work = templates["avalon"]["work"]
|
||||
template_publish = templates["avalon"]["publish"]
|
||||
|
||||
collected_paths = []
|
||||
presets = config.get_presets()['tools']['sw_folders']
|
||||
for entity in all_entities:
|
||||
if entity.entity_type.lower() == 'project':
|
||||
continue
|
||||
ent_data = data.copy()
|
||||
|
||||
asset_name = entity['name']
|
||||
ent_data['asset'] = asset_name
|
||||
|
||||
parents = entity['link']
|
||||
hierarchy_names = [p['name'] for p in parents[1:-1]]
|
||||
hierarchy = ''
|
||||
if hierarchy_names:
|
||||
hierarchy = os.path.sep.join(hierarchy_names)
|
||||
ent_data['hierarchy'] = hierarchy
|
||||
|
||||
tasks_created = False
|
||||
if entity['children']:
|
||||
for child in entity['children']:
|
||||
if child['object_type']['name'].lower() != 'task':
|
||||
continue
|
||||
tasks_created = True
|
||||
task_type_name = child['type']['name'].lower()
|
||||
task_data = ent_data.copy()
|
||||
task_data['task'] = child['name']
|
||||
possible_apps = presets.get(task_type_name, [])
|
||||
template_work_created = False
|
||||
template_publish_created = False
|
||||
apps = []
|
||||
for app in possible_apps:
|
||||
try:
|
||||
app_data = avalonlib.get_application(app)
|
||||
app_dir = app_data['application_dir']
|
||||
except ValueError:
|
||||
app_dir = app
|
||||
apps.append(app_dir)
|
||||
|
||||
# Template wok
|
||||
if '{app}' in template_work:
|
||||
for app in apps:
|
||||
template_work_created = True
|
||||
app_data = task_data.copy()
|
||||
app_data['app'] = app
|
||||
collected_paths.append(
|
||||
self.compute_template(
|
||||
template_work, app_data
|
||||
)
|
||||
)
|
||||
if template_work_created is False:
|
||||
collected_paths.append(
|
||||
self.compute_template(template_work, task_data)
|
||||
)
|
||||
# Template publish
|
||||
if '{app}' in template_publish:
|
||||
for app in apps:
|
||||
template_publish_created = True
|
||||
app_data = task_data.copy()
|
||||
app_data['app'] = app
|
||||
collected_paths.append(
|
||||
self.compute_template(
|
||||
template_publish, app_data, True
|
||||
)
|
||||
)
|
||||
if template_publish_created is False:
|
||||
collected_paths.append(
|
||||
self.compute_template(
|
||||
template_publish, task_data, True
|
||||
)
|
||||
)
|
||||
|
||||
if not tasks_created:
|
||||
# create path for entity
|
||||
collected_paths.append(
|
||||
self.compute_template(template_work, ent_data)
|
||||
)
|
||||
collected_paths.append(
|
||||
self.compute_template(template_publish, ent_data)
|
||||
)
|
||||
if len(collected_paths) > 0:
|
||||
self.log.info('Creating folders:')
|
||||
for path in set(collected_paths):
|
||||
self.log.info(path)
|
||||
if not os.path.exists(path):
|
||||
os.makedirs(path)
|
||||
|
||||
return {
|
||||
'success': True,
|
||||
'message': 'Created Folders Successfully!'
|
||||
}
|
||||
|
||||
def get_notask_children(self, entity):
|
||||
output = []
|
||||
if entity.get('object_type', {}).get(
|
||||
'name', entity.entity_type
|
||||
).lower() == 'task':
|
||||
return output
|
||||
else:
|
||||
output.append(entity)
|
||||
if entity['children']:
|
||||
for child in entity['children']:
|
||||
output.extend(self.get_notask_children(child))
|
||||
return output
|
||||
|
||||
def template_format(self, template, data):
|
||||
|
||||
partial_data = PartialDict(data)
|
||||
|
||||
# remove subdict items from string (like 'project[name]')
|
||||
subdict = PartialDict()
|
||||
count = 1
|
||||
store_pattern = 5*'_'+'{:0>3}'
|
||||
regex_patern = "\{\w*\[[^\}]*\]\}"
|
||||
matches = re.findall(regex_patern, template)
|
||||
|
||||
for match in matches:
|
||||
key = store_pattern.format(count)
|
||||
subdict[key] = match
|
||||
template = template.replace(match, '{'+key+'}')
|
||||
count += 1
|
||||
# solve fillind keys with optional keys
|
||||
solved = self._solve_with_optional(template, partial_data)
|
||||
# try to solve subdict and replace them back to string
|
||||
for k, v in subdict.items():
|
||||
try:
|
||||
v = v.format_map(data)
|
||||
except (KeyError, TypeError):
|
||||
pass
|
||||
subdict[k] = v
|
||||
|
||||
return solved.format_map(subdict)
|
||||
|
||||
def _solve_with_optional(self, template, data):
|
||||
# Remove optional missing keys
|
||||
pattern = re.compile(r"(<.*?[^{0]*>)[^0-9]*?")
|
||||
invalid_optionals = []
|
||||
for group in pattern.findall(template):
|
||||
try:
|
||||
group.format(**data)
|
||||
except KeyError:
|
||||
invalid_optionals.append(group)
|
||||
for group in invalid_optionals:
|
||||
template = template.replace(group, "")
|
||||
|
||||
solved = template.format_map(data)
|
||||
|
||||
# solving after format optional in second round
|
||||
for catch in re.compile(r"(<.*?[^{0]*>)[^0-9]*?").findall(solved):
|
||||
if "{" in catch:
|
||||
# remove all optional
|
||||
solved = solved.replace(catch, "")
|
||||
else:
|
||||
# Remove optional symbols
|
||||
solved = solved.replace(catch, catch[1:-1])
|
||||
|
||||
return solved
|
||||
|
||||
def compute_template(self, str, data, task=False):
|
||||
first_result = self.template_format(str, data)
|
||||
if first_result == first_result.split('{')[0]:
|
||||
return os.path.normpath(first_result)
|
||||
if task:
|
||||
return os.path.normpath(first_result.split('{')[0])
|
||||
|
||||
index = first_result.index('{')
|
||||
|
||||
regex = '\{\w*[^\}]*\}'
|
||||
match = re.findall(regex, first_result[index:])[0]
|
||||
without_missing = str.split(match)[0].split('}')
|
||||
output_items = []
|
||||
for part in without_missing:
|
||||
if '{' in part:
|
||||
output_items.append(part + '}')
|
||||
return os.path.normpath(
|
||||
self.template_format(''.join(output_items), data)
|
||||
)
|
||||
|
||||
|
||||
class PartialDict(dict):
|
||||
def __getitem__(self, item):
|
||||
out = super().__getitem__(item)
|
||||
if isinstance(out, dict):
|
||||
return '{'+item+'}'
|
||||
return out
|
||||
|
||||
def __missing__(self, key):
|
||||
return '{'+key+'}'
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
CreateFolders(session).register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
'''Set up logging and register action.'''
|
||||
if arguments is None:
|
||||
arguments = []
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
# Allow setting of logging level from arguments.
|
||||
loggingLevels = {}
|
||||
for level in (
|
||||
logging.NOTSET, logging.DEBUG, logging.INFO, logging.WARNING,
|
||||
logging.ERROR, logging.CRITICAL
|
||||
):
|
||||
loggingLevels[logging.getLevelName(level).lower()] = level
|
||||
|
||||
parser.add_argument(
|
||||
'-v', '--verbosity',
|
||||
help='Set the logging output verbosity.',
|
||||
choices=loggingLevels.keys(),
|
||||
default='info'
|
||||
)
|
||||
namespace = parser.parse_args(arguments)
|
||||
|
||||
# Set up basic logging
|
||||
logging.basicConfig(level=loggingLevels[namespace.verbosity])
|
||||
|
||||
session = ftrack_api.Session()
|
||||
register(session)
|
||||
|
||||
# Wait for events
|
||||
logging.info(
|
||||
'Registered actions and listening for events. Use Ctrl-C to abort.'
|
||||
)
|
||||
session.event_hub.wait()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
raise SystemExit(main(sys.argv[1:]))
|
||||
238
pype/ftrack/actions/action_create_project_folders.py
Normal file
238
pype/ftrack/actions/action_create_project_folders.py
Normal file
|
|
@ -0,0 +1,238 @@
|
|||
import os
|
||||
import sys
|
||||
import re
|
||||
import argparse
|
||||
import logging
|
||||
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction
|
||||
from pypeapp import config
|
||||
|
||||
|
||||
class CreateProjectFolders(BaseAction):
|
||||
'''Edit meta data action.'''
|
||||
|
||||
#: Action identifier.
|
||||
identifier = 'create.project.folders'
|
||||
#: Action label.
|
||||
label = 'Create Project Folders'
|
||||
#: Action description.
|
||||
description = 'Creates folder structure'
|
||||
#: roles that are allowed to register this action
|
||||
role_list = ['Pypeclub', 'Administrator']
|
||||
icon = '{}/ftrack/action_icons/CreateProjectFolders.svg'.format(
|
||||
os.environ.get('PYPE_STATICS_SERVER', '')
|
||||
)
|
||||
|
||||
pattern_array = re.compile('\[.*\]')
|
||||
pattern_ftrack = '.*\[[.]*ftrack[.]*'
|
||||
pattern_ent_ftrack = 'ftrack\.[^.,\],\s,]*'
|
||||
project_root_key = '__project_root__'
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
|
||||
return True
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
entity = entities[0]
|
||||
if entity.entity_type.lower() == 'project':
|
||||
project = entity
|
||||
else:
|
||||
project = entity['project']
|
||||
|
||||
presets = config.get_presets()['tools']['project_folder_structure']
|
||||
try:
|
||||
# Get paths based on presets
|
||||
basic_paths = self.get_path_items(presets)
|
||||
self.create_folders(basic_paths, entity)
|
||||
self.create_ftrack_entities(basic_paths, project)
|
||||
except Exception as e:
|
||||
session.rollback()
|
||||
return {
|
||||
'success': False,
|
||||
'message': str(e)
|
||||
}
|
||||
|
||||
return True
|
||||
|
||||
def get_ftrack_paths(self, paths_items):
|
||||
all_ftrack_paths = []
|
||||
for path_items in paths_items:
|
||||
ftrack_path_items = []
|
||||
is_ftrack = False
|
||||
for item in reversed(path_items):
|
||||
if item == self.project_root_key:
|
||||
continue
|
||||
if is_ftrack:
|
||||
ftrack_path_items.append(item)
|
||||
elif re.match(self.pattern_ftrack, item):
|
||||
ftrack_path_items.append(item)
|
||||
is_ftrack = True
|
||||
ftrack_path_items = list(reversed(ftrack_path_items))
|
||||
if ftrack_path_items:
|
||||
all_ftrack_paths.append(ftrack_path_items)
|
||||
return all_ftrack_paths
|
||||
|
||||
def compute_ftrack_items(self, in_list, keys):
|
||||
if len(keys) == 0:
|
||||
return in_list
|
||||
key = keys[0]
|
||||
exist = None
|
||||
for index, subdict in enumerate(in_list):
|
||||
if key in subdict:
|
||||
exist = index
|
||||
break
|
||||
if exist is not None:
|
||||
in_list[exist][key] = self.compute_ftrack_items(
|
||||
in_list[exist][key], keys[1:]
|
||||
)
|
||||
else:
|
||||
in_list.append({key: self.compute_ftrack_items([], keys[1:])})
|
||||
return in_list
|
||||
|
||||
def translate_ftrack_items(self, paths_items):
|
||||
main = []
|
||||
for path_items in paths_items:
|
||||
main = self.compute_ftrack_items(main, path_items)
|
||||
return main
|
||||
|
||||
def create_ftrack_entities(self, basic_paths, project_ent):
|
||||
only_ftrack_items = self.get_ftrack_paths(basic_paths)
|
||||
ftrack_paths = self.translate_ftrack_items(only_ftrack_items)
|
||||
|
||||
for separation in ftrack_paths:
|
||||
parent = project_ent
|
||||
self.trigger_creation(separation, parent)
|
||||
|
||||
def trigger_creation(self, separation, parent):
|
||||
for item, subvalues in separation.items():
|
||||
matches = re.findall(self.pattern_array, item)
|
||||
ent_type = 'Folder'
|
||||
if len(matches) == 0:
|
||||
name = item
|
||||
else:
|
||||
match = matches[0]
|
||||
name = item.replace(match, '')
|
||||
ent_type_match = re.findall(self.pattern_ent_ftrack, match)
|
||||
if len(ent_type_match) > 0:
|
||||
ent_type_split = ent_type_match[0].split('.')
|
||||
if len(ent_type_split) == 2:
|
||||
ent_type = ent_type_split[1]
|
||||
new_parent = self.create_ftrack_entity(name, ent_type, parent)
|
||||
if subvalues:
|
||||
for subvalue in subvalues:
|
||||
self.trigger_creation(subvalue, new_parent)
|
||||
|
||||
def create_ftrack_entity(self, name, ent_type, parent):
|
||||
for children in parent['children']:
|
||||
if children['name'] == name:
|
||||
return children
|
||||
data = {
|
||||
'name': name,
|
||||
'parent_id': parent['id']
|
||||
}
|
||||
if parent.entity_type.lower() == 'project':
|
||||
data['project_id'] = parent['id']
|
||||
else:
|
||||
data['project_id'] = parent['project']['id']
|
||||
|
||||
new_ent = self.session.create(ent_type, data)
|
||||
self.session.commit()
|
||||
return new_ent
|
||||
|
||||
def get_path_items(self, in_dict):
|
||||
output = []
|
||||
for key, value in in_dict.items():
|
||||
if not value:
|
||||
output.append(key)
|
||||
else:
|
||||
paths = self.get_path_items(value)
|
||||
for path in paths:
|
||||
if isinstance(path, str):
|
||||
output.append([key, path])
|
||||
else:
|
||||
p = [key]
|
||||
p.extend(path)
|
||||
output.append(p)
|
||||
return output
|
||||
|
||||
def compute_paths(self, basic_paths_items, project_root):
|
||||
output = []
|
||||
for path_items in basic_paths_items:
|
||||
clean_items = []
|
||||
for path_item in path_items:
|
||||
matches = re.findall(self.pattern_array, path_item)
|
||||
if len(matches) > 0:
|
||||
path_item = path_item.replace(matches[0], '')
|
||||
if path_item == self.project_root_key:
|
||||
path_item = project_root
|
||||
clean_items.append(path_item)
|
||||
output.append(os.path.normpath(os.path.sep.join(clean_items)))
|
||||
return output
|
||||
|
||||
def create_folders(self, basic_paths, entity):
|
||||
# Set project root folder
|
||||
if entity.entity_type.lower() == 'project':
|
||||
project_name = entity['full_name']
|
||||
else:
|
||||
project_name = entity['project']['full_name']
|
||||
project_root_items = [os.environ['AVALON_PROJECTS'], project_name]
|
||||
project_root = os.path.sep.join(project_root_items)
|
||||
|
||||
full_paths = self.compute_paths(basic_paths, project_root)
|
||||
#Create folders
|
||||
for path in full_paths:
|
||||
if os.path.exists(path):
|
||||
continue
|
||||
os.makedirs(path.format(project_root=project_root))
|
||||
|
||||
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
CreateProjectFolders(session).register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
'''Set up logging and register action.'''
|
||||
if arguments is None:
|
||||
arguments = []
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
# Allow setting of logging level from arguments.
|
||||
loggingLevels = {}
|
||||
for level in (
|
||||
logging.NOTSET, logging.DEBUG, logging.INFO, logging.WARNING,
|
||||
logging.ERROR, logging.CRITICAL
|
||||
):
|
||||
loggingLevels[logging.getLevelName(level).lower()] = level
|
||||
|
||||
parser.add_argument(
|
||||
'-v', '--verbosity',
|
||||
help='Set the logging output verbosity.',
|
||||
choices=loggingLevels.keys(),
|
||||
default='info'
|
||||
)
|
||||
namespace = parser.parse_args(arguments)
|
||||
|
||||
# Set up basic logging
|
||||
logging.basicConfig(level=loggingLevels[namespace.verbosity])
|
||||
|
||||
session = ftrack_api.Session()
|
||||
register(session)
|
||||
|
||||
# Wait for events
|
||||
logging.info(
|
||||
'Registered actions and listening for events. Use Ctrl-C to abort.'
|
||||
)
|
||||
session.event_hub.wait()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
raise SystemExit(main(sys.argv[1:]))
|
||||
334
pype/ftrack/actions/action_cust_attr_doctor.py
Normal file
334
pype/ftrack/actions/action_cust_attr_doctor.py
Normal file
|
|
@ -0,0 +1,334 @@
|
|||
import os
|
||||
import sys
|
||||
import json
|
||||
import argparse
|
||||
import logging
|
||||
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction
|
||||
|
||||
|
||||
class CustomAttributeDoctor(BaseAction):
|
||||
#: Action identifier.
|
||||
identifier = 'custom.attributes.doctor'
|
||||
#: Action label.
|
||||
label = 'Custom Attributes Doctor'
|
||||
#: Action description.
|
||||
description = (
|
||||
'Fix hierarchical custom attributes mainly handles, fstart'
|
||||
' and fend'
|
||||
)
|
||||
|
||||
icon = '{}/ftrack/action_icons/TestAction.svg'.format(
|
||||
os.environ.get('PYPE_STATICS_SERVER', '')
|
||||
)
|
||||
hierarchical_ca = ['handle_start', 'handle_end', 'fstart', 'fend']
|
||||
hierarchical_alternatives = {
|
||||
'handle_start': 'handles',
|
||||
'handle_end': 'handles'
|
||||
}
|
||||
|
||||
# Roles for new custom attributes
|
||||
read_roles = ['ALL',]
|
||||
write_roles = ['ALL',]
|
||||
|
||||
data_ca = {
|
||||
'handle_start': {
|
||||
'label': 'Frame handles start',
|
||||
'type': 'number',
|
||||
'config': json.dumps({'isdecimal': False})
|
||||
},
|
||||
'handle_end': {
|
||||
'label': 'Frame handles end',
|
||||
'type': 'number',
|
||||
'config': json.dumps({'isdecimal': False})
|
||||
},
|
||||
'fstart': {
|
||||
'label': 'Frame start',
|
||||
'type': 'number',
|
||||
'config': json.dumps({'isdecimal': False})
|
||||
},
|
||||
'fend': {
|
||||
'label': 'Frame end',
|
||||
'type': 'number',
|
||||
'config': json.dumps({'isdecimal': False})
|
||||
}
|
||||
}
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
|
||||
return True
|
||||
|
||||
def interface(self, session, entities, event):
|
||||
if event['data'].get('values', {}):
|
||||
return
|
||||
|
||||
title = 'Select Project to fix Custom attributes'
|
||||
|
||||
items = []
|
||||
item_splitter = {'type': 'label', 'value': '---'}
|
||||
|
||||
all_projects = session.query('Project').all()
|
||||
for project in all_projects:
|
||||
item_label = {
|
||||
'type': 'label',
|
||||
'value': '{} (<i>{}</i>)'.format(
|
||||
project['full_name'], project['name']
|
||||
)
|
||||
}
|
||||
item = {
|
||||
'name': project['id'],
|
||||
'type': 'boolean',
|
||||
'value': False
|
||||
}
|
||||
if len(items) > 0:
|
||||
items.append(item_splitter)
|
||||
items.append(item_label)
|
||||
items.append(item)
|
||||
|
||||
if len(items) == 0:
|
||||
return {
|
||||
'success': False,
|
||||
'message': 'Didn\'t found any projects'
|
||||
}
|
||||
else:
|
||||
return {
|
||||
'items': items,
|
||||
'title': title
|
||||
}
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
if 'values' not in event['data']:
|
||||
return
|
||||
|
||||
values = event['data']['values']
|
||||
projects_to_update = []
|
||||
for project_id, update_bool in values.items():
|
||||
if not update_bool:
|
||||
continue
|
||||
|
||||
project = session.query(
|
||||
'Project where id is "{}"'.format(project_id)
|
||||
).one()
|
||||
projects_to_update.append(project)
|
||||
|
||||
if not projects_to_update:
|
||||
self.log.debug('Nothing to update')
|
||||
return {
|
||||
'success': True,
|
||||
'message': 'Nothing to update'
|
||||
}
|
||||
|
||||
self.security_roles = {}
|
||||
self.to_process = {}
|
||||
# self.curent_default_values = {}
|
||||
existing_attrs = session.query('CustomAttributeConfiguration').all()
|
||||
self.prepare_custom_attributes(existing_attrs)
|
||||
|
||||
self.projects_data = {}
|
||||
for project in projects_to_update:
|
||||
self.process_data(project)
|
||||
|
||||
return True
|
||||
|
||||
def process_data(self, entity):
|
||||
cust_attrs = entity.get('custom_attributes')
|
||||
if not cust_attrs:
|
||||
return
|
||||
for dst_key, src_key in self.to_process.items():
|
||||
if src_key in cust_attrs:
|
||||
value = cust_attrs[src_key]
|
||||
entity['custom_attributes'][dst_key] = value
|
||||
self.session.commit()
|
||||
|
||||
for child in entity.get('children', []):
|
||||
self.process_data(child)
|
||||
|
||||
def prepare_custom_attributes(self, existing_attrs):
|
||||
to_process = {}
|
||||
to_create = []
|
||||
all_keys = {attr['key']: attr for attr in existing_attrs}
|
||||
for key in self.hierarchical_ca:
|
||||
if key not in all_keys:
|
||||
self.log.debug(
|
||||
'Custom attribute "{}" does not exist at all'.format(key)
|
||||
)
|
||||
to_create.append(key)
|
||||
if key in self.hierarchical_alternatives:
|
||||
alt_key = self.hierarchical_alternatives[key]
|
||||
if alt_key in all_keys:
|
||||
self.log.debug((
|
||||
'Custom attribute "{}" will use values from "{}"'
|
||||
).format(key, alt_key))
|
||||
|
||||
to_process[key] = alt_key
|
||||
|
||||
obj = all_keys[alt_key]
|
||||
# if alt_key not in self.curent_default_values:
|
||||
# self.curent_default_values[alt_key] = obj['default']
|
||||
obj['default'] = None
|
||||
self.session.commit()
|
||||
|
||||
else:
|
||||
obj = all_keys[key]
|
||||
new_key = key + '_old'
|
||||
|
||||
if obj['is_hierarchical']:
|
||||
if new_key not in all_keys:
|
||||
self.log.info((
|
||||
'Custom attribute "{}" is already hierarchical'
|
||||
' and can\'t find old one'
|
||||
).format(key)
|
||||
)
|
||||
continue
|
||||
|
||||
to_process[key] = new_key
|
||||
continue
|
||||
|
||||
# default_value = obj['default']
|
||||
# if new_key not in self.curent_default_values:
|
||||
# self.curent_default_values[new_key] = default_value
|
||||
|
||||
obj['key'] = new_key
|
||||
obj['label'] = obj['label'] + '(old)'
|
||||
obj['default'] = None
|
||||
|
||||
self.session.commit()
|
||||
|
||||
to_create.append(key)
|
||||
to_process[key] = new_key
|
||||
|
||||
self.to_process = to_process
|
||||
for key in to_create:
|
||||
data = {
|
||||
'key': key,
|
||||
'entity_type': 'show',
|
||||
'is_hierarchical': True,
|
||||
'default': None
|
||||
}
|
||||
for _key, _value in self.data_ca.get(key, {}).items():
|
||||
if _key == 'type':
|
||||
_value = self.session.query((
|
||||
'CustomAttributeType where name is "{}"'
|
||||
).format(_value)).first()
|
||||
|
||||
data[_key] = _value
|
||||
|
||||
avalon_group = self.session.query(
|
||||
'CustomAttributeGroup where name is "avalon"'
|
||||
).first()
|
||||
if avalon_group:
|
||||
data['group'] = avalon_group
|
||||
|
||||
read_roles = self.get_security_role(self.read_roles)
|
||||
write_roles = self.get_security_role(self.write_roles)
|
||||
data['read_security_roles'] = read_roles
|
||||
data['write_security_roles'] = write_roles
|
||||
|
||||
self.session.create('CustomAttributeConfiguration', data)
|
||||
self.session.commit()
|
||||
|
||||
# def return_back_defaults(self):
|
||||
# existing_attrs = self.session.query(
|
||||
# 'CustomAttributeConfiguration'
|
||||
# ).all()
|
||||
#
|
||||
# for attr_key, default in self.curent_default_values.items():
|
||||
# for attr in existing_attrs:
|
||||
# if attr['key'] != attr_key:
|
||||
# continue
|
||||
# attr['default'] = default
|
||||
# self.session.commit()
|
||||
# break
|
||||
|
||||
def get_security_role(self, security_roles):
|
||||
roles = []
|
||||
if len(security_roles) == 0 or security_roles[0] == 'ALL':
|
||||
roles = self.get_role_ALL()
|
||||
elif security_roles[0] == 'except':
|
||||
excepts = security_roles[1:]
|
||||
all = self.get_role_ALL()
|
||||
for role in all:
|
||||
if role['name'] not in excepts:
|
||||
roles.append(role)
|
||||
if role['name'] not in self.security_roles:
|
||||
self.security_roles[role['name']] = role
|
||||
else:
|
||||
for role_name in security_roles:
|
||||
if role_name in self.security_roles:
|
||||
roles.append(self.security_roles[role_name])
|
||||
continue
|
||||
|
||||
try:
|
||||
query = 'SecurityRole where name is "{}"'.format(role_name)
|
||||
role = self.session.query(query).one()
|
||||
self.security_roles[role_name] = role
|
||||
roles.append(role)
|
||||
except Exception:
|
||||
self.log.warning(
|
||||
'Securit role "{}" does not exist'.format(role_name)
|
||||
)
|
||||
continue
|
||||
|
||||
return roles
|
||||
|
||||
def get_role_ALL(self):
|
||||
role_name = 'ALL'
|
||||
if role_name in self.security_roles:
|
||||
all_roles = self.security_roles[role_name]
|
||||
else:
|
||||
all_roles = self.session.query('SecurityRole').all()
|
||||
self.security_roles[role_name] = all_roles
|
||||
for role in all_roles:
|
||||
if role['name'] not in self.security_roles:
|
||||
self.security_roles[role['name']] = role
|
||||
return all_roles
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
CustomAttributeDoctor(session).register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
'''Set up logging and register action.'''
|
||||
if arguments is None:
|
||||
arguments = []
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
# Allow setting of logging level from arguments.
|
||||
loggingLevels = {}
|
||||
for level in (
|
||||
logging.NOTSET, logging.DEBUG, logging.INFO, logging.WARNING,
|
||||
logging.ERROR, logging.CRITICAL
|
||||
):
|
||||
loggingLevels[logging.getLevelName(level).lower()] = level
|
||||
|
||||
parser.add_argument(
|
||||
'-v', '--verbosity',
|
||||
help='Set the logging output verbosity.',
|
||||
choices=loggingLevels.keys(),
|
||||
default='info'
|
||||
)
|
||||
namespace = parser.parse_args(arguments)
|
||||
|
||||
# Set up basic logging
|
||||
logging.basicConfig(level=loggingLevels[namespace.verbosity])
|
||||
|
||||
session = ftrack_api.Session()
|
||||
register(session)
|
||||
|
||||
# Wait for events
|
||||
logging.info(
|
||||
'Registered actions and listening for events. Use Ctrl-C to abort.'
|
||||
)
|
||||
session.event_hub.wait()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
raise SystemExit(main(sys.argv[1:]))
|
||||
363
pype/ftrack/actions/action_delete_asset.py
Normal file
363
pype/ftrack/actions/action_delete_asset.py
Normal file
|
|
@ -0,0 +1,363 @@
|
|||
import os
|
||||
import sys
|
||||
import logging
|
||||
from bson.objectid import ObjectId
|
||||
import argparse
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction
|
||||
from avalon.tools.libraryloader.io_nonsingleton import DbConnector
|
||||
|
||||
|
||||
class DeleteAsset(BaseAction):
|
||||
'''Edit meta data action.'''
|
||||
|
||||
#: Action identifier.
|
||||
identifier = 'delete.asset'
|
||||
#: Action label.
|
||||
label = 'Delete Asset/Subsets'
|
||||
#: Action description.
|
||||
description = 'Removes from Avalon with all childs and asset from Ftrack'
|
||||
icon = '{}/ftrack/action_icons/DeleteAsset.svg'.format(
|
||||
os.environ.get('PYPE_STATICS_SERVER', '')
|
||||
)
|
||||
#: roles that are allowed to register this action
|
||||
role_list = ['Pypeclub', 'Administrator']
|
||||
#: Db
|
||||
db = DbConnector()
|
||||
|
||||
value = None
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
if len(entities) != 1:
|
||||
return False
|
||||
|
||||
valid = ["task"]
|
||||
entityType = event["data"]["selection"][0].get("entityType", "")
|
||||
if entityType.lower() not in valid:
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def _launch(self, event):
|
||||
self.reset_session()
|
||||
try:
|
||||
self.db.install()
|
||||
args = self._translate_event(
|
||||
self.session, event
|
||||
)
|
||||
|
||||
interface = self._interface(
|
||||
self.session, *args
|
||||
)
|
||||
|
||||
confirmation = self.confirm_delete(
|
||||
True, *args
|
||||
)
|
||||
|
||||
if interface:
|
||||
return interface
|
||||
|
||||
if confirmation:
|
||||
return confirmation
|
||||
|
||||
response = self.launch(
|
||||
self.session, *args
|
||||
)
|
||||
finally:
|
||||
self.db.uninstall()
|
||||
|
||||
return self._handle_result(
|
||||
self.session, response, *args
|
||||
)
|
||||
|
||||
def interface(self, session, entities, event):
|
||||
if not event['data'].get('values', {}):
|
||||
self.attempt = 1
|
||||
items = []
|
||||
entity = entities[0]
|
||||
title = 'Choose items to delete from "{}"'.format(entity['name'])
|
||||
project = entity['project']
|
||||
|
||||
self.db.Session['AVALON_PROJECT'] = project["full_name"]
|
||||
|
||||
av_entity = self.db.find_one({
|
||||
'type': 'asset',
|
||||
'name': entity['name']
|
||||
})
|
||||
|
||||
if av_entity is None:
|
||||
return {
|
||||
'success': False,
|
||||
'message': 'Didn\'t found assets in avalon'
|
||||
}
|
||||
|
||||
asset_label = {
|
||||
'type': 'label',
|
||||
'value': '## Delete whole asset: ##'
|
||||
}
|
||||
asset_item = {
|
||||
'label': av_entity['name'],
|
||||
'name': 'whole_asset',
|
||||
'type': 'boolean',
|
||||
'value': False
|
||||
}
|
||||
splitter = {
|
||||
'type': 'label',
|
||||
'value': '{}'.format(200*"-")
|
||||
}
|
||||
subset_label = {
|
||||
'type': 'label',
|
||||
'value': '## Subsets: ##'
|
||||
}
|
||||
if av_entity is not None:
|
||||
items.append(asset_label)
|
||||
items.append(asset_item)
|
||||
items.append(splitter)
|
||||
|
||||
all_subsets = self.db.find({
|
||||
'type': 'subset',
|
||||
'parent': av_entity['_id']
|
||||
})
|
||||
|
||||
subset_items = []
|
||||
for subset in all_subsets:
|
||||
item = {
|
||||
'label': subset['name'],
|
||||
'name': str(subset['_id']),
|
||||
'type': 'boolean',
|
||||
'value': False
|
||||
}
|
||||
subset_items.append(item)
|
||||
if len(subset_items) > 0:
|
||||
items.append(subset_label)
|
||||
items.extend(subset_items)
|
||||
else:
|
||||
return {
|
||||
'success': False,
|
||||
'message': 'Didn\'t found assets in avalon'
|
||||
}
|
||||
|
||||
return {
|
||||
'items': items,
|
||||
'title': title
|
||||
}
|
||||
|
||||
def confirm_delete(self, first_attempt, entities, event):
|
||||
if first_attempt is True:
|
||||
if 'values' not in event['data']:
|
||||
return
|
||||
|
||||
values = event['data']['values']
|
||||
|
||||
if len(values) <= 0:
|
||||
return
|
||||
if 'whole_asset' not in values:
|
||||
return
|
||||
else:
|
||||
values = self.values
|
||||
|
||||
title = 'Confirmation of deleting {}'
|
||||
if values['whole_asset'] is True:
|
||||
title = title.format(
|
||||
'whole asset {}'.format(
|
||||
entities[0]['name']
|
||||
)
|
||||
)
|
||||
else:
|
||||
subsets = []
|
||||
for key, value in values.items():
|
||||
if value is True:
|
||||
subsets.append(key)
|
||||
len_subsets = len(subsets)
|
||||
if len_subsets == 0:
|
||||
return {
|
||||
'success': True,
|
||||
'message': 'Nothing was selected to delete'
|
||||
}
|
||||
elif len_subsets == 1:
|
||||
title = title.format(
|
||||
'{} subset'.format(len_subsets)
|
||||
)
|
||||
else:
|
||||
title = title.format(
|
||||
'{} subsets'.format(len_subsets)
|
||||
)
|
||||
|
||||
self.values = values
|
||||
items = []
|
||||
|
||||
delete_label = {
|
||||
'type': 'label',
|
||||
'value': '# Please enter "DELETE" to confirm #'
|
||||
}
|
||||
|
||||
delete_item = {
|
||||
'name': 'delete_key',
|
||||
'type': 'text',
|
||||
'value': '',
|
||||
'empty_text': 'Type Delete here...'
|
||||
}
|
||||
items.append(delete_label)
|
||||
items.append(delete_item)
|
||||
|
||||
return {
|
||||
'items': items,
|
||||
'title': title
|
||||
}
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
if 'values' not in event['data']:
|
||||
return
|
||||
|
||||
values = event['data']['values']
|
||||
if len(values) <= 0:
|
||||
return
|
||||
if 'delete_key' not in values:
|
||||
return
|
||||
|
||||
if values['delete_key'].lower() != 'delete':
|
||||
if values['delete_key'].lower() == '':
|
||||
return {
|
||||
'success': False,
|
||||
'message': 'Deleting cancelled'
|
||||
}
|
||||
if self.attempt < 3:
|
||||
self.attempt += 1
|
||||
return_dict = self.confirm_delete(False, entities, event)
|
||||
return_dict['title'] = '{} ({} attempt)'.format(
|
||||
return_dict['title'], self.attempt
|
||||
)
|
||||
return return_dict
|
||||
return {
|
||||
'success': False,
|
||||
'message': 'You didn\'t enter "DELETE" properly 3 times!'
|
||||
}
|
||||
|
||||
entity = entities[0]
|
||||
project = entity['project']
|
||||
|
||||
self.db.Session['AVALON_PROJECT'] = project["full_name"]
|
||||
|
||||
all_ids = []
|
||||
if self.values.get('whole_asset', False) is True:
|
||||
av_entity = self.db.find_one({
|
||||
'type': 'asset',
|
||||
'name': entity['name']
|
||||
})
|
||||
|
||||
if av_entity is not None:
|
||||
all_ids.append(av_entity['_id'])
|
||||
all_ids.extend(self.find_child(av_entity))
|
||||
|
||||
session.delete(entity)
|
||||
session.commit()
|
||||
else:
|
||||
subset_names = []
|
||||
for key, value in self.values.items():
|
||||
if key == 'delete_key' or value is False:
|
||||
continue
|
||||
|
||||
entity_id = ObjectId(key)
|
||||
av_entity = self.db.find_one({'_id': entity_id})
|
||||
subset_names.append(av_entity['name'])
|
||||
if av_entity is None:
|
||||
continue
|
||||
all_ids.append(entity_id)
|
||||
all_ids.extend(self.find_child(av_entity))
|
||||
|
||||
for ft_asset in entity['assets']:
|
||||
if ft_asset['name'] in subset_names:
|
||||
session.delete(ft_asset)
|
||||
session.commit()
|
||||
|
||||
if len(all_ids) == 0:
|
||||
return {
|
||||
'success': True,
|
||||
'message': 'No entities to delete in avalon'
|
||||
}
|
||||
|
||||
or_subquery = []
|
||||
for id in all_ids:
|
||||
or_subquery.append({'_id': id})
|
||||
delete_query = {'$or': or_subquery}
|
||||
self.db.delete_many(delete_query)
|
||||
|
||||
return {
|
||||
'success': True,
|
||||
'message': 'All assets were deleted!'
|
||||
}
|
||||
|
||||
def find_child(self, entity):
|
||||
output = []
|
||||
id = entity['_id']
|
||||
visuals = [x for x in self.db.find({'data.visualParent': id})]
|
||||
assert len(visuals) == 0, 'This asset has another asset as child'
|
||||
childs = self.db.find({'parent': id})
|
||||
for child in childs:
|
||||
output.append(child['_id'])
|
||||
output.extend(self.find_child(child))
|
||||
return output
|
||||
|
||||
def find_assets(self, asset_names):
|
||||
assets = []
|
||||
for name in asset_names:
|
||||
entity = self.db.find_one({
|
||||
'type': 'asset',
|
||||
'name': name
|
||||
})
|
||||
if entity is not None and entity not in assets:
|
||||
assets.append(entity)
|
||||
return assets
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
# Validate that session is an instance of ftrack_api.Session. If not,
|
||||
# assume that register is being called from an old or incompatible API and
|
||||
# return without doing anything.
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
DeleteAsset(session).register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
'''Set up logging and register action.'''
|
||||
if arguments is None:
|
||||
arguments = []
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
# Allow setting of logging level from arguments.
|
||||
loggingLevels = {}
|
||||
for level in (
|
||||
logging.NOTSET, logging.DEBUG, logging.INFO, logging.WARNING,
|
||||
logging.ERROR, logging.CRITICAL
|
||||
):
|
||||
loggingLevels[logging.getLevelName(level).lower()] = level
|
||||
|
||||
parser.add_argument(
|
||||
'-v', '--verbosity',
|
||||
help='Set the logging output verbosity.',
|
||||
choices=loggingLevels.keys(),
|
||||
default='info'
|
||||
)
|
||||
namespace = parser.parse_args(arguments)
|
||||
|
||||
# Set up basic logging
|
||||
logging.basicConfig(level=loggingLevels[namespace.verbosity])
|
||||
|
||||
session = ftrack_api.Session()
|
||||
|
||||
register(session)
|
||||
|
||||
# Wait for events
|
||||
logging.info(
|
||||
'Registered actions and listening for events. Use Ctrl-C to abort.'
|
||||
)
|
||||
session.event_hub.wait()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
raise SystemExit(main(sys.argv[1:]))
|
||||
183
pype/ftrack/actions/action_delete_asset_byname.py
Normal file
183
pype/ftrack/actions/action_delete_asset_byname.py
Normal file
|
|
@ -0,0 +1,183 @@
|
|||
import os
|
||||
import sys
|
||||
import logging
|
||||
import argparse
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction
|
||||
from avalon.tools.libraryloader.io_nonsingleton import DbConnector
|
||||
|
||||
|
||||
class AssetsRemover(BaseAction):
|
||||
'''Edit meta data action.'''
|
||||
|
||||
#: Action identifier.
|
||||
identifier = 'remove.assets'
|
||||
#: Action label.
|
||||
label = 'Delete Assets by Name'
|
||||
#: Action description.
|
||||
description = 'Removes assets from Ftrack and Avalon db with all childs'
|
||||
#: roles that are allowed to register this action
|
||||
role_list = ['Pypeclub', 'Administrator']
|
||||
icon = '{}/ftrack/action_icons/AssetsRemover.svg'.format(
|
||||
os.environ.get('PYPE_STATICS_SERVER', '')
|
||||
)
|
||||
#: Db
|
||||
db = DbConnector()
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
if len(entities) != 1:
|
||||
return False
|
||||
|
||||
valid = ["show", "task"]
|
||||
entityType = event["data"]["selection"][0].get("entityType", "")
|
||||
if entityType.lower() not in valid:
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def interface(self, session, entities, event):
|
||||
if not event['data'].get('values', {}):
|
||||
title = 'Enter Asset names to delete'
|
||||
|
||||
items = []
|
||||
for i in range(15):
|
||||
|
||||
item = {
|
||||
'label': 'Asset {}'.format(i+1),
|
||||
'name': 'asset_{}'.format(i+1),
|
||||
'type': 'text',
|
||||
'value': ''
|
||||
}
|
||||
items.append(item)
|
||||
|
||||
return {
|
||||
'items': items,
|
||||
'title': title
|
||||
}
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
entity = entities[0]
|
||||
if entity.entity_type.lower() != 'Project':
|
||||
project = entity['project']
|
||||
else:
|
||||
project = entity
|
||||
|
||||
if 'values' not in event['data']:
|
||||
return
|
||||
|
||||
values = event['data']['values']
|
||||
if len(values) <= 0:
|
||||
return {
|
||||
'success': True,
|
||||
'message': 'No Assets to delete!'
|
||||
}
|
||||
|
||||
asset_names = []
|
||||
|
||||
for k, v in values.items():
|
||||
if v.replace(' ', '') != '':
|
||||
asset_names.append(v)
|
||||
|
||||
self.db.install()
|
||||
self.db.Session['AVALON_PROJECT'] = project["full_name"]
|
||||
|
||||
assets = self.find_assets(asset_names)
|
||||
|
||||
all_ids = []
|
||||
for asset in assets:
|
||||
all_ids.append(asset['_id'])
|
||||
all_ids.extend(self.find_child(asset))
|
||||
|
||||
if len(all_ids) == 0:
|
||||
self.db.uninstall()
|
||||
return {
|
||||
'success': True,
|
||||
'message': 'None of assets'
|
||||
}
|
||||
|
||||
or_subquery = []
|
||||
for id in all_ids:
|
||||
or_subquery.append({'_id': id})
|
||||
delete_query = {'$or': or_subquery}
|
||||
self.db.delete_many(delete_query)
|
||||
|
||||
self.db.uninstall()
|
||||
return {
|
||||
'success': True,
|
||||
'message': 'All assets were deleted!'
|
||||
}
|
||||
|
||||
def find_child(self, entity):
|
||||
output = []
|
||||
id = entity['_id']
|
||||
visuals = [x for x in self.db.find({'data.visualParent': id})]
|
||||
assert len(visuals) == 0, 'This asset has another asset as child'
|
||||
childs = self.db.find({'parent': id})
|
||||
for child in childs:
|
||||
output.append(child['_id'])
|
||||
output.extend(self.find_child(child))
|
||||
return output
|
||||
|
||||
def find_assets(self, asset_names):
|
||||
assets = []
|
||||
for name in asset_names:
|
||||
entity = self.db.find_one({
|
||||
'type': 'asset',
|
||||
'name': name
|
||||
})
|
||||
if entity is not None and entity not in assets:
|
||||
assets.append(entity)
|
||||
return assets
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
# Validate that session is an instance of ftrack_api.Session. If not,
|
||||
# assume that register is being called from an old or incompatible API and
|
||||
# return without doing anything.
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
AssetsRemover(session).register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
'''Set up logging and register action.'''
|
||||
if arguments is None:
|
||||
arguments = []
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
# Allow setting of logging level from arguments.
|
||||
loggingLevels = {}
|
||||
for level in (
|
||||
logging.NOTSET, logging.DEBUG, logging.INFO, logging.WARNING,
|
||||
logging.ERROR, logging.CRITICAL
|
||||
):
|
||||
loggingLevels[logging.getLevelName(level).lower()] = level
|
||||
|
||||
parser.add_argument(
|
||||
'-v', '--verbosity',
|
||||
help='Set the logging output verbosity.',
|
||||
choices=loggingLevels.keys(),
|
||||
default='info'
|
||||
)
|
||||
namespace = parser.parse_args(arguments)
|
||||
|
||||
# Set up basic logging
|
||||
logging.basicConfig(level=loggingLevels[namespace.verbosity])
|
||||
|
||||
session = ftrack_api.Session()
|
||||
|
||||
register(session)
|
||||
|
||||
# Wait for events
|
||||
logging.info(
|
||||
'Registered actions and listening for events. Use Ctrl-C to abort.'
|
||||
)
|
||||
session.event_hub.wait()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
raise SystemExit(main(sys.argv[1:]))
|
||||
|
|
@ -1,9 +1,8 @@
|
|||
import sys
|
||||
import argparse
|
||||
import logging
|
||||
import getpass
|
||||
import ftrack_api
|
||||
from ftrack_action_handler import BaseAction
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction
|
||||
|
||||
|
||||
class VersionsCleanup(BaseAction):
|
||||
|
|
@ -14,7 +13,6 @@ class VersionsCleanup(BaseAction):
|
|||
# Action label
|
||||
label = 'Versions cleanup'
|
||||
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
|
||||
|
|
@ -34,13 +32,13 @@ class VersionsCleanup(BaseAction):
|
|||
session.delete(version)
|
||||
try:
|
||||
session.commit()
|
||||
except:
|
||||
except Exception:
|
||||
session.rollback()
|
||||
raise
|
||||
|
||||
return {
|
||||
'success': True,
|
||||
'message': 'removed hidden versions'
|
||||
'message': 'Hidden versions were removed'
|
||||
}
|
||||
|
||||
|
||||
|
|
@ -53,8 +51,7 @@ def register(session, **kw):
|
|||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
action_handler = VersionsCleanup(session)
|
||||
action_handler.register()
|
||||
VersionsCleanup(session).register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
|
|
|
|||
266
pype/ftrack/actions/action_djvview.py
Normal file
266
pype/ftrack/actions/action_djvview.py
Normal file
|
|
@ -0,0 +1,266 @@
|
|||
import os
|
||||
import sys
|
||||
import json
|
||||
import logging
|
||||
import subprocess
|
||||
from operator import itemgetter
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction
|
||||
from pypeapp import Logger, config
|
||||
|
||||
log = Logger().get_logger(__name__)
|
||||
|
||||
|
||||
class DJVViewAction(BaseAction):
|
||||
"""Launch DJVView action."""
|
||||
identifier = "djvview-launch-action"
|
||||
label = "DJV View"
|
||||
description = "DJV View Launcher"
|
||||
icon = '{}/app_icons/djvView.png'.format(
|
||||
os.environ.get('PYPE_STATICS_SERVER', '')
|
||||
)
|
||||
type = 'Application'
|
||||
|
||||
def __init__(self, session):
|
||||
'''Expects a ftrack_api.Session instance'''
|
||||
super().__init__(session)
|
||||
self.djv_path = None
|
||||
|
||||
self.config_data = config.get_presets()['djv_view']['config']
|
||||
self.set_djv_path()
|
||||
|
||||
if self.djv_path is None:
|
||||
return
|
||||
|
||||
self.allowed_types = self.config_data.get(
|
||||
'file_ext', ["img", "mov", "exr"]
|
||||
)
|
||||
|
||||
def register(self):
|
||||
assert (self.djv_path is not None), (
|
||||
'DJV View is not installed'
|
||||
' or paths in presets are not set correctly'
|
||||
)
|
||||
super().register()
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
"""Return available actions based on *event*. """
|
||||
selection = event["data"].get("selection", [])
|
||||
if len(selection) != 1:
|
||||
return False
|
||||
|
||||
entityType = selection[0].get("entityType", None)
|
||||
if entityType in ["assetversion", "task"]:
|
||||
return True
|
||||
return False
|
||||
|
||||
def set_djv_path(self):
|
||||
for path in self.config_data.get("djv_paths", []):
|
||||
if os.path.exists(path):
|
||||
self.djv_path = path
|
||||
break
|
||||
|
||||
def interface(self, session, entities, event):
|
||||
if event['data'].get('values', {}):
|
||||
return
|
||||
|
||||
entity = entities[0]
|
||||
versions = []
|
||||
|
||||
entity_type = entity.entity_type.lower()
|
||||
if entity_type == "assetversion":
|
||||
if (
|
||||
entity[
|
||||
'components'
|
||||
][0]['file_type'][1:] in self.allowed_types
|
||||
):
|
||||
versions.append(entity)
|
||||
else:
|
||||
master_entity = entity
|
||||
if entity_type == "task":
|
||||
master_entity = entity['parent']
|
||||
|
||||
for asset in master_entity['assets']:
|
||||
for version in asset['versions']:
|
||||
# Get only AssetVersion of selected task
|
||||
if (
|
||||
entity_type == "task" and
|
||||
version['task']['id'] != entity['id']
|
||||
):
|
||||
continue
|
||||
# Get only components with allowed type
|
||||
filetype = version['components'][0]['file_type']
|
||||
if filetype[1:] in self.allowed_types:
|
||||
versions.append(version)
|
||||
|
||||
if len(versions) < 1:
|
||||
return {
|
||||
'success': False,
|
||||
'message': 'There are no Asset Versions to open.'
|
||||
}
|
||||
|
||||
items = []
|
||||
base_label = "v{0} - {1} - {2}"
|
||||
default_component = self.config_data.get(
|
||||
'default_component', None
|
||||
)
|
||||
last_available = None
|
||||
select_value = None
|
||||
for version in versions:
|
||||
for component in version['components']:
|
||||
label = base_label.format(
|
||||
str(version['version']).zfill(3),
|
||||
version['asset']['type']['name'],
|
||||
component['name']
|
||||
)
|
||||
|
||||
try:
|
||||
location = component[
|
||||
'component_locations'
|
||||
][0]['location']
|
||||
file_path = location.get_filesystem_path(component)
|
||||
except Exception:
|
||||
file_path = component[
|
||||
'component_locations'
|
||||
][0]['resource_identifier']
|
||||
|
||||
if os.path.isdir(os.path.dirname(file_path)):
|
||||
last_available = file_path
|
||||
if component['name'] == default_component:
|
||||
select_value = file_path
|
||||
items.append(
|
||||
{'label': label, 'value': file_path}
|
||||
)
|
||||
|
||||
if len(items) == 0:
|
||||
return {
|
||||
'success': False,
|
||||
'message': (
|
||||
'There are no Asset Versions with accessible path.'
|
||||
)
|
||||
}
|
||||
|
||||
item = {
|
||||
'label': 'Items to view',
|
||||
'type': 'enumerator',
|
||||
'name': 'path',
|
||||
'data': sorted(
|
||||
items,
|
||||
key=itemgetter('label'),
|
||||
reverse=True
|
||||
)
|
||||
}
|
||||
if select_value is not None:
|
||||
item['value'] = select_value
|
||||
else:
|
||||
item['value'] = last_available
|
||||
|
||||
return {'items': [item]}
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
"""Callback method for DJVView action."""
|
||||
|
||||
# Launching application
|
||||
if "values" not in event["data"]:
|
||||
return
|
||||
filename = event['data']['values']['path']
|
||||
|
||||
fps = entities[0].get('custom_attributes', {}).get('fps', None)
|
||||
|
||||
cmd = []
|
||||
# DJV path
|
||||
cmd.append(os.path.normpath(self.djv_path))
|
||||
# DJV Options Start ##############################################
|
||||
# '''layer name'''
|
||||
# cmd.append('-file_layer (value)')
|
||||
# ''' Proxy scale: 1/2, 1/4, 1/8'''
|
||||
# cmd.append('-file_proxy 1/2')
|
||||
# ''' Cache: True, False.'''
|
||||
# cmd.append('-file_cache True')
|
||||
# ''' Start in full screen '''
|
||||
# cmd.append('-window_fullscreen')
|
||||
# ''' Toolbar controls: False, True.'''
|
||||
# cmd.append("-window_toolbar False")
|
||||
# ''' Window controls: False, True.'''
|
||||
# cmd.append("-window_playbar False")
|
||||
# ''' Grid overlay: None, 1x1, 10x10, 100x100.'''
|
||||
# cmd.append("-view_grid None")
|
||||
# ''' Heads up display: True, False.'''
|
||||
# cmd.append("-view_hud True")
|
||||
''' Playback: Stop, Forward, Reverse.'''
|
||||
cmd.append("-playback Forward")
|
||||
# ''' Frame.'''
|
||||
# cmd.append("-playback_frame (value)")
|
||||
if fps is not None:
|
||||
cmd.append("-playback_speed {}".format(int(fps)))
|
||||
# ''' Timer: Sleep, Timeout. Value: Sleep.'''
|
||||
# cmd.append("-playback_timer (value)")
|
||||
# ''' Timer resolution (seconds): 0.001.'''
|
||||
# cmd.append("-playback_timer_resolution (value)")
|
||||
''' Time units: Timecode, Frames.'''
|
||||
cmd.append("-time_units Frames")
|
||||
# DJV Options End ################################################
|
||||
|
||||
# PATH TO COMPONENT
|
||||
cmd.append(os.path.normpath(filename))
|
||||
|
||||
try:
|
||||
# Run DJV with these commands
|
||||
subprocess.Popen(' '.join(cmd))
|
||||
except FileNotFoundError:
|
||||
return {
|
||||
'success': False,
|
||||
'message': 'File "{}" was not found.'.format(
|
||||
os.path.basename(filename)
|
||||
)
|
||||
}
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def register(session):
|
||||
"""Register hooks."""
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
DJVViewAction(session).register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
'''Set up logging and register action.'''
|
||||
if arguments is None:
|
||||
arguments = []
|
||||
|
||||
import argparse
|
||||
parser = argparse.ArgumentParser()
|
||||
# Allow setting of logging level from arguments.
|
||||
loggingLevels = {}
|
||||
for level in (
|
||||
logging.NOTSET, logging.DEBUG, logging.INFO, logging.WARNING,
|
||||
logging.ERROR, logging.CRITICAL
|
||||
):
|
||||
loggingLevels[logging.getLevelName(level).lower()] = level
|
||||
|
||||
parser.add_argument(
|
||||
'-v', '--verbosity',
|
||||
help='Set the logging output verbosity.',
|
||||
choices=loggingLevels.keys(),
|
||||
default='info'
|
||||
)
|
||||
namespace = parser.parse_args(arguments)
|
||||
|
||||
# Set up basic logging
|
||||
logging.basicConfig(level=loggingLevels[namespace.verbosity])
|
||||
|
||||
session = ftrack_api.Session()
|
||||
register(session)
|
||||
|
||||
# Wait for events
|
||||
logging.info(
|
||||
'Registered actions and listening for events. Use Ctrl-C to abort.'
|
||||
)
|
||||
session.event_hub.wait()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
raise SystemExit(main(sys.argv[1:]))
|
||||
169
pype/ftrack/actions/action_job_killer.py
Normal file
169
pype/ftrack/actions/action_job_killer.py
Normal file
|
|
@ -0,0 +1,169 @@
|
|||
import os
|
||||
import sys
|
||||
import argparse
|
||||
import logging
|
||||
import json
|
||||
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction
|
||||
|
||||
|
||||
class JobKiller(BaseAction):
|
||||
'''Edit meta data action.'''
|
||||
|
||||
#: Action identifier.
|
||||
identifier = 'job.killer'
|
||||
#: Action label.
|
||||
label = 'Job Killer'
|
||||
#: Action description.
|
||||
description = 'Killing selected running jobs'
|
||||
#: roles that are allowed to register this action
|
||||
role_list = ['Pypeclub', 'Administrator']
|
||||
icon = '{}/ftrack/action_icons/JobKiller.svg'.format(
|
||||
os.environ.get('PYPE_STATICS_SERVER', '')
|
||||
)
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
|
||||
return True
|
||||
|
||||
def interface(self, session, entities, event):
|
||||
if not event['data'].get('values', {}):
|
||||
title = 'Select jobs to kill'
|
||||
|
||||
jobs = session.query(
|
||||
'select id, status from Job'
|
||||
' where status in ("queued", "running")'
|
||||
).all()
|
||||
|
||||
items = []
|
||||
|
||||
item_splitter = {'type': 'label', 'value': '---'}
|
||||
for job in jobs:
|
||||
try:
|
||||
data = json.loads(job['data'])
|
||||
desctiption = data['description']
|
||||
except Exception:
|
||||
desctiption = '*No description*'
|
||||
user = job['user']['username']
|
||||
created = job['created_at'].strftime('%d.%m.%Y %H:%M:%S')
|
||||
label = '{} - {} - {}'.format(
|
||||
desctiption, created, user
|
||||
)
|
||||
item_label = {
|
||||
'type': 'label',
|
||||
'value': label
|
||||
}
|
||||
item = {
|
||||
'name': job['id'],
|
||||
'type': 'boolean',
|
||||
'value': False
|
||||
}
|
||||
if len(items) > 0:
|
||||
items.append(item_splitter)
|
||||
items.append(item_label)
|
||||
items.append(item)
|
||||
|
||||
if len(items) == 0:
|
||||
return {
|
||||
'success': False,
|
||||
'message': 'Didn\'t found any running jobs'
|
||||
}
|
||||
else:
|
||||
return {
|
||||
'items': items,
|
||||
'title': title
|
||||
}
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
""" GET JOB """
|
||||
if 'values' not in event['data']:
|
||||
return
|
||||
|
||||
values = event['data']['values']
|
||||
if len(values) <= 0:
|
||||
return {
|
||||
'success': True,
|
||||
'message': 'No jobs to kill!'
|
||||
}
|
||||
jobs = []
|
||||
job_ids = []
|
||||
|
||||
for k, v in values.items():
|
||||
if v is True:
|
||||
job_ids.append(k)
|
||||
|
||||
for id in job_ids:
|
||||
query = 'Job where id is "{}"'.format(id)
|
||||
jobs.append(session.query(query).one())
|
||||
# Update all the queried jobs, setting the status to failed.
|
||||
for job in jobs:
|
||||
try:
|
||||
job['status'] = 'failed'
|
||||
session.commit()
|
||||
self.log.debug((
|
||||
'Changing Job ({}) status: {} -> failed'
|
||||
).format(job['id'], job['status']))
|
||||
except Exception:
|
||||
self.warning.debug((
|
||||
'Changing Job ({}) has failed'
|
||||
).format(job['id']))
|
||||
|
||||
self.log.info('All running jobs were killed Successfully!')
|
||||
return {
|
||||
'success': True,
|
||||
'message': 'All running jobs were killed Successfully!'
|
||||
}
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
# Validate that session is an instance of ftrack_api.Session. If not,
|
||||
# assume that register is being called from an old or incompatible API and
|
||||
# return without doing anything.
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
JobKiller(session).register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
'''Set up logging and register action.'''
|
||||
if arguments is None:
|
||||
arguments = []
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
# Allow setting of logging level from arguments.
|
||||
loggingLevels = {}
|
||||
for level in (
|
||||
logging.NOTSET, logging.DEBUG, logging.INFO, logging.WARNING,
|
||||
logging.ERROR, logging.CRITICAL
|
||||
):
|
||||
loggingLevels[logging.getLevelName(level).lower()] = level
|
||||
|
||||
parser.add_argument(
|
||||
'-v', '--verbosity',
|
||||
help='Set the logging output verbosity.',
|
||||
choices=loggingLevels.keys(),
|
||||
default='info'
|
||||
)
|
||||
namespace = parser.parse_args(arguments)
|
||||
|
||||
# Set up basic logging
|
||||
logging.basicConfig(level=loggingLevels[namespace.verbosity])
|
||||
|
||||
session = ftrack_api.Session()
|
||||
|
||||
register(session)
|
||||
|
||||
# Wait for events
|
||||
logging.info(
|
||||
'Registered actions and listening for events. Use Ctrl-C to abort.'
|
||||
)
|
||||
session.event_hub.wait()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
raise SystemExit(main(sys.argv[1:]))
|
||||
|
|
@ -1,107 +0,0 @@
|
|||
# :coding: utf-8
|
||||
# :copyright: Copyright (c) 2017 ftrack
|
||||
import sys
|
||||
import argparse
|
||||
import logging
|
||||
|
||||
import datetime
|
||||
import ftrack_api
|
||||
from ftrack_action_handler import BaseAction
|
||||
|
||||
|
||||
class JobKiller(BaseAction):
|
||||
'''Edit meta data action.'''
|
||||
|
||||
#: Action identifier.
|
||||
identifier = 'job.kill'
|
||||
#: Action label.
|
||||
label = 'Job Killer'
|
||||
#: Action description.
|
||||
description = 'Killing all running jobs younger than day'
|
||||
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
""" GET JOB """
|
||||
|
||||
yesterday = datetime.date.today() - datetime.timedelta(days=1)
|
||||
|
||||
jobs = session.query(
|
||||
'select id, status from Job '
|
||||
'where status in ("queued", "running") and created_at > {0}'.format(yesterday)
|
||||
)
|
||||
|
||||
# Update all the queried jobs, setting the status to failed.
|
||||
for job in jobs:
|
||||
print(job['created_at'])
|
||||
print('Changing Job ({}) status: {} -> failed'.format(job['id'], job['status']))
|
||||
job['status'] = 'failed'
|
||||
|
||||
try:
|
||||
session.commit()
|
||||
except:
|
||||
session.rollback()
|
||||
|
||||
print('All running jobs were killed Successfully!')
|
||||
return {
|
||||
'success': True,
|
||||
'message': 'All running jobs were killed Successfully!'
|
||||
}
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
# Validate that session is an instance of ftrack_api.Session. If not,
|
||||
# assume that register is being called from an old or incompatible API and
|
||||
# return without doing anything.
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
action_handler = JobKiller(session)
|
||||
action_handler.register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
'''Set up logging and register action.'''
|
||||
if arguments is None:
|
||||
arguments = []
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
# Allow setting of logging level from arguments.
|
||||
loggingLevels = {}
|
||||
for level in (
|
||||
logging.NOTSET, logging.DEBUG, logging.INFO, logging.WARNING,
|
||||
logging.ERROR, logging.CRITICAL
|
||||
):
|
||||
loggingLevels[logging.getLevelName(level).lower()] = level
|
||||
|
||||
parser.add_argument(
|
||||
'-v', '--verbosity',
|
||||
help='Set the logging output verbosity.',
|
||||
choices=loggingLevels.keys(),
|
||||
default='info'
|
||||
)
|
||||
namespace = parser.parse_args(arguments)
|
||||
|
||||
# Set up basic logging
|
||||
logging.basicConfig(level=loggingLevels[namespace.verbosity])
|
||||
|
||||
session = ftrack_api.Session()
|
||||
|
||||
register(session)
|
||||
|
||||
# Wait for events
|
||||
logging.info(
|
||||
'Registered actions and listening for events. Use Ctrl-C to abort.'
|
||||
)
|
||||
session.event_hub.wait()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
raise SystemExit(main(sys.argv[1:]))
|
||||
160
pype/ftrack/actions/action_multiple_notes.py
Normal file
160
pype/ftrack/actions/action_multiple_notes.py
Normal file
|
|
@ -0,0 +1,160 @@
|
|||
import os
|
||||
import sys
|
||||
import argparse
|
||||
import logging
|
||||
from pype.vendor import ftrack_api
|
||||
|
||||
from pype.ftrack import BaseAction
|
||||
|
||||
|
||||
class MultipleNotes(BaseAction):
|
||||
'''Edit meta data action.'''
|
||||
|
||||
#: Action identifier.
|
||||
identifier = 'multiple.notes'
|
||||
#: Action label.
|
||||
label = 'Multiple Notes'
|
||||
#: Action description.
|
||||
description = 'Add same note to multiple Asset Versions'
|
||||
icon = '{}/ftrack/action_icons/MultipleNotes.svg'.format(
|
||||
os.environ.get('PYPE_STATICS_SERVER', '')
|
||||
)
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
valid = True
|
||||
for entity in entities:
|
||||
if entity.entity_type.lower() != 'assetversion':
|
||||
valid = False
|
||||
break
|
||||
return valid
|
||||
|
||||
def interface(self, session, entities, event):
|
||||
if not event['data'].get('values', {}):
|
||||
note_label = {
|
||||
'type': 'label',
|
||||
'value': '# Enter note: #'
|
||||
}
|
||||
|
||||
note_value = {
|
||||
'name': 'note',
|
||||
'type': 'textarea'
|
||||
}
|
||||
|
||||
category_label = {
|
||||
'type': 'label',
|
||||
'value': '## Category: ##'
|
||||
}
|
||||
|
||||
category_data = []
|
||||
category_data.append({
|
||||
'label': '- None -',
|
||||
'value': 'none'
|
||||
})
|
||||
all_categories = session.query('NoteCategory').all()
|
||||
for cat in all_categories:
|
||||
category_data.append({
|
||||
'label': cat['name'],
|
||||
'value': cat['id']
|
||||
})
|
||||
category_value = {
|
||||
'type': 'enumerator',
|
||||
'name': 'category',
|
||||
'data': category_data,
|
||||
'value': 'none'
|
||||
}
|
||||
|
||||
splitter = {
|
||||
'type': 'label',
|
||||
'value': '{}'.format(200*"-")
|
||||
}
|
||||
|
||||
items = []
|
||||
items.append(note_label)
|
||||
items.append(note_value)
|
||||
items.append(splitter)
|
||||
items.append(category_label)
|
||||
items.append(category_value)
|
||||
return items
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
if 'values' not in event['data']:
|
||||
return
|
||||
|
||||
values = event['data']['values']
|
||||
if len(values) <= 0 or 'note' not in values:
|
||||
return False
|
||||
# Get Note text
|
||||
note_value = values['note']
|
||||
if note_value.lower().strip() == '':
|
||||
return False
|
||||
# Get User
|
||||
user = session.query(
|
||||
'User where username is "{}"'.format(session.api_user)
|
||||
).one()
|
||||
# Base note data
|
||||
note_data = {
|
||||
'content': note_value,
|
||||
'author': user
|
||||
}
|
||||
# Get category
|
||||
category_value = values['category']
|
||||
if category_value != 'none':
|
||||
category = session.query(
|
||||
'NoteCategory where id is "{}"'.format(category_value)
|
||||
).one()
|
||||
note_data['category'] = category
|
||||
# Create notes for entities
|
||||
for entity in entities:
|
||||
new_note = session.create('Note', note_data)
|
||||
entity['notes'].append(new_note)
|
||||
session.commit()
|
||||
return True
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
MultipleNotes(session).register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
'''Set up logging and register action.'''
|
||||
if arguments is None:
|
||||
arguments = []
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
# Allow setting of logging level from arguments.
|
||||
loggingLevels = {}
|
||||
for level in (
|
||||
logging.NOTSET, logging.DEBUG, logging.INFO, logging.WARNING,
|
||||
logging.ERROR, logging.CRITICAL
|
||||
):
|
||||
loggingLevels[logging.getLevelName(level).lower()] = level
|
||||
|
||||
parser.add_argument(
|
||||
'-v', '--verbosity',
|
||||
help='Set the logging output verbosity.',
|
||||
choices=loggingLevels.keys(),
|
||||
default='info'
|
||||
)
|
||||
namespace = parser.parse_args(arguments)
|
||||
|
||||
# Set up basic logging
|
||||
logging.basicConfig(level=loggingLevels[namespace.verbosity])
|
||||
|
||||
session = ftrack_api.Session()
|
||||
register(session)
|
||||
|
||||
# Wait for events
|
||||
logging.info(
|
||||
'Registered actions and listening for events. Use Ctrl-C to abort.'
|
||||
)
|
||||
session.event_hub.wait()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
raise SystemExit(main(sys.argv[1:]))
|
||||
374
pype/ftrack/actions/action_rv.py
Normal file
374
pype/ftrack/actions/action_rv.py
Normal file
|
|
@ -0,0 +1,374 @@
|
|||
import os
|
||||
import sys
|
||||
import subprocess
|
||||
import logging
|
||||
import traceback
|
||||
import json
|
||||
|
||||
from pypeapp import Logger, config
|
||||
from pype.ftrack import BaseAction
|
||||
from pype.vendor import ftrack_api
|
||||
from avalon import io, api
|
||||
|
||||
log = Logger().get_logger(__name__)
|
||||
|
||||
|
||||
class RVAction(BaseAction):
|
||||
""" Launch RV action """
|
||||
identifier = "rv.launch.action"
|
||||
label = "rv"
|
||||
description = "rv Launcher"
|
||||
icon = '{}/ftrack/action_icons/RV.png'.format(
|
||||
os.environ.get('PYPE_STATICS_SERVER', '')
|
||||
)
|
||||
type = 'Application'
|
||||
|
||||
def __init__(self, session):
|
||||
""" Constructor
|
||||
|
||||
:param session: ftrack Session
|
||||
:type session: :class:`ftrack_api.Session`
|
||||
"""
|
||||
super().__init__(session)
|
||||
self.rv_path = None
|
||||
self.config_data = None
|
||||
|
||||
# RV_HOME should be set if properly installed
|
||||
if os.environ.get('RV_HOME'):
|
||||
self.rv_path = os.path.join(
|
||||
os.environ.get('RV_HOME'),
|
||||
'bin',
|
||||
'rv'
|
||||
)
|
||||
else:
|
||||
# if not, fallback to config file location
|
||||
self.config_data = config.get_presets()['rv']['config']
|
||||
self.set_rv_path()
|
||||
|
||||
if self.rv_path is None:
|
||||
return
|
||||
|
||||
self.allowed_types = self.config_data.get(
|
||||
'file_ext', ["img", "mov", "exr"]
|
||||
)
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
"""Return available actions based on *event*. """
|
||||
return True
|
||||
|
||||
def set_rv_path(self):
|
||||
self.rv_path = self.config_data.get("rv_path")
|
||||
|
||||
def register(self):
|
||||
assert (self.rv_path is not None), (
|
||||
'RV is not installed'
|
||||
' or paths in presets are not set correctly'
|
||||
)
|
||||
super().register()
|
||||
|
||||
def get_components_from_entity(self, session, entity, components):
|
||||
"""Get components from various entity types.
|
||||
|
||||
The components dictionary is modifid in place, so nothing is returned.
|
||||
|
||||
Args:
|
||||
entity (Ftrack entity)
|
||||
components (dict)
|
||||
"""
|
||||
|
||||
if entity.entity_type.lower() == "assetversion":
|
||||
for component in entity["components"]:
|
||||
if component["file_type"][1:] not in self.allowed_types:
|
||||
continue
|
||||
|
||||
try:
|
||||
components[entity["asset"]["parent"]["name"]].append(
|
||||
component
|
||||
)
|
||||
except KeyError:
|
||||
components[entity["asset"]["parent"]["name"]] = [component]
|
||||
|
||||
return
|
||||
|
||||
if entity.entity_type.lower() == "task":
|
||||
query = "AssetVersion where task_id is '{0}'".format(entity["id"])
|
||||
for assetversion in session.query(query):
|
||||
self.get_components_from_entity(
|
||||
session, assetversion, components
|
||||
)
|
||||
|
||||
return
|
||||
|
||||
if entity.entity_type.lower() == "shot":
|
||||
query = "AssetVersion where asset.parent.id is '{0}'".format(
|
||||
entity["id"]
|
||||
)
|
||||
for assetversion in session.query(query):
|
||||
self.get_components_from_entity(
|
||||
session, assetversion, components
|
||||
)
|
||||
|
||||
return
|
||||
|
||||
raise NotImplementedError(
|
||||
"\"{}\" entity type is not implemented yet.".format(
|
||||
entity.entity_type
|
||||
)
|
||||
)
|
||||
|
||||
def interface(self, session, entities, event):
|
||||
if event['data'].get('values', {}):
|
||||
return
|
||||
|
||||
user = session.query(
|
||||
"User where username is '{0}'".format(
|
||||
os.environ["FTRACK_API_USER"]
|
||||
)
|
||||
).one()
|
||||
job = session.create(
|
||||
"Job",
|
||||
{
|
||||
"user": user,
|
||||
"status": "running",
|
||||
"data": json.dumps({
|
||||
"description": "RV: Collecting components."
|
||||
})
|
||||
}
|
||||
)
|
||||
# Commit to feedback to user.
|
||||
session.commit()
|
||||
|
||||
items = []
|
||||
try:
|
||||
items = self.get_interface_items(session, entities)
|
||||
except Exception:
|
||||
log.error(traceback.format_exc())
|
||||
job["status"] = "failed"
|
||||
else:
|
||||
job["status"] = "done"
|
||||
|
||||
# Commit to end job.
|
||||
session.commit()
|
||||
|
||||
return {"items": items}
|
||||
|
||||
def get_interface_items(self, session, entities):
|
||||
|
||||
components = {}
|
||||
for entity in entities:
|
||||
self.get_components_from_entity(session, entity, components)
|
||||
|
||||
# Sort by version
|
||||
for parent_name, entities in components.items():
|
||||
version_mapping = {}
|
||||
for entity in entities:
|
||||
try:
|
||||
version_mapping[entity["version"]["version"]].append(
|
||||
entity
|
||||
)
|
||||
except KeyError:
|
||||
version_mapping[entity["version"]["version"]] = [entity]
|
||||
|
||||
# Sort same versions by date.
|
||||
for version, entities in version_mapping.items():
|
||||
version_mapping[version] = sorted(
|
||||
entities, key=lambda x: x["version"]["date"], reverse=True
|
||||
)
|
||||
|
||||
components[parent_name] = []
|
||||
for version in reversed(sorted(version_mapping.keys())):
|
||||
components[parent_name].extend(version_mapping[version])
|
||||
|
||||
# Items to present to user.
|
||||
items = []
|
||||
label = "{} - v{} - {}"
|
||||
for parent_name, entities in components.items():
|
||||
data = []
|
||||
for entity in entities:
|
||||
data.append(
|
||||
{
|
||||
"label": label.format(
|
||||
entity["version"]["asset"]["name"],
|
||||
str(entity["version"]["version"]).zfill(3),
|
||||
entity["file_type"][1:]
|
||||
),
|
||||
"value": entity["id"]
|
||||
}
|
||||
)
|
||||
|
||||
items.append(
|
||||
{
|
||||
"label": parent_name,
|
||||
"type": "enumerator",
|
||||
"name": parent_name,
|
||||
"data": data,
|
||||
"value": data[0]["value"]
|
||||
}
|
||||
)
|
||||
|
||||
return items
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
"""Callback method for RV action."""
|
||||
# Launching application
|
||||
if "values" not in event["data"]:
|
||||
return
|
||||
|
||||
user = session.query(
|
||||
"User where username is '{0}'".format(
|
||||
os.environ["FTRACK_API_USER"]
|
||||
)
|
||||
).one()
|
||||
job = session.create(
|
||||
"Job",
|
||||
{
|
||||
"user": user,
|
||||
"status": "running",
|
||||
"data": json.dumps({
|
||||
"description": "RV: Collecting file paths."
|
||||
})
|
||||
}
|
||||
)
|
||||
# Commit to feedback to user.
|
||||
session.commit()
|
||||
|
||||
paths = []
|
||||
try:
|
||||
paths = self.get_file_paths(session, event)
|
||||
except Exception:
|
||||
log.error(traceback.format_exc())
|
||||
job["status"] = "failed"
|
||||
else:
|
||||
job["status"] = "done"
|
||||
|
||||
# Commit to end job.
|
||||
session.commit()
|
||||
|
||||
args = [os.path.normpath(self.rv_path)]
|
||||
|
||||
fps = entities[0].get("custom_attributes", {}).get("fps", None)
|
||||
if fps is not None:
|
||||
args.extend(["-fps", str(fps)])
|
||||
|
||||
args.extend(paths)
|
||||
|
||||
log.info("Running rv: {}".format(args))
|
||||
|
||||
subprocess.Popen(args)
|
||||
|
||||
return True
|
||||
|
||||
def get_file_paths(self, session, event):
|
||||
"""Get file paths from selected components."""
|
||||
|
||||
link = session.get(
|
||||
"Component", list(event["data"]["values"].values())[0]
|
||||
)["version"]["asset"]["parent"]["link"][0]
|
||||
project = session.get(link["type"], link["id"])
|
||||
os.environ["AVALON_PROJECT"] = project["name"]
|
||||
api.Session["AVALON_PROJECT"] = project["name"]
|
||||
io.install()
|
||||
|
||||
location = ftrack_api.Session().pick_location()
|
||||
|
||||
paths = []
|
||||
for parent_name in sorted(event["data"]["values"].keys()):
|
||||
component = session.get(
|
||||
"Component", event["data"]["values"][parent_name]
|
||||
)
|
||||
|
||||
# Newer publishes have the source referenced in Ftrack.
|
||||
online_source = False
|
||||
for neighbour_component in component["version"]["components"]:
|
||||
if neighbour_component["name"] != "ftrackreview-mp4_src":
|
||||
continue
|
||||
|
||||
paths.append(
|
||||
location.get_filesystem_path(neighbour_component)
|
||||
)
|
||||
online_source = True
|
||||
|
||||
if online_source:
|
||||
continue
|
||||
|
||||
asset = io.find_one({"type": "asset", "name": parent_name})
|
||||
subset = io.find_one(
|
||||
{
|
||||
"type": "subset",
|
||||
"name": component["version"]["asset"]["name"],
|
||||
"parent": asset["_id"]
|
||||
}
|
||||
)
|
||||
version = io.find_one(
|
||||
{
|
||||
"type": "version",
|
||||
"name": component["version"]["version"],
|
||||
"parent": subset["_id"]
|
||||
}
|
||||
)
|
||||
representation = io.find_one(
|
||||
{
|
||||
"type": "representation",
|
||||
"parent": version["_id"],
|
||||
"name": component["file_type"][1:]
|
||||
}
|
||||
)
|
||||
if representation is None:
|
||||
representation = io.find_one(
|
||||
{
|
||||
"type": "representation",
|
||||
"parent": version["_id"],
|
||||
"name": "preview"
|
||||
}
|
||||
)
|
||||
paths.append(api.get_representation_path(representation))
|
||||
|
||||
return paths
|
||||
|
||||
|
||||
def register(session):
|
||||
"""Register hooks."""
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
RVAction(session).register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
'''Set up logging and register action.'''
|
||||
if arguments is None:
|
||||
arguments = []
|
||||
|
||||
import argparse
|
||||
parser = argparse.ArgumentParser()
|
||||
# Allow setting of logging level from arguments.
|
||||
loggingLevels = {}
|
||||
for level in (
|
||||
logging.NOTSET, logging.DEBUG, logging.INFO, logging.WARNING,
|
||||
logging.ERROR, logging.CRITICAL
|
||||
):
|
||||
loggingLevels[logging.getLevelName(level).lower()] = level
|
||||
|
||||
parser.add_argument(
|
||||
'-v', '--verbosity',
|
||||
help='Set the logging output verbosity.',
|
||||
choices=loggingLevels.keys(),
|
||||
default='info'
|
||||
)
|
||||
namespace = parser.parse_args(arguments)
|
||||
|
||||
# Set up basic logging
|
||||
logging.basicConfig(level=loggingLevels[namespace.verbosity])
|
||||
|
||||
session = ftrack_api.Session()
|
||||
register(session)
|
||||
|
||||
# Wait for events
|
||||
logging.info(
|
||||
'Registered actions and listening for events. Use Ctrl-C to abort.'
|
||||
)
|
||||
session.event_hub.wait()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
raise SystemExit(main(sys.argv[1:]))
|
||||
|
|
@ -1,9 +1,8 @@
|
|||
import sys
|
||||
import argparse
|
||||
import logging
|
||||
import getpass
|
||||
import ftrack_api
|
||||
from ftrack_action_handler import BaseAction
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction
|
||||
|
||||
|
||||
class SetVersion(BaseAction):
|
||||
|
|
@ -11,11 +10,9 @@ class SetVersion(BaseAction):
|
|||
|
||||
#: Action identifier.
|
||||
identifier = 'version.set'
|
||||
|
||||
#: Action label.
|
||||
label = 'Version Set'
|
||||
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
|
||||
|
|
@ -49,23 +46,24 @@ class SetVersion(BaseAction):
|
|||
# Do something with the values or return a new form.
|
||||
values = event['data'].get('values', {})
|
||||
# Default is action True
|
||||
scs = True
|
||||
msg = 'Version was changed to v{0}'.format(values['version_number'])
|
||||
scs = False
|
||||
|
||||
if not values['version_number']:
|
||||
scs = False,
|
||||
msg = "You didn't enter any version."
|
||||
msg = 'You didn\'t enter any version.'
|
||||
elif int(values['version_number']) <= 0:
|
||||
scs = False
|
||||
msg = 'Negative or zero version is not valid.'
|
||||
else:
|
||||
entity['version'] = values['version_number']
|
||||
|
||||
try:
|
||||
session.commit()
|
||||
except:
|
||||
session.rollback()
|
||||
raise
|
||||
try:
|
||||
entity['version'] = values['version_number']
|
||||
session.commit()
|
||||
msg = 'Version was changed to v{0}'.format(
|
||||
values['version_number']
|
||||
)
|
||||
scs = True
|
||||
except Exception as e:
|
||||
msg = 'Unexpected error occurs during version set ({})'.format(
|
||||
str(e)
|
||||
)
|
||||
|
||||
return {
|
||||
'success': scs,
|
||||
|
|
@ -82,8 +80,7 @@ def register(session, **kw):
|
|||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
action_handler = SetVersion(session)
|
||||
action_handler.register()
|
||||
SetVersion(session).register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
|
|
|
|||
79
pype/ftrack/actions/action_start_timer.py
Normal file
79
pype/ftrack/actions/action_start_timer.py
Normal file
|
|
@ -0,0 +1,79 @@
|
|||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction
|
||||
|
||||
|
||||
class StartTimer(BaseAction):
|
||||
'''Starts timer.'''
|
||||
|
||||
identifier = 'start.timer'
|
||||
label = 'Start timer'
|
||||
description = 'Starts timer'
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
return False
|
||||
|
||||
def _handle_result(*arg):
|
||||
return
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
entity = entities[0]
|
||||
if entity.entity_type.lower() != 'task':
|
||||
return
|
||||
self.start_ftrack_timer(entity)
|
||||
try:
|
||||
self.start_clockify_timer(entity)
|
||||
except Exception:
|
||||
self.log.warning(
|
||||
'Failed starting Clockify timer for task: ' + entity['name']
|
||||
)
|
||||
return
|
||||
|
||||
def start_ftrack_timer(self, task):
|
||||
user_query = 'User where username is "{}"'.format(self.session.api_user)
|
||||
user = self.session.query(user_query).one()
|
||||
self.log.info('Starting Ftrack timer for task: ' + task['name'])
|
||||
user.start_timer(task, force=True)
|
||||
self.session.commit()
|
||||
|
||||
def start_clockify_timer(self, task):
|
||||
# Validate Clockify settings if Clockify is required
|
||||
clockify_timer = os.environ.get('CLOCKIFY_WORKSPACE', None)
|
||||
if clockify_timer is None:
|
||||
return
|
||||
|
||||
from pype.clockify import ClockifyAPI
|
||||
clockapi = ClockifyAPI()
|
||||
if clockapi.verify_api() is False:
|
||||
return
|
||||
task_type = task['type']['name']
|
||||
project_name = task['project']['full_name']
|
||||
|
||||
def get_parents(entity):
|
||||
output = []
|
||||
if entity.entity_type.lower() == 'project':
|
||||
return output
|
||||
output.extend(get_parents(entity['parent']))
|
||||
output.append(entity['name'])
|
||||
|
||||
return output
|
||||
|
||||
desc_items = get_parents(task['parent'])
|
||||
desc_items.append(task['name'])
|
||||
description = '/'.join(desc_items)
|
||||
|
||||
project_id = clockapi.get_project_id(project_name)
|
||||
tag_ids = []
|
||||
tag_ids.append(clockapi.get_tag_id(task_type))
|
||||
clockapi.start_time_entry(
|
||||
description, project_id, tag_ids=tag_ids
|
||||
)
|
||||
self.log.info('Starting Clockify timer for task: ' + task['name'])
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
StartTimer(session).register()
|
||||
|
|
@ -1,355 +0,0 @@
|
|||
# :coding: utf-8
|
||||
# :copyright: Copyright (c) 2017 ftrack
|
||||
import sys
|
||||
import argparse
|
||||
import logging
|
||||
import os
|
||||
import ftrack_api
|
||||
import json
|
||||
import re
|
||||
from ftrack_action_handler import BaseAction
|
||||
|
||||
from avalon import io, inventory, lib
|
||||
from avalon.vendor import toml
|
||||
|
||||
class SyncToAvalon(BaseAction):
|
||||
'''Edit meta data action.'''
|
||||
|
||||
#: Action identifier.
|
||||
identifier = 'sync.to.avalon'
|
||||
#: Action label.
|
||||
label = 'SyncToAvalon'
|
||||
#: Action description.
|
||||
description = 'Send data from Ftrack to Avalon'
|
||||
#: Action icon.
|
||||
icon = 'https://cdn1.iconfinder.com/data/icons/hawcons/32/699650-icon-92-inbox-download-512.png'
|
||||
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
discover = False
|
||||
for entity in entities:
|
||||
if entity.entity_type.lower() not in ['task', 'assetversion']:
|
||||
discover = True
|
||||
break
|
||||
|
||||
return discover
|
||||
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
message = ""
|
||||
|
||||
# JOB SETTINGS
|
||||
userId = event['source']['user']['id']
|
||||
user = session.query('User where id is ' + userId).one()
|
||||
|
||||
job = session.create('Job', {
|
||||
'user': user,
|
||||
'status': 'running',
|
||||
'data': json.dumps({
|
||||
'description': 'Synch Ftrack to Avalon.'
|
||||
})
|
||||
})
|
||||
|
||||
try:
|
||||
print("action <" + self.__class__.__name__ + "> is running")
|
||||
|
||||
#TODO AVALON_PROJECTS, AVALON_ASSET, AVALON_SILO should be set up otherwise console log shows avalon debug
|
||||
self.setAvalonAttributes(session)
|
||||
self.importable = []
|
||||
|
||||
# get from top entity in hierarchy all parent entities
|
||||
top_entity = entities[0]['link']
|
||||
if len(top_entity) > 1:
|
||||
for e in top_entity:
|
||||
parent_entity = session.get(e['type'], e['id'])
|
||||
self.importable.append(parent_entity)
|
||||
|
||||
# get all child entities separately/unique
|
||||
for entity in entities:
|
||||
self.getShotAsset(entity)
|
||||
|
||||
# Check duplicate name - raise error if found
|
||||
all_names = {}
|
||||
duplicates = []
|
||||
|
||||
for e in self.importable:
|
||||
name = self.checkName(e['name'])
|
||||
if name in all_names:
|
||||
duplicates.append("'{}'-'{}'".format(all_names[name], e['name']))
|
||||
else:
|
||||
all_names[name] = e['name']
|
||||
|
||||
if len(duplicates) > 0:
|
||||
raise ValueError("Unable to sync: Entity name duplication: {}".format(", ".join(duplicates)))
|
||||
|
||||
# Import all entities to Avalon DB
|
||||
for e in self.importable:
|
||||
self.importToAvalon(session, e)
|
||||
|
||||
job['status'] = 'done'
|
||||
session.commit()
|
||||
print('Synchronization to Avalon was successfull!')
|
||||
|
||||
except Exception as e:
|
||||
job['status'] = 'failed'
|
||||
print('During synchronization to Avalon went something wrong!')
|
||||
print(e)
|
||||
message = str(e)
|
||||
|
||||
if len(message) > 0:
|
||||
return {
|
||||
'success': False,
|
||||
'message': message
|
||||
}
|
||||
|
||||
return {
|
||||
'success': True,
|
||||
'message': "Synchronization was successfull"
|
||||
}
|
||||
def setAvalonAttributes(self, session):
|
||||
self.custom_attributes = []
|
||||
all_avalon_attr = session.query('CustomAttributeGroup where name is "avalon"').one()
|
||||
for cust_attr in all_avalon_attr['custom_attribute_configurations']:
|
||||
if 'avalon_' not in cust_attr['key']:
|
||||
self.custom_attributes.append(cust_attr)
|
||||
|
||||
def getShotAsset(self, entity):
|
||||
if not (entity.entity_type in ['Task']):
|
||||
if entity not in self.importable:
|
||||
self.importable.append(entity)
|
||||
|
||||
if entity['children']:
|
||||
childrens = entity['children']
|
||||
for child in childrens:
|
||||
self.getShotAsset(child)
|
||||
|
||||
def checkName(self, input_name):
|
||||
if input_name.find(" ") == -1:
|
||||
name = input_name
|
||||
else:
|
||||
name = input_name.replace(" ", "-")
|
||||
print("Name of {} was changed to {}".format(input_name, name))
|
||||
return name
|
||||
|
||||
def getConfig(self, entity):
|
||||
apps = []
|
||||
for app in entity['custom_attributes']['applications']:
|
||||
try:
|
||||
label = toml.load(lib.which_app(app))['label']
|
||||
apps.append({'name':app, 'label':label})
|
||||
except Exception as e:
|
||||
print('Error with application {0} - {1}'.format(app, e))
|
||||
|
||||
config = {
|
||||
'schema': 'avalon-core:config-1.0',
|
||||
'tasks': [{'name': ''}],
|
||||
'apps': apps,
|
||||
# TODO redo work!!!
|
||||
'template': {
|
||||
'workfile': '{asset[name]}_{task[name]}_{version:0>3}<_{comment}>',
|
||||
'work': '{root}/{project}/{hierarchy}/{asset}/work/{task}',
|
||||
'publish':'{root}/{project}/{hierarchy}/{asset}/publish/{family}/{subset}/v{version}/{projectcode}_{asset}_{subset}_v{version}.{representation}'}
|
||||
}
|
||||
return config
|
||||
|
||||
|
||||
def importToAvalon(self, session, entity):
|
||||
eLinks = []
|
||||
|
||||
ca_mongoid = 'avalon_mongo_id'
|
||||
|
||||
# get needed info of entity and all parents
|
||||
for e in entity['link']:
|
||||
tmp = session.get(e['type'], e['id'])
|
||||
eLinks.append(tmp)
|
||||
|
||||
entityProj = eLinks[0]
|
||||
|
||||
# set AVALON_PROJECT env
|
||||
os.environ["AVALON_PROJECT"] = entityProj["full_name"]
|
||||
os.environ["AVALON_ASSET"] = entityProj['full_name']
|
||||
|
||||
# Set project template
|
||||
template = {"schema": "avalon-core:inventory-1.0"}
|
||||
|
||||
# --- Begin: PUSH TO Avalon ---
|
||||
io.install()
|
||||
## ----- PROJECT ------
|
||||
# If project don't exists -> <Create project> ELSE <Update Config>
|
||||
avalon_project = io.find_one({"type": "project", "name": entityProj["full_name"]})
|
||||
entity_type = entity.entity_type
|
||||
|
||||
data = {}
|
||||
data['ftrackId'] = entity['id']
|
||||
data['entityType'] = entity_type
|
||||
|
||||
for cust_attr in self.custom_attributes:
|
||||
if cust_attr['entity_type'].lower() in ['asset']:
|
||||
data[cust_attr['key']] = entity['custom_attributes'][cust_attr['key']]
|
||||
|
||||
elif cust_attr['entity_type'].lower() in ['show'] and entity_type.lower() == 'project':
|
||||
data[cust_attr['key']] = entity['custom_attributes'][cust_attr['key']]
|
||||
|
||||
elif cust_attr['entity_type'].lower() in ['task'] and entity_type.lower() != 'project':
|
||||
# Put space between capitals (e.g. 'AssetBuild' -> 'Asset Build')
|
||||
entity_type = re.sub(r"(\w)([A-Z])", r"\1 \2", entity_type)
|
||||
# Get object id of entity type
|
||||
ent_obj_type_id = session.query('ObjectType where name is "{}"'.format(entity_type)).one()['id']
|
||||
|
||||
if cust_attr['object_type_id'] == ent_obj_type_id:
|
||||
data[cust_attr['key']] = entity['custom_attributes'][cust_attr['key']]
|
||||
|
||||
|
||||
if entity.entity_type.lower() in ['project']:
|
||||
# Set project Config
|
||||
config = self.getConfig(entity)
|
||||
|
||||
if avalon_project is None:
|
||||
inventory.save(entityProj['full_name'], config, template)
|
||||
else:
|
||||
io.update_many({'type': 'project','name': entityProj['full_name']},
|
||||
{'$set':{'config':config}})
|
||||
|
||||
data['code'] = entity['name']
|
||||
|
||||
# Store info about project (FtrackId)
|
||||
io.update_many({
|
||||
'type': 'project',
|
||||
'name': entity['full_name']},
|
||||
{'$set':{'data':data}})
|
||||
|
||||
projectId = io.find_one({"type": "project", "name": entityProj["full_name"]})["_id"]
|
||||
if ca_mongoid in entity['custom_attributes']:
|
||||
entity['custom_attributes'][ca_mongoid] = str(projectId)
|
||||
else:
|
||||
print("Custom attribute for <{}> is not created.".format(entity['name']))
|
||||
io.uninstall()
|
||||
return
|
||||
|
||||
# Store project Id
|
||||
projectId = avalon_project["_id"]
|
||||
|
||||
## ----- ASSETS ------
|
||||
# Presets:
|
||||
# TODO how to check if entity is Asset Library or AssetBuild?
|
||||
if entity.entity_type in ['AssetBuild', 'Library']:
|
||||
silo = 'Assets'
|
||||
else:
|
||||
silo = 'Film'
|
||||
|
||||
os.environ['AVALON_SILO'] = silo
|
||||
|
||||
# Get list of parents without project
|
||||
parents = []
|
||||
for i in range(1, len(eLinks)-1):
|
||||
parents.append(eLinks[i])
|
||||
|
||||
# Get info for 'Data' in Avalon DB
|
||||
tasks = []
|
||||
for child in entity['children']:
|
||||
if child.entity_type in ['Task']:
|
||||
tasks.append(child['name'])
|
||||
|
||||
folderStruct = []
|
||||
parentId = None
|
||||
|
||||
for parent in parents:
|
||||
name = self.checkName(parent['name'])
|
||||
folderStruct.append(name)
|
||||
parentId = io.find_one({'type': 'asset', 'name': name})['_id']
|
||||
if parent['parent'].entity_type != 'project' and parentId is None:
|
||||
self.importToAvalon(parent)
|
||||
parentId = io.find_one({'type': 'asset', 'name': name})['_id']
|
||||
|
||||
hierarchy = os.path.sep.join(folderStruct)
|
||||
|
||||
data['visualParent'] = parentId
|
||||
data['parents'] = folderStruct
|
||||
data['tasks'] = tasks
|
||||
data['hierarchy'] = hierarchy
|
||||
|
||||
|
||||
name = self.checkName(entity['name'])
|
||||
os.environ['AVALON_ASSET'] = name
|
||||
|
||||
# Try to find asset in current database
|
||||
avalon_asset = io.find_one({'type': 'asset', 'name': name})
|
||||
# Create if don't exists
|
||||
if avalon_asset is None:
|
||||
inventory.create_asset(name, silo, data, projectId)
|
||||
print("Asset {} - created".format(name))
|
||||
# Raise error if it seems to be different ent. with same name
|
||||
|
||||
elif (avalon_asset['data']['ftrackId'] != data['ftrackId'] or
|
||||
avalon_asset['data']['visualParent'] != data['visualParent'] or
|
||||
avalon_asset['data']['parents'] != data['parents']):
|
||||
raise ValueError('Entity <{}> is not same'.format(name))
|
||||
# Else update info
|
||||
else:
|
||||
io.update_many({'type': 'asset','name': name},
|
||||
{'$set':{'data':data, 'silo': silo}})
|
||||
# TODO check if is asset in same folder!!! ???? FEATURE FOR FUTURE
|
||||
print("Asset {} - updated".format(name))
|
||||
|
||||
## FTRACK FEATURE - FTRACK MUST HAVE avalon_mongo_id FOR EACH ENTITY TYPE EXCEPT TASK
|
||||
# Set custom attribute to avalon/mongo id of entity (parentID is last)
|
||||
if ca_mongoid in entity['custom_attributes']:
|
||||
entity['custom_attributes'][ca_mongoid] = str(parentId)
|
||||
else:
|
||||
print("Custom attribute for <{}> is not created.".format(entity['name']))
|
||||
|
||||
io.uninstall()
|
||||
session.commit()
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
# Validate that session is an instance of ftrack_api.Session. If not,
|
||||
# assume that register is being called from an old or incompatible API and
|
||||
# return without doing anything.
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
action_handler = SyncToAvalon(session)
|
||||
action_handler.register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
'''Set up logging and register action.'''
|
||||
if arguments is None:
|
||||
arguments = []
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
# Allow setting of logging level from arguments.
|
||||
loggingLevels = {}
|
||||
for level in (
|
||||
logging.NOTSET, logging.DEBUG, logging.INFO, logging.WARNING,
|
||||
logging.ERROR, logging.CRITICAL
|
||||
):
|
||||
loggingLevels[logging.getLevelName(level).lower()] = level
|
||||
|
||||
parser.add_argument(
|
||||
'-v', '--verbosity',
|
||||
help='Set the logging output verbosity.',
|
||||
choices=loggingLevels.keys(),
|
||||
default='info'
|
||||
)
|
||||
namespace = parser.parse_args(arguments)
|
||||
|
||||
# Set up basic logging
|
||||
logging.basicConfig(level=loggingLevels[namespace.verbosity])
|
||||
|
||||
session = ftrack_api.Session()
|
||||
register(session)
|
||||
|
||||
# Wait for events
|
||||
logging.info(
|
||||
'Registered actions and listening for events. Use Ctrl-C to abort.'
|
||||
)
|
||||
session.event_hub.wait()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
raise SystemExit(main(sys.argv[1:]))
|
||||
353
pype/ftrack/actions/action_sync_hier_attrs_local.py
Normal file
353
pype/ftrack/actions/action_sync_hier_attrs_local.py
Normal file
|
|
@ -0,0 +1,353 @@
|
|||
import os
|
||||
import sys
|
||||
import json
|
||||
import argparse
|
||||
import logging
|
||||
import collections
|
||||
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction, lib
|
||||
from avalon.tools.libraryloader.io_nonsingleton import DbConnector
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
|
||||
class SyncHierarchicalAttrs(BaseAction):
|
||||
|
||||
db_con = DbConnector()
|
||||
ca_mongoid = lib.get_ca_mongoid()
|
||||
|
||||
#: Action identifier.
|
||||
identifier = 'sync.hierarchical.attrs.local'
|
||||
#: Action label.
|
||||
label = 'Sync HierAttrs - Local'
|
||||
#: Action description.
|
||||
description = 'Synchronize hierarchical attributes'
|
||||
#: Icon
|
||||
icon = '{}/ftrack/action_icons/SyncHierarchicalAttrsLocal.svg'.format(
|
||||
os.environ.get('PYPE_STATICS_SERVER', '')
|
||||
)
|
||||
|
||||
#: roles that are allowed to register this action
|
||||
role_list = ['Pypeclub', 'Administrator', 'Project Manager']
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
for entity in entities:
|
||||
if (
|
||||
entity.get('context_type', '').lower() in ('show', 'task') and
|
||||
entity.entity_type.lower() != 'task'
|
||||
):
|
||||
return True
|
||||
return False
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
self.interface_messages = {}
|
||||
user = session.query(
|
||||
'User where id is "{}"'.format(event['source']['user']['id'])
|
||||
).one()
|
||||
|
||||
job = session.create('Job', {
|
||||
'user': user,
|
||||
'status': 'running',
|
||||
'data': json.dumps({
|
||||
'description': 'Sync Hierachical attributes'
|
||||
})
|
||||
})
|
||||
session.commit()
|
||||
self.log.debug('Job with id "{}" created'.format(job['id']))
|
||||
|
||||
process_session = ftrack_api.Session(
|
||||
server_url=session.server_url,
|
||||
api_key=session.api_key,
|
||||
api_user=session.api_user,
|
||||
auto_connect_event_hub=True
|
||||
)
|
||||
|
||||
try:
|
||||
# Collect hierarchical attrs
|
||||
self.log.debug('Collecting Hierarchical custom attributes started')
|
||||
custom_attributes = {}
|
||||
all_avalon_attr = process_session.query(
|
||||
'CustomAttributeGroup where name is "avalon"'
|
||||
).one()
|
||||
|
||||
error_key = (
|
||||
'Hierarchical attributes with set "default" value (not allowed)'
|
||||
)
|
||||
|
||||
for cust_attr in all_avalon_attr['custom_attribute_configurations']:
|
||||
if 'avalon_' in cust_attr['key']:
|
||||
continue
|
||||
|
||||
if not cust_attr['is_hierarchical']:
|
||||
continue
|
||||
|
||||
if cust_attr['default']:
|
||||
if error_key not in self.interface_messages:
|
||||
self.interface_messages[error_key] = []
|
||||
self.interface_messages[error_key].append(
|
||||
cust_attr['label']
|
||||
)
|
||||
|
||||
self.log.warning((
|
||||
'Custom attribute "{}" has set default value.'
|
||||
' This attribute can\'t be synchronized'
|
||||
).format(cust_attr['label']))
|
||||
continue
|
||||
|
||||
custom_attributes[cust_attr['key']] = cust_attr
|
||||
|
||||
self.log.debug(
|
||||
'Collecting Hierarchical custom attributes has finished'
|
||||
)
|
||||
|
||||
if not custom_attributes:
|
||||
msg = 'No hierarchical attributes to sync.'
|
||||
self.log.debug(msg)
|
||||
return {
|
||||
'success': True,
|
||||
'message': msg
|
||||
}
|
||||
|
||||
entity = entities[0]
|
||||
if entity.entity_type.lower() == 'project':
|
||||
project_name = entity['full_name']
|
||||
else:
|
||||
project_name = entity['project']['full_name']
|
||||
|
||||
self.db_con.install()
|
||||
self.db_con.Session['AVALON_PROJECT'] = project_name
|
||||
|
||||
_entities = self._get_entities(event, process_session)
|
||||
|
||||
for entity in _entities:
|
||||
self.log.debug(30*'-')
|
||||
self.log.debug(
|
||||
'Processing entity "{}"'.format(entity.get('name', entity))
|
||||
)
|
||||
|
||||
ent_name = entity.get('name', entity)
|
||||
if entity.entity_type.lower() == 'project':
|
||||
ent_name = entity['full_name']
|
||||
|
||||
for key in custom_attributes:
|
||||
self.log.debug(30*'*')
|
||||
self.log.debug(
|
||||
'Processing Custom attribute key "{}"'.format(key)
|
||||
)
|
||||
# check if entity has that attribute
|
||||
if key not in entity['custom_attributes']:
|
||||
error_key = 'Missing key on entities'
|
||||
if error_key not in self.interface_messages:
|
||||
self.interface_messages[error_key] = []
|
||||
|
||||
self.interface_messages[error_key].append(
|
||||
'- key: "{}" - entity: "{}"'.format(key, ent_name)
|
||||
)
|
||||
|
||||
self.log.error((
|
||||
'- key "{}" not found on "{}"'
|
||||
).format(key, ent_name))
|
||||
continue
|
||||
|
||||
value = self.get_hierarchical_value(key, entity)
|
||||
if value is None:
|
||||
error_key = (
|
||||
'Missing value for key on entity'
|
||||
' and its parents (synchronization was skipped)'
|
||||
)
|
||||
if error_key not in self.interface_messages:
|
||||
self.interface_messages[error_key] = []
|
||||
|
||||
self.interface_messages[error_key].append(
|
||||
'- key: "{}" - entity: "{}"'.format(key, ent_name)
|
||||
)
|
||||
|
||||
self.log.warning((
|
||||
'- key "{}" not set on "{}" or its parents'
|
||||
).format(key, ent_name))
|
||||
continue
|
||||
|
||||
self.update_hierarchical_attribute(entity, key, value)
|
||||
|
||||
job['status'] = 'done'
|
||||
session.commit()
|
||||
|
||||
except Exception:
|
||||
self.log.error(
|
||||
'Action "{}" failed'.format(self.label),
|
||||
exc_info=True
|
||||
)
|
||||
|
||||
finally:
|
||||
self.db_con.uninstall()
|
||||
|
||||
if job['status'] in ('queued', 'running'):
|
||||
job['status'] = 'failed'
|
||||
session.commit()
|
||||
if self.interface_messages:
|
||||
title = "Errors during SyncHierarchicalAttrs"
|
||||
self.show_interface_from_dict(
|
||||
messages=self.interface_messages, title=title, event=event
|
||||
)
|
||||
|
||||
return True
|
||||
|
||||
def get_hierarchical_value(self, key, entity):
|
||||
value = entity['custom_attributes'][key]
|
||||
if (
|
||||
value is not None or
|
||||
entity.entity_type.lower() == 'project'
|
||||
):
|
||||
return value
|
||||
|
||||
return self.get_hierarchical_value(key, entity['parent'])
|
||||
|
||||
def update_hierarchical_attribute(self, entity, key, value):
|
||||
if (
|
||||
entity['context_type'].lower() not in ('show', 'task') or
|
||||
entity.entity_type.lower() == 'task'
|
||||
):
|
||||
return
|
||||
|
||||
ent_name = entity.get('name', entity)
|
||||
if entity.entity_type.lower() == 'project':
|
||||
ent_name = entity['full_name']
|
||||
|
||||
hierarchy = '/'.join(
|
||||
[a['name'] for a in entity.get('ancestors', [])]
|
||||
)
|
||||
if hierarchy:
|
||||
hierarchy = '/'.join(
|
||||
[entity['project']['full_name'], hierarchy, entity['name']]
|
||||
)
|
||||
elif entity.entity_type.lower() == 'project':
|
||||
hierarchy = entity['full_name']
|
||||
else:
|
||||
hierarchy = '/'.join(
|
||||
[entity['project']['full_name'], entity['name']]
|
||||
)
|
||||
|
||||
self.log.debug('- updating entity "{}"'.format(hierarchy))
|
||||
|
||||
# collect entity's custom attributes
|
||||
custom_attributes = entity.get('custom_attributes')
|
||||
if not custom_attributes:
|
||||
return
|
||||
|
||||
mongoid = custom_attributes.get(self.ca_mongoid)
|
||||
if not mongoid:
|
||||
error_key = 'Missing MongoID on entities (try SyncToAvalon first)'
|
||||
if error_key not in self.interface_messages:
|
||||
self.interface_messages[error_key] = []
|
||||
|
||||
if ent_name not in self.interface_messages[error_key]:
|
||||
self.interface_messages[error_key].append(ent_name)
|
||||
|
||||
self.log.warning(
|
||||
'-- entity "{}" is not synchronized to avalon. Skipping'.format(
|
||||
ent_name
|
||||
)
|
||||
)
|
||||
return
|
||||
|
||||
try:
|
||||
mongoid = ObjectId(mongoid)
|
||||
except Exception:
|
||||
error_key = 'Invalid MongoID on entities (try SyncToAvalon)'
|
||||
if error_key not in self.interface_messages:
|
||||
self.interface_messages[error_key] = []
|
||||
|
||||
if ent_name not in self.interface_messages[error_key]:
|
||||
self.interface_messages[error_key].append(ent_name)
|
||||
|
||||
self.log.warning(
|
||||
'-- entity "{}" has stored invalid MongoID. Skipping'.format(
|
||||
ent_name
|
||||
)
|
||||
)
|
||||
return
|
||||
# Find entity in Mongo DB
|
||||
mongo_entity = self.db_con.find_one({'_id': mongoid})
|
||||
if not mongo_entity:
|
||||
error_key = 'Entities not found in Avalon DB (try SyncToAvalon)'
|
||||
if error_key not in self.interface_messages:
|
||||
self.interface_messages[error_key] = []
|
||||
|
||||
if ent_name not in self.interface_messages[error_key]:
|
||||
self.interface_messages[error_key].append(ent_name)
|
||||
|
||||
self.log.warning(
|
||||
'-- entity "{}" was not found in DB by id "{}". Skipping'.format(
|
||||
ent_name, str(mongoid)
|
||||
)
|
||||
)
|
||||
return
|
||||
|
||||
# Change value if entity has set it's own
|
||||
entity_value = custom_attributes[key]
|
||||
if entity_value is not None:
|
||||
value = entity_value
|
||||
|
||||
data = mongo_entity.get('data') or {}
|
||||
|
||||
data[key] = value
|
||||
self.db_con.update_many(
|
||||
{'_id': mongoid},
|
||||
{'$set': {'data': data}}
|
||||
)
|
||||
|
||||
self.log.debug(
|
||||
'-- stored value "{}"'.format(value)
|
||||
)
|
||||
|
||||
for child in entity.get('children', []):
|
||||
self.update_hierarchical_attribute(child, key, value)
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
SyncHierarchicalAttrs(session).register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
'''Set up logging and register action.'''
|
||||
if arguments is None:
|
||||
arguments = []
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
# Allow setting of logging level from arguments.
|
||||
loggingLevels = {}
|
||||
for level in (
|
||||
logging.NOTSET, logging.DEBUG, logging.INFO, logging.WARNING,
|
||||
logging.ERROR, logging.CRITICAL
|
||||
):
|
||||
loggingLevels[logging.getLevelName(level).lower()] = level
|
||||
|
||||
parser.add_argument(
|
||||
'-v', '--verbosity',
|
||||
help='Set the logging output verbosity.',
|
||||
choices=loggingLevels.keys(),
|
||||
default='info'
|
||||
)
|
||||
namespace = parser.parse_args(arguments)
|
||||
|
||||
# Set up basic logging
|
||||
logging.basicConfig(level=loggingLevels[namespace.verbosity])
|
||||
|
||||
session = ftrack_api.Session()
|
||||
register(session)
|
||||
|
||||
# Wait for events
|
||||
logging.info(
|
||||
'Registered actions and listening for events. Use Ctrl-C to abort.'
|
||||
)
|
||||
session.event_hub.wait()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
raise SystemExit(main(sys.argv[1:]))
|
||||
263
pype/ftrack/actions/action_sync_to_avalon_local.py
Normal file
263
pype/ftrack/actions/action_sync_to_avalon_local.py
Normal file
|
|
@ -0,0 +1,263 @@
|
|||
import os
|
||||
import sys
|
||||
import argparse
|
||||
import logging
|
||||
import json
|
||||
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction, lib as ftracklib
|
||||
from pype.vendor.ftrack_api import session as fa_session
|
||||
|
||||
|
||||
class SyncToAvalon(BaseAction):
|
||||
'''
|
||||
Synchronizing data action - from Ftrack to Avalon DB
|
||||
|
||||
Stores all information about entity.
|
||||
- Name(string) - Most important information = identifier of entity
|
||||
- Parent(ObjectId) - Avalon Project Id, if entity is not project itself
|
||||
- Silo(string) - Last parent except project
|
||||
- Data(dictionary):
|
||||
- VisualParent(ObjectId) - Avalon Id of parent asset
|
||||
- Parents(array of string) - All parent names except project
|
||||
- Tasks(array of string) - Tasks on asset
|
||||
- FtrackId(string)
|
||||
- entityType(string) - entity's type on Ftrack
|
||||
* All Custom attributes in group 'Avalon' which name don't start with 'avalon_'
|
||||
|
||||
* These information are stored also for all parents and children entities.
|
||||
|
||||
Avalon ID of asset is stored to Ftrack -> Custom attribute 'avalon_mongo_id'.
|
||||
- action IS NOT creating this Custom attribute if doesn't exist
|
||||
- run 'Create Custom Attributes' action or do it manually (Not recommended)
|
||||
|
||||
If Ftrack entity already has Custom Attribute 'avalon_mongo_id' that stores ID:
|
||||
- name, parents and silo are checked -> shows error if are not exact the same
|
||||
- after sync it is not allowed to change names or move entities
|
||||
|
||||
If ID in 'avalon_mongo_id' is empty string or is not found in DB:
|
||||
- tries to find entity by name
|
||||
- found:
|
||||
- raise error if ftrackId/visual parent/parents are not same
|
||||
- not found:
|
||||
- Creates asset/project
|
||||
|
||||
'''
|
||||
|
||||
#: Action identifier.
|
||||
identifier = 'sync.to.avalon.local'
|
||||
#: Action label.
|
||||
label = 'SyncToAvalon - Local'
|
||||
#: Action description.
|
||||
description = 'Send data from Ftrack to Avalon'
|
||||
#: Action icon.
|
||||
icon = '{}/ftrack/action_icons/SyncToAvalon-local.svg'.format(
|
||||
os.environ.get('PYPE_STATICS_SERVER', '')
|
||||
)
|
||||
#: roles that are allowed to register this action
|
||||
role_list = ['Pypeclub']
|
||||
#: Action priority
|
||||
priority = 200
|
||||
|
||||
def __init__(self, session):
|
||||
super(SyncToAvalon, self).__init__(session)
|
||||
# reload utils on initialize (in case of server restart)
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
for entity in entities:
|
||||
if entity.entity_type.lower() not in ['task', 'assetversion']:
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
message = ""
|
||||
|
||||
# JOB SETTINGS
|
||||
userId = event['source']['user']['id']
|
||||
user = session.query('User where id is ' + userId).one()
|
||||
|
||||
job = session.create('Job', {
|
||||
'user': user,
|
||||
'status': 'running',
|
||||
'data': json.dumps({
|
||||
'description': 'Sync Ftrack to Avalon.'
|
||||
})
|
||||
})
|
||||
session.commit()
|
||||
try:
|
||||
self.importable = []
|
||||
|
||||
# get from top entity in hierarchy all parent entities
|
||||
top_entity = entities[0]['link']
|
||||
if len(top_entity) > 1:
|
||||
for e in top_entity:
|
||||
parent_entity = session.get(e['type'], e['id'])
|
||||
self.importable.append(parent_entity)
|
||||
|
||||
# get all child entities separately/unique
|
||||
for entity in entities:
|
||||
self.add_childs_to_importable(entity)
|
||||
|
||||
# Check names: REGEX in schema/duplicates - raise error if found
|
||||
all_names = []
|
||||
duplicates = []
|
||||
|
||||
for entity in self.importable:
|
||||
ftracklib.avalon_check_name(entity)
|
||||
if entity['name'] in all_names:
|
||||
duplicates.append("'{}'".format(entity['name']))
|
||||
else:
|
||||
all_names.append(entity['name'])
|
||||
|
||||
if len(duplicates) > 0:
|
||||
raise ValueError(
|
||||
"Entity name duplication: {}".format(", ".join(duplicates))
|
||||
)
|
||||
|
||||
# ----- PROJECT ------
|
||||
# store Ftrack project- self.importable[0] must be project entity!!
|
||||
ft_project = self.importable[0]
|
||||
avalon_project = ftracklib.get_avalon_project(ft_project)
|
||||
custom_attributes = ftracklib.get_avalon_attr(session)
|
||||
|
||||
# Import all entities to Avalon DB
|
||||
for entity in self.importable:
|
||||
result = ftracklib.import_to_avalon(
|
||||
session=session,
|
||||
entity=entity,
|
||||
ft_project=ft_project,
|
||||
av_project=avalon_project,
|
||||
custom_attributes=custom_attributes
|
||||
)
|
||||
|
||||
if 'errors' in result and len(result['errors']) > 0:
|
||||
job['status'] = 'failed'
|
||||
session.commit()
|
||||
|
||||
ftracklib.show_errors(self, event, result['errors'])
|
||||
|
||||
return {
|
||||
'success': False,
|
||||
'message': "Sync to avalon FAILED"
|
||||
}
|
||||
|
||||
if avalon_project is None:
|
||||
if 'project' in result:
|
||||
avalon_project = result['project']
|
||||
|
||||
job['status'] = 'done'
|
||||
|
||||
except ValueError as ve:
|
||||
job['status'] = 'failed'
|
||||
message = str(ve)
|
||||
self.log.error(
|
||||
'Error during syncToAvalon: {}'.format(message),
|
||||
exc_info=True
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
job['status'] = 'failed'
|
||||
exc_type, exc_obj, exc_tb = sys.exc_info()
|
||||
fname = os.path.split(exc_tb.tb_frame.f_code.co_filename)[1]
|
||||
log_message = "{}/{}/Line: {}".format(
|
||||
exc_type, fname, exc_tb.tb_lineno
|
||||
)
|
||||
self.log.error(
|
||||
'Error during syncToAvalon: {}'.format(log_message),
|
||||
exc_info=True
|
||||
)
|
||||
message = (
|
||||
'Unexpected Error'
|
||||
' - Please check Log for more information'
|
||||
)
|
||||
finally:
|
||||
if job['status'] in ['queued', 'running']:
|
||||
job['status'] = 'failed'
|
||||
session.commit()
|
||||
|
||||
event = fa_session.ftrack_api.event.base.Event(
|
||||
topic='ftrack.action.launch',
|
||||
data=dict(
|
||||
actionIdentifier='sync.hierarchical.attrs.local',
|
||||
selection=event['data']['selection']
|
||||
),
|
||||
source=dict(
|
||||
user=event['source']['user']
|
||||
)
|
||||
)
|
||||
session.event_hub.publish(event, on_error='ignore')
|
||||
|
||||
if len(message) > 0:
|
||||
message = "Unable to sync: {}".format(message)
|
||||
return {
|
||||
'success': False,
|
||||
'message': message
|
||||
}
|
||||
|
||||
return {
|
||||
'success': True,
|
||||
'message': "Synchronization was successfull"
|
||||
}
|
||||
|
||||
def add_childs_to_importable(self, entity):
|
||||
if not (entity.entity_type in ['Task']):
|
||||
if entity not in self.importable:
|
||||
self.importable.append(entity)
|
||||
|
||||
if entity['children']:
|
||||
childrens = entity['children']
|
||||
for child in childrens:
|
||||
self.add_childs_to_importable(child)
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
# Validate that session is an instance of ftrack_api.Session. If not,
|
||||
# assume that register is being called from an old or incompatible API and
|
||||
# return without doing anything.
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
SyncToAvalon(session).register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
'''Set up logging and register action.'''
|
||||
if arguments is None:
|
||||
arguments = []
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
# Allow setting of logging level from arguments.
|
||||
loggingLevels = {}
|
||||
for level in (
|
||||
logging.NOTSET, logging.DEBUG, logging.INFO, logging.WARNING,
|
||||
logging.ERROR, logging.CRITICAL
|
||||
):
|
||||
loggingLevels[logging.getLevelName(level).lower()] = level
|
||||
|
||||
parser.add_argument(
|
||||
'-v', '--verbosity',
|
||||
help='Set the logging output verbosity.',
|
||||
choices=loggingLevels.keys(),
|
||||
default='info'
|
||||
)
|
||||
namespace = parser.parse_args(arguments)
|
||||
|
||||
# Set up basic logging
|
||||
logging.basicConfig(level=loggingLevels[namespace.verbosity])
|
||||
|
||||
session = ftrack_api.Session()
|
||||
register(session)
|
||||
|
||||
# Wait for events
|
||||
logging.info(
|
||||
'Registered actions and listening for events. Use Ctrl-C to abort.'
|
||||
)
|
||||
session.event_hub.wait()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
raise SystemExit(main(sys.argv[1:]))
|
||||
|
|
@ -1,64 +1,41 @@
|
|||
# :coding: utf-8
|
||||
# :copyright: Copyright (c) 2017 ftrack
|
||||
import os
|
||||
import sys
|
||||
import argparse
|
||||
import logging
|
||||
import collections
|
||||
import os
|
||||
import json
|
||||
import re
|
||||
|
||||
import ftrack_api
|
||||
from ftrack_action_handler import BaseAction
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction
|
||||
from avalon import io, inventory, schema
|
||||
from avalon.vendor import toml
|
||||
|
||||
|
||||
class TestAction(BaseAction):
|
||||
'''Edit meta data action.'''
|
||||
|
||||
ignore_me = True
|
||||
#: Action identifier.
|
||||
identifier = 'test.action'
|
||||
#: Action label.
|
||||
label = 'Test action'
|
||||
#: Action description.
|
||||
description = 'Test action'
|
||||
|
||||
#: priority
|
||||
priority = 10000
|
||||
#: roles that are allowed to register this action
|
||||
role_list = ['Pypeclub']
|
||||
icon = '{}/ftrack/action_icons/TestAction.svg'.format(
|
||||
os.environ.get('PYPE_STATICS_SERVER', '')
|
||||
)
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
entity = entities[0]
|
||||
|
||||
|
||||
entity_type = entity.entity_type
|
||||
data = {}
|
||||
"""
|
||||
custom_attributes = []
|
||||
|
||||
all_avalon_attr = session.query('CustomAttributeGroup where name is "avalon"').one()
|
||||
for cust_attr in all_avalon_attr['custom_attribute_configurations']:
|
||||
if 'avalon_' not in cust_attr['key']:
|
||||
custom_attributes.append(cust_attr)
|
||||
"""
|
||||
for cust_attr in custom_attributes:
|
||||
if cust_attr['entity_type'].lower() in ['asset']:
|
||||
data[cust_attr['key']] = entity['custom_attributes'][cust_attr['key']]
|
||||
|
||||
elif cust_attr['entity_type'].lower() in ['show'] and entity_type.lower() == 'project':
|
||||
data[cust_attr['key']] = entity['custom_attributes'][cust_attr['key']]
|
||||
|
||||
elif cust_attr['entity_type'].lower() in ['task'] and entity_type.lower() != 'project':
|
||||
# Put space between capitals (e.g. 'AssetBuild' -> 'Asset Build')
|
||||
entity_type = re.sub(r"(\w)([A-Z])", r"\1 \2", entity_type)
|
||||
# Get object id of entity type
|
||||
ent_obj_type_id = session.query('ObjectType where name is "{}"'.format(entity_type)).one()['id']
|
||||
if cust_attr['type_id'] == ent_obj_type_id:
|
||||
data[cust_attr['key']] = entity['custom_attributes'][cust_attr['key']]
|
||||
self.log.info(event)
|
||||
|
||||
return True
|
||||
|
||||
|
|
@ -69,8 +46,7 @@ def register(session, **kw):
|
|||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
action_handler = TestAction(session)
|
||||
action_handler.register()
|
||||
TestAction(session).register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
|
|
|
|||
|
|
@ -1,14 +1,12 @@
|
|||
# :coding: utf-8
|
||||
# :copyright: Copyright (c) 2015 Milan Kolar
|
||||
|
||||
import os
|
||||
import sys
|
||||
import argparse
|
||||
import logging
|
||||
import getpass
|
||||
import json
|
||||
|
||||
import ftrack_api
|
||||
from ftrack_action_handler import BaseAction
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction
|
||||
|
||||
|
||||
class ThumbToChildren(BaseAction):
|
||||
'''Custom action.'''
|
||||
|
|
@ -18,8 +16,9 @@ class ThumbToChildren(BaseAction):
|
|||
# Action label
|
||||
label = 'Thumbnail to Children'
|
||||
# Action icon
|
||||
icon = "https://cdn3.iconfinder.com/data/icons/transfers/100/239322-download_transfer-128.png"
|
||||
|
||||
icon = '{}/ftrack/action_icons/thumbToChildren.svg'.format(
|
||||
os.environ.get('PYPE_STATICS_SERVER', '')
|
||||
)
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
|
|
@ -29,7 +28,6 @@ class ThumbToChildren(BaseAction):
|
|||
|
||||
return True
|
||||
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
'''Callback method for action.'''
|
||||
|
||||
|
|
@ -53,7 +51,7 @@ class ThumbToChildren(BaseAction):
|
|||
|
||||
# inform the user that the job is done
|
||||
job['status'] = 'done'
|
||||
except:
|
||||
except Exception:
|
||||
# fail the job if something goes wrong
|
||||
job['status'] = 'failed'
|
||||
raise
|
||||
|
|
@ -66,14 +64,13 @@ class ThumbToChildren(BaseAction):
|
|||
}
|
||||
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register action. Called when used as an event plugin.'''
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
action_handler = ThumbToChildren(session)
|
||||
action_handler.register()
|
||||
ThumbToChildren(session).register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
'''Set up logging and register action.'''
|
||||
|
|
|
|||
|
|
@ -1,13 +1,11 @@
|
|||
# :coding: utf-8
|
||||
# :copyright: Copyright (c) 2015 Milan Kolar
|
||||
|
||||
import os
|
||||
import sys
|
||||
import argparse
|
||||
import logging
|
||||
import getpass
|
||||
import json
|
||||
import ftrack_api
|
||||
from ftrack_action_handler import BaseAction
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction
|
||||
|
||||
|
||||
class ThumbToParent(BaseAction):
|
||||
'''Custom action.'''
|
||||
|
|
@ -17,8 +15,9 @@ class ThumbToParent(BaseAction):
|
|||
# Action label
|
||||
label = 'Thumbnail to Parent'
|
||||
# Action icon
|
||||
icon = "https://cdn3.iconfinder.com/data/icons/transfers/100/239419-upload_transfer-512.png"
|
||||
|
||||
icon = '{}/ftrack/action_icons/thumbToParent.svg'.format(
|
||||
os.environ.get('PYPE_STATICS_SERVER', '')
|
||||
)
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
'''Return action config if triggered on asset versions.'''
|
||||
|
|
@ -28,7 +27,6 @@ class ThumbToParent(BaseAction):
|
|||
|
||||
return True
|
||||
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
'''Callback method for action.'''
|
||||
|
||||
|
|
@ -50,14 +48,19 @@ class ThumbToParent(BaseAction):
|
|||
if entity.entity_type.lower() == 'assetversion':
|
||||
try:
|
||||
parent = entity['task']
|
||||
except:
|
||||
except Exception:
|
||||
par_ent = entity['link'][-2]
|
||||
parent = session.get(par_ent['type'], par_ent['id'])
|
||||
else:
|
||||
try:
|
||||
parent = entity['parent']
|
||||
except:
|
||||
print("Durin Action 'Thumb to Parent' went something wrong")
|
||||
except Exception as e:
|
||||
msg = (
|
||||
"Durin Action 'Thumb to Parent'"
|
||||
" went something wrong"
|
||||
)
|
||||
self.log.error(msg)
|
||||
raise e
|
||||
thumbid = entity['thumbnail_id']
|
||||
|
||||
if parent and thumbid:
|
||||
|
|
@ -69,10 +72,10 @@ class ThumbToParent(BaseAction):
|
|||
# inform the user that the job is done
|
||||
job['status'] = status or 'done'
|
||||
|
||||
except:
|
||||
except Exception as e:
|
||||
# fail the job if something goes wrong
|
||||
job['status'] = 'failed'
|
||||
raise
|
||||
raise e
|
||||
|
||||
finally:
|
||||
session.commit()
|
||||
|
|
@ -88,8 +91,8 @@ def register(session, **kw):
|
|||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
action_handler = ThumbToParent(session)
|
||||
action_handler.register()
|
||||
ThumbToParent(session).register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
'''Set up logging and register action.'''
|
||||
|
|
|
|||
54
pype/ftrack/actions/action_where_run_ask.py
Normal file
54
pype/ftrack/actions/action_where_run_ask.py
Normal file
|
|
@ -0,0 +1,54 @@
|
|||
import os
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction
|
||||
from pype.vendor.ftrack_api import session as fa_session
|
||||
|
||||
|
||||
class ActionAskWhereIRun(BaseAction):
|
||||
""" Sometimes user forget where pipeline with his credentials is running.
|
||||
- this action triggers `ActionShowWhereIRun`
|
||||
"""
|
||||
# Action is ignored by default
|
||||
ignore_me = True
|
||||
#: Action identifier.
|
||||
identifier = 'ask.where.i.run'
|
||||
#: Action label.
|
||||
label = 'Ask where I run'
|
||||
#: Action description.
|
||||
description = 'Triggers PC info where user have running Pype'
|
||||
#: Action icon
|
||||
icon = '{}/ftrack/action_icons/ActionAskWhereIRun.svg'.format(
|
||||
os.environ.get('PYPE_STATICS_SERVER', '')
|
||||
)
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
""" Hide by default - Should be enabled only if you want to run.
|
||||
- best practise is to create another action that triggers this one
|
||||
"""
|
||||
|
||||
return True
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
event = fa_session.ftrack_api.event.base.Event(
|
||||
topic='ftrack.action.launch',
|
||||
data=dict(
|
||||
actionIdentifier="show.where.i.run",
|
||||
selection=event["data"]["selection"],
|
||||
event_hub_id=session.event_hub.id
|
||||
),
|
||||
source=dict(
|
||||
user=dict(username=session.api_user)
|
||||
)
|
||||
)
|
||||
session.event_hub.publish(event, on_error='ignore')
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
ActionAskWhereIRun(session).register()
|
||||
86
pype/ftrack/actions/action_where_run_show.py
Normal file
86
pype/ftrack/actions/action_where_run_show.py
Normal file
|
|
@ -0,0 +1,86 @@
|
|||
import platform
|
||||
import socket
|
||||
import getpass
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction
|
||||
|
||||
|
||||
class ActionShowWhereIRun(BaseAction):
|
||||
""" Sometimes user forget where pipeline with his credentials is running.
|
||||
- this action shows on which PC, Username and IP is running
|
||||
- requirement action MUST be registered where we want to locate the PC:
|
||||
- - can't be used retrospectively...
|
||||
"""
|
||||
#: Action identifier.
|
||||
identifier = 'show.where.i.run'
|
||||
#: Action label.
|
||||
label = 'Show where I run'
|
||||
#: Action description.
|
||||
description = 'Shows PC info where user have running Pype'
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
""" Hide by default - Should be enabled only if you want to run.
|
||||
- best practise is to create another action that triggers this one
|
||||
"""
|
||||
|
||||
return False
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
# Don't show info when was launch from this session
|
||||
if session.event_hub.id == event.get("data", {}).get("event_hub_id"):
|
||||
return True
|
||||
|
||||
title = "Where Do I Run?"
|
||||
msgs = {}
|
||||
all_keys = ["Hostname", "IP", "Username", "System name", "PC name"]
|
||||
try:
|
||||
host_name = socket.gethostname()
|
||||
msgs["Hostname"] = host_name
|
||||
host_ip = socket.gethostbyname(host_name)
|
||||
msgs["IP"] = host_ip
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
try:
|
||||
system_name, pc_name, *_ = platform.uname()
|
||||
msgs["System name"] = system_name
|
||||
msgs["PC name"] = pc_name
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
try:
|
||||
msgs["Username"] = getpass.getuser()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
for key in all_keys:
|
||||
if not msgs.get(key):
|
||||
msgs[key] = "-Undefined-"
|
||||
|
||||
items = []
|
||||
first = True
|
||||
splitter = {'type': 'label', 'value': '---'}
|
||||
for key, value in msgs.items():
|
||||
if first:
|
||||
first = False
|
||||
else:
|
||||
items.append(splitter)
|
||||
self.log.debug("{}: {}".format(key, value))
|
||||
|
||||
subtitle = {'type': 'label', 'value': '<h3>{}</h3>'.format(key)}
|
||||
items.append(subtitle)
|
||||
message = {'type': 'label', 'value': '<p>{}</p>'.format(value)}
|
||||
items.append(message)
|
||||
|
||||
self.show_interface(items, title, event=event)
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
ActionShowWhereIRun(session).register()
|
||||
|
|
@ -1,403 +0,0 @@
|
|||
import logging
|
||||
import subprocess
|
||||
import sys
|
||||
import os
|
||||
import re
|
||||
from operator import itemgetter
|
||||
import ftrack_api
|
||||
|
||||
|
||||
class DJVViewAction(object):
|
||||
"""Launch DJVView action."""
|
||||
identifier = "djvview-launch-action"
|
||||
# label = "DJV View"
|
||||
# icon = "http://a.fsdn.com/allura/p/djv/icon"
|
||||
|
||||
def __init__(self, session):
|
||||
'''Expects a ftrack_api.Session instance'''
|
||||
|
||||
self.logger = logging.getLogger(
|
||||
'{0}.{1}'.format(__name__, self.__class__.__name__)
|
||||
)
|
||||
|
||||
if self.identifier is None:
|
||||
raise ValueError(
|
||||
'Action missing identifier.'
|
||||
)
|
||||
|
||||
self.session = session
|
||||
|
||||
def is_valid_selection(self, event):
|
||||
selection = event["data"].get("selection", [])
|
||||
|
||||
if not selection:
|
||||
return
|
||||
|
||||
entityType = selection[0]["entityType"]
|
||||
|
||||
if entityType not in ["assetversion", "task"]:
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def discover(self, event):
|
||||
"""Return available actions based on *event*. """
|
||||
|
||||
if not self.is_valid_selection(event):
|
||||
return
|
||||
|
||||
items = []
|
||||
applications = self.get_applications()
|
||||
applications = sorted(
|
||||
applications, key=lambda application: application["label"]
|
||||
)
|
||||
|
||||
for application in applications:
|
||||
self.djv_path = application.get("path", None)
|
||||
applicationIdentifier = application["identifier"]
|
||||
label = application["label"]
|
||||
items.append({
|
||||
"actionIdentifier": self.identifier,
|
||||
"label": label,
|
||||
"variant": application.get("variant", None),
|
||||
"description": application.get("description", None),
|
||||
"icon": application.get("icon", "default"),
|
||||
"applicationIdentifier": applicationIdentifier
|
||||
})
|
||||
|
||||
return {
|
||||
"items": items
|
||||
}
|
||||
|
||||
def register(self):
|
||||
'''Registers the action, subscribing the discover and launch topics.'''
|
||||
self.session.event_hub.subscribe(
|
||||
'topic=ftrack.action.discover and source.user.username={0}'.format(
|
||||
self.session.api_user
|
||||
), self.discover
|
||||
)
|
||||
|
||||
self.session.event_hub.subscribe(
|
||||
'topic=ftrack.action.launch and data.actionIdentifier={0} and source.user.username={1}'.format(
|
||||
self.identifier,
|
||||
self.session.api_user
|
||||
),
|
||||
self.launch
|
||||
)
|
||||
print("----- action - <" + self.__class__.__name__ + "> - Has been registered -----")
|
||||
|
||||
def get_applications(self):
|
||||
applications = []
|
||||
|
||||
label = "DJVView {version}"
|
||||
versionExpression = re.compile(r"(?P<version>\d+.\d+.\d+)")
|
||||
applicationIdentifier = "djvview"
|
||||
description = "DJV View Launcher"
|
||||
icon = "http://a.fsdn.com/allura/p/djv/icon"
|
||||
expression = []
|
||||
if sys.platform == "win32":
|
||||
expression = ["C:\\", "Program Files", "djv-\d.+",
|
||||
"bin", "djv_view.exe"]
|
||||
|
||||
elif sys.platform == "darwin":
|
||||
expression = ["Application", "DJV.app", "Contents", "MacOS", "DJV"]
|
||||
# Linuxs
|
||||
else:
|
||||
expression = ["usr", "local", "djv", "djv_view"]
|
||||
|
||||
pieces = expression[:]
|
||||
start = pieces.pop(0)
|
||||
|
||||
if sys.platform == 'win32':
|
||||
# On Windows C: means current directory so convert roots that look
|
||||
# like drive letters to the C:\ format.
|
||||
if start and start[-1] == ':':
|
||||
start += '\\'
|
||||
|
||||
if not os.path.exists(start):
|
||||
raise ValueError(
|
||||
'First part "{0}" of expression "{1}" must match exactly to an '
|
||||
'existing entry on the filesystem.'
|
||||
.format(start, expression)
|
||||
)
|
||||
|
||||
|
||||
expressions = list(map(re.compile, pieces))
|
||||
expressionsCount = len(expression)-1
|
||||
|
||||
for location, folders, files in os.walk(start, topdown=True, followlinks=True):
|
||||
level = location.rstrip(os.path.sep).count(os.path.sep)
|
||||
expression = expressions[level]
|
||||
|
||||
if level < (expressionsCount - 1):
|
||||
# If not yet at final piece then just prune directories.
|
||||
folders[:] = [folder for folder in folders
|
||||
if expression.match(folder)]
|
||||
else:
|
||||
# Match executable. Note that on OSX executable might equate to
|
||||
# a folder (.app).
|
||||
for entry in folders + files:
|
||||
match = expression.match(entry)
|
||||
if match:
|
||||
# Extract version from full matching path.
|
||||
path = os.path.join(start, location, entry)
|
||||
versionMatch = versionExpression.search(path)
|
||||
if versionMatch:
|
||||
version = versionMatch.group('version')
|
||||
|
||||
applications.append({
|
||||
'identifier': applicationIdentifier.format(
|
||||
version=version
|
||||
),
|
||||
'path': path,
|
||||
'version': version,
|
||||
'label': label.format(version=version),
|
||||
'icon': icon,
|
||||
# 'variant': variant.format(version=version),
|
||||
'description': description
|
||||
})
|
||||
else:
|
||||
self.logger.debug(
|
||||
'Discovered application executable, but it '
|
||||
'does not appear to o contain required version '
|
||||
'information: {0}'.format(path)
|
||||
)
|
||||
|
||||
# Don't descend any further as out of patterns to match.
|
||||
del folders[:]
|
||||
|
||||
return applications
|
||||
|
||||
def translate_event(self, session, event):
|
||||
'''Return *event* translated structure to be used with the API.'''
|
||||
|
||||
selection = event['data'].get('selection', [])
|
||||
|
||||
entities = list()
|
||||
for entity in selection:
|
||||
entities.append(
|
||||
(session.get(self.get_entity_type(entity), entity.get('entityId')))
|
||||
)
|
||||
|
||||
return entities
|
||||
|
||||
def get_entity_type(self, entity):
|
||||
entity_type = entity.get('entityType').replace('_', '').lower()
|
||||
|
||||
for schema in self.session.schemas:
|
||||
alias_for = schema.get('alias_for')
|
||||
|
||||
if (
|
||||
alias_for and isinstance(alias_for, str) and
|
||||
alias_for.lower() == entity_type
|
||||
):
|
||||
return schema['id']
|
||||
|
||||
for schema in self.session.schemas:
|
||||
if schema['id'].lower() == entity_type:
|
||||
return schema['id']
|
||||
|
||||
raise ValueError(
|
||||
'Unable to translate entity type: {0}.'.format(entity_type)
|
||||
)
|
||||
|
||||
def launch(self, event):
|
||||
"""Callback method for DJVView action."""
|
||||
session = self.session
|
||||
entities = self.translate_event(session, event)
|
||||
|
||||
# Launching application
|
||||
if "values" in event["data"]:
|
||||
filename = event['data']['values']['path']
|
||||
file_type = filename.split(".")[-1]
|
||||
|
||||
# TODO Is this proper way?
|
||||
try:
|
||||
fps = int(entities[0]['custom_attributes']['fps'])
|
||||
except:
|
||||
fps = 24
|
||||
|
||||
# TODO issequence is probably already built-in validation in ftrack
|
||||
isseq = re.findall('%[0-9]*d', filename)
|
||||
if len(isseq) > 0:
|
||||
if len(isseq) == 1:
|
||||
frames = []
|
||||
padding = re.findall('%[0-9]*d', filename).pop()
|
||||
index = filename.find(padding)
|
||||
|
||||
full_file = filename[0:index-1]
|
||||
file = full_file.split(os.sep)[-1]
|
||||
folder = os.path.dirname(full_file)
|
||||
|
||||
for fname in os.listdir(path=folder):
|
||||
if fname.endswith(file_type) and file in fname:
|
||||
frames.append(int(fname.split(".")[-2]))
|
||||
|
||||
if len(frames) > 0:
|
||||
start = min(frames)
|
||||
end = max(frames)
|
||||
|
||||
range = (padding % start) + '-' + (padding % end)
|
||||
filename = re.sub('%[0-9]*d', range, filename)
|
||||
else:
|
||||
print("")
|
||||
return {
|
||||
'success': False,
|
||||
'message': 'DJV View - Filename has more than one seqence identifier.'
|
||||
}
|
||||
|
||||
cmd = []
|
||||
# DJV path
|
||||
cmd.append(os.path.normpath(self.djv_path))
|
||||
# DJV Options Start ##############################################
|
||||
# cmd.append('-file_layer (value)') #layer name
|
||||
cmd.append('-file_proxy 1/2') # Proxy scale: 1/2, 1/4, 1/8
|
||||
cmd.append('-file_cache True') # Cache: True, False.
|
||||
# cmd.append('-window_fullscreen') #Start in full screen
|
||||
# cmd.append("-window_toolbar False") # Toolbar controls: False, True.
|
||||
# cmd.append("-window_playbar False") # Window controls: False, True.
|
||||
# cmd.append("-view_grid None") # Grid overlay: None, 1x1, 10x10, 100x100.
|
||||
# cmd.append("-view_hud True") # Heads up display: True, False.
|
||||
cmd.append("-playback Forward") # Playback: Stop, Forward, Reverse.
|
||||
# cmd.append("-playback_frame (value)") # Frame.
|
||||
cmd.append("-playback_speed " + str(fps))
|
||||
# cmd.append("-playback_timer (value)") # Timer: Sleep, Timeout. Value: Sleep.
|
||||
# cmd.append("-playback_timer_resolution (value)") # Timer resolution (seconds): 0.001.
|
||||
cmd.append("-time_units Frames") # Time units: Timecode, Frames.
|
||||
# DJV Options End ################################################
|
||||
|
||||
# PATH TO COMPONENT
|
||||
cmd.append(os.path.normpath(filename))
|
||||
|
||||
# Run DJV with these commands
|
||||
subprocess.Popen(' '.join(cmd))
|
||||
|
||||
return {
|
||||
'success': True,
|
||||
'message': 'DJV View started.'
|
||||
}
|
||||
|
||||
if 'items' not in event["data"]:
|
||||
event["data"]['items'] = []
|
||||
|
||||
try:
|
||||
for entity in entities:
|
||||
versions = []
|
||||
allowed_types = ["img", "mov", "exr"]
|
||||
|
||||
if entity.entity_type.lower() == "assetversion":
|
||||
if entity['components'][0]['file_type'] in allowed_types:
|
||||
versions.append(entity)
|
||||
|
||||
if entity.entity_type.lower() == "task":
|
||||
# AssetVersions are obtainable only from shot!
|
||||
shotentity = entity['parent']
|
||||
|
||||
for asset in shotentity['assets']:
|
||||
for version in asset['versions']:
|
||||
# Get only AssetVersion of selected task
|
||||
if version['task']['id'] != entity['id']:
|
||||
continue
|
||||
# Get only components with allowed type
|
||||
if version['components'][0]['file_type'] in allowed_types:
|
||||
versions.append(version)
|
||||
|
||||
# Raise error if no components were found
|
||||
if len(versions) < 1:
|
||||
raise ValueError('There are no Asset Versions to open.')
|
||||
|
||||
for version in versions:
|
||||
for component in version['components']:
|
||||
label = "v{0} - {1} - {2}"
|
||||
|
||||
label = label.format(
|
||||
str(version['version']).zfill(3),
|
||||
version['asset']['type']['name'],
|
||||
component['name']
|
||||
)
|
||||
|
||||
try:
|
||||
# TODO This is proper way to get filepath!!!
|
||||
# THIS WON'T WORK RIGHT NOW
|
||||
location = component['component_locations'][0]['location']
|
||||
file_path = location.get_filesystem_path(component)
|
||||
# if component.isSequence():
|
||||
# if component.getMembers():
|
||||
# frame = int(component.getMembers()[0].getName())
|
||||
# file_path = file_path % frame
|
||||
except:
|
||||
# This works but is NOT proper way
|
||||
file_path = component['component_locations'][0]['resource_identifier']
|
||||
|
||||
event["data"]["items"].append(
|
||||
{"label": label, "value": file_path}
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
return {
|
||||
'success': False,
|
||||
'message': str(e)
|
||||
}
|
||||
|
||||
return {
|
||||
"items": [
|
||||
{
|
||||
"label": "Items to view",
|
||||
"type": "enumerator",
|
||||
"name": "path",
|
||||
"data": sorted(
|
||||
event["data"]['items'],
|
||||
key=itemgetter("label"),
|
||||
reverse=True
|
||||
)
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
"""Register hooks."""
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
action = DJVViewAction(session)
|
||||
action.register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
'''Set up logging and register action.'''
|
||||
if arguments is None:
|
||||
arguments = []
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
# Allow setting of logging level from arguments.
|
||||
loggingLevels = {}
|
||||
for level in (
|
||||
logging.NOTSET, logging.DEBUG, logging.INFO, logging.WARNING,
|
||||
logging.ERROR, logging.CRITICAL
|
||||
):
|
||||
loggingLevels[logging.getLevelName(level).lower()] = level
|
||||
|
||||
parser.add_argument(
|
||||
'-v', '--verbosity',
|
||||
help='Set the logging output verbosity.',
|
||||
choices=loggingLevels.keys(),
|
||||
default='info'
|
||||
)
|
||||
namespace = parser.parse_args(arguments)
|
||||
|
||||
# Set up basic logging
|
||||
logging.basicConfig(level=loggingLevels[namespace.verbosity])
|
||||
|
||||
session = ftrack_api.Session()
|
||||
register(session)
|
||||
|
||||
# Wait for events
|
||||
logging.info(
|
||||
'Registered actions and listening for events. Use Ctrl-C to abort.'
|
||||
)
|
||||
session.event_hub.wait()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
raise SystemExit(main(sys.argv[1:]))
|
||||
|
|
@ -1,428 +0,0 @@
|
|||
import os
|
||||
import operator
|
||||
import ftrack_api
|
||||
import collections
|
||||
import sys
|
||||
import json
|
||||
import base64
|
||||
|
||||
# sys.path.append(os.path.dirname(os.path.dirname(__file__)))
|
||||
# from ftrack_kredenc.lucidity.vendor import yaml
|
||||
# from ftrack_kredenc import lucidity
|
||||
#
|
||||
#
|
||||
# def get_ftrack_connect_path():
|
||||
#
|
||||
# ftrack_connect_root = os.path.abspath(os.getenv('FTRACK_CONNECT_PACKAGE'))
|
||||
#
|
||||
# return ftrack_connect_root
|
||||
#
|
||||
#
|
||||
# def from_yaml(filepath):
|
||||
# ''' Parse a Schema from a YAML file at the given *filepath*.
|
||||
# '''
|
||||
# with open(filepath, 'r') as f:
|
||||
# data = yaml.safe_load(f)
|
||||
# return data
|
||||
#
|
||||
#
|
||||
# def get_task_enviro(entity, environment=None):
|
||||
#
|
||||
# context = get_context(entity)
|
||||
#
|
||||
# if not environment:
|
||||
# environment = {}
|
||||
#
|
||||
# for key in context:
|
||||
# os.environ[key.upper()] = context[key]['name']
|
||||
# environment[key.upper()] = context[key]['name']
|
||||
#
|
||||
# if key == 'Project':
|
||||
# os.putenv('PROJECT_ROOT', context[key]['root'])
|
||||
# os.environ['PROJECT_ROOT'] = context[key]['root']
|
||||
# environment['PROJECT_ROOT'] = context[key]['root']
|
||||
# print('PROJECT_ROOT: ' + context[key]['root'])
|
||||
# print(key + ': ' + context[key]['name'])
|
||||
#
|
||||
# return environment
|
||||
#
|
||||
#
|
||||
# def get_entity():
|
||||
# decodedEventData = json.loads(
|
||||
# base64.b64decode(
|
||||
# os.environ.get('FTRACK_CONNECT_EVENT')
|
||||
# )
|
||||
# )
|
||||
#
|
||||
# entity = decodedEventData.get('selection')[0]
|
||||
#
|
||||
# if entity['entityType'] == 'task':
|
||||
# return ftrack_api.Task(entity['entityId'])
|
||||
# else:
|
||||
# return None
|
||||
#
|
||||
#
|
||||
# def set_env_vars():
|
||||
#
|
||||
# entity = get_entity()
|
||||
#
|
||||
# if entity:
|
||||
# if not os.environ.get('project_root'):
|
||||
# enviro = get_task_enviro(entity)
|
||||
#
|
||||
# print(enviro)
|
||||
#
|
||||
#
|
||||
def get_context(entity):
|
||||
|
||||
entityName = entity['name']
|
||||
entityId = entity['id']
|
||||
entityType = entity.entity_type
|
||||
entityDescription = entity['description']
|
||||
|
||||
print(100*"*")
|
||||
for k in entity['ancestors']:
|
||||
print(k['name'])
|
||||
print(100*"*")
|
||||
hierarchy = entity.getParents()
|
||||
|
||||
ctx = collections.OrderedDict()
|
||||
|
||||
if entity.get('entityType') == 'task' and entityType == 'Task':
|
||||
taskType = entity.getType().getName()
|
||||
entityDic = {
|
||||
'type': taskType,
|
||||
'name': entityName,
|
||||
'id': entityId,
|
||||
'description': entityDescription
|
||||
}
|
||||
elif entity.get('entityType') == 'task':
|
||||
entityDic = {
|
||||
'name': entityName,
|
||||
'id': entityId,
|
||||
'description': entityDescription
|
||||
}
|
||||
|
||||
ctx[entityType] = entityDic
|
||||
|
||||
folder_counter = 0
|
||||
|
||||
for ancestor in hierarchy:
|
||||
tempdic = {}
|
||||
if isinstance(ancestor, ftrack_api.Component):
|
||||
# Ignore intermediate components.
|
||||
continue
|
||||
|
||||
tempdic['name'] = ancestor.getName()
|
||||
tempdic['id'] = ancestor.getId()
|
||||
|
||||
try:
|
||||
objectType = ancestor.getObjectType()
|
||||
tempdic['description'] = ancestor.getDescription()
|
||||
except AttributeError:
|
||||
objectType = 'Project'
|
||||
tempdic['description'] = ''
|
||||
|
||||
if objectType == 'Asset Build':
|
||||
tempdic['type'] = ancestor.getType().get('name')
|
||||
objectType = objectType.replace(' ', '_')
|
||||
elif objectType == 'Project':
|
||||
tempdic['code'] = tempdic['name']
|
||||
tempdic['name'] = ancestor.get('fullname')
|
||||
tempdic['root'] = ancestor.getRoot()
|
||||
|
||||
if objectType == 'Folder':
|
||||
objectType = objectType + str(folder_counter)
|
||||
folder_counter += 1
|
||||
ctx[objectType] = tempdic
|
||||
|
||||
return ctx
|
||||
|
||||
|
||||
def getNewContext(entity):
|
||||
|
||||
parents = []
|
||||
item = entity
|
||||
while True:
|
||||
item = item['parent']
|
||||
if not item:
|
||||
break
|
||||
parents.append(item)
|
||||
|
||||
ctx = collections.OrderedDict()
|
||||
|
||||
entityDic = {
|
||||
'name': entity['name'],
|
||||
'id': entity['id'],
|
||||
}
|
||||
try:
|
||||
entityDic['type'] = entity['type']['name']
|
||||
except:
|
||||
pass
|
||||
|
||||
ctx[entity['object_type']['name']] = entityDic
|
||||
|
||||
print(100*"-")
|
||||
for p in parents:
|
||||
print(p)
|
||||
# add all parents to the context
|
||||
for parent in parents:
|
||||
tempdic = {}
|
||||
if not parent.get('project_schema'):
|
||||
tempdic = {
|
||||
'name': parent['full_name'],
|
||||
'code': parent['name'],
|
||||
'id': parent['id'],
|
||||
}
|
||||
tempdic = {
|
||||
'name': parent['name'],
|
||||
'id': parent['id'],
|
||||
}
|
||||
object_type = parent['object_type']['name']
|
||||
|
||||
ctx[object_type] = tempdic
|
||||
|
||||
# add project to the context
|
||||
project = entity['project']
|
||||
ctx['Project'] = {
|
||||
'name': project['full_name'],
|
||||
'code': project['name'],
|
||||
'id': project['id'],
|
||||
'root': project['root'],
|
||||
},
|
||||
|
||||
return ctx
|
||||
#
|
||||
#
|
||||
# def get_frame_range():
|
||||
#
|
||||
# entity = get_entity()
|
||||
# entityType = entity.getObjectType()
|
||||
# environment = {}
|
||||
#
|
||||
# if entityType == 'Task':
|
||||
# try:
|
||||
# environment['FS'] = str(int(entity.getFrameStart()))
|
||||
# except Exception:
|
||||
# environment['FS'] = '1'
|
||||
# try:
|
||||
# environment['FE'] = str(int(entity.getFrameEnd()))
|
||||
# except Exception:
|
||||
# environment['FE'] = '1'
|
||||
# else:
|
||||
# try:
|
||||
# environment['FS'] = str(int(entity.getFrameStart()))
|
||||
# except Exception:
|
||||
# environment['FS'] = '1'
|
||||
# try:
|
||||
# environment['FE'] = str(int(entity.getFrameEnd()))
|
||||
# except Exception:
|
||||
# environment['FE'] = '1'
|
||||
#
|
||||
#
|
||||
# def get_asset_name_by_id(id):
|
||||
# for t in ftrack_api.getAssetTypes():
|
||||
# try:
|
||||
# if t.get('typeid') == id:
|
||||
# return t.get('name')
|
||||
# except:
|
||||
# return None
|
||||
#
|
||||
#
|
||||
# def get_status_by_name(name):
|
||||
# statuses = ftrack_api.getTaskStatuses()
|
||||
#
|
||||
# result = None
|
||||
# for s in statuses:
|
||||
# if s.get('name').lower() == name.lower():
|
||||
# result = s
|
||||
#
|
||||
# return result
|
||||
#
|
||||
#
|
||||
# def sort_types(types):
|
||||
# data = {}
|
||||
# for t in types:
|
||||
# data[t] = t.get('sort')
|
||||
#
|
||||
# data = sorted(data.items(), key=operator.itemgetter(1))
|
||||
# results = []
|
||||
# for item in data:
|
||||
# results.append(item[0])
|
||||
#
|
||||
# return results
|
||||
#
|
||||
#
|
||||
# def get_next_task(task):
|
||||
# shot = task.getParent()
|
||||
# tasks = shot.getTasks()
|
||||
#
|
||||
# types_sorted = sort_types(ftrack_api.getTaskTypes())
|
||||
#
|
||||
# next_types = None
|
||||
# for t in types_sorted:
|
||||
# if t.get('typeid') == task.get('typeid'):
|
||||
# try:
|
||||
# next_types = types_sorted[(types_sorted.index(t) + 1):]
|
||||
# except:
|
||||
# pass
|
||||
#
|
||||
# for nt in next_types:
|
||||
# for t in tasks:
|
||||
# if nt.get('typeid') == t.get('typeid'):
|
||||
# return t
|
||||
#
|
||||
# return None
|
||||
#
|
||||
#
|
||||
# def get_latest_version(versions):
|
||||
# latestVersion = None
|
||||
# if len(versions) > 0:
|
||||
# versionNumber = 0
|
||||
# for item in versions:
|
||||
# if item.get('version') > versionNumber:
|
||||
# versionNumber = item.getVersion()
|
||||
# latestVersion = item
|
||||
# return latestVersion
|
||||
#
|
||||
#
|
||||
# def get_thumbnail_recursive(task):
|
||||
# if task.get('thumbid'):
|
||||
# thumbid = task.get('thumbid')
|
||||
# return ftrack_api.Attachment(id=thumbid)
|
||||
# if not task.get('thumbid'):
|
||||
# parent = ftrack_api.Task(id=task.get('parent_id'))
|
||||
# return get_thumbnail_recursive(parent)
|
||||
#
|
||||
#
|
||||
# # paths_collected
|
||||
#
|
||||
# def getFolderHierarchy(context):
|
||||
# '''Return structure for *hierarchy*.
|
||||
# '''
|
||||
#
|
||||
# hierarchy = []
|
||||
# for key in reversed(context):
|
||||
# hierarchy.append(context[key]['name'])
|
||||
# print(hierarchy)
|
||||
#
|
||||
# return os.path.join(*hierarchy[1:-1])
|
||||
#
|
||||
#
|
||||
def tweakContext(context, include=False):
|
||||
|
||||
for key in context:
|
||||
if key == 'Asset Build':
|
||||
context['Asset_Build'] = context.pop(key)
|
||||
key = 'Asset_Build'
|
||||
description = context[key].get('description')
|
||||
if description:
|
||||
context[key]['description'] = '_' + description
|
||||
|
||||
hierarchy = []
|
||||
for key in reversed(context):
|
||||
hierarchy.append(context[key]['name'])
|
||||
|
||||
if include:
|
||||
hierarchy = os.path.join(*hierarchy[1:])
|
||||
else:
|
||||
hierarchy = os.path.join(*hierarchy[1:-1])
|
||||
|
||||
context['ft_hierarchy'] = hierarchy
|
||||
|
||||
|
||||
def getSchema(entity):
|
||||
|
||||
project = entity['project']
|
||||
schema = project['project_schema']['name']
|
||||
|
||||
tools = os.path.abspath(os.environ.get('studio_tools'))
|
||||
|
||||
schema_path = os.path.join(tools, 'studio', 'templates', (schema + '_' + project['name'] + '.yml'))
|
||||
if not os.path.exists(schema_path):
|
||||
schema_path = os.path.join(tools, 'studio', 'templates', (schema + '.yml'))
|
||||
if not os.path.exists(schema_path):
|
||||
schema_path = os.path.join(tools, 'studio', 'templates', 'default.yml')
|
||||
|
||||
schema = lucidity.Schema.from_yaml(schema_path)
|
||||
|
||||
print(schema_path)
|
||||
return schema
|
||||
|
||||
|
||||
# def getAllPathsYaml(entity, root=''):
|
||||
#
|
||||
# if isinstance(entity, str) or isinstance(entity, unicode):
|
||||
# entity = ftrack_api.Task(entity)
|
||||
#
|
||||
# context = get_context(entity)
|
||||
#
|
||||
# tweakContext(context)
|
||||
#
|
||||
# schema = getSchema(entity)
|
||||
#
|
||||
# paths = schema.format_all(context)
|
||||
# paths_collected = []
|
||||
#
|
||||
# for path in paths:
|
||||
# tweak_path = path[0].replace(" ", '_').replace('\'', '').replace('\\', '/')
|
||||
#
|
||||
# tempPath = os.path.join(root, tweak_path)
|
||||
# path = list(path)
|
||||
# path[0] = tempPath
|
||||
# paths_collected.append(path)
|
||||
#
|
||||
# return paths_collected
|
||||
#
|
||||
|
||||
def getPathsYaml(entity, templateList=None, root=None, **kwargs):
|
||||
'''
|
||||
version=None
|
||||
ext=None
|
||||
item=None
|
||||
family=None
|
||||
subset=None
|
||||
'''
|
||||
|
||||
context = get_context(entity)
|
||||
|
||||
if entity.entity_type != 'Task':
|
||||
tweakContext(context, include=True)
|
||||
else:
|
||||
tweakContext(context)
|
||||
|
||||
context.update(kwargs)
|
||||
|
||||
host = sys.executable.lower()
|
||||
|
||||
ext = None
|
||||
if not context.get('ext'):
|
||||
if "nuke" in host:
|
||||
ext = 'nk'
|
||||
elif "maya" in host:
|
||||
ext = 'ma'
|
||||
elif "houdini" in host:
|
||||
ext = 'hip'
|
||||
if ext:
|
||||
context['ext'] = ext
|
||||
|
||||
if not context.get('subset'):
|
||||
context['subset'] = ''
|
||||
else:
|
||||
context['subset'] = '_' + context['subset']
|
||||
|
||||
schema = getSchema(entity)
|
||||
paths = schema.format_all(context)
|
||||
paths_collected = set([])
|
||||
for temp_mask in templateList:
|
||||
for path in paths:
|
||||
if temp_mask in path[1].name:
|
||||
path = path[0].lower().replace(" ", '_').replace('\'', '').replace('\\', '/')
|
||||
path_list = path.split('/')
|
||||
if path_list[0].endswith(':'):
|
||||
path_list[0] = path_list[0] + os.path.sep
|
||||
path = os.path.join(*path_list)
|
||||
temppath = os.path.join(root, path)
|
||||
paths_collected.add(temppath)
|
||||
|
||||
return list(paths_collected)
|
||||
|
|
@ -1,617 +0,0 @@
|
|||
# :coding: utf-8
|
||||
# :copyright: Copyright (c) 2017 ftrack
|
||||
import os
|
||||
import logging
|
||||
import getpass
|
||||
import platform
|
||||
import ftrack_api
|
||||
import toml
|
||||
from avalon import io, lib, pipeline
|
||||
from avalon import session as sess
|
||||
import acre
|
||||
|
||||
from app.api import (
|
||||
Templates,
|
||||
Logger
|
||||
)
|
||||
|
||||
t = Templates(
|
||||
type=["anatomy"]
|
||||
)
|
||||
|
||||
|
||||
class AppAction(object):
|
||||
'''Custom Action base class
|
||||
|
||||
<label> - a descriptive string identifing your action.
|
||||
<varaint> - To group actions together, give them the same
|
||||
label and specify a unique variant per action.
|
||||
<identifier> - a unique identifier for app.
|
||||
<description> - a verbose descriptive text for you action
|
||||
<icon> - icon in ftrack
|
||||
'''
|
||||
|
||||
def __init__(self, session, label, name, executable, variant=None, icon=None, description=None):
|
||||
'''Expects a ftrack_api.Session instance'''
|
||||
|
||||
self.logger = logging.getLogger(
|
||||
'{0}.{1}'.format(__name__, self.__class__.__name__)
|
||||
)
|
||||
|
||||
# self.logger = Logger.getLogger(__name__)
|
||||
|
||||
if label is None:
|
||||
raise ValueError('Action missing label.')
|
||||
elif name is None:
|
||||
raise ValueError('Action missing identifier.')
|
||||
elif executable is None:
|
||||
raise ValueError('Action missing executable.')
|
||||
|
||||
self._session = session
|
||||
self.label = label
|
||||
self.identifier = name
|
||||
self.executable = executable
|
||||
self.variant = variant
|
||||
self.icon = icon
|
||||
self.description = description
|
||||
|
||||
@property
|
||||
def session(self):
|
||||
'''Return current session.'''
|
||||
return self._session
|
||||
|
||||
def register(self):
|
||||
'''Registers the action, subscribing the discover and launch topics.'''
|
||||
self.session.event_hub.subscribe(
|
||||
'topic=ftrack.action.discover and source.user.username={0}'.format(
|
||||
self.session.api_user
|
||||
), self._discover
|
||||
)
|
||||
|
||||
self.session.event_hub.subscribe(
|
||||
'topic=ftrack.action.launch and data.actionIdentifier={0} and source.user.username={1}'.format(
|
||||
self.identifier,
|
||||
self.session.api_user
|
||||
),
|
||||
self._launch
|
||||
)
|
||||
|
||||
def _discover(self, event):
|
||||
args = self._translate_event(
|
||||
self.session, event
|
||||
)
|
||||
|
||||
accepts = self.discover(
|
||||
self.session, *args
|
||||
)
|
||||
|
||||
if accepts:
|
||||
self.logger.info('Selection is valid')
|
||||
return {
|
||||
'items': [{
|
||||
'label': self.label,
|
||||
'variant': self.variant,
|
||||
'description': self.description,
|
||||
'actionIdentifier': self.identifier,
|
||||
'icon': self.icon,
|
||||
}]
|
||||
}
|
||||
else:
|
||||
self.logger.info('Selection is _not_ valid')
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
'''Return true if we can handle the selected entities.
|
||||
|
||||
*session* is a `ftrack_api.Session` instance
|
||||
|
||||
*entities* is a list of tuples each containing the entity type and
|
||||
the entity id. If the entity is a hierarchical you will always get
|
||||
the entity type TypedContext, once retrieved through a get operation
|
||||
you will have the "real" entity type ie. example Shot, Sequence
|
||||
or Asset Build.
|
||||
|
||||
*event* the unmodified original event
|
||||
|
||||
'''
|
||||
|
||||
entity_type, entity_id = entities[0]
|
||||
entity = session.get(entity_type, entity_id)
|
||||
|
||||
# TODO Should return False if not TASK ?!!!
|
||||
if entity.entity_type != 'Task':
|
||||
return False
|
||||
|
||||
# TODO Should return False if more than one entity is selected ?!!!
|
||||
if len(entities) > 1:
|
||||
return False
|
||||
|
||||
ft_project = entity['project'] if (entity.entity_type != 'Project') else entity
|
||||
|
||||
os.environ['AVALON_PROJECT'] = ft_project['full_name']
|
||||
io.install()
|
||||
project = io.find_one({"type": "project", "name": ft_project['full_name']})
|
||||
io.uninstall()
|
||||
|
||||
if project is None:
|
||||
return False
|
||||
else:
|
||||
apps = []
|
||||
for app in project['config']['apps']:
|
||||
apps.append(app['name'].split("_")[0])
|
||||
|
||||
if self.identifier not in apps:
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def _translate_event(self, session, event):
|
||||
'''Return *event* translated structure to be used with the API.'''
|
||||
|
||||
_selection = event['data'].get('selection', [])
|
||||
|
||||
_entities = list()
|
||||
for entity in _selection:
|
||||
_entities.append(
|
||||
(
|
||||
self._get_entity_type(entity), entity.get('entityId')
|
||||
)
|
||||
)
|
||||
|
||||
return [
|
||||
_entities,
|
||||
event
|
||||
]
|
||||
|
||||
def _get_entity_type(self, entity):
|
||||
'''Return translated entity type tht can be used with API.'''
|
||||
# Get entity type and make sure it is lower cased. Most places except
|
||||
# the component tab in the Sidebar will use lower case notation.
|
||||
entity_type = entity.get('entityType').replace('_', '').lower()
|
||||
|
||||
for schema in self.session.schemas:
|
||||
alias_for = schema.get('alias_for')
|
||||
|
||||
if (
|
||||
alias_for and isinstance(alias_for, str) and
|
||||
alias_for.lower() == entity_type
|
||||
):
|
||||
return schema['id']
|
||||
|
||||
for schema in self.session.schemas:
|
||||
if schema['id'].lower() == entity_type:
|
||||
return schema['id']
|
||||
|
||||
raise ValueError(
|
||||
'Unable to translate entity type: {0}.'.format(entity_type)
|
||||
)
|
||||
|
||||
def _launch(self, event):
|
||||
args = self._translate_event(
|
||||
self.session, event
|
||||
)
|
||||
|
||||
interface = self._interface(
|
||||
self.session, *args
|
||||
)
|
||||
|
||||
if interface:
|
||||
return interface
|
||||
|
||||
response = self.launch(
|
||||
self.session, *args
|
||||
)
|
||||
|
||||
return self._handle_result(
|
||||
self.session, response, *args
|
||||
)
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
'''Callback method for the custom action.
|
||||
|
||||
return either a bool ( True if successful or False if the action failed )
|
||||
or a dictionary with they keys `message` and `success`, the message should be a
|
||||
string and will be displayed as feedback to the user, success should be a bool,
|
||||
True if successful or False if the action failed.
|
||||
|
||||
*session* is a `ftrack_api.Session` instance
|
||||
|
||||
*entities* is a list of tuples each containing the entity type and the entity id.
|
||||
If the entity is a hierarchical you will always get the entity
|
||||
type TypedContext, once retrieved through a get operation you
|
||||
will have the "real" entity type ie. example Shot, Sequence
|
||||
or Asset Build.
|
||||
|
||||
*event* the unmodified original event
|
||||
|
||||
'''
|
||||
|
||||
# TODO Delete this line
|
||||
print("Action - {0} ({1}) - just started".format(self.label, self.identifier))
|
||||
|
||||
entity, id = entities[0]
|
||||
entity = session.get(entity, id)
|
||||
|
||||
silo = "Film"
|
||||
if entity.entity_type == "AssetBuild":
|
||||
silo = "Asset"
|
||||
|
||||
# set environments for Avalon
|
||||
os.environ["AVALON_PROJECT"] = entity['project']['full_name']
|
||||
os.environ["AVALON_SILO"] = silo
|
||||
os.environ["AVALON_ASSET"] = entity['parent']['name']
|
||||
os.environ["AVALON_TASK"] = entity['name']
|
||||
os.environ["AVALON_APP"] = self.identifier
|
||||
os.environ["AVALON_APP_NAME"] = self.identifier + "_" + self.variant
|
||||
|
||||
anatomy = t.anatomy
|
||||
io.install()
|
||||
hierarchy = io.find_one({"type": 'asset', "name": entity['parent']['name']})[
|
||||
'data']['parents']
|
||||
io.uninstall()
|
||||
if hierarchy:
|
||||
# hierarchy = os.path.sep.join(hierarchy)
|
||||
hierarchy = os.path.join(*hierarchy)
|
||||
|
||||
data = {"project": {"name": entity['project']['full_name'],
|
||||
"code": entity['project']['name']},
|
||||
"task": entity['name'],
|
||||
"asset": entity['parent']['name'],
|
||||
"hierarchy": hierarchy}
|
||||
|
||||
anatomy = anatomy.format(data)
|
||||
|
||||
os.environ["AVALON_WORKDIR"] = os.path.join(anatomy.work.root, anatomy.work.folder)
|
||||
|
||||
# TODO Add paths to avalon setup from tomls
|
||||
if self.identifier == 'maya':
|
||||
os.environ['PYTHONPATH'] += os.pathsep + \
|
||||
os.path.join(os.getenv("AVALON_CORE"), 'setup', 'maya')
|
||||
elif self.identifier == 'nuke':
|
||||
os.environ['NUKE_PATH'] = os.pathsep + \
|
||||
os.path.join(os.getenv("AVALON_CORE"), 'setup', 'nuke')
|
||||
# config = toml.load(lib.which_app(self.identifier + "_" + self.variant))
|
||||
|
||||
env = os.environ
|
||||
|
||||
# collect all parents from the task
|
||||
parents = []
|
||||
for item in entity['link']:
|
||||
parents.append(session.get(item['type'], item['id']))
|
||||
|
||||
# collect all the 'environment' attributes from parents
|
||||
tools_attr = [os.environ["AVALON_APP_NAME"]]
|
||||
for parent in reversed(parents):
|
||||
# check if the attribute is empty, if not use it
|
||||
if parent['custom_attributes']['tools_env']:
|
||||
tools_attr.extend(parent['custom_attributes']['tools_env'])
|
||||
break
|
||||
|
||||
tools_env = acre.get_tools(tools_attr)
|
||||
env = acre.compute(tools_env)
|
||||
env = acre.merge(env, current_env=dict(os.environ))
|
||||
|
||||
# Get path to execute
|
||||
st_temp_path = os.environ['PYPE_STUDIO_TEMPLATES']
|
||||
os_plat = platform.system().lower()
|
||||
|
||||
# Path to folder with launchers
|
||||
path = os.path.join(st_temp_path, 'bin', os_plat)
|
||||
# Full path to executable launcher
|
||||
execfile = None
|
||||
|
||||
for ext in os.environ["PATHEXT"].split(os.pathsep):
|
||||
fpath = os.path.join(path.strip('"'), self.executable + ext)
|
||||
if os.path.isfile(fpath) and os.access(fpath, os.X_OK):
|
||||
execfile = fpath
|
||||
break
|
||||
|
||||
# Run SW if was found executable
|
||||
if execfile is not None:
|
||||
lib.launch(executable=execfile, args=[], environment=env)
|
||||
else:
|
||||
return {
|
||||
'success': False,
|
||||
'message': "We didn't found launcher for {0}".format(self.label)
|
||||
}
|
||||
|
||||
# RUN TIMER IN FTRACK
|
||||
username = event['source']['user']['username']
|
||||
user = session.query('User where username is "{}"'.format(username)).one()
|
||||
task = session.query('Task where id is {}'.format(entity['id'])).one()
|
||||
print('Starting timer for task: ' + task['name'])
|
||||
user.start_timer(task, force=True)
|
||||
|
||||
return {
|
||||
'success': True,
|
||||
'message': "Launching {0}".format(self.label)
|
||||
}
|
||||
|
||||
def _interface(self, *args):
|
||||
interface = self.interface(*args)
|
||||
|
||||
if interface:
|
||||
return {
|
||||
'items': interface
|
||||
}
|
||||
|
||||
def interface(self, session, entities, event):
|
||||
'''Return a interface if applicable or None
|
||||
|
||||
*session* is a `ftrack_api.Session` instance
|
||||
|
||||
*entities* is a list of tuples each containing the entity type and the entity id.
|
||||
If the entity is a hierarchical you will always get the entity
|
||||
type TypedContext, once retrieved through a get operation you
|
||||
will have the "real" entity type ie. example Shot, Sequence
|
||||
or Asset Build.
|
||||
|
||||
*event* the unmodified original event
|
||||
'''
|
||||
return None
|
||||
|
||||
def _handle_result(self, session, result, entities, event):
|
||||
'''Validate the returned result from the action callback'''
|
||||
if isinstance(result, bool):
|
||||
result = {
|
||||
'success': result,
|
||||
'message': (
|
||||
'{0} launched successfully.'.format(
|
||||
self.label
|
||||
)
|
||||
)
|
||||
}
|
||||
|
||||
elif isinstance(result, dict):
|
||||
for key in ('success', 'message'):
|
||||
if key in result:
|
||||
continue
|
||||
|
||||
raise KeyError(
|
||||
'Missing required key: {0}.'.format(key)
|
||||
)
|
||||
|
||||
else:
|
||||
self.logger.error(
|
||||
'Invalid result type must be bool or dictionary!'
|
||||
)
|
||||
|
||||
return result
|
||||
|
||||
|
||||
class BaseAction(object):
|
||||
'''Custom Action base class
|
||||
|
||||
`label` a descriptive string identifing your action.
|
||||
|
||||
`varaint` To group actions together, give them the same
|
||||
label and specify a unique variant per action.
|
||||
|
||||
`identifier` a unique identifier for your action.
|
||||
|
||||
`description` a verbose descriptive text for you action
|
||||
|
||||
'''
|
||||
label = None
|
||||
variant = None
|
||||
identifier = None
|
||||
description = None
|
||||
icon = None
|
||||
|
||||
def __init__(self, session):
|
||||
'''Expects a ftrack_api.Session instance'''
|
||||
|
||||
self.logger = logging.getLogger(
|
||||
'{0}.{1}'.format(__name__, self.__class__.__name__)
|
||||
)
|
||||
|
||||
if self.label is None:
|
||||
raise ValueError(
|
||||
'Action missing label.'
|
||||
)
|
||||
|
||||
elif self.identifier is None:
|
||||
raise ValueError(
|
||||
'Action missing identifier.'
|
||||
)
|
||||
|
||||
self._session = session
|
||||
|
||||
@property
|
||||
def session(self):
|
||||
'''Return current session.'''
|
||||
return self._session
|
||||
|
||||
def reset_session(self):
|
||||
self.session.reset()
|
||||
|
||||
def register(self):
|
||||
'''Registers the action, subscribing the the discover and launch topics.'''
|
||||
self.session.event_hub.subscribe(
|
||||
'topic=ftrack.action.discover and source.user.username={0}'.format(
|
||||
self.session.api_user
|
||||
), self._discover
|
||||
)
|
||||
|
||||
self.session.event_hub.subscribe(
|
||||
'topic=ftrack.action.launch and data.actionIdentifier={0} and source.user.username={1}'.format(
|
||||
self.identifier,
|
||||
self.session.api_user
|
||||
),
|
||||
self._launch
|
||||
)
|
||||
print("----- action - <" + self.__class__.__name__ + "> - Has been registered -----")
|
||||
|
||||
def _discover(self, event):
|
||||
args = self._translate_event(
|
||||
self.session, event
|
||||
)
|
||||
|
||||
accepts = self.discover(
|
||||
self.session, *args
|
||||
)
|
||||
|
||||
if accepts:
|
||||
self.logger.info(u'Discovering action with selection: {0}'.format(
|
||||
args[1]['data'].get('selection', [])))
|
||||
return {
|
||||
'items': [{
|
||||
'label': self.label,
|
||||
'variant': self.variant,
|
||||
'description': self.description,
|
||||
'actionIdentifier': self.identifier,
|
||||
'icon': self.icon,
|
||||
}]
|
||||
}
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
'''Return true if we can handle the selected entities.
|
||||
|
||||
*session* is a `ftrack_api.Session` instance
|
||||
|
||||
|
||||
*entities* is a list of tuples each containing the entity type and the entity id.
|
||||
If the entity is a hierarchical you will always get the entity
|
||||
type TypedContext, once retrieved through a get operation you
|
||||
will have the "real" entity type ie. example Shot, Sequence
|
||||
or Asset Build.
|
||||
|
||||
*event* the unmodified original event
|
||||
|
||||
'''
|
||||
|
||||
return False
|
||||
|
||||
def _translate_event(self, session, event):
|
||||
'''Return *event* translated structure to be used with the API.'''
|
||||
|
||||
_selection = event['data'].get('selection', [])
|
||||
|
||||
_entities = list()
|
||||
for entity in _selection:
|
||||
_entities.append(
|
||||
(
|
||||
session.get(self._get_entity_type(entity), entity.get('entityId'))
|
||||
# self._get_entity_type(entity), entity.get('entityId')
|
||||
)
|
||||
)
|
||||
|
||||
return [
|
||||
_entities,
|
||||
event
|
||||
]
|
||||
|
||||
def _get_entity_type(self, entity):
|
||||
'''Return translated entity type tht can be used with API.'''
|
||||
# Get entity type and make sure it is lower cased. Most places except
|
||||
# the component tab in the Sidebar will use lower case notation.
|
||||
entity_type = entity.get('entityType').replace('_', '').lower()
|
||||
|
||||
for schema in self.session.schemas:
|
||||
alias_for = schema.get('alias_for')
|
||||
|
||||
if (
|
||||
alias_for and isinstance(alias_for, str) and
|
||||
alias_for.lower() == entity_type
|
||||
):
|
||||
return schema['id']
|
||||
|
||||
for schema in self.session.schemas:
|
||||
if schema['id'].lower() == entity_type:
|
||||
return schema['id']
|
||||
|
||||
raise ValueError(
|
||||
'Unable to translate entity type: {0}.'.format(entity_type)
|
||||
)
|
||||
|
||||
def _launch(self, event):
|
||||
self.reset_session()
|
||||
args = self._translate_event(
|
||||
self.session, event
|
||||
)
|
||||
|
||||
interface = self._interface(
|
||||
self.session, *args
|
||||
)
|
||||
|
||||
if interface:
|
||||
return interface
|
||||
|
||||
response = self.launch(
|
||||
self.session, *args
|
||||
)
|
||||
|
||||
return self._handle_result(
|
||||
self.session, response, *args
|
||||
)
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
'''Callback method for the custom action.
|
||||
|
||||
return either a bool ( True if successful or False if the action failed )
|
||||
or a dictionary with they keys `message` and `success`, the message should be a
|
||||
string and will be displayed as feedback to the user, success should be a bool,
|
||||
True if successful or False if the action failed.
|
||||
|
||||
*session* is a `ftrack_api.Session` instance
|
||||
|
||||
*entities* is a list of tuples each containing the entity type and the entity id.
|
||||
If the entity is a hierarchical you will always get the entity
|
||||
type TypedContext, once retrieved through a get operation you
|
||||
will have the "real" entity type ie. example Shot, Sequence
|
||||
or Asset Build.
|
||||
|
||||
*event* the unmodified original event
|
||||
|
||||
'''
|
||||
raise NotImplementedError()
|
||||
|
||||
def _interface(self, *args):
|
||||
interface = self.interface(*args)
|
||||
|
||||
if interface:
|
||||
return {
|
||||
'items': interface
|
||||
}
|
||||
|
||||
def interface(self, session, entities, event):
|
||||
'''Return a interface if applicable or None
|
||||
|
||||
*session* is a `ftrack_api.Session` instance
|
||||
|
||||
*entities* is a list of tuples each containing the entity type and the entity id.
|
||||
If the entity is a hierarchical you will always get the entity
|
||||
type TypedContext, once retrieved through a get operation you
|
||||
will have the "real" entity type ie. example Shot, Sequence
|
||||
or Asset Build.
|
||||
|
||||
*event* the unmodified original event
|
||||
'''
|
||||
return None
|
||||
|
||||
def _handle_result(self, session, result, entities, event):
|
||||
'''Validate the returned result from the action callback'''
|
||||
if isinstance(result, bool):
|
||||
result = {
|
||||
'success': result,
|
||||
'message': (
|
||||
'{0} launched successfully.'.format(
|
||||
self.label
|
||||
)
|
||||
)
|
||||
}
|
||||
|
||||
elif isinstance(result, dict):
|
||||
for key in ('success', 'message'):
|
||||
if key in result:
|
||||
continue
|
||||
|
||||
raise KeyError(
|
||||
'Missing required key: {0}.'.format(key)
|
||||
)
|
||||
|
||||
else:
|
||||
self.logger.error(
|
||||
'Invalid result type must be bool or dictionary!'
|
||||
)
|
||||
|
||||
return result
|
||||
|
|
@ -1,65 +0,0 @@
|
|||
import os
|
||||
import toml
|
||||
|
||||
import ftrack_api
|
||||
import appdirs
|
||||
|
||||
config_path = os.path.normpath(appdirs.user_data_dir('pype-app','pype'))
|
||||
config_name = 'ftrack_cred.toml'
|
||||
fpath = os.path.join(config_path, config_name)
|
||||
folder = os.path.dirname(fpath)
|
||||
|
||||
if not os.path.isdir(folder):
|
||||
os.makedirs(folder)
|
||||
|
||||
def _get_credentials():
|
||||
|
||||
folder = os.path.dirname(fpath)
|
||||
|
||||
if not os.path.isdir(folder):
|
||||
os.makedirs(folder)
|
||||
|
||||
try:
|
||||
file = open(fpath, 'r')
|
||||
except:
|
||||
filecreate = open(fpath, 'w')
|
||||
filecreate.close()
|
||||
file = open(fpath, 'r')
|
||||
|
||||
credentials = toml.load(file)
|
||||
file.close()
|
||||
|
||||
return credentials
|
||||
|
||||
def _save_credentials(username, apiKey):
|
||||
file = open(fpath, 'w')
|
||||
|
||||
data = {
|
||||
'username':username,
|
||||
'apiKey':apiKey
|
||||
}
|
||||
|
||||
credentials = toml.dumps(data)
|
||||
file.write(credentials)
|
||||
file.close()
|
||||
|
||||
def _clear_credentials():
|
||||
file = open(fpath, 'w').close()
|
||||
|
||||
def _set_env(username, apiKey):
|
||||
os.environ['FTRACK_API_USER'] = username
|
||||
os.environ['FTRACK_API_KEY'] = apiKey
|
||||
|
||||
def _check_credentials(username=None, apiKey=None):
|
||||
|
||||
if username and apiKey:
|
||||
_set_env(username, apiKey)
|
||||
|
||||
try:
|
||||
session = ftrack_api.Session()
|
||||
session.close()
|
||||
except Exception as e:
|
||||
print(e)
|
||||
return False
|
||||
|
||||
return True
|
||||
381
pype/ftrack/events/action_sync_hier_attrs.py
Normal file
381
pype/ftrack/events/action_sync_hier_attrs.py
Normal file
|
|
@ -0,0 +1,381 @@
|
|||
import os
|
||||
import sys
|
||||
import json
|
||||
import argparse
|
||||
import logging
|
||||
import collections
|
||||
|
||||
from pypeapp import config
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction, lib
|
||||
from avalon.tools.libraryloader.io_nonsingleton import DbConnector
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
|
||||
class SyncHierarchicalAttrs(BaseAction):
|
||||
|
||||
db_con = DbConnector()
|
||||
ca_mongoid = lib.get_ca_mongoid()
|
||||
|
||||
#: Action identifier.
|
||||
identifier = 'sync.hierarchical.attrs'
|
||||
#: Action label.
|
||||
label = 'Sync HierAttrs'
|
||||
#: Action description.
|
||||
description = 'Synchronize hierarchical attributes'
|
||||
#: Icon
|
||||
icon = '{}/ftrack/action_icons/SyncHierarchicalAttrs.svg'.format(
|
||||
os.environ.get(
|
||||
'PYPE_STATICS_SERVER',
|
||||
'http://localhost:{}'.format(
|
||||
config.get_presets().get('services', {}).get(
|
||||
'statics_server', {}
|
||||
).get('default_port', 8021)
|
||||
)
|
||||
)
|
||||
)
|
||||
|
||||
def register(self):
|
||||
self.session.event_hub.subscribe(
|
||||
'topic=ftrack.action.discover',
|
||||
self._discover
|
||||
)
|
||||
|
||||
self.session.event_hub.subscribe(
|
||||
'topic=ftrack.action.launch and data.actionIdentifier={}'.format(
|
||||
self.identifier
|
||||
),
|
||||
self._launch
|
||||
)
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
role_check = False
|
||||
discover = False
|
||||
role_list = ['Pypeclub', 'Administrator', 'Project Manager']
|
||||
user = session.query(
|
||||
'User where id is "{}"'.format(event['source']['user']['id'])
|
||||
).one()
|
||||
|
||||
for role in user['user_security_roles']:
|
||||
if role['security_role']['name'] in role_list:
|
||||
role_check = True
|
||||
break
|
||||
|
||||
if role_check is True:
|
||||
for entity in entities:
|
||||
context_type = entity.get('context_type', '').lower()
|
||||
if (
|
||||
context_type in ('show', 'task') and
|
||||
entity.entity_type.lower() != 'task'
|
||||
):
|
||||
discover = True
|
||||
break
|
||||
|
||||
return discover
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
self.interface_messages = {}
|
||||
|
||||
user = session.query(
|
||||
'User where id is "{}"'.format(event['source']['user']['id'])
|
||||
).one()
|
||||
|
||||
job = session.create('Job', {
|
||||
'user': user,
|
||||
'status': 'running',
|
||||
'data': json.dumps({
|
||||
'description': 'Sync Hierachical attributes'
|
||||
})
|
||||
})
|
||||
session.commit()
|
||||
self.log.debug('Job with id "{}" created'.format(job['id']))
|
||||
|
||||
process_session = ftrack_api.Session(
|
||||
server_url=session.server_url,
|
||||
api_key=session.api_key,
|
||||
api_user=session.api_user,
|
||||
auto_connect_event_hub=True
|
||||
)
|
||||
try:
|
||||
# Collect hierarchical attrs
|
||||
self.log.debug('Collecting Hierarchical custom attributes started')
|
||||
custom_attributes = {}
|
||||
all_avalon_attr = process_session.query(
|
||||
'CustomAttributeGroup where name is "avalon"'
|
||||
).one()
|
||||
|
||||
error_key = (
|
||||
'Hierarchical attributes with set "default" value (not allowed)'
|
||||
)
|
||||
|
||||
for cust_attr in all_avalon_attr['custom_attribute_configurations']:
|
||||
if 'avalon_' in cust_attr['key']:
|
||||
continue
|
||||
|
||||
if not cust_attr['is_hierarchical']:
|
||||
continue
|
||||
|
||||
if cust_attr['default']:
|
||||
if error_key not in self.interface_messages:
|
||||
self.interface_messages[error_key] = []
|
||||
self.interface_messages[error_key].append(
|
||||
cust_attr['label']
|
||||
)
|
||||
|
||||
self.log.warning((
|
||||
'Custom attribute "{}" has set default value.'
|
||||
' This attribute can\'t be synchronized'
|
||||
).format(cust_attr['label']))
|
||||
continue
|
||||
|
||||
custom_attributes[cust_attr['key']] = cust_attr
|
||||
|
||||
self.log.debug(
|
||||
'Collecting Hierarchical custom attributes has finished'
|
||||
)
|
||||
|
||||
if not custom_attributes:
|
||||
msg = 'No hierarchical attributes to sync.'
|
||||
self.log.debug(msg)
|
||||
return {
|
||||
'success': True,
|
||||
'message': msg
|
||||
}
|
||||
|
||||
entity = entities[0]
|
||||
if entity.entity_type.lower() == 'project':
|
||||
project_name = entity['full_name']
|
||||
else:
|
||||
project_name = entity['project']['full_name']
|
||||
|
||||
self.db_con.install()
|
||||
self.db_con.Session['AVALON_PROJECT'] = project_name
|
||||
|
||||
_entities = self._get_entities(event, process_session)
|
||||
|
||||
for entity in _entities:
|
||||
self.log.debug(30*'-')
|
||||
self.log.debug(
|
||||
'Processing entity "{}"'.format(entity.get('name', entity))
|
||||
)
|
||||
|
||||
ent_name = entity.get('name', entity)
|
||||
if entity.entity_type.lower() == 'project':
|
||||
ent_name = entity['full_name']
|
||||
|
||||
for key in custom_attributes:
|
||||
self.log.debug(30*'*')
|
||||
self.log.debug(
|
||||
'Processing Custom attribute key "{}"'.format(key)
|
||||
)
|
||||
# check if entity has that attribute
|
||||
if key not in entity['custom_attributes']:
|
||||
error_key = 'Missing key on entities'
|
||||
if error_key not in self.interface_messages:
|
||||
self.interface_messages[error_key] = []
|
||||
|
||||
self.interface_messages[error_key].append(
|
||||
'- key: "{}" - entity: "{}"'.format(key, ent_name)
|
||||
)
|
||||
|
||||
self.log.error((
|
||||
'- key "{}" not found on "{}"'
|
||||
).format(key, entity.get('name', entity)))
|
||||
continue
|
||||
|
||||
value = self.get_hierarchical_value(key, entity)
|
||||
if value is None:
|
||||
error_key = (
|
||||
'Missing value for key on entity'
|
||||
' and its parents (synchronization was skipped)'
|
||||
)
|
||||
if error_key not in self.interface_messages:
|
||||
self.interface_messages[error_key] = []
|
||||
|
||||
self.interface_messages[error_key].append(
|
||||
'- key: "{}" - entity: "{}"'.format(key, ent_name)
|
||||
)
|
||||
|
||||
self.log.warning((
|
||||
'- key "{}" not set on "{}" or its parents'
|
||||
).format(key, ent_name))
|
||||
continue
|
||||
|
||||
self.update_hierarchical_attribute(entity, key, value)
|
||||
|
||||
job['status'] = 'done'
|
||||
session.commit()
|
||||
|
||||
except Exception:
|
||||
self.log.error(
|
||||
'Action "{}" failed'.format(self.label),
|
||||
exc_info=True
|
||||
)
|
||||
|
||||
finally:
|
||||
self.db_con.uninstall()
|
||||
|
||||
if job['status'] in ('queued', 'running'):
|
||||
job['status'] = 'failed'
|
||||
session.commit()
|
||||
|
||||
if self.interface_messages:
|
||||
self.show_interface_from_dict(self.interface_messages, event)
|
||||
|
||||
return True
|
||||
|
||||
def get_hierarchical_value(self, key, entity):
|
||||
value = entity['custom_attributes'][key]
|
||||
if (
|
||||
value is not None or
|
||||
entity.entity_type.lower() == 'project'
|
||||
):
|
||||
return value
|
||||
|
||||
return self.get_hierarchical_value(key, entity['parent'])
|
||||
|
||||
def update_hierarchical_attribute(self, entity, key, value):
|
||||
if (
|
||||
entity['context_type'].lower() not in ('show', 'task') or
|
||||
entity.entity_type.lower() == 'task'
|
||||
):
|
||||
return
|
||||
|
||||
ent_name = entity.get('name', entity)
|
||||
if entity.entity_type.lower() == 'project':
|
||||
ent_name = entity['full_name']
|
||||
|
||||
hierarchy = '/'.join(
|
||||
[a['name'] for a in entity.get('ancestors', [])]
|
||||
)
|
||||
if hierarchy:
|
||||
hierarchy = '/'.join(
|
||||
[entity['project']['full_name'], hierarchy, entity['name']]
|
||||
)
|
||||
elif entity.entity_type.lower() == 'project':
|
||||
hierarchy = entity['full_name']
|
||||
else:
|
||||
hierarchy = '/'.join(
|
||||
[entity['project']['full_name'], entity['name']]
|
||||
)
|
||||
|
||||
self.log.debug('- updating entity "{}"'.format(hierarchy))
|
||||
|
||||
# collect entity's custom attributes
|
||||
custom_attributes = entity.get('custom_attributes')
|
||||
if not custom_attributes:
|
||||
return
|
||||
|
||||
mongoid = custom_attributes.get(self.ca_mongoid)
|
||||
if not mongoid:
|
||||
error_key = 'Missing MongoID on entities (try SyncToAvalon first)'
|
||||
if error_key not in self.interface_messages:
|
||||
self.interface_messages[error_key] = []
|
||||
|
||||
if ent_name not in self.interface_messages[error_key]:
|
||||
self.interface_messages[error_key].append(ent_name)
|
||||
|
||||
self.log.warning(
|
||||
'-- entity "{}" is not synchronized to avalon. Skipping'.format(
|
||||
ent_name
|
||||
)
|
||||
)
|
||||
return
|
||||
|
||||
try:
|
||||
mongoid = ObjectId(mongoid)
|
||||
except Exception:
|
||||
error_key = 'Invalid MongoID on entities (try SyncToAvalon)'
|
||||
if error_key not in self.interface_messages:
|
||||
self.interface_messages[error_key] = []
|
||||
|
||||
if ent_name not in self.interface_messages[error_key]:
|
||||
self.interface_messages[error_key].append(ent_name)
|
||||
|
||||
self.log.warning(
|
||||
'-- entity "{}" has stored invalid MongoID. Skipping'.format(
|
||||
ent_name
|
||||
)
|
||||
)
|
||||
return
|
||||
# Find entity in Mongo DB
|
||||
mongo_entity = self.db_con.find_one({'_id': mongoid})
|
||||
if not mongo_entity:
|
||||
error_key = 'Entities not found in Avalon DB (try SyncToAvalon)'
|
||||
if error_key not in self.interface_messages:
|
||||
self.interface_messages[error_key] = []
|
||||
|
||||
if ent_name not in self.interface_messages[error_key]:
|
||||
self.interface_messages[error_key].append(ent_name)
|
||||
|
||||
self.log.warning(
|
||||
'-- entity "{}" was not found in DB by id "{}". Skipping'.format(
|
||||
ent_name, str(mongoid)
|
||||
)
|
||||
)
|
||||
return
|
||||
|
||||
# Change value if entity has set it's own
|
||||
entity_value = custom_attributes[key]
|
||||
if entity_value is not None:
|
||||
value = entity_value
|
||||
|
||||
data = mongo_entity.get('data') or {}
|
||||
|
||||
data[key] = value
|
||||
self.db_con.update_many(
|
||||
{'_id': mongoid},
|
||||
{'$set': {'data': data}}
|
||||
)
|
||||
|
||||
for child in entity.get('children', []):
|
||||
self.update_hierarchical_attribute(child, key, value)
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
SyncHierarchicalAttrs(session).register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
'''Set up logging and register action.'''
|
||||
if arguments is None:
|
||||
arguments = []
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
# Allow setting of logging level from arguments.
|
||||
loggingLevels = {}
|
||||
for level in (
|
||||
logging.NOTSET, logging.DEBUG, logging.INFO, logging.WARNING,
|
||||
logging.ERROR, logging.CRITICAL
|
||||
):
|
||||
loggingLevels[logging.getLevelName(level).lower()] = level
|
||||
|
||||
parser.add_argument(
|
||||
'-v', '--verbosity',
|
||||
help='Set the logging output verbosity.',
|
||||
choices=loggingLevels.keys(),
|
||||
default='info'
|
||||
)
|
||||
namespace = parser.parse_args(arguments)
|
||||
|
||||
# Set up basic logging
|
||||
logging.basicConfig(level=loggingLevels[namespace.verbosity])
|
||||
|
||||
session = ftrack_api.Session()
|
||||
register(session)
|
||||
|
||||
# Wait for events
|
||||
logging.info(
|
||||
'Registered actions and listening for events. Use Ctrl-C to abort.'
|
||||
)
|
||||
session.event_hub.wait()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
raise SystemExit(main(sys.argv[1:]))
|
||||
293
pype/ftrack/events/action_sync_to_avalon.py
Normal file
293
pype/ftrack/events/action_sync_to_avalon.py
Normal file
|
|
@ -0,0 +1,293 @@
|
|||
import os
|
||||
import sys
|
||||
import argparse
|
||||
import logging
|
||||
import json
|
||||
|
||||
from pypeapp import config
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseAction, lib
|
||||
from pype.vendor.ftrack_api import session as fa_session
|
||||
|
||||
|
||||
class Sync_To_Avalon(BaseAction):
|
||||
'''
|
||||
Synchronizing data action - from Ftrack to Avalon DB
|
||||
|
||||
Stores all information about entity.
|
||||
- Name(string) - Most important information = identifier of entity
|
||||
- Parent(ObjectId) - Avalon Project Id, if entity is not project itself
|
||||
- Silo(string) - Last parent except project
|
||||
- Data(dictionary):
|
||||
- VisualParent(ObjectId) - Avalon Id of parent asset
|
||||
- Parents(array of string) - All parent names except project
|
||||
- Tasks(array of string) - Tasks on asset
|
||||
- FtrackId(string)
|
||||
- entityType(string) - entity's type on Ftrack
|
||||
* All Custom attributes in group 'Avalon' which name don't start with 'avalon_'
|
||||
|
||||
* These information are stored also for all parents and children entities.
|
||||
|
||||
Avalon ID of asset is stored to Ftrack -> Custom attribute 'avalon_mongo_id'.
|
||||
- action IS NOT creating this Custom attribute if doesn't exist
|
||||
- run 'Create Custom Attributes' action or do it manually (Not recommended)
|
||||
|
||||
If Ftrack entity already has Custom Attribute 'avalon_mongo_id' that stores ID:
|
||||
- name, parents and silo are checked -> shows error if are not exact the same
|
||||
- after sync it is not allowed to change names or move entities
|
||||
|
||||
If ID in 'avalon_mongo_id' is empty string or is not found in DB:
|
||||
- tries to find entity by name
|
||||
- found:
|
||||
- raise error if ftrackId/visual parent/parents are not same
|
||||
- not found:
|
||||
- Creates asset/project
|
||||
|
||||
'''
|
||||
|
||||
#: Action identifier.
|
||||
identifier = 'sync.to.avalon'
|
||||
#: Action label.
|
||||
label = 'SyncToAvalon'
|
||||
#: Action description.
|
||||
description = 'Send data from Ftrack to Avalon'
|
||||
#: Action icon.
|
||||
icon = '{}/ftrack/action_icons/SyncToAvalon.svg'.format(
|
||||
os.environ.get(
|
||||
'PYPE_STATICS_SERVER',
|
||||
'http://localhost:{}'.format(
|
||||
config.get_presets().get('services', {}).get(
|
||||
'statics_server', {}
|
||||
).get('default_port', 8021)
|
||||
)
|
||||
)
|
||||
)
|
||||
|
||||
def register(self):
|
||||
self.session.event_hub.subscribe(
|
||||
'topic=ftrack.action.discover',
|
||||
self._discover
|
||||
)
|
||||
|
||||
self.session.event_hub.subscribe(
|
||||
'topic=ftrack.action.launch and data.actionIdentifier={0}'.format(
|
||||
self.identifier
|
||||
),
|
||||
self._launch
|
||||
)
|
||||
|
||||
def discover(self, session, entities, event):
|
||||
''' Validation '''
|
||||
roleCheck = False
|
||||
discover = False
|
||||
roleList = ['Pypeclub', 'Administrator', 'Project Manager']
|
||||
userId = event['source']['user']['id']
|
||||
user = session.query('User where id is ' + userId).one()
|
||||
|
||||
for role in user['user_security_roles']:
|
||||
if role['security_role']['name'] in roleList:
|
||||
roleCheck = True
|
||||
break
|
||||
if roleCheck is True:
|
||||
for entity in entities:
|
||||
if entity.entity_type.lower() not in ['task', 'assetversion']:
|
||||
discover = True
|
||||
break
|
||||
|
||||
return discover
|
||||
|
||||
def launch(self, session, entities, event):
|
||||
message = ""
|
||||
|
||||
# JOB SETTINGS
|
||||
userId = event['source']['user']['id']
|
||||
user = session.query('User where id is ' + userId).one()
|
||||
|
||||
job = session.create('Job', {
|
||||
'user': user,
|
||||
'status': 'running',
|
||||
'data': json.dumps({
|
||||
'description': 'Sync Ftrack to Avalon.'
|
||||
})
|
||||
})
|
||||
session.commit()
|
||||
try:
|
||||
self.importable = []
|
||||
|
||||
# get from top entity in hierarchy all parent entities
|
||||
top_entity = entities[0]['link']
|
||||
if len(top_entity) > 1:
|
||||
for e in top_entity:
|
||||
parent_entity = session.get(e['type'], e['id'])
|
||||
self.importable.append(parent_entity)
|
||||
|
||||
# get all child entities separately/unique
|
||||
for entity in entities:
|
||||
self.add_childs_to_importable(entity)
|
||||
|
||||
# Check names: REGEX in schema/duplicates - raise error if found
|
||||
all_names = []
|
||||
duplicates = []
|
||||
|
||||
for e in self.importable:
|
||||
lib.avalon_check_name(e)
|
||||
if e['name'] in all_names:
|
||||
duplicates.append("'{}'".format(e['name']))
|
||||
else:
|
||||
all_names.append(e['name'])
|
||||
|
||||
if len(duplicates) > 0:
|
||||
raise ValueError(
|
||||
"Entity name duplication: {}".format(", ".join(duplicates))
|
||||
)
|
||||
|
||||
# ----- PROJECT ------
|
||||
# store Ftrack project- self.importable[0] must be project entity!!
|
||||
ft_project = self.importable[0]
|
||||
avalon_project = lib.get_avalon_project(ft_project)
|
||||
custom_attributes = lib.get_avalon_attr(session)
|
||||
|
||||
# Import all entities to Avalon DB
|
||||
for entity in self.importable:
|
||||
result = lib.import_to_avalon(
|
||||
session=session,
|
||||
entity=entity,
|
||||
ft_project=ft_project,
|
||||
av_project=avalon_project,
|
||||
custom_attributes=custom_attributes
|
||||
)
|
||||
|
||||
if 'errors' in result and len(result['errors']) > 0:
|
||||
job['status'] = 'failed'
|
||||
session.commit()
|
||||
|
||||
lib.show_errors(self, event, result['errors'])
|
||||
|
||||
return {
|
||||
'success': False,
|
||||
'message': "Sync to avalon FAILED"
|
||||
}
|
||||
|
||||
if avalon_project is None:
|
||||
if 'project' in result:
|
||||
avalon_project = result['project']
|
||||
|
||||
job['status'] = 'done'
|
||||
session.commit()
|
||||
|
||||
except ValueError as ve:
|
||||
job['status'] = 'failed'
|
||||
session.commit()
|
||||
message = str(ve)
|
||||
self.log.error(
|
||||
'Error during syncToAvalon: {}'.format(message),
|
||||
exc_info=True
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
job['status'] = 'failed'
|
||||
session.commit()
|
||||
exc_type, exc_obj, exc_tb = sys.exc_info()
|
||||
fname = os.path.split(exc_tb.tb_frame.f_code.co_filename)[1]
|
||||
log_message = "{}/{}/Line: {}".format(
|
||||
exc_type, fname, exc_tb.tb_lineno
|
||||
)
|
||||
self.log.error(
|
||||
'Error during syncToAvalon: {}'.format(log_message),
|
||||
exc_info=True
|
||||
)
|
||||
message = (
|
||||
'Unexpected Error'
|
||||
' - Please check Log for more information'
|
||||
)
|
||||
|
||||
finally:
|
||||
if job['status'] in ['queued', 'running']:
|
||||
job['status'] = 'failed'
|
||||
|
||||
session.commit()
|
||||
|
||||
event = fa_session.ftrack_api.event.base.Event(
|
||||
topic='ftrack.action.launch',
|
||||
data=dict(
|
||||
actionIdentifier='sync.hierarchical.attrs',
|
||||
selection=event['data']['selection']
|
||||
),
|
||||
source=dict(
|
||||
user=event['source']['user']
|
||||
)
|
||||
)
|
||||
session.event_hub.publish(event, on_error='ignore')
|
||||
|
||||
if len(message) > 0:
|
||||
message = "Unable to sync: {}".format(message)
|
||||
return {
|
||||
'success': False,
|
||||
'message': message
|
||||
}
|
||||
|
||||
return {
|
||||
'success': True,
|
||||
'message': "Synchronization was successfull"
|
||||
}
|
||||
|
||||
def add_childs_to_importable(self, entity):
|
||||
if not (entity.entity_type in ['Task']):
|
||||
if entity not in self.importable:
|
||||
self.importable.append(entity)
|
||||
|
||||
if entity['children']:
|
||||
childrens = entity['children']
|
||||
for child in childrens:
|
||||
self.add_childs_to_importable(child)
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
# Validate that session is an instance of ftrack_api.Session. If not,
|
||||
# assume that register is being called from an old or incompatible API and
|
||||
# return without doing anything.
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
Sync_To_Avalon(session).register()
|
||||
|
||||
|
||||
def main(arguments=None):
|
||||
'''Set up logging and register action.'''
|
||||
if arguments is None:
|
||||
arguments = []
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
# Allow setting of logging level from arguments.
|
||||
loggingLevels = {}
|
||||
for level in (
|
||||
logging.NOTSET, logging.DEBUG, logging.INFO, logging.WARNING,
|
||||
logging.ERROR, logging.CRITICAL
|
||||
):
|
||||
loggingLevels[logging.getLevelName(level).lower()] = level
|
||||
|
||||
parser.add_argument(
|
||||
'-v', '--verbosity',
|
||||
help='Set the logging output verbosity.',
|
||||
choices=loggingLevels.keys(),
|
||||
default='info'
|
||||
)
|
||||
namespace = parser.parse_args(arguments)
|
||||
|
||||
# Set up basic logging
|
||||
logging.basicConfig(level=loggingLevels[namespace.verbosity])
|
||||
|
||||
session = ftrack_api.Session()
|
||||
register(session)
|
||||
|
||||
# Wait for events
|
||||
logging.info(
|
||||
'Registered actions and listening for events. Use Ctrl-C to abort.'
|
||||
)
|
||||
session.event_hub.wait()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
raise SystemExit(main(sys.argv[1:]))
|
||||
59
pype/ftrack/events/event_del_avalon_id_from_new.py
Normal file
59
pype/ftrack/events/event_del_avalon_id_from_new.py
Normal file
|
|
@ -0,0 +1,59 @@
|
|||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseEvent, get_ca_mongoid
|
||||
from pype.ftrack.events.event_sync_to_avalon import Sync_to_Avalon
|
||||
|
||||
|
||||
class DelAvalonIdFromNew(BaseEvent):
|
||||
'''
|
||||
This event removes AvalonId from custom attributes of new entities
|
||||
Result:
|
||||
- 'Copy->Pasted' entities won't have same AvalonID as source entity
|
||||
|
||||
Priority of this event must be less than SyncToAvalon event
|
||||
'''
|
||||
priority = Sync_to_Avalon.priority - 1
|
||||
|
||||
def launch(self, session, event):
|
||||
created = []
|
||||
entities = event['data']['entities']
|
||||
for entity in entities:
|
||||
try:
|
||||
entity_id = entity['entityId']
|
||||
|
||||
if entity.get('action', None) == 'add':
|
||||
id_dict = entity['changes']['id']
|
||||
|
||||
if id_dict['new'] is not None and id_dict['old'] is None:
|
||||
created.append(id_dict['new'])
|
||||
|
||||
elif (
|
||||
entity.get('action', None) == 'update' and
|
||||
get_ca_mongoid() in entity['keys'] and
|
||||
entity_id in created
|
||||
):
|
||||
ftrack_entity = session.get(
|
||||
self._get_entity_type(entity),
|
||||
entity_id
|
||||
)
|
||||
|
||||
cust_attr = ftrack_entity['custom_attributes'][
|
||||
get_ca_mongoid()
|
||||
]
|
||||
|
||||
if cust_attr != '':
|
||||
ftrack_entity['custom_attributes'][
|
||||
get_ca_mongoid()
|
||||
] = ''
|
||||
session.commit()
|
||||
|
||||
except Exception:
|
||||
session.rollback()
|
||||
continue
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
DelAvalonIdFromNew(session).register()
|
||||
94
pype/ftrack/events/event_next_task_update.py
Normal file
94
pype/ftrack/events/event_next_task_update.py
Normal file
|
|
@ -0,0 +1,94 @@
|
|||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseEvent
|
||||
import operator
|
||||
|
||||
|
||||
class NextTaskUpdate(BaseEvent):
|
||||
|
||||
def get_next_task(self, task, session):
|
||||
parent = task['parent']
|
||||
# tasks = parent['tasks']
|
||||
tasks = parent['children']
|
||||
|
||||
def sort_types(types):
|
||||
data = {}
|
||||
for t in types:
|
||||
data[t] = t.get('sort')
|
||||
|
||||
data = sorted(data.items(), key=operator.itemgetter(1))
|
||||
results = []
|
||||
for item in data:
|
||||
results.append(item[0])
|
||||
return results
|
||||
|
||||
types_sorted = sort_types(session.query('Type'))
|
||||
next_types = None
|
||||
for t in types_sorted:
|
||||
if t['id'] == task['type_id']:
|
||||
next_types = types_sorted[(types_sorted.index(t) + 1):]
|
||||
|
||||
for nt in next_types:
|
||||
for t in tasks:
|
||||
if nt['id'] == t['type_id']:
|
||||
return t
|
||||
|
||||
return None
|
||||
|
||||
def launch(self, session, event):
|
||||
'''Propagates status from version to task when changed'''
|
||||
|
||||
# self.log.info(event)
|
||||
# start of event procedure ----------------------------------
|
||||
|
||||
for entity in event['data'].get('entities', []):
|
||||
changes = entity.get('changes', None)
|
||||
if changes is None:
|
||||
continue
|
||||
statusid_changes = changes.get('statusid', {})
|
||||
if (
|
||||
entity['entityType'] != 'task' or
|
||||
'statusid' not in entity['keys'] or
|
||||
statusid_changes.get('new', None) is None or
|
||||
statusid_changes.get('old', None) is None
|
||||
):
|
||||
continue
|
||||
|
||||
task = session.get('Task', entity['entityId'])
|
||||
|
||||
status = session.get('Status',
|
||||
entity['changes']['statusid']['new'])
|
||||
state = status['state']['name']
|
||||
|
||||
next_task = self.get_next_task(task, session)
|
||||
|
||||
# Setting next task to Ready, if on NOT READY
|
||||
if next_task and state == 'Done':
|
||||
if next_task['status']['name'].lower() == 'not ready':
|
||||
|
||||
# Get path to task
|
||||
path = task['name']
|
||||
for p in task['ancestors']:
|
||||
path = p['name'] + '/' + path
|
||||
|
||||
# Setting next task status
|
||||
try:
|
||||
query = 'Status where name is "{}"'.format('Ready')
|
||||
status_to_set = session.query(query).one()
|
||||
next_task['status'] = status_to_set
|
||||
session.commit()
|
||||
self.log.info((
|
||||
'>>> [ {} ] updated to [ Ready ]'
|
||||
).format(path))
|
||||
except Exception as e:
|
||||
self.log.warning((
|
||||
'!!! [ {} ] status couldnt be set: [ {} ]'
|
||||
).format(path, e))
|
||||
session.rollback()
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
NextTaskUpdate(session).register()
|
||||
42
pype/ftrack/events/event_radio_buttons.py
Normal file
42
pype/ftrack/events/event_radio_buttons.py
Normal file
|
|
@ -0,0 +1,42 @@
|
|||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseEvent
|
||||
|
||||
|
||||
class Radio_buttons(BaseEvent):
|
||||
|
||||
ignore_me = True
|
||||
|
||||
def launch(self, session, event):
|
||||
'''Provides a readio button behaviour to any bolean attribute in
|
||||
radio_button group.'''
|
||||
|
||||
# start of event procedure ----------------------------------
|
||||
for entity in event['data'].get('entities', []):
|
||||
|
||||
if entity['entityType'] == 'assetversion':
|
||||
|
||||
query = 'CustomAttributeGroup where name is "radio_button"'
|
||||
group = session.query(query).one()
|
||||
radio_buttons = []
|
||||
for g in group['custom_attribute_configurations']:
|
||||
radio_buttons.append(g['key'])
|
||||
|
||||
for key in entity['keys']:
|
||||
if (key in radio_buttons and entity['changes'] is not None):
|
||||
if entity['changes'][key]['new'] == '1':
|
||||
version = session.get('AssetVersion',
|
||||
entity['entityId'])
|
||||
asset = session.get('Asset', entity['parentId'])
|
||||
for v in asset['versions']:
|
||||
if version is not v:
|
||||
v['custom_attributes'][key] = 0
|
||||
|
||||
session.commit()
|
||||
|
||||
|
||||
def register(session):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
Radio_buttons(session).register()
|
||||
123
pype/ftrack/events/event_sync_hier_attr.py
Normal file
123
pype/ftrack/events/event_sync_hier_attr.py
Normal file
|
|
@ -0,0 +1,123 @@
|
|||
import os
|
||||
import sys
|
||||
|
||||
from avalon.tools.libraryloader.io_nonsingleton import DbConnector
|
||||
|
||||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseEvent, lib
|
||||
from bson.objectid import ObjectId
|
||||
|
||||
|
||||
class SyncHierarchicalAttrs(BaseEvent):
|
||||
# After sync to avalon event!
|
||||
priority = 101
|
||||
db_con = DbConnector()
|
||||
ca_mongoid = lib.get_ca_mongoid()
|
||||
|
||||
def launch(self, session, event):
|
||||
# Filter entities and changed values if it makes sence to run script
|
||||
processable = []
|
||||
processable_ent = {}
|
||||
for ent in event['data']['entities']:
|
||||
keys = ent.get('keys')
|
||||
if not keys:
|
||||
continue
|
||||
|
||||
entity = session.get(ent['entity_type'], ent['entityId'])
|
||||
processable.append(ent)
|
||||
processable_ent[ent['entityId']] = entity
|
||||
|
||||
if not processable:
|
||||
return True
|
||||
|
||||
ft_project = None
|
||||
for entity in processable_ent.values():
|
||||
try:
|
||||
base_proj = entity['link'][0]
|
||||
except Exception:
|
||||
continue
|
||||
ft_project = session.get(base_proj['type'], base_proj['id'])
|
||||
break
|
||||
|
||||
# check if project is set to auto-sync
|
||||
if (
|
||||
ft_project is None or
|
||||
'avalon_auto_sync' not in ft_project['custom_attributes'] or
|
||||
ft_project['custom_attributes']['avalon_auto_sync'] is False
|
||||
):
|
||||
return True
|
||||
|
||||
custom_attributes = {}
|
||||
query = 'CustomAttributeGroup where name is "avalon"'
|
||||
all_avalon_attr = session.query(query).one()
|
||||
for cust_attr in all_avalon_attr['custom_attribute_configurations']:
|
||||
if 'avalon_' in cust_attr['key']:
|
||||
continue
|
||||
if not cust_attr['is_hierarchical']:
|
||||
continue
|
||||
custom_attributes[cust_attr['key']] = cust_attr
|
||||
|
||||
if not custom_attributes:
|
||||
return True
|
||||
|
||||
self.db_con.install()
|
||||
self.db_con.Session['AVALON_PROJECT'] = ft_project['full_name']
|
||||
|
||||
for ent in processable:
|
||||
for key in ent['keys']:
|
||||
if key not in custom_attributes:
|
||||
continue
|
||||
|
||||
entity = processable_ent[ent['entityId']]
|
||||
attr_value = entity['custom_attributes'][key]
|
||||
self.update_hierarchical_attribute(entity, key, attr_value)
|
||||
|
||||
self.db_con.uninstall()
|
||||
|
||||
return True
|
||||
|
||||
def update_hierarchical_attribute(self, entity, key, value):
|
||||
custom_attributes = entity.get('custom_attributes')
|
||||
if not custom_attributes:
|
||||
return
|
||||
|
||||
mongoid = custom_attributes.get(self.ca_mongoid)
|
||||
if not mongoid:
|
||||
return
|
||||
|
||||
try:
|
||||
mongoid = ObjectId(mongoid)
|
||||
except Exception:
|
||||
return
|
||||
|
||||
mongo_entity = self.db_con.find_one({'_id': mongoid})
|
||||
if not mongo_entity:
|
||||
return
|
||||
|
||||
data = mongo_entity.get('data') or {}
|
||||
cur_value = data.get(key)
|
||||
if cur_value:
|
||||
if cur_value == value:
|
||||
return
|
||||
|
||||
data[key] = value
|
||||
self.db_con.update_many(
|
||||
{'_id': mongoid},
|
||||
{'$set': {'data': data}}
|
||||
)
|
||||
|
||||
for child in entity.get('children', []):
|
||||
if key not in child.get('custom_attributes', {}):
|
||||
continue
|
||||
child_value = child['custom_attributes'][key]
|
||||
if child_value is not None:
|
||||
continue
|
||||
self.update_hierarchical_attribute(child, key, value)
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
SyncHierarchicalAttrs(session).register()
|
||||
127
pype/ftrack/events/event_sync_to_avalon.py
Normal file
127
pype/ftrack/events/event_sync_to_avalon.py
Normal file
|
|
@ -0,0 +1,127 @@
|
|||
from pype.vendor import ftrack_api
|
||||
from pype.ftrack import BaseEvent, lib
|
||||
|
||||
|
||||
class Sync_to_Avalon(BaseEvent):
|
||||
|
||||
priority = 100
|
||||
|
||||
ignore_entityType = [
|
||||
'assetversion', 'job', 'user', 'reviewsessionobject', 'timer',
|
||||
'socialfeed', 'socialnotification', 'timelog'
|
||||
]
|
||||
|
||||
def launch(self, session, event):
|
||||
ca_mongoid = lib.get_ca_mongoid()
|
||||
# If mongo_id textfield has changed: RETURN!
|
||||
# - infinite loop
|
||||
for ent in event['data']['entities']:
|
||||
if ent.get('keys') is not None:
|
||||
if ca_mongoid in ent['keys']:
|
||||
return
|
||||
|
||||
entities = self._get_entities(session, event, self.ignore_entityType)
|
||||
ft_project = None
|
||||
# get project
|
||||
for entity in entities:
|
||||
try:
|
||||
base_proj = entity['link'][0]
|
||||
except Exception:
|
||||
continue
|
||||
ft_project = session.get(base_proj['type'], base_proj['id'])
|
||||
break
|
||||
|
||||
# check if project is set to auto-sync
|
||||
if (
|
||||
ft_project is None or
|
||||
'avalon_auto_sync' not in ft_project['custom_attributes'] or
|
||||
ft_project['custom_attributes']['avalon_auto_sync'] is False
|
||||
):
|
||||
return
|
||||
|
||||
# check if project have Custom Attribute 'avalon_mongo_id'
|
||||
if ca_mongoid not in ft_project['custom_attributes']:
|
||||
message = (
|
||||
"Custom attribute '{}' for 'Project' is not created"
|
||||
" or don't have set permissions for API"
|
||||
).format(ca_mongoid)
|
||||
self.log.warning(message)
|
||||
self.show_message(event, message, False)
|
||||
return
|
||||
|
||||
# get avalon project if possible
|
||||
import_entities = []
|
||||
|
||||
custom_attributes = lib.get_avalon_attr(session)
|
||||
|
||||
avalon_project = lib.get_avalon_project(ft_project)
|
||||
if avalon_project is None:
|
||||
import_entities.append(ft_project)
|
||||
|
||||
for entity in entities:
|
||||
if entity.entity_type.lower() in ['task']:
|
||||
entity = entity['parent']
|
||||
|
||||
if 'custom_attributes' not in entity:
|
||||
continue
|
||||
if ca_mongoid not in entity['custom_attributes']:
|
||||
|
||||
message = (
|
||||
"Custom attribute '{}' for '{}' is not created"
|
||||
" or don't have set permissions for API"
|
||||
).format(ca_mongoid, entity.entity_type)
|
||||
|
||||
self.log.warning(message)
|
||||
self.show_message(event, message, False)
|
||||
return
|
||||
|
||||
if entity not in import_entities:
|
||||
import_entities.append(entity)
|
||||
|
||||
if len(import_entities) < 1:
|
||||
return
|
||||
|
||||
try:
|
||||
for entity in import_entities:
|
||||
result = lib.import_to_avalon(
|
||||
session=session,
|
||||
entity=entity,
|
||||
ft_project=ft_project,
|
||||
av_project=avalon_project,
|
||||
custom_attributes=custom_attributes
|
||||
)
|
||||
if 'errors' in result and len(result['errors']) > 0:
|
||||
session.commit()
|
||||
lib.show_errors(self, event, result['errors'])
|
||||
|
||||
return
|
||||
|
||||
if avalon_project is None:
|
||||
if 'project' in result:
|
||||
avalon_project = result['project']
|
||||
|
||||
except Exception as e:
|
||||
message = str(e)
|
||||
title = 'Hey You! Unknown Error has been raised! (*look below*)'
|
||||
ftrack_message = (
|
||||
'SyncToAvalon event ended with unexpected error'
|
||||
' please check log file or contact Administrator'
|
||||
' for more information.'
|
||||
)
|
||||
items = [
|
||||
{'type': 'label', 'value': '# Fatal Error'},
|
||||
{'type': 'label', 'value': '<p>{}</p>'.format(ftrack_message)}
|
||||
]
|
||||
self.show_interface(items, title, event=event)
|
||||
self.log.error('Fatal error during sync: {}'.format(message))
|
||||
|
||||
return
|
||||
|
||||
|
||||
def register(session, **kw):
|
||||
'''Register plugin. Called when used as an plugin.'''
|
||||
|
||||
if not isinstance(session, ftrack_api.session.Session):
|
||||
return
|
||||
|
||||
Sync_to_Avalon(session).register()
|
||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue