Squashed commit of the following:

commit 2312a5432c
Merge: af641556 2b309565
Author: Milan Kolar <milan@orbi.tools>
Date:   Wed Nov 27 11:13:41 2019 +0000

    Merged in hotfix/integrate_ftrack_hierarchy_exception (pull request #385)

    fix(ftrack): removing gibberish code - just a quick patch for a case

    Approved-by: Milan Kolar <milan@orbi.tools>

commit af64155661
Merge: f7551605 ea379739
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Tue Nov 26 11:25:44 2019 +0000

    Merged in feature/PYPE-504_run_action_on_auto_sync (pull request #380)

    feature/PYPE-504_run_action_on_auto_sync

    Approved-by: Milan Kolar <milan@orbi.tools>

commit ea3797391b
Merge: aeaef2fa f7551605
Author: Milan Kolar <milan@orbi.tools>
Date:   Tue Nov 26 12:24:07 2019 +0100

    Merge branch 'refs/heads/develop' into feature/PYPE-504_run_action_on_auto_sync

    # Conflicts:
    #	pype/ftrack/events/event_sync_to_avalon.py

commit f755160591
Merge: be161817 7dcf0d5f
Author: Milan Kolar <milan@orbi.tools>
Date:   Tue Nov 26 12:15:28 2019 +0100

    Merge branch 'develop' of bitbucket.org:pypeclub/pype into develop

commit 2b30956579
Author: Jakub Jezek <jakub@pype.club>
Date:   Tue Nov 26 12:12:12 2019 +0100

    fix(ftrack): removing gibberish code - just a quick patch for a case

commit be161817f7
Author: Milan Kolar <milan@orbi.tools>
Date:   Tue Nov 26 12:00:24 2019 +0100

    hotfix wrong imports

commit 7dcf0d5ffa
Merge: 474e241e ecc3716d
Author: Jakub Ježek <jakub@pype.club>
Date:   Tue Nov 26 10:43:21 2019 +0000

    Merged in hotfix/publish-plugin-improvements (pull request #377)

    Publishing plugin improvements

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 474e241e16
Author: Milan Kolar <milan@orbi.tools>
Date:   Tue Nov 26 11:38:12 2019 +0100

    fix add task to assetversion only if needed

commit e231f4f5d2
Author: Milan Kolar <milan@orbi.tools>
Date:   Tue Nov 26 11:37:50 2019 +0100

    bugfix: sync to avalon to use latest ftrack api

commit 799991d77f
Merge: f63dd1c0 9a167bec
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Tue Nov 26 10:30:36 2019 +0000

    Merged in hotfix/rest_api_minor_fixes (pull request #382)

    Hotfix/rest api minor fixes

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 9a167bec7f
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Mon Nov 25 10:50:46 2019 +0100

    CustomNone moved to pype's lib

commit d30cc41e8b
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Mon Nov 25 10:50:22 2019 +0100

    import fixes

commit 343cdf55c1
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Mon Nov 25 10:49:52 2019 +0100

    formatting fixes

commit f63dd1c048
Author: Milan Kolar <milan@orbi.tools>
Date:   Fri Nov 22 18:15:57 2019 +0100

    return capture to vendors

commit 6e2fdb880c
Author: Milan Kolar <milan@orbi.tools>
Date:   Fri Nov 22 17:46:19 2019 +0100

    remove all but ftrack old api packages and change imports

commit 593a98a385
Merge: b8603db6 73886c50
Author: Toke Jepsen <tokejepsen@bumpybox.com>
Date:   Thu Nov 21 20:12:32 2019 +0000

    Merged in tokejepsen/pype/feature/nuke_validate_knobs (pull request #280)

    Validate knobs to studio presets.

    Approved-by: Milan Kolar <milan@orbi.tools>

commit b8603db6fa
Merge: e561384e 562c5a8e
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Thu Nov 21 20:10:30 2019 +0000

    Merged in feature/trigger_event_method (pull request #372)

    added trigger event method to base handler

    Approved-by: Milan Kolar <milan@orbi.tools>

commit e561384eae
Merge: 3ed1b4e7 38198124
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Thu Nov 21 20:09:45 2019 +0000

    Merged in bugfix/fix_status_changes_on_app (pull request #369)

    Bugfix/fix status changes on app

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 3ed1b4e7be
Merge: 1b8b9760 413dfc6a
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Thu Nov 21 20:03:52 2019 +0000

    Merged in feature/seed_ftrack_action (pull request #375)

    feature/seed ftrack action

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 1b8b976088
Merge: c6e7e9fd 3f3932cf
Author: Toke Jepsen <tokejepsen@bumpybox.com>
Date:   Thu Nov 21 20:00:39 2019 +0000

    Merged in tokejepsen/pype/feature/maya_look_force_copy (pull request #275)

    Enable "Force Copy" option.

    Approved-by: Milan Kolar <milan@orbi.tools>

commit c6e7e9fd2b
Merge: 0351c05d 246072f3
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Thu Nov 21 19:55:59 2019 +0000

    Merged in bugfix/event_server_traceback (pull request #371)

    use logging traceback print instead of traceback module

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 0351c05d2e
Merge: 61b8ebdc 40ff5309
Author: Jakub Ježek <jakub@pype.club>
Date:   Thu Nov 21 17:39:52 2019 +0000

    Merged in feature/nuke-validate-write-legacy (pull request #374)

    Nuke Validate Write Node Lekacy

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 40ff53098f
Author: Jakub Jezek <jakub@pype.club>
Date:   Thu Nov 21 18:33:07 2019 +0100

    typo(nuke): fixing add_family

commit 61b8ebdcf7
Merge: 4201ba17 256377c3
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Thu Nov 21 13:45:25 2019 +0000

    Merged in bugfix/PYPE-591_hier_attrs_fix (pull request #379)

    bugfix/PYPE-591_hier_attrs_fix

    Approved-by: Milan Kolar <milan@orbi.tools>

commit aeaef2fa02
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Thu Nov 21 11:50:42 2019 +0100

    current sync to avalon will trigger sync to avalon action on checked auto-sync

commit 256377c36f
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Wed Nov 20 19:49:52 2019 +0100

    replaced CustomAttributeValue with ContextCustomAttributeValue in queries to get all attributes

commit ecc3716d1b
Merge: ba5796eb 4201ba17
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 20 09:26:00 2019 +0100

    Merge branch 'develop' into hotfix/publish-plugin-improvements

commit ba5796ebd8
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 20 09:25:02 2019 +0100

    doc(nuke): improving assert message

commit 20270715cc
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 20 09:24:19 2019 +0100

    fix(global): removing duplicity in families

commit 5c321a6747
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 20 09:23:42 2019 +0100

    feat(global): cleanup is now turned on only if successful publishing

commit 4201ba173d
Merge: 6f228260 0af320f9
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Tue Nov 19 22:26:41 2019 +0000

    Merged in feature/mongo_or_query (pull request #356)

    feature/mongo_or_query

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 6f22826041
Merge: dc0bd2dd 06e55d53
Author: Ondřej Samohel <annatar@annatar.net>
Date:   Tue Nov 19 22:25:51 2019 +0000

    Merged in sync-with-3de (pull request #367)

    added plugins from 3de

    Approved-by: Milan Kolar <milan@orbi.tools>

commit dc0bd2dd39
Merge: eebb2511 e2ede5d0
Author: Ondřej Samohel <annatar@annatar.net>
Date:   Tue Nov 19 22:24:26 2019 +0000

    Merged in bugfix/3de-sync-fixes (pull request #368)

    small bugfixes arising from syncing with 3de

    Approved-by: Milan Kolar <milan@orbi.tools>

commit eebb251112
Merge: f8d389a5 0b1caf5c
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Tue Nov 19 22:23:12 2019 +0000

    Merged in bugfix/ftrack_session_rollback (pull request #376)

    Bugfix/ftrack session rollback

    Approved-by: Milan Kolar <milan@orbi.tools>

commit f8d389a5cb
Merge: 3cf76abf f25aac63
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Tue Nov 19 22:21:24 2019 +0000

    Merged in bugfix/auto_reconnect_actions (pull request #370)

    Bugfix/auto reconnect actions

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 3cf76abf9d
Merge: 061e0056 b5267574
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Tue Nov 19 22:18:45 2019 +0000

    Merged in feature/PYPE-506_sync_to_avalon_action (pull request #373)

    feature/PYPE-506_sync_to_avalon_action

    Approved-by: Milan Kolar <milan@orbi.tools>

commit b526757405
Author: Milan Kolar <milan@pype.club>
Date:   Tue Nov 19 23:17:59 2019 +0100

    fix grammar and tone of the user messages

commit 0b1caf5c0f
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Nov 19 19:25:28 2019 +0100

    removed underscores from class names

commit 4326e2b91e
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Nov 19 19:24:24 2019 +0100

    added few rollbacks to ftrack actions and events to prevent uncommitable session

commit 9924231834
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Nov 19 19:00:50 2019 +0100

    added session rollbacks to ftrack plugins

commit 413dfc6a03
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Nov 19 18:33:51 2019 +0100

    added current project to project list

commit 81dbc9a17e
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Nov 19 18:33:38 2019 +0100

    user can change number of created instances

commit 9e74d08670
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Nov 19 18:14:19 2019 +0100

    added icon

commit 6acbd7c799
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Nov 19 18:09:46 2019 +0100

    first version of seed action for project entities seeding

commit 1e67a5bc5b
Author: Jakub Jezek <jakub@pype.club>
Date:   Tue Nov 19 17:47:16 2019 +0100

    feat(nuke): improvements on fix method

commit f0618af38e
Author: Jakub Jezek <jakub@pype.club>
Date:   Tue Nov 19 17:46:52 2019 +0100

    feat(nuke): upgrading collect legacy write to see new version of write

commit 30f854e872
Author: Jakub Jezek <jakub@pype.club>
Date:   Tue Nov 19 17:46:16 2019 +0100

    fix(nuke): family and families definition improvement

    collect instances

commit 061e0056e5
Merge: d15c57b5 3833adb3
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Tue Nov 19 15:21:47 2019 +0000

    Merged in feature/action_preregister_implementation (pull request #354)

    Feature/action preregister implementation

    Approved-by: Milan Kolar <milan@orbi.tools>

commit d15c57b57f
Merge: b567af16 312d3c96
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Tue Nov 19 14:52:30 2019 +0000

    Merged in feature/advanced_rest_api (pull request #334)

    Feature/advanced rest api

    Approved-by: Milan Kolar <milan@orbi.tools>

commit f25aac63b2
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Nov 15 18:38:36 2019 +0100

    removed unnecessary bool changes

commit edd3ad2e57
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Nov 15 18:35:08 2019 +0100

    action server is not in daemon thread anymore

commit 0e5cb8652c
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Nov 15 18:15:41 2019 +0100

    added one more check on start action server

commit db77c06555
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Nov 15 18:15:25 2019 +0100

    log fix

commit 1750d565b4
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Nov 15 17:44:46 2019 +0100

    close connections to mongo and ftrack after synchronization

commit ccc99c50e0
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Nov 15 17:42:12 2019 +0100

    fixed discovery of sync to avalon action

commit 78a1172ad9
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Nov 15 17:41:44 2019 +0100

    enhanced mongo db connector

commit b609e7e892
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Nov 15 17:07:34 2019 +0100

    sync to avalon and hierarchical attrs actions were removed and replaced with new sync to avalon

commit 562c5a8e69
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Nov 15 16:30:42 2019 +0100

    added trigger event method to base handler

commit 246072f39f
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Nov 15 16:28:57 2019 +0100

    use logging traceback print instead of traceback module

commit 0f7c320ad4
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Nov 15 16:13:29 2019 +0100

    changed bool for visibility of reset action server to not confuse users that reset is not there but stop is

commit 2f8f0c5bca
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Nov 15 16:11:11 2019 +0100

    changed logic of action server processing in ftrack module, now is checking if is possible to connect to ftrack in while loop

commit 2c3ebb0c13
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Nov 15 16:08:06 2019 +0100

    added exception message when there are no events to register

commit daee320863
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Nov 15 16:07:36 2019 +0100

    check_ftrack_url has moved from event_server_cli to lib to be importable

commit 38198124e4
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Nov 15 14:36:37 2019 +0100

    removed unnecesary commit

commit e66a3e1b38
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Nov 15 14:25:13 2019 +0100

    added more necessary rollbacks

commit 67e9089638
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Nov 15 14:19:00 2019 +0100

    added while loop so more than one status may be tried and rollback is called on session if change failed

commit b567af166d
Author: Milan Kolar <milan@orbi.tools>
Date:   Thu Nov 14 11:21:55 2019 +0100

    bugfix/environment_crashing_on_floats

commit e2ede5d047
Author: Ondrej Samohel <ondrej@samohel.cz>
Date:   Wed Nov 13 22:39:37 2019 +0100

    small bugfixes arising from syncing with 3de

commit f263eabe82
Merge: 13c46a8c a03e4862
Author: Jakub Ježek <jakub@pype.club>
Date:   Wed Nov 13 14:12:18 2019 +0000

    Merged in feature/nuke-publish-grouping (pull request #366)

    Nuke Subset Grouping

commit 13c46a8cd8
Merge: df29a22e 752af5e5
Author: Toke Jepsen <tokejepsen@bumpybox.com>
Date:   Wed Nov 13 13:45:47 2019 +0000

    Merged in tokejepsen/pype/feature/nukestudio_comments (pull request #279)

    Publish comments from NukeStudio.

commit df29a22e87
Merge: 1e523caa 963d6fb1
Author: Toke Jepsen <tokejepsen@bumpybox.com>
Date:   Wed Nov 13 13:20:58 2019 +0000

    Merged in tokejepsen/pype/validate_containers (pull request #348)

    Validate containers.

commit 06e55d533a
Author: Ondrej Samohel <ondrej@samohel.cz>
Date:   Wed Nov 13 11:59:22 2019 +0100

    added plugins from 3de

commit 1e523caa52
Merge: ade6a021 90da0921
Author: Jakub Ježek <jakub@pype.club>
Date:   Wed Nov 13 09:17:59 2019 +0000

    Merged in feature/nuke-create-backdrop (pull request #360)

    Nuke: Create plugin for backdrop creation

commit a03e486265
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 13 10:12:29 2019 +0100

    fix(nuke): validate render frames correct family if `no` render

commit 561c54cab2
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 13 10:11:18 2019 +0100

    fix(nuke): validators change to correct `render` family

commit 256c23d8ec
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 13 10:10:06 2019 +0100

    feat(nuke): collector are now adding subset group attr to instance

commit 330b9bad87
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 13 10:09:20 2019 +0100

    fix(nuke): collect instances getting correctly writes family

commit cc866903c8
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 13 10:08:36 2019 +0100

    fix(nuke): create write family and families mismatch

commit cb8aff44a4
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 13 10:07:48 2019 +0100

    feat(global): integrate new added support for subset group attribute

commit ade6a02115
Merge: 2100a3e6 378d363b
Author: Ondřej Samohel <annatar@annatar.net>
Date:   Wed Nov 13 08:33:58 2019 +0000

    Merged in fix/maya-fps-mapping (pull request #365)

    fixed casting to integer for fps

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 48c4bd2103
Author: Jakub Jezek <jakub@pype.club>
Date:   Tue Nov 12 17:50:08 2019 +0100

    feat(nuke): adding grouping subset to write instances

commit c3452d3594
Author: Jakub Jezek <jakub@pype.club>
Date:   Tue Nov 12 17:49:44 2019 +0100

    feat(global): adding subset grouping ability to integrate new

commit 378d363ba8
Author: Ondrej Samohel <ondrej@samohel.cz>
Date:   Tue Nov 12 13:57:19 2019 +0100

    fixed casting to integer for fps

commit 2100a3e666
Merge: 69639b30 ca922280
Author: Jakub Jezek <jakub@pype.club>
Date:   Tue Nov 12 12:53:41 2019 +0100

    Merge remote-tracking branch 'origin/develop' into develop

commit 14d4f4ff8a
Merge: 43d916af 509578c4
Author: Jakub Jezek <jakub@pype.club>
Date:   Tue Nov 12 12:53:15 2019 +0100

    Merge branch 'hotfix/nuke-workfile-publish'

commit 69639b30a0
Merge: 68bad839 509578c4
Author: Jakub Jezek <jakub@pype.club>
Date:   Tue Nov 12 12:53:14 2019 +0100

    Merge branch 'hotfix/nuke-workfile-publish' into develop

commit 509578c4ef
Author: Jakub Jezek <jakub@pype.club>
Date:   Tue Nov 12 12:52:57 2019 +0100

    fix(nuke): nuke little fixes

commit ca9222804b
Author: Milan Kolar <milan@orbi.tools>
Date:   Mon Nov 11 13:30:31 2019 +0100

    missing variable when mongo is unavailable

commit 1559b8db91
Merge: 8c07ebf1 43d916af
Author: Milan Kolar <milan@orbi.tools>
Date:   Mon Nov 11 12:20:01 2019 +0100

    Merge tag 'sigkill_arguments' into develop

commit 43d916af6a
Merge: b37586b6 129c264f
Author: Milan Kolar <milan@orbi.tools>
Date:   Mon Nov 11 12:20:00 2019 +0100

    Merge branch 'hotfix/sigkill_arguments'

commit 129c264f49
Author: Milan Kolar <milan@orbi.tools>
Date:   Mon Nov 11 12:19:37 2019 +0100

    fixing signal errors

commit b37586b6b1
Merge: c5aa1758 94e302ee
Author: Milan Kolar <milan@orbi.tools>
Date:   Mon Nov 11 11:49:45 2019 +0100

    Merge branch 'master' of bitbucket.org:pypeclub/pype

commit 8c07ebf104
Merge: ecc7ac53 68bad839
Author: Milan Kolar <milan@orbi.tools>
Date:   Mon Nov 11 11:43:51 2019 +0100

    Merge branch 'develop' of bitbucket.org:pypeclub/pype into develop

commit 68bad8394f
Merge: 5ecb7476 61c9de58
Author: Jakub Jezek <jakub@pype.club>
Date:   Thu Nov 7 19:26:35 2019 +0100

    Merge remote-tracking branch 'origin/develop' into develop

commit 94e302eed6
Merge: 6d2a481a f3549af4
Author: Jakub Jezek <jakub@pype.club>
Date:   Thu Nov 7 19:26:22 2019 +0100

    Merge remote-tracking branch 'origin/master'

commit 5ecb74760a
Merge: 1fca6f1d 6ae508d3
Author: Jakub Jezek <jakub@pype.club>
Date:   Thu Nov 7 19:25:46 2019 +0100

    Merge branch 'hotfix/nuke-publish-fixes' into develop

commit 6d2a481a59
Merge: 71413190 6ae508d3
Author: Jakub Jezek <jakub@pype.club>
Date:   Thu Nov 7 19:25:46 2019 +0100

    Merge branch 'hotfix/nuke-publish-fixes'

commit 6ae508d383
Author: Jakub Jezek <jakub@pype.club>
Date:   Thu Nov 7 19:24:43 2019 +0100

    typo(nuke): removing prints

commit ecc7ac5335
Merge: 61c9de58 c5aa1758
Author: Milan Kolar <milan@orbi.tools>
Date:   Thu Nov 7 18:40:45 2019 +0100

    Merge tag 'mixed_unc_and_mapped_paths' into develop

commit c5aa17583e
Merge: f3549af4 e99df15b
Author: Milan Kolar <milan@orbi.tools>
Date:   Thu Nov 7 18:40:44 2019 +0100

    Merge branch 'hotfix/mixed_unc_and_mapped_paths'

commit e99df15b1a
Author: Milan Kolar <milan@orbi.tools>
Date:   Thu Nov 7 18:39:55 2019 +0100

    make sure we're not comparing mounted to UNC path in integrator.

commit 59b4178fda
Author: Jakub Jezek <jakub@pype.club>
Date:   Thu Nov 7 18:10:49 2019 +0100

    fix(nuke): create read from write

    - wasn't returning correctly if nothing found in file path

commit f3549af450
Merge: 71413190 d64bc838
Author: Ondřej Samohel <annatar@annatar.net>
Date:   Thu Nov 7 16:24:05 2019 +0000

    Merged in hotfix/farm_rendering (pull request #363)

    hotfix for various aspect of farm rendering

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 61c9de587f
Merge: 1fca6f1d d64bc838
Author: Ondřej Samohel <annatar@annatar.net>
Date:   Thu Nov 7 16:23:39 2019 +0000

    Merged in hotfix/farm_rendering (pull request #364)

    hotfix for various aspect of farm rendering

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 6f5f823e44
Author: Jakub Jezek <jakub@pype.club>
Date:   Thu Nov 7 17:23:15 2019 +0100

    fix(nuke): create_write plugin improvements

    - selected node operation
    - what if the node exists already

commit d64bc8388a
Author: Ondrej Samohel <ondrej@samohel.cz>
Date:   Thu Nov 7 16:22:26 2019 +0100

    fixed python2/3 compatibility

commit 86ddb924b5
Author: Jakub Jezek <jakub@pype.club>
Date:   Thu Nov 7 15:59:18 2019 +0100

    feat(nuke): rewriting create write plugin

    - self.families and self.nCass for easier applications

commit 803936950a
Author: Jakub Jezek <jakub@pype.club>
Date:   Thu Nov 7 15:56:25 2019 +0100

    fix(nuke): colorspace presets didn't work with families

commit 2ef74e33cb
Author: Jakub Jezek <jakub@pype.club>
Date:   Thu Nov 7 15:55:43 2019 +0100

    fix(nuke): overwriting create write function

    - adding input node to connect to selected node in Create
    - fixing prenode to be functional if more node before write need to be added [prep for mask/prerender]
    - `preset` data rewrited to be `families`

commit ba32b5f4a1
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 6 23:44:48 2019 +0100

    typo(nuke): removing todo

commit 7bb247528c
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 6 23:44:17 2019 +0100

    fix(nuke): didn't reload modules correctly

    python 27 and 3 supported

commit fd2e0f8c27
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 6 15:52:19 2019 +0100

    fix(nuke): nuke.templates renamed to nuke.presets

    - it was still remaining as templates in some modules

commit 637862f22d
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 6 15:51:32 2019 +0100

    fix(nuke): create plugin correct way of working with families

commit fe918b1bd6
Merge: 2d6e11c1 1fca6f1d
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 6 14:36:24 2019 +0100

    Merge branch 'develop' into hotfix/nuke-publish-fixes

commit 1fca6f1d36
Merge: c83b4ee5 a6daff37
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 6 14:21:08 2019 +0100

    Merge branch 'hotfix/ftrack-event-thumbnail-sintax_error' into develop

commit 714131908b
Merge: e2669dde a6daff37
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 6 14:21:08 2019 +0100

    Merge branch 'hotfix/ftrack-event-thumbnail-sintax_error'

commit a6daff37ee
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 6 14:20:39 2019 +0100

    fix(ftrack): thumbnail event syntax error

commit c83b4ee58c
Merge: cb6ffb0f dd967d87
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 6 14:06:36 2019 +0100

    Merge branch 'hotfix/ftrack-signals' into develop

commit e2669dde7d
Merge: 85c9422a dd967d87
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 6 14:06:36 2019 +0100

    Merge branch 'hotfix/ftrack-signals'

commit 50438283cd
Author: Ondrej Samohel <ondrej@samohel.cz>
Date:   Wed Nov 6 14:06:24 2019 +0100

    hotfix for various aspect of farm rendering

commit dd967d87aa
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 6 14:06:06 2019 +0100

    fix(ftrack): signals for linux

commit 85c9422a25
Merge: bb510008 9d32095c
Author: Milan Kolar <milan@orbi.tools>
Date:   Wed Nov 6 10:34:12 2019 +0000

    Merged in release/2.3.0 (pull request #361)

    Realease 2.3.0

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 9d32095c3a
Author: Milan Kolar <milan@orbi.tools>
Date:   Wed Nov 6 10:38:18 2019 +0100

    update changelog

commit 90da0921d9
Merge: 61978b38 cb6ffb0f
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 6 10:27:21 2019 +0100

    Merge branch 'develop' into feature/nuke-create-backdrop

commit 45b2ec6020
Merge: e0b4bec2 cb6ffb0f
Author: Milan Kolar <milan@orbi.tools>
Date:   Wed Nov 6 10:23:07 2019 +0100

    Merge branch 'refs/heads/develop' into release/2.3.0

commit cb6ffb0f63
Merge: 321028ea b22f7536
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Wed Nov 6 09:21:41 2019 +0000

    Merged in feature/faster_loader_actions (pull request #357)

    feature/ faster loader actions menu

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 321028ead8
Merge: 16463867 275bd6da
Author: Jakub Ježek <jakub@pype.club>
Date:   Wed Nov 6 09:20:57 2019 +0000

    Merged in hotfix/eallin_fixes_nks_nk (pull request #359)

    fix(nks): thumbnails, build workfile with preview mov

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 16463867a7
Merge: 12d8c520 2d6e11c1
Author: Jakub Ježek <jakub@pype.club>
Date:   Wed Nov 6 09:20:30 2019 +0000

    Merged in hotfix/nuke-publish-fixes (pull request #358)

    Nuke publising fixes related to avalon.nuke

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 12d8c520e2
Merge: e6ac4fcb 95655d00
Author: Ondřej Samohel <annatar@annatar.net>
Date:   Wed Nov 6 09:19:36 2019 +0000

    Merged in feature/PYPE-580-yeti-connect-mesh-when-loading (pull request #346)

    connect shapes to loaded Yeti Rig

    Approved-by: Milan Kolar <milan@orbi.tools>

commit e6ac4fcb75
Merge: 4b8aa14c b3dbf0f2
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Wed Nov 6 09:18:52 2019 +0000

    Merged in bugfix/thumbnail_event_on_not_task (pull request #355)

    fixed not crashing event thumbnail updates if thumbnail is not happening on task

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 275bd6da64
Merge: c009f661 4b8aa14c
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 6 10:09:09 2019 +0100

    Merge branch 'develop' into hotfix/eallin_fixes_nks_nk

commit 2d6e11c19a
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 6 01:17:42 2019 +0100

    feat(nuke): node avalon knob prefix as list for backward compatibility

    - `avalon:` or `ak:`
    - omit tags for later clearing in pype.nuke.presets

commit f35f4c1e93
Author: Jakub Jezek <jakub@pype.club>
Date:   Wed Nov 6 01:14:09 2019 +0100

    feat(nuke): templates.py to presets.py

    - rename key `preset` to `families`
    - `families`: looping family as preset
    - omit tags for later clearing

commit b22f7536ca
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Nov 5 18:27:09 2019 +0100

    moved pymel.core imports to process parts of actions to not slow down discover

commit 4cfb4d2558
Merge: 4b8aa14c a2574043
Author: Jakub Jezek <jakub@pype.club>
Date:   Tue Nov 5 14:40:58 2019 +0100

    Merge branch 'hotfix/nuke_publish_load_workflow_improvment' into hotfix/nuke-publish-fixes

commit a2574043c8
Merge: 1df729c2 4b8aa14c
Author: Jakub Jezek <jakub@pype.club>
Date:   Tue Nov 5 14:06:38 2019 +0100

    Merge branch 'hotfix/nuke_publish_fixes' into hotfix/nuke_publish_load_workflow_improvment

commit 4b8aa14c55
Merge: 2cb1c7ef 1df729c2
Author: Jakub Ježek <jakub@pype.club>
Date:   Tue Nov 5 10:13:08 2019 +0000

    Merged in hotfix/nuke_publish_load_workflow_improvment (pull request #352)

    feat(nuke): improving nuke features

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 2cb1c7efe3
Merge: 1cbf9c5e a188d37d
Author: Jakub Ježek <jakub@pype.club>
Date:   Tue Nov 5 10:12:32 2019 +0000

    Merged in hotfix/validate_ftrack_attributes (pull request #353)

    Hotfix/validate ftrack attributes

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 1cbf9c5eef
Merge: 4dee025c b170fe1a
Author: Jakub Ježek <jakub@pype.club>
Date:   Tue Nov 5 10:11:53 2019 +0000

    Merged in hotfix/integrate_new_image_sequence (pull request #351)

    removing hashes from path in image sequence

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 0af320f9f7
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Nov 5 10:33:17 2019 +0100

    $or queries replaced with $in which is much easier to read in code and prepare

commit b3dbf0f228
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Nov 5 10:29:02 2019 +0100

    variable name fix

commit 26454a752b
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Nov 5 10:23:49 2019 +0100

    fixed not crashing event thumbnail updates if thumbnail is not happening on task

commit 3833adb30a
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Nov 5 10:19:50 2019 +0100

    use warning log instead of info

commit 0957cffbfd
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Nov 5 10:09:21 2019 +0100

    created PreregisterException to recognize preregistration failed and do not print traceback

commit 91832e50ed
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Nov 5 10:08:37 2019 +0100

    actions rv and djv view use preregister instead of overriding register method

commit 1df729c206
Author: Jakub Jezek <jakub@pype.club>
Date:   Mon Nov 4 20:31:21 2019 +0100

    fix(nuke): version data frameStart/end excluded handles

commit a188d37d62
Author: Jakub Jezek <jakub@pype.club>
Date:   Mon Nov 4 19:37:33 2019 +0100

    fix(gloabal): validator didn't work properly

commit 66d0b1b253
Author: Jakub Jezek <jakub@pype.club>
Date:   Mon Nov 4 19:28:39 2019 +0100

    feat(nuke): improving nuke features

    - loader reads fpath and convert to hashes in path
    - adding `review` knob to write
    - removing extract_frames.py to _unused plugins

commit b170fe1a6f
Author: Jakub Jezek <jakub@pype.club>
Date:   Mon Nov 4 19:23:29 2019 +0100

    fix(global): data.path without hashes

    - removing hashes from data.path

commit 95655d0056
Author: Ondrej Samohel <ondrej@samohel.cz>
Date:   Mon Nov 4 16:04:54 2019 +0100

    added texture extraction to yeti rig extractor

commit 4dee025cfe
Merge: f7dae6aa d698a577
Author: Jakub Ježek <jakub@pype.club>
Date:   Mon Nov 4 15:04:03 2019 +0000

    Merged in hotfix/nuke-custom-ocio-path (pull request #350)

    Nuke Support for OCIO custom path

commit d698a57733
Author: Jakub Jezek <jakub@pype.club>
Date:   Mon Nov 4 15:59:35 2019 +0100

    fix(nuke): adding custom ocio path support

commit f7dae6aa5d
Merge: bce9abe1 fdccabee
Author: Toke Jepsen <tokejepsen@bumpybox.com>
Date:   Mon Nov 4 13:56:57 2019 +0000

    Merged in tokejepsen/pype/deadline_priority_nuke (pull request #349)

    Support Deadline priority in Nuke.

    Approved-by: Jakub Ježek <jakub@pype.club>

commit bce9abe189
Merge: d40ba5eb f217899e
Author: Ondřej Samohel <annatar@annatar.net>
Date:   Mon Nov 4 13:42:06 2019 +0000

    Merged in hofix/rv-action-preset-error (pull request #344)

    fixed rv action not to load when preset is missing

commit fdccabeeb9
Author: Toke Jepsen <tokejepsen@gmail.com>
Date:   Sun Nov 3 18:45:30 2019 +0000

    Support Deadline priority in Nuke.

commit 963d6fb18f
Author: Toke Jepsen <tokejepsen@gmail.com>
Date:   Sun Nov 3 18:41:23 2019 +0000

    Validate containers.

commit 73886c5079
Author: Toke Jepsen <tokejepsen@gmail.com>
Date:   Sun Nov 3 18:43:57 2019 +0000

    Bugfix for string based knobs.

commit 5dcffac491
Author: Toke Jepsen <tokejepsen@gmail.com>
Date:   Sun Nov 3 18:43:22 2019 +0000

    Validate knobs inside write group.

commit 66544273fd
Author: Ondrej Samohel <ondrej@samohel.cz>
Date:   Fri Nov 1 19:06:00 2019 +0100

    connect shapes to loaded Yeti Rig

commit d40ba5ebbd
Merge: 9d994dd7 2d65ab83
Author: Jakub Ježek <jakub@pype.club>
Date:   Fri Nov 1 17:06:36 2019 +0000

    Merged in hotfix/assum-dest-ftrack-hierarchy (pull request #345)

    Hotfix/assum dest ftrack hierarchy

    Approved-by: Milan Kolar <milan@orbi.tools>

commit c009f661e6
Author: Jana Mizikova <mizikova.jana@gmail.com>
Date:   Fri Nov 1 17:06:35 2019 +0100

    fix(nks): thumbnails, build workfile with preview mov

     - thumbnail for clip is taken from middle of duration

commit e0b4bec245
Merge: 00b497cc 2d65ab83
Author: Milan Kolar <milan@orbi.tools>
Date:   Fri Nov 1 15:34:56 2019 +0100

    Merge remote-tracking branch 'refs/remotes/origin/hotfix/assum-dest-ftrack-hierarchy' into release/2.3.0

commit 2d65ab83f1
Author: Jana Mizikova <mizikova.jana@gmail.com>
Date:   Fri Nov 1 11:27:46 2019 +0100

    fix(plugins): changing the way ftrack is querying entity_type

    this will remove the server entity duplicity error on mysql

commit d2c8810470
Author: Jana Mizikova <mizikova.jana@gmail.com>
Date:   Fri Nov 1 11:25:48 2019 +0100

    fix(plugins): ditching `silo` from assumig destination

commit f217899ea1
Author: Ondřej Samohel <annatar@annatar.net>
Date:   Fri Nov 1 09:09:16 2019 +0000

    fixed rv action not to load when preset is missing

commit 312d3c96ab
Merge: 58819982 9d994dd7
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Thu Oct 31 23:35:28 2019 +0100

    Merge branch 'develop' into feature/advanced_rest_api

commit 00b497cc21
Author: Milan Kolar <milan@orbi.tools>
Date:   Thu Oct 31 20:16:16 2019 +0100

    update changelog

commit 2c6b451485
Author: Milan Kolar <milan@orbi.tools>
Date:   Thu Oct 31 20:15:13 2019 +0100

    update chagnelog

commit 8f1ff9e31b
Author: Milan Kolar <milan@orbi.tools>
Date:   Thu Oct 31 20:07:03 2019 +0100

    update changelog and version

commit 9d994dd7fc
Merge: 92ca7d90 0de66c1c
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Thu Oct 31 18:30:59 2019 +0000

    Merged in hotfix/job_killer_log_fix (pull request #341)

    fixed logging of job killer action

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 92ca7d9041
Merge: 78519aa0 384843a8
Author: Milan Kolar <milan@orbi.tools>
Date:   Thu Oct 31 18:11:42 2019 +0000

    Merged in feature/PYPE-349_parallel_event_server (pull request #343)

    Feature/PYPE-349 parallel event server

commit 78519aa066
Merge: ad41ffaf 6f93743c
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Thu Oct 31 18:06:54 2019 +0000

    Merged in feature/PYPE-349_parallel_event_server (pull request #330)

    Feature/PYPE-349 parallel event server

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 384843a8b4
Author: Milan Kolar <milan@orbi.tools>
Date:   Thu Oct 31 18:52:35 2019 +0100

    change `old way` to `legacy

commit 8df857bb89
Merge: c3cbab14 ad41ffaf
Author: Milan Kolar <milan@orbi.tools>
Date:   Thu Oct 31 18:42:26 2019 +0100

    Merge branch 'develop' into feature/PYPE-349_parallel_event_server

commit c3cbab14b9
Author: Milan Kolar <milan@orbi.tools>
Date:   Thu Oct 31 18:42:15 2019 +0100

    missing self

commit ad41ffafe5
Author: Milan Kolar <milan@orbi.tools>
Date:   Thu Oct 31 18:15:30 2019 +0100

    hotfix: frames in representation

commit 3c4083fb99
Author: Milan Kolar <milan@orbi.tools>
Date:   Thu Oct 31 17:07:54 2019 +0100

    hotfix: yeti frames in integrator

commit b02ca38674
Merge: 38981581 dbe79c77
Author: Milan Kolar <milan@orbi.tools>
Date:   Thu Oct 31 08:41:20 2019 +0000

    Merged in hotfix/yeti_cache (pull request #342)

    add frames to published representations

commit dbe79c7741
Author: Jana Mizikova <mizikova.jana@gmail.com>
Date:   Wed Oct 30 19:06:18 2019 +0100

    add frames to published representations

commit 3898158144
Merge: 48efc6bc 62851494
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Wed Oct 30 11:56:18 2019 +0000

    Merged in bugfix/PYPE-578_idle_manager_bug (pull request #340)

    Bugfix/PYPE-578 idle manager bug

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 48efc6bcee
Merge: 6195e16b 64e86e39
Author: Ondřej Samohel <annatar@annatar.net>
Date:   Wed Oct 30 11:55:46 2019 +0000

    Merged in hotfix/maya-set-fps (pull request #339)

    fixed Maya set fps

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 6195e16b9e
Merge: ff44a003 83367047
Author: Ondřej Samohel <annatar@annatar.net>
Date:   Wed Oct 30 11:55:13 2019 +0000

    Merged in feature/PYPE-523-validator-for-comparing-arbitrary-attributes (pull request #335)

    Global validator for ftrack attributes

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 0de66c1c45
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Wed Oct 30 10:48:25 2019 +0100

    fixed logging of job killer action

commit 6285149489
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Wed Oct 30 10:44:47 2019 +0100

    enhanced signal dictionary creating

commit 744a606d69
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Wed Oct 30 10:44:19 2019 +0100

    added try except of Attribute error to catch not existing thread error

commit ff44a00354
Author: Milan Kolar <milan@orbi.tools>
Date:   Tue Oct 29 20:57:41 2019 +0100

    hotfix: ignore shapes when referencing and preserve references when importing

commit 6f93743c8d
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Oct 29 18:01:02 2019 +0100

    added old_way_server which can handle same way as was before and restart event server when ftrack connection fails

commit 92ccd406c5
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Oct 29 18:00:20 2019 +0100

    created exception for mongo permission error

commit 5b70ddf32f
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Oct 29 17:59:36 2019 +0100

    added `oldway` argument to event server cli

commit 75ce8f1196
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Oct 29 17:58:29 2019 +0100

    added subprocess file for oldway event server

commit 61978b38cf
Merge: 317b13b5 160d285d
Author: Jakub Jezek <jakub@pype.club>
Date:   Tue Oct 29 16:58:00 2019 +0100

    Merge branch 'develop' into feature/nuke-create-backdrop

commit 64e86e3988
Author: Ondřej Samohel <annatar@annatar.net>
Date:   Tue Oct 29 15:41:45 2019 +0000

    fixed Maya set fps

commit 160d285d9e
Merge: ed48b852 1b936d6e
Author: Milan Kolar <milan@orbi.tools>
Date:   Tue Oct 29 15:54:18 2019 +0100

    Merge branch 'develop' of bitbucket.org:pypeclub/pype into develop

commit 1b936d6e81
Merge: 749852fc 5ccc34ab
Author: Jana Mizikova <mizikova.jana@gmail.com>
Date:   Tue Oct 29 14:53:39 2019 +0000

    Merged in bugfix/validate-unicode-strings (pull request #326)

    validate unicode strings

    Approved-by: Milan Kolar <milan@orbi.tools>

commit ed48b85233
Author: Milan Kolar <milan@orbi.tools>
Date:   Tue Oct 29 15:50:22 2019 +0100

    missing families

commit 749852fc8b
Merge: e2446100 b3dca2d0
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Tue Oct 29 14:18:54 2019 +0000

    Merged in hotfix/review_integrate_fix (pull request #337)

    Hotfix/review integrate fix

commit e24461007d
Merge: e35e4f18 e52e6241
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Tue Oct 29 14:17:22 2019 +0000

    Merged in hotfix/standalone_model_fix (pull request #338)

    Hotfix/standalone model fix

commit b3dca2d0ce
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Oct 29 15:12:37 2019 +0100

    fixed ftrack integration

commit e35e4f1899
Author: Milan Kolar <milan@orbi.tools>
Date:   Tue Oct 29 14:42:45 2019 +0100

    fix families in subset publishing

commit 317b13b5f9
Author: Jakub Jezek <jakub@pype.club>
Date:   Tue Oct 29 14:22:05 2019 +0100

    feat(nuke): add create plugin for backdrop creation

commit e52e6241fb
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Oct 29 11:05:41 2019 +0100

    version is refreshed on pyblish close

commit 1b27cc59f8
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Oct 29 11:05:31 2019 +0100

    get_parents get parents from entity instead of getting through all parents if possible

commit 918d23c9ab
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Oct 29 10:56:06 2019 +0100

    fixed role variable naming and way of getting asset

commit dede49f2c6
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Oct 29 10:47:28 2019 +0100

    extract review creates mov in temp instead of stagigng dir

commit b3c693fc5a
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Oct 29 10:47:02 2019 +0100

    removed delete tag from plates when collecting

commit 10d6956037
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Oct 25 18:27:31 2019 +0200

    event server will wait some time if subprocess crashed many times in row

commit 38a6c84fe5
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Oct 25 18:26:38 2019 +0200

    its used lib to get mongo port and host in event server cli

commit 11f0c41e90
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Oct 25 18:25:44 2019 +0200

    crash all server if mongo error happened

commit 5d6f91d614
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Oct 25 18:24:36 2019 +0200

    socket thread stores info if mongo error has happened during subprocess

commit e0f7752205
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Oct 25 18:24:12 2019 +0200

    sub event processor print traceback

commit 4313cc5667
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Oct 25 18:23:52 2019 +0200

    mongo permissions error is sent with socket message

commit 24175a2587
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Oct 25 18:22:28 2019 +0200

    atexit register has args now

commit e7c9a61b2e
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Oct 25 18:21:44 2019 +0200

    added lib to ftrack_server with functios able to handle mongo url parts

commit ebbc5f6fa9
Author: Milan Kolar <milan@orbi.tools>
Date:   Fri Oct 25 16:20:23 2019 +0200

    hotfix: forgotten instance of silos

commit 83367047ab
Author: Ondrej Samohel <ondrej@samohel.cz>
Date:   Fri Oct 25 13:51:11 2019 +0200

    added global validator for ftrack attributes

commit 588199821a
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Thu Oct 24 15:25:45 2019 +0200

    removed statics server since rest api server can handle statics too

commit 3a5525edf6
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Wed Oct 23 18:45:41 2019 +0200

    processed events older than 3 days are removed from mongo db

commit 7c59df5ec5
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Wed Oct 23 18:03:29 2019 +0200

    max timeout of heartbeat increased to 35

commit 11f4451af1
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Wed Oct 23 18:02:53 2019 +0200

    added basic docstrings

commit bfa61e2e98
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Wed Oct 23 17:34:11 2019 +0200

    kill threads with subprocesses on exit

commit 565c43971c
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Wed Oct 23 17:33:32 2019 +0200

    added sigkill signal

commit ee8f09cf46
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Wed Oct 23 16:52:07 2019 +0200

    added error logs

commit 68fb2c2e27
Merge: 78592655 6fcc28ab
Author: Ondrej Samohel <ondrej@samohel.cz>
Date:   Wed Oct 23 16:49:48 2019 +0200

    Merge remote-tracking branch 'origin/develop' into feature/PYPE-523-validator-for-comparing-arbitrary-attributes

commit aad641f233
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Wed Oct 23 16:48:13 2019 +0200

    import fixes

commit ac290711b9
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Wed Oct 23 16:47:45 2019 +0200

    sort events by date there were stored

commit 45370876e3
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Wed Oct 23 15:29:05 2019 +0200

    replaced args with options(kwargs) in custom db connector

commit 1ca674f33f
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Wed Oct 23 15:05:07 2019 +0200

    added atexit to custom db connector to run uninstall on exit

commit 19f810ff57
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Wed Oct 23 15:01:51 2019 +0200

    added check active table to custom db connector

commit 4290af23e7
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Wed Oct 23 14:57:53 2019 +0200

    moved event server files in hierarchy

commit f1c873fedc
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Wed Oct 23 14:38:32 2019 +0200

    minor fixes, occupied socket ports are skipped and logger is renamed

commit 4921783d1a
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Wed Oct 23 14:36:54 2019 +0200

    terminate signal is registered too to be able terminate subprocesses

commit d1bfa2412e
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Wed Oct 23 14:33:01 2019 +0200

    pymongo connection error cause that subprocess will end

commit b7c2954061
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Wed Oct 23 14:26:31 2019 +0200

    changed default ports

commit bcf8bf17a5
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Oct 22 14:02:49 2019 +0200

    session instance validation also checks for process session and raises exception if does not match ftrack_api session

commit 28f2d14318
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Oct 22 14:01:58 2019 +0200

    pype logging is not used in event_server_cli for cases when mongo is not accessible and removed backup solution

commit 5c70f3eb13
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Oct 22 12:12:54 2019 +0200

    modified event_server_cli to be able run subprocesses and handle their liveability

commit 631dcb56ea
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Oct 22 12:12:15 2019 +0200

    added modified ftrack sessions, one for storing, second for processing events

commit 1ee8ab0dab
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Oct 22 12:11:46 2019 +0200

    added socket thread which is able to start subprocess with connection to specific port with sockets

commit 8b8427ca87
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Oct 22 12:11:12 2019 +0200

    added scripts to be able to be run as subprocess, one for storing, second for processing events

commit f0e5f39342
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Oct 22 12:02:22 2019 +0200

    run server in ftrack server gives ability to use other session and not to load plugins

commit 43f63fd260
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Oct 22 12:01:43 2019 +0200

    print traceback when error during plugin registration happens

commit 11e2382dfd
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Tue Oct 22 12:00:04 2019 +0200

    session check is happening during initialization not before register of each plugin

commit a2ae970b65
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Mon Oct 21 13:42:14 2019 +0200

    added docstrings

commit ee143aaee0
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Mon Oct 21 13:37:09 2019 +0200

    added possibility to turn off auto regex without entity specificment

commit 5ccc34abb4
Author: Jana Mizikova <mizikova.jana@gmail.com>
Date:   Fri Oct 18 15:09:47 2019 +0200

    validate unicode strings

    checking for unicode strings in environment variables that block extracting review

commit d1b294e4a2
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Oct 18 11:21:36 2019 +0200

    implemented options to methods args to have access to use all optional args

commit e0827d0597
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Oct 18 11:20:32 2019 +0200

    added __getattribute__ to have acces to not implemented methods of pymongo db

commit 9233530df5
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Fri Oct 18 11:19:48 2019 +0200

    added default message to NotActiveTable

commit 339b623e89
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Fri Oct 18 00:27:01 2019 +0200

    minor changes in prefix and fullpath preparing

commit 70853e25e3
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Fri Oct 18 00:26:23 2019 +0200

    removed unused variables

commit 518885a39f
Author: Jakub Trllo <jakub.trllo@gmail.com>
Date:   Fri Oct 18 00:26:10 2019 +0200

    added docstrings

commit 7d623a2bb5
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Thu Oct 17 12:17:46 2019 +0200

    minor change in request class

commit dc14235e97
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Thu Oct 17 12:03:03 2019 +0200

    rest api is not sending to callback specific data by args but send request info obj that contain all of that

commit cc9dd16d98
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Thu Oct 17 11:20:52 2019 +0200

    changed registering troute info are not stored to callback

commit cdd6bd9c52
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Mon Oct 14 19:09:25 2019 +0200

    created first rest api for avalon module

commit 0ff704a733
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Mon Oct 14 19:08:20 2019 +0200

    enchancements of custom db connector

commit 1298bf73bf
Author: iLLiCiTiT <jakub.trllo@gmail.com>
Date:   Mon Oct 14 19:07:52 2019 +0200

    initial commit of rest api module with partially working rest api handling

commit 752af5e5cc
Author: Toke Jepsen <tokejepsen@gmail.com>
Date:   Wed Sep 18 10:26:51 2019 +0100

    Separate comments collection to plugin.

commit 096709abf7
Merge: 46c06ce2 07fc12bb
Author: Toke Jepsen <tokejepsen@gmail.com>
Date:   Wed Sep 18 09:38:35 2019 +0100

    Merge branch 'develop' into feature/nukestudio_comments

    # Conflicts:
    #	pype/plugins/nukestudio/publish/collect_shots.py

commit bb51000871
Merge: 6e294529 f2d1613b
Author: Milan Kolar <milan@pype.club>
Date:   Fri Sep 6 19:09:18 2019 +0200

    Merge branch 'release/2.2.0'

commit 6e294529b8
Merge: c20f4ba5 7dc7fbd3
Author: Milan Kolar <milan@orbi.tools>
Date:   Fri Sep 6 17:01:23 2019 +0000

    Merged in release/2.2.0 (pull request #294)

    Release/2.2.0

    Approved-by: Milan Kolar <milan@orbi.tools>

commit 049e9ba716
Author: Toke Jepsen <tokejepsen@gmail.com>
Date:   Wed Aug 21 16:43:54 2019 +0100

    Validate knobs to studio presets.

commit 46c06ce20d
Author: Toke Jepsen <tokejepsen@gmail.com>
Date:   Wed Aug 21 16:09:31 2019 +0100

    Publish comments from NukeStudio.

commit 3f3932cff1
Author: Toke Jepsen <tokejepsen@gmail.com>
Date:   Wed Aug 21 14:29:25 2019 +0100

    Enable "Force Copy" option.

    In cases where hardlinking is not an option (cloud syncing), copying is the only option.
This commit is contained in:
Ondrej Samohel 2019-11-27 13:20:33 +01:00
parent b840fca4de
commit f9e3104042
786 changed files with 9876 additions and 168710 deletions

View file

@ -1,6 +1,6 @@
MIT License
Copyright (c) 2018 pype club
Copyright (c) 2018 orbi tools s.r.o
Permission is hereby granted, free of charge, to any person obtaining a copy

View file

@ -1,6 +1,47 @@
# Pype changelog #
Welcome to pype changelog
## 2.3.0 ##
_release date: 6 Oct 2019_
**new**:
- _(maya)_ support for yeti rigs and yeti caches
- _(maya)_ validator for comparing arbitrary attributes against ftrack
- _(pype)_ burnins can now show current date and time
- _(muster)_ pools can now be set in render globals in maya
- _(pype)_ Rest API has been implemented in beta stage
- _(nuke)_ LUT loader has been added
- _(pype)_ rudimentary user module has been added as preparation for user management
- _(pype)_ a simple logging GUI has been added to pype tray
- _(nuke)_ nuke can now bake input process into mov
- _(maya)_ imported models now have selection handle displayed by defaulting
- _(avalon)_ it's is now possible to load multiple assets at once using loader
- _(maya)_ added ability to automatically connect yeti rig to a mesh upon loading
**changed**:
- _(ftrack)_ event server now runs two parallel processes and is able to keep queue of events to process.
- _(nuke)_ task name is now added to all rendered subsets
- _(pype)_ adding more families to standalone publisher
- _(pype)_ standalone publisher now uses pyblish-lite
- _(pype)_ standalone publisher can now create review quicktimes
- _(ftrack)_ queries to ftrack were sped up
- _(ftrack)_ multiple ftrack action have been deprecated
- _(avalon)_ avalon upstream has been updated to 5.5.0
- _(nukestudio)_ published transforms can now be animated
-
**fix**:
- _(maya)_ fps popup button didn't work in some cases
- _(maya)_ geometry instances and references in maya were losing shader assignments
- _(muster)_ muster rendering templates were not working correctly
- _(maya)_ arnold tx texture conversion wasn't respecting colorspace set by the artist
- _(pype)_ problems with avalon db sync
- _(maya)_ ftrack was rounding FPS making it inconsistent
- _(pype)_ wrong icon names in Creator
- _(maya)_ scene inventory wasn't showing anything if representation was removed from database after it's been loaded to the scene
- _(nukestudio)_ multiple bugs squashed
- _(loader)_ loader was taking long time to show all the loading action when first launcher in maya
## 2.2.0 ##
_release date: 8 Sept 2019_

View file

@ -9,7 +9,7 @@ from pypeapp import config
import logging
log = logging.getLogger(__name__)
__version__ = "2.1.0"
__version__ = "2.3.0"
PACKAGE_DIR = os.path.dirname(__file__)
PLUGINS_DIR = os.path.join(PACKAGE_DIR, "plugins")

View file

@ -14,6 +14,11 @@ class AvalonApps:
self.parent = parent
self.app_launcher = None
def process_modules(self, modules):
if "RestApiServer" in modules:
from .rest_api import AvalonRestApi
self.rest_api_obj = AvalonRestApi()
# Definition of Tray menu
def tray_menu(self, parent_menu=None):
# Actions

View file

@ -0,0 +1,86 @@
import os
import re
import json
import bson
import bson.json_util
from pype.services.rest_api import RestApi, abort, CallbackResult
from pype.ftrack.lib.custom_db_connector import DbConnector
class AvalonRestApi(RestApi):
dbcon = DbConnector(
os.environ["AVALON_MONGO"],
os.environ["AVALON_DB"]
)
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.dbcon.install()
@RestApi.route("/projects/<project_name>", url_prefix="/avalon", methods="GET")
def get_project(self, request):
project_name = request.url_data["project_name"]
if not project_name:
output = {}
for project_name in self.dbcon.tables():
project = self.dbcon[project_name].find_one({"type": "project"})
output[project_name] = project
return CallbackResult(data=self.result_to_json(output))
project = self.dbcon[project_name].find_one({"type": "project"})
if project:
return CallbackResult(data=self.result_to_json(project))
abort(404, "Project \"{}\" was not found in database".format(
project_name
))
@RestApi.route("/projects/<project_name>/assets/<asset>", url_prefix="/avalon", methods="GET")
def get_assets(self, request):
_project_name = request.url_data["project_name"]
_asset = request.url_data["asset"]
if not self.dbcon.exist_table(_project_name):
abort(404, "Project \"{}\" was not found in database".format(
project_name
))
if not _asset:
assets = self.dbcon[_project_name].find({"type": "asset"})
output = self.result_to_json(assets)
return CallbackResult(data=output)
# identificator can be specified with url query (default is `name`)
identificator = request.query.get("identificator", "name")
asset = self.dbcon[_project_name].find_one({
"type": "asset",
identificator: _asset
})
if asset:
id = asset["_id"]
asset["_id"] = str(id)
return asset
abort(404, "Asset \"{}\" with {} was not found in project {}".format(
_asset, identificator, project_name
))
def result_to_json(self, result):
""" Converts result of MongoDB query to dict without $oid (ObjectId)
keys with help of regex matching.
..note:
This will convert object type entries similar to ObjectId.
"""
bson_json = bson.json_util.dumps(result)
# Replace "{$oid: "{entity id}"}" with "{entity id}"
regex1 = '(?P<id>{\"\$oid\": \"[^\"]+\"})'
regex2 = '{\"\$oid\": (?P<id>\"[^\"]+\")}'
for value in re.findall(regex1, bson_json):
for substr in re.findall(regex2, value):
bson_json = bson_json.replace(value, substr)
return json.loads(bson_json)

View file

@ -3,7 +3,7 @@ import sys
import argparse
import logging
import json
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseAction, MissingPermision
from pype.clockify import ClockifyAPI

View file

@ -1,2 +1,2 @@
from .lib import *
from .ftrack_server import *
from .ftrack_server import FtrackServer, check_ftrack_url

View file

@ -1,6 +1,6 @@
import os
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseAction
from pype.ftrack.lib.io_nonsingleton import DbConnector
@ -281,7 +281,4 @@ class AttributesRemapper(BaseAction):
def register(session, plugins_presets={}):
'''Register plugin. Called when used as an plugin.'''
if not isinstance(session, ftrack_api.session.Session):
return
AttributesRemapper(session, plugins_presets).register()

View file

@ -2,7 +2,7 @@ import sys
import argparse
import logging
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseAction
@ -55,11 +55,8 @@ class ClientReviewSort(BaseAction):
def register(session, plugins_presets={}):
'''Register action. Called when used as an event plugin.'''
if not isinstance(session, ftrack_api.session.Session):
return
action_handler = ClientReviewSort(session, plugins_presets)
action_handler.register()
ClientReviewSort(session, plugins_presets).register()
def main(arguments=None):

View file

@ -3,7 +3,7 @@ import sys
import argparse
import logging
import subprocess
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseAction
@ -68,12 +68,6 @@ class ComponentOpen(BaseAction):
def register(session, plugins_presets={}):
'''Register action. Called when used as an event plugin.'''
# Validate that session is an instance of ftrack_api.Session. If not,
# assume that register is being called from an old or incompatible API and
# return without doing anything.
if not isinstance(session, ftrack_api.session.Session):
return
ComponentOpen(session, plugins_presets).register()

View file

@ -4,7 +4,7 @@ import argparse
import json
import arrow
import logging
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseAction, get_ca_mongoid
from pypeapp import config
from ftrack_api.exception import NoResultFoundError
@ -572,12 +572,6 @@ class CustomAttributes(BaseAction):
def register(session, plugins_presets={}):
'''Register plugin. Called when used as an plugin.'''
# Validate that session is an instance of ftrack_api.Session. If not,
# assume that register is being called from an old or incompatible API and
# return without doing anything.
if not isinstance(session, ftrack_api.session.Session):
return
CustomAttributes(session, plugins_presets).register()

View file

@ -4,7 +4,7 @@ import logging
import argparse
import re
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseAction
from avalon import lib as avalonlib
from pype.ftrack.lib.io_nonsingleton import DbConnector
@ -327,9 +327,6 @@ class PartialDict(dict):
def register(session, plugins_presets={}):
'''Register plugin. Called when used as an plugin.'''
if not isinstance(session, ftrack_api.session.Session):
return
CreateFolders(session, plugins_presets).register()

View file

@ -4,7 +4,7 @@ import re
import argparse
import logging
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseAction
from pypeapp import config
@ -198,9 +198,6 @@ class CreateProjectFolders(BaseAction):
def register(session, plugins_presets={}):
'''Register plugin. Called when used as an plugin.'''
if not isinstance(session, ftrack_api.session.Session):
return
CreateProjectFolders(session, plugins_presets).register()

View file

@ -4,12 +4,12 @@ import json
import argparse
import logging
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseAction
class CustomAttributeDoctor(BaseAction):
ignore_me = True
#: Action identifier.
identifier = 'custom.attributes.doctor'
@ -294,9 +294,6 @@ class CustomAttributeDoctor(BaseAction):
def register(session, plugins_presets={}):
'''Register plugin. Called when used as an plugin.'''
if not isinstance(session, ftrack_api.session.Session):
return
CustomAttributeDoctor(session, plugins_presets).register()

View file

@ -3,7 +3,7 @@ import sys
import logging
from bson.objectid import ObjectId
import argparse
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseAction
from pype.ftrack.lib.io_nonsingleton import DbConnector
@ -85,7 +85,7 @@ class DeleteAsset(BaseAction):
'type': 'asset',
'name': entity['name']
})
if av_entity is None:
return {
'success': False,
@ -277,10 +277,7 @@ class DeleteAsset(BaseAction):
'message': 'No entities to delete in avalon'
}
or_subquery = []
for id in all_ids:
or_subquery.append({'_id': id})
delete_query = {'$or': or_subquery}
delete_query = {'_id': {'$in': all_ids}}
self.db.delete_many(delete_query)
return {
@ -314,12 +311,6 @@ class DeleteAsset(BaseAction):
def register(session, plugins_presets={}):
'''Register plugin. Called when used as an plugin.'''
# Validate that session is an instance of ftrack_api.Session. If not,
# assume that register is being called from an old or incompatible API and
# return without doing anything.
if not isinstance(session, ftrack_api.session.Session):
return
DeleteAsset(session, plugins_presets).register()

View file

@ -2,7 +2,7 @@ import os
import sys
import logging
import argparse
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseAction
from pype.ftrack.lib.io_nonsingleton import DbConnector
@ -97,10 +97,7 @@ class AssetsRemover(BaseAction):
'message': 'None of assets'
}
or_subquery = []
for id in all_ids:
or_subquery.append({'_id': id})
delete_query = {'$or': or_subquery}
delete_query = {'_id': {'$in': all_ids}}
self.db.delete_many(delete_query)
self.db.uninstall()
@ -135,12 +132,6 @@ class AssetsRemover(BaseAction):
def register(session, plugins_presets={}):
'''Register plugin. Called when used as an plugin.'''
# Validate that session is an instance of ftrack_api.Session. If not,
# assume that register is being called from an old or incompatible API and
# return without doing anything.
if not isinstance(session, ftrack_api.session.Session):
return
AssetsRemover(session, plugins_presets).register()

View file

@ -4,7 +4,7 @@ import json
import logging
import subprocess
from operator import itemgetter
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseAction
from pypeapp import Logger, config
@ -36,12 +36,13 @@ class DJVViewAction(BaseAction):
'file_ext', ["img", "mov", "exr"]
)
def register(self):
assert (self.djv_path is not None), (
'DJV View is not installed'
' or paths in presets are not set correctly'
)
super().register()
def preregister(self):
if self.djv_path is None:
return (
'DJV View is not installed'
' or paths in presets are not set correctly'
)
return True
def discover(self, session, entities, event):
"""Return available actions based on *event*. """
@ -220,8 +221,6 @@ class DJVViewAction(BaseAction):
def register(session, plugins_presets={}):
"""Register hooks."""
if not isinstance(session, ftrack_api.session.Session):
return
DJVViewAction(session, plugins_presets).register()

View file

@ -4,7 +4,7 @@ import argparse
import logging
import json
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseAction
@ -101,13 +101,15 @@ class JobKiller(BaseAction):
# Update all the queried jobs, setting the status to failed.
for job in jobs:
try:
origin_status = job["status"]
job['status'] = 'failed'
session.commit()
self.log.debug((
'Changing Job ({}) status: {} -> failed'
).format(job['id'], job['status']))
).format(job['id'], origin_status))
except Exception:
self.log.warning.debug((
session.rollback()
self.log.warning((
'Changing Job ({}) has failed'
).format(job['id']))
@ -121,12 +123,6 @@ class JobKiller(BaseAction):
def register(session, plugins_presets={}):
'''Register plugin. Called when used as an plugin.'''
# Validate that session is an instance of ftrack_api.Session. If not,
# assume that register is being called from an old or incompatible API and
# return without doing anything.
if not isinstance(session, ftrack_api.session.Session):
return
JobKiller(session, plugins_presets).register()

View file

@ -2,7 +2,7 @@ import os
import sys
import argparse
import logging
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseAction
@ -115,9 +115,6 @@ class MultipleNotes(BaseAction):
def register(session, plugins_presets={}):
'''Register plugin. Called when used as an plugin.'''
if not isinstance(session, ftrack_api.session.Session):
return
MultipleNotes(session, plugins_presets).register()

View file

@ -2,12 +2,12 @@ import os
import json
from ruamel import yaml
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseAction
from pypeapp import config
from pype.ftrack.lib import get_avalon_attr
from pype.vendor.ftrack_api import session as fa_session
from ftrack_api import session as fa_session
class PrepareProject(BaseAction):
@ -372,7 +372,4 @@ class PrepareProject(BaseAction):
def register(session, plugins_presets={}):
'''Register plugin. Called when used as an plugin.'''
if not isinstance(session, ftrack_api.session.Session):
return
PrepareProject(session, plugins_presets).register()

View file

@ -7,7 +7,7 @@ import json
from pypeapp import Logger, config
from pype.ftrack import BaseAction
from pype.vendor import ftrack_api
import ftrack_api
from avalon import io, api
log = Logger().get_logger(__name__)
@ -15,6 +15,7 @@ log = Logger().get_logger(__name__)
class RVAction(BaseAction):
""" Launch RV action """
ignore_me = "rv" not in config.get_presets()
identifier = "rv.launch.action"
label = "rv"
description = "rv Launcher"
@ -42,8 +43,9 @@ class RVAction(BaseAction):
)
else:
# if not, fallback to config file location
self.config_data = config.get_presets()['rv']['config']
self.set_rv_path()
if "rv" in config.get_presets():
self.config_data = config.get_presets()['rv']['config']
self.set_rv_path()
if self.rv_path is None:
return
@ -59,12 +61,12 @@ class RVAction(BaseAction):
def set_rv_path(self):
self.rv_path = self.config_data.get("rv_path")
def register(self):
assert (self.rv_path is not None), (
'RV is not installed'
' or paths in presets are not set correctly'
)
super().register()
def preregister(self):
if self.rv_path is None:
return (
'RV is not installed or paths in presets are not set correctly'
)
return True
def get_components_from_entity(self, session, entity, components):
"""Get components from various entity types.
@ -328,8 +330,6 @@ class RVAction(BaseAction):
def register(session, plugins_presets={}):
"""Register hooks."""
if not isinstance(session, ftrack_api.session.Session):
return
RVAction(session, plugins_presets).register()

View file

@ -0,0 +1,347 @@
import os
from operator import itemgetter
from pype.ftrack import BaseAction
class SeedDebugProject(BaseAction):
'''Edit meta data action.'''
#: Action identifier.
identifier = "seed.debug.project"
#: Action label.
label = "SeedDebugProject"
#: Action description.
description = "Description"
#: priority
priority = 100
#: roles that are allowed to register this action
role_list = ["Pypeclub"]
icon = "{}/ftrack/action_icons/SeedProject.svg".format(
os.environ.get("PYPE_STATICS_SERVER", "")
)
# Asset names which will be created in `Assets` entity
assets = [
"Addax", "Alpaca", "Ant", "Antelope", "Aye", "Badger", "Bear", "Bee",
"Beetle", "Bluebird", "Bongo", "Bontebok", "Butterflie", "Caiman",
"Capuchin", "Capybara", "Cat", "Caterpillar", "Coyote", "Crocodile",
"Cuckoo", "Deer", "Dragonfly", "Duck", "Eagle", "Egret", "Elephant",
"Falcon", "Fossa", "Fox", "Gazelle", "Gecko", "Gerbil",
"GiantArmadillo", "Gibbon", "Giraffe", "Goose", "Gorilla",
"Grasshoper", "Hare", "Hawk", "Hedgehog", "Heron", "Hog",
"Hummingbird", "Hyena", "Chameleon", "Cheetah", "Iguana", "Jackal",
"Jaguar", "Kingfisher", "Kinglet", "Kite", "Komodo", "Lemur",
"Leopard", "Lion", "Lizard", "Macaw", "Malachite", "Mandrill",
"Mantis", "Marmoset", "Meadowlark", "Meerkat", "Mockingbird",
"Mongoose", "Monkey", "Nyal", "Ocelot", "Okapi", "Oribi", "Oriole",
"Otter", "Owl", "Panda", "Parrot", "Pelican", "Pig", "Porcupine",
"Reedbuck", "Rhinocero", "Sandpiper", "Servil", "Skink", "Sloth",
"Snake", "Spider", "Squirrel", "Sunbird", "Swallow", "Swift", "Tiger",
"Sylph", "Tanager", "Vulture", "Warthog", "Waterbuck", "Woodpecker",
"Zebra"
]
# Tasks which will be created for Assets
asset_tasks = [
"Modeling", "Lookdev", "Rigging"
]
# Tasks which will be created for Shots
shot_tasks = [
"Animation", "Lighting", "Compositing", "FX"
]
# Define how much sequences will be created
default_seq_count = 5
# Define how much shots will be created for each sequence
default_shots_count = 10
existing_projects = None
new_project_item = "< New Project >"
current_project_item = "< Current Project >"
def discover(self, session, entities, event):
''' Validation '''
return True
def interface(self, session, entities, event):
if event["data"].get("values", {}):
return
title = "Select Project where you want to create seed data"
items = []
item_splitter = {"type": "label", "value": "---"}
description_label = {
"type": "label",
"value": (
"WARNING: Action does NOT check if entities already exist !!!"
)
}
items.append(description_label)
all_projects = session.query("select full_name from Project").all()
self.existing_projects = [proj["full_name"] for proj in all_projects]
projects_items = [
{"label": proj, "value": proj} for proj in self.existing_projects
]
data_items = []
data_items.append({
"label": self.new_project_item,
"value": self.new_project_item
})
data_items.append({
"label": self.current_project_item,
"value": self.current_project_item
})
data_items.extend(sorted(
projects_items,
key=itemgetter("label"),
reverse=False
))
projects_item = {
"label": "Choose Project",
"type": "enumerator",
"name": "project_name",
"data": data_items,
"value": self.current_project_item
}
items.append(projects_item)
items.append(item_splitter)
items.append({
"label": "Number of assets",
"type": "number",
"name": "asset_count",
"value": len(self.assets)
})
items.append({
"label": "Number of sequences",
"type": "number",
"name": "seq_count",
"value": self.default_seq_count
})
items.append({
"label": "Number of shots",
"type": "number",
"name": "shots_count",
"value": self.default_shots_count
})
items.append(item_splitter)
note_label = {
"type": "label",
"value": (
"<p><i>NOTE: Enter project name and choose schema if you "
"chose `\"< New Project >\"`(code is optional)</i><p>"
)
}
items.append(note_label)
items.append({
"label": "Project name",
"name": "new_project_name",
"type": "text",
"value": ""
})
project_schemas = [
sch["name"] for sch in self.session.query("ProjectSchema").all()
]
schemas_item = {
"label": "Choose Schema",
"type": "enumerator",
"name": "new_schema_name",
"data": [
{"label": sch, "value": sch} for sch in project_schemas
],
"value": project_schemas[0]
}
items.append(schemas_item)
items.append({
"label": "*Project code",
"name": "new_project_code",
"type": "text",
"value": "",
"empty_text": "Optional..."
})
return {
"items": items,
"title": title
}
def launch(self, session, in_entities, event):
if "values" not in event["data"]:
return
# THIS IS THE PROJECT PART
values = event["data"]["values"]
selected_project = values["project_name"]
if selected_project == self.new_project_item:
project_name = values["new_project_name"]
if project_name in self.existing_projects:
msg = "Project \"{}\" already exist".format(project_name)
self.log.error(msg)
return {"success": False, "message": msg}
project_code = values["new_project_code"]
project_schema_name = values["new_schema_name"]
if not project_code:
project_code = project_name
project_code = project_code.lower().replace(" ", "_").strip()
_project = session.query(
"Project where name is \"{}\"".format(project_code)
).first()
if _project:
msg = "Project with code \"{}\" already exist".format(
project_code
)
self.log.error(msg)
return {"success": False, "message": msg}
project_schema = session.query(
"ProjectSchema where name is \"{}\"".format(
project_schema_name
)
).one()
# Create the project with the chosen schema.
self.log.debug((
"*** Creating Project: name <{}>, code <{}>, schema <{}>"
).format(project_name, project_code, project_schema_name))
project = session.create("Project", {
"name": project_code,
"full_name": project_name,
"project_schema": project_schema
})
session.commit()
elif selected_project == self.current_project_item:
entity = in_entities[0]
if entity.entity_type.lower() == "project":
project = entity
else:
if "project" in entity:
project = entity["project"]
else:
project = entity["parent"]["project"]
project_schema = project["project_schema"]
self.log.debug((
"*** Using Project: name <{}>, code <{}>, schema <{}>"
).format(
project["full_name"], project["name"], project_schema["name"]
))
else:
project = session.query("Project where full_name is \"{}\"".format(
selected_project
)).one()
project_schema = project["project_schema"]
self.log.debug((
"*** Using Project: name <{}>, code <{}>, schema <{}>"
).format(
project["full_name"], project["name"], project_schema["name"]
))
# THIS IS THE MAGIC PART
task_types = {}
for _type in project_schema["_task_type_schema"]["types"]:
if _type["name"] not in task_types:
task_types[_type["name"]] = _type
self.task_types = task_types
asset_count = values.get("asset_count") or len(self.assets)
seq_count = values.get("seq_count") or self.default_seq_count
shots_count = values.get("shots_count") or self.default_shots_count
self.create_assets(project, asset_count)
self.create_shots(project, seq_count, shots_count)
return True
def create_assets(self, project, asset_count):
self.log.debug("*** Creating assets:")
main_entity = self.session.create("Folder", {
"name": "Assets",
"parent": project
})
self.log.debug("- Assets")
available_assets = len(self.assets)
repetitive_times = (
int(asset_count / available_assets) +
(asset_count % available_assets > 0)
)
created_assets = 0
for _asset_name in self.assets:
if created_assets >= asset_count:
break
for asset_num in range(1, repetitive_times + 1):
if created_assets >= asset_count:
break
asset_name = "%s_%02d" % (_asset_name, asset_num)
asset = self.session.create("AssetBuild", {
"name": asset_name,
"parent": main_entity
})
created_assets += 1
self.log.debug("- Assets/{}".format(asset_name))
for task_name in self.asset_tasks:
self.session.create("Task", {
"name": task_name,
"parent": asset,
"type": self.task_types[task_name]
})
self.log.debug("- Assets/{}/{}".format(
asset_name, task_name
))
self.log.debug("*** Commiting Assets")
self.session.commit()
def create_shots(self, project, seq_count, shots_count):
self.log.debug("*** Creating shots:")
main_entity = self.session.create("Folder", {
"name": "Shots",
"parent": project
})
self.log.debug("- Shots")
for seq_num in range(1, seq_count+1):
seq_name = "sq%03d" % seq_num
seq = self.session.create("Sequence", {
"name": seq_name,
"parent": main_entity
})
self.log.debug("- Shots/{}".format(seq_name))
for shot_num in range(1, shots_count+1):
shot_name = "%ssh%04d" % (seq_name, (shot_num*10))
shot = self.session.create("Shot", {
"name": shot_name,
"parent": seq
})
self.log.debug("- Shots/{}/{}".format(seq_name, shot_name))
for task_name in self.shot_tasks:
self.session.create("Task", {
"name": task_name,
"parent": shot,
"type": self.task_types[task_name]
})
self.log.debug("- Shots/{}/{}/{}".format(
seq_name, shot_name, task_name
))
self.log.debug("*** Commiting Shots")
self.session.commit()
def register(session, plugins_presets={}):
'''Register plugin. Called when used as an plugin.'''
SeedDebugProject(session, plugins_presets).register()

View file

@ -1,4 +1,4 @@
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseAction
@ -26,7 +26,7 @@ class StartTimer(BaseAction):
user.start_timer(entity, force=True)
self.session.commit()
self.log.info(
"Starting Ftrack timer for task: {}".format(entity['name'])
)
@ -37,7 +37,4 @@ class StartTimer(BaseAction):
def register(session, plugins_presets={}):
'''Register plugin. Called when used as an plugin.'''
if not isinstance(session, ftrack_api.session.Session):
return
StartTimer(session, plugins_presets).register()

View file

@ -1,354 +0,0 @@
import os
import sys
import json
import argparse
import logging
import collections
from pype.vendor import ftrack_api
from pype.ftrack import BaseAction, lib
from pype.ftrack.lib.io_nonsingleton import DbConnector
from bson.objectid import ObjectId
class SyncHierarchicalAttrs(BaseAction):
db_con = DbConnector()
ca_mongoid = lib.get_ca_mongoid()
#: Action identifier.
identifier = 'sync.hierarchical.attrs.local'
#: Action label.
label = "Pype Admin"
variant = '- Sync Hier Attrs (Local)'
#: Action description.
description = 'Synchronize hierarchical attributes'
#: Icon
icon = '{}/ftrack/action_icons/PypeAdmin.svg'.format(
os.environ.get('PYPE_STATICS_SERVER', '')
)
#: roles that are allowed to register this action
role_list = ['Pypeclub', 'Administrator', 'Project Manager']
def discover(self, session, entities, event):
''' Validation '''
for entity in entities:
if (
entity.get('context_type', '').lower() in ('show', 'task') and
entity.entity_type.lower() != 'task'
):
return True
return False
def launch(self, session, entities, event):
self.interface_messages = {}
user = session.query(
'User where id is "{}"'.format(event['source']['user']['id'])
).one()
job = session.create('Job', {
'user': user,
'status': 'running',
'data': json.dumps({
'description': 'Sync Hierachical attributes'
})
})
session.commit()
self.log.debug('Job with id "{}" created'.format(job['id']))
process_session = ftrack_api.Session(
server_url=session.server_url,
api_key=session.api_key,
api_user=session.api_user,
auto_connect_event_hub=True
)
try:
# Collect hierarchical attrs
self.log.debug('Collecting Hierarchical custom attributes started')
custom_attributes = {}
all_avalon_attr = process_session.query(
'CustomAttributeGroup where name is "avalon"'
).one()
error_key = (
'Hierarchical attributes with set "default" value (not allowed)'
)
for cust_attr in all_avalon_attr['custom_attribute_configurations']:
if 'avalon_' in cust_attr['key']:
continue
if not cust_attr['is_hierarchical']:
continue
if cust_attr['default']:
if error_key not in self.interface_messages:
self.interface_messages[error_key] = []
self.interface_messages[error_key].append(
cust_attr['label']
)
self.log.warning((
'Custom attribute "{}" has set default value.'
' This attribute can\'t be synchronized'
).format(cust_attr['label']))
continue
custom_attributes[cust_attr['key']] = cust_attr
self.log.debug(
'Collecting Hierarchical custom attributes has finished'
)
if not custom_attributes:
msg = 'No hierarchical attributes to sync.'
self.log.debug(msg)
return {
'success': True,
'message': msg
}
entity = entities[0]
if entity.entity_type.lower() == 'project':
project_name = entity['full_name']
else:
project_name = entity['project']['full_name']
self.db_con.install()
self.db_con.Session['AVALON_PROJECT'] = project_name
_entities = self._get_entities(event, process_session)
for entity in _entities:
self.log.debug(30*'-')
self.log.debug(
'Processing entity "{}"'.format(entity.get('name', entity))
)
ent_name = entity.get('name', entity)
if entity.entity_type.lower() == 'project':
ent_name = entity['full_name']
for key in custom_attributes:
self.log.debug(30*'*')
self.log.debug(
'Processing Custom attribute key "{}"'.format(key)
)
# check if entity has that attribute
if key not in entity['custom_attributes']:
error_key = 'Missing key on entities'
if error_key not in self.interface_messages:
self.interface_messages[error_key] = []
self.interface_messages[error_key].append(
'- key: "{}" - entity: "{}"'.format(key, ent_name)
)
self.log.error((
'- key "{}" not found on "{}"'
).format(key, ent_name))
continue
value = self.get_hierarchical_value(key, entity)
if value is None:
error_key = (
'Missing value for key on entity'
' and its parents (synchronization was skipped)'
)
if error_key not in self.interface_messages:
self.interface_messages[error_key] = []
self.interface_messages[error_key].append(
'- key: "{}" - entity: "{}"'.format(key, ent_name)
)
self.log.warning((
'- key "{}" not set on "{}" or its parents'
).format(key, ent_name))
continue
self.update_hierarchical_attribute(entity, key, value)
job['status'] = 'done'
session.commit()
except Exception:
self.log.error(
'Action "{}" failed'.format(self.label),
exc_info=True
)
finally:
self.db_con.uninstall()
if job['status'] in ('queued', 'running'):
job['status'] = 'failed'
session.commit()
if self.interface_messages:
title = "Errors during SyncHierarchicalAttrs"
self.show_interface_from_dict(
messages=self.interface_messages, title=title, event=event
)
return True
def get_hierarchical_value(self, key, entity):
value = entity['custom_attributes'][key]
if (
value is not None or
entity.entity_type.lower() == 'project'
):
return value
return self.get_hierarchical_value(key, entity['parent'])
def update_hierarchical_attribute(self, entity, key, value):
if (
entity['context_type'].lower() not in ('show', 'task') or
entity.entity_type.lower() == 'task'
):
return
ent_name = entity.get('name', entity)
if entity.entity_type.lower() == 'project':
ent_name = entity['full_name']
hierarchy = '/'.join(
[a['name'] for a in entity.get('ancestors', [])]
)
if hierarchy:
hierarchy = '/'.join(
[entity['project']['full_name'], hierarchy, entity['name']]
)
elif entity.entity_type.lower() == 'project':
hierarchy = entity['full_name']
else:
hierarchy = '/'.join(
[entity['project']['full_name'], entity['name']]
)
self.log.debug('- updating entity "{}"'.format(hierarchy))
# collect entity's custom attributes
custom_attributes = entity.get('custom_attributes')
if not custom_attributes:
return
mongoid = custom_attributes.get(self.ca_mongoid)
if not mongoid:
error_key = 'Missing MongoID on entities (try SyncToAvalon first)'
if error_key not in self.interface_messages:
self.interface_messages[error_key] = []
if ent_name not in self.interface_messages[error_key]:
self.interface_messages[error_key].append(ent_name)
self.log.warning(
'-- entity "{}" is not synchronized to avalon. Skipping'.format(
ent_name
)
)
return
try:
mongoid = ObjectId(mongoid)
except Exception:
error_key = 'Invalid MongoID on entities (try SyncToAvalon)'
if error_key not in self.interface_messages:
self.interface_messages[error_key] = []
if ent_name not in self.interface_messages[error_key]:
self.interface_messages[error_key].append(ent_name)
self.log.warning(
'-- entity "{}" has stored invalid MongoID. Skipping'.format(
ent_name
)
)
return
# Find entity in Mongo DB
mongo_entity = self.db_con.find_one({'_id': mongoid})
if not mongo_entity:
error_key = 'Entities not found in Avalon DB (try SyncToAvalon)'
if error_key not in self.interface_messages:
self.interface_messages[error_key] = []
if ent_name not in self.interface_messages[error_key]:
self.interface_messages[error_key].append(ent_name)
self.log.warning(
'-- entity "{}" was not found in DB by id "{}". Skipping'.format(
ent_name, str(mongoid)
)
)
return
# Change value if entity has set it's own
entity_value = custom_attributes[key]
if entity_value is not None:
value = entity_value
data = mongo_entity.get('data') or {}
data[key] = value
self.db_con.update_many(
{'_id': mongoid},
{'$set': {'data': data}}
)
self.log.debug(
'-- stored value "{}"'.format(value)
)
for child in entity.get('children', []):
self.update_hierarchical_attribute(child, key, value)
def register(session, plugins_presets={}):
'''Register plugin. Called when used as an plugin.'''
if not isinstance(session, ftrack_api.session.Session):
return
SyncHierarchicalAttrs(session, plugins_presets).register()
def main(arguments=None):
'''Set up logging and register action.'''
if arguments is None:
arguments = []
parser = argparse.ArgumentParser()
# Allow setting of logging level from arguments.
loggingLevels = {}
for level in (
logging.NOTSET, logging.DEBUG, logging.INFO, logging.WARNING,
logging.ERROR, logging.CRITICAL
):
loggingLevels[logging.getLevelName(level).lower()] = level
parser.add_argument(
'-v', '--verbosity',
help='Set the logging output verbosity.',
choices=loggingLevels.keys(),
default='info'
)
namespace = parser.parse_args(arguments)
# Set up basic logging
logging.basicConfig(level=loggingLevels[namespace.verbosity])
session = ftrack_api.Session()
register(session)
# Wait for events
logging.info(
'Registered actions and listening for events. Use Ctrl-C to abort.'
)
session.event_hub.wait()
if __name__ == '__main__':
raise SystemExit(main(sys.argv[1:]))

File diff suppressed because it is too large Load diff

View file

@ -1,273 +0,0 @@
import os
import sys
import time
import argparse
import logging
import json
import collections
from pype.vendor import ftrack_api
from pype.ftrack import BaseAction
from pype.ftrack.lib import avalon_sync as ftracklib
from pype.vendor.ftrack_api import session as fa_session
class SyncToAvalon(BaseAction):
'''
Synchronizing data action - from Ftrack to Avalon DB
Stores all information about entity.
- Name(string) - Most important information = identifier of entity
- Parent(ObjectId) - Avalon Project Id, if entity is not project itself
- Silo(string) - Last parent except project
- Data(dictionary):
- VisualParent(ObjectId) - Avalon Id of parent asset
- Parents(array of string) - All parent names except project
- Tasks(array of string) - Tasks on asset
- FtrackId(string)
- entityType(string) - entity's type on Ftrack
* All Custom attributes in group 'Avalon' which name don't start with 'avalon_'
* These information are stored also for all parents and children entities.
Avalon ID of asset is stored to Ftrack -> Custom attribute 'avalon_mongo_id'.
- action IS NOT creating this Custom attribute if doesn't exist
- run 'Create Custom Attributes' action or do it manually (Not recommended)
If Ftrack entity already has Custom Attribute 'avalon_mongo_id' that stores ID:
- name, parents and silo are checked -> shows error if are not exact the same
- after sync it is not allowed to change names or move entities
If ID in 'avalon_mongo_id' is empty string or is not found in DB:
- tries to find entity by name
- found:
- raise error if ftrackId/visual parent/parents are not same
- not found:
- Creates asset/project
'''
#: Action identifier.
identifier = 'sync.to.avalon.local'
#: Action label.
label = "Pype Admin"
variant = '- Sync To Avalon (Local)'
#: Action description.
description = 'Send data from Ftrack to Avalon'
#: Action icon.
icon = '{}/ftrack/action_icons/PypeAdmin.svg'.format(
os.environ.get('PYPE_STATICS_SERVER', '')
)
#: roles that are allowed to register this action
role_list = ['Pypeclub']
#: Action priority
priority = 200
project_query = (
"select full_name, name, custom_attributes"
", project_schema._task_type_schema.types.name"
" from Project where full_name is \"{}\""
)
entities_query = (
"select id, name, parent_id, link, custom_attributes"
" from TypedContext where project.full_name is \"{}\""
)
# Entity type names(lowered) that won't be synchronized with their children
ignore_entity_types = ["task", "milestone"]
def __init__(self, session, plugins_presets):
super(SyncToAvalon, self).__init__(session)
# reload utils on initialize (in case of server restart)
def discover(self, session, entities, event):
''' Validation '''
for entity in entities:
if entity.entity_type.lower() not in ['task', 'assetversion']:
return True
return False
def launch(self, session, entities, event):
time_start = time.time()
message = ""
# JOB SETTINGS
userId = event['source']['user']['id']
user = session.query('User where id is ' + userId).one()
job = session.create('Job', {
'user': user,
'status': 'running',
'data': json.dumps({
'description': 'Sync Ftrack to Avalon.'
})
})
session.commit()
try:
self.log.debug("Preparing entities for synchronization")
if entities[0].entity_type.lower() == "project":
ft_project_name = entities[0]["full_name"]
else:
ft_project_name = entities[0]["project"]["full_name"]
project_entities = session.query(
self.entities_query.format(ft_project_name)
).all()
ft_project = session.query(
self.project_query.format(ft_project_name)
).one()
entities_by_id = {}
entities_by_parent = collections.defaultdict(list)
entities_by_id[ft_project["id"]] = ft_project
for ent in project_entities:
entities_by_id[ent["id"]] = ent
entities_by_parent[ent["parent_id"]].append(ent)
importable = []
for ent_info in event["data"]["selection"]:
ent = entities_by_id[ent_info["entityId"]]
for link_ent_info in ent["link"]:
link_ent = entities_by_id[link_ent_info["id"]]
if (
ent.entity_type.lower() in self.ignore_entity_types or
link_ent in importable
):
continue
importable.append(link_ent)
def add_children(parent_id):
ents = entities_by_parent[parent_id]
for ent in ents:
if ent.entity_type.lower() in self.ignore_entity_types:
continue
if ent not in importable:
importable.append(ent)
add_children(ent["id"])
# add children of selection to importable
for ent_info in event["data"]["selection"]:
add_children(ent_info["entityId"])
# Check names: REGEX in schema/duplicates - raise error if found
all_names = []
duplicates = []
for entity in importable:
ftracklib.avalon_check_name(entity)
if entity.entity_type.lower() == "project":
continue
if entity['name'] in all_names:
duplicates.append("'{}'".format(entity['name']))
else:
all_names.append(entity['name'])
if len(duplicates) > 0:
# TODO Show information to user and return False
raise ValueError(
"Entity name duplication: {}".format(", ".join(duplicates))
)
# ----- PROJECT ------
avalon_project = ftracklib.get_avalon_project(ft_project)
custom_attributes = ftracklib.get_avalon_attr(session)
# Import all entities to Avalon DB
for entity in importable:
result = ftracklib.import_to_avalon(
session=session,
entity=entity,
ft_project=ft_project,
av_project=avalon_project,
custom_attributes=custom_attributes
)
# TODO better error handling
# maybe split into critical, warnings and messages?
if 'errors' in result and len(result['errors']) > 0:
job['status'] = 'failed'
session.commit()
ftracklib.show_errors(self, event, result['errors'])
return {
'success': False,
'message': "Sync to avalon FAILED"
}
if avalon_project is None:
if 'project' in result:
avalon_project = result['project']
job['status'] = 'done'
except ValueError as ve:
# TODO remove this part!!!!
job['status'] = 'failed'
message = str(ve)
self.log.error(
'Error during syncToAvalon: {}'.format(message),
exc_info=True
)
except Exception as e:
job['status'] = 'failed'
exc_type, exc_obj, exc_tb = sys.exc_info()
fname = os.path.split(exc_tb.tb_frame.f_code.co_filename)[1]
log_message = "{}/{}/Line: {}".format(
exc_type, fname, exc_tb.tb_lineno
)
self.log.error(
'Error during syncToAvalon: {}'.format(log_message),
exc_info=True
)
# TODO add traceback to message and show to user
message = (
'Unexpected Error'
' - Please check Log for more information'
)
finally:
if job['status'] in ['queued', 'running']:
job['status'] = 'failed'
session.commit()
time_end = time.time()
self.log.debug("Synchronization took \"{}\"".format(
str(time_end - time_start)
))
if job["status"] != "failed":
self.log.debug("Triggering Sync hierarchical attributes")
self.trigger_action("sync.hierarchical.attrs.local", event)
if len(message) > 0:
message = "Unable to sync: {}".format(message)
return {
'success': False,
'message': message
}
return {
'success': True,
'message': "Synchronization was successfull"
}
def register(session, plugins_presets={}):
'''Register plugin. Called when used as an plugin.'''
# Validate that session is an instance of ftrack_api.Session. If not,
# assume that register is being called from an old or incompatible API and
# return without doing anything.
if not isinstance(session, ftrack_api.session.Session):
return
SyncToAvalon(session, plugins_presets).register()

View file

@ -6,7 +6,7 @@ import collections
import json
import re
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseAction
from avalon import io, inventory, schema
@ -43,9 +43,6 @@ class TestAction(BaseAction):
def register(session, plugins_presets={}):
'''Register plugin. Called when used as an plugin.'''
if not isinstance(session, ftrack_api.session.Session):
return
TestAction(session, plugins_presets).register()

View file

@ -4,7 +4,7 @@ import argparse
import logging
import json
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseAction
@ -43,7 +43,7 @@ class ThumbToChildren(BaseAction):
'description': 'Push thumbnails to Childrens'
})
})
session.commit()
try:
for entity in entities:
thumbid = entity['thumbnail_id']
@ -53,10 +53,11 @@ class ThumbToChildren(BaseAction):
# inform the user that the job is done
job['status'] = 'done'
except Exception:
except Exception as exc:
session.rollback()
# fail the job if something goes wrong
job['status'] = 'failed'
raise
raise exc
finally:
session.commit()
@ -68,8 +69,6 @@ class ThumbToChildren(BaseAction):
def register(session, plugins_presets={}):
'''Register action. Called when used as an event plugin.'''
if not isinstance(session, ftrack_api.session.Session):
return
ThumbToChildren(session, plugins_presets).register()

View file

@ -3,7 +3,7 @@ import sys
import argparse
import logging
import json
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseAction
@ -40,9 +40,9 @@ class ThumbToParent(BaseAction):
'status': 'running',
'data': json.dumps({
'description': 'Push thumbnails to parents'
})
})
})
session.commit()
try:
for entity in entities:
parent = None
@ -74,10 +74,11 @@ class ThumbToParent(BaseAction):
# inform the user that the job is done
job['status'] = status or 'done'
except Exception as e:
except Exception as exc:
session.rollback()
# fail the job if something goes wrong
job['status'] = 'failed'
raise e
raise exc
finally:
session.commit()
@ -90,8 +91,6 @@ class ThumbToParent(BaseAction):
def register(session, plugins_presets={}):
'''Register action. Called when used as an event plugin.'''
if not isinstance(session, ftrack_api.session.Session):
return
ThumbToParent(session, plugins_presets).register()

View file

@ -6,7 +6,7 @@ import collections
import json
import re
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseAction
from avalon import io, inventory, schema
from pype.ftrack.lib.io_nonsingleton import DbConnector

View file

@ -1,7 +1,7 @@
import os
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseAction
from pype.vendor.ftrack_api import session as fa_session
from ftrack_api import session as fa_session
class ActionAskWhereIRun(BaseAction):
@ -40,7 +40,4 @@ class ActionAskWhereIRun(BaseAction):
def register(session, plugins_presets={}):
'''Register plugin. Called when used as an plugin.'''
if not isinstance(session, ftrack_api.session.Session):
return
ActionAskWhereIRun(session, plugins_presets).register()

View file

@ -1,7 +1,7 @@
import platform
import socket
import getpass
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseAction
@ -80,7 +80,4 @@ class ActionShowWhereIRun(BaseAction):
def register(session, plugins_presets={}):
'''Register plugin. Called when used as an plugin.'''
if not isinstance(session, ftrack_api.session.Session):
return
ActionShowWhereIRun(session, plugins_presets).register()

View file

@ -1,386 +0,0 @@
import os
import sys
import json
import argparse
import logging
import collections
from pypeapp import config
from pype.vendor import ftrack_api
from pype.ftrack import BaseAction, lib
from pype.ftrack.lib.io_nonsingleton import DbConnector
from bson.objectid import ObjectId
class SyncHierarchicalAttrs(BaseAction):
db_con = DbConnector()
ca_mongoid = lib.get_ca_mongoid()
#: Action identifier.
identifier = 'sync.hierarchical.attrs'
#: Action label.
label = "Pype Admin"
variant = '- Sync Hier Attrs (Server)'
#: Action description.
description = 'Synchronize hierarchical attributes'
#: Icon
icon = '{}/ftrack/action_icons/PypeAdmin.svg'.format(
os.environ.get(
'PYPE_STATICS_SERVER',
'http://localhost:{}'.format(
config.get_presets().get('services', {}).get(
'statics_server', {}
).get('default_port', 8021)
)
)
)
def register(self):
self.session.event_hub.subscribe(
'topic=ftrack.action.discover',
self._discover
)
self.session.event_hub.subscribe(
'topic=ftrack.action.launch and data.actionIdentifier={}'.format(
self.identifier
),
self._launch
)
def discover(self, session, entities, event):
''' Validation '''
role_check = False
discover = False
role_list = ['Pypeclub', 'Administrator', 'Project Manager']
user = session.query(
'User where id is "{}"'.format(event['source']['user']['id'])
).one()
for role in user['user_security_roles']:
if role['security_role']['name'] in role_list:
role_check = True
break
if role_check is True:
for entity in entities:
context_type = entity.get('context_type', '').lower()
if (
context_type in ('show', 'task') and
entity.entity_type.lower() != 'task'
):
discover = True
break
return discover
def launch(self, session, entities, event):
self.interface_messages = {}
user = session.query(
'User where id is "{}"'.format(event['source']['user']['id'])
).one()
job = session.create('Job', {
'user': user,
'status': 'running',
'data': json.dumps({
'description': 'Sync Hierachical attributes'
})
})
session.commit()
self.log.debug('Job with id "{}" created'.format(job['id']))
process_session = ftrack_api.Session(
server_url=session.server_url,
api_key=session.api_key,
api_user=session.api_user,
auto_connect_event_hub=True
)
try:
# Collect hierarchical attrs
self.log.debug('Collecting Hierarchical custom attributes started')
custom_attributes = {}
all_avalon_attr = process_session.query(
'CustomAttributeGroup where name is "avalon"'
).one()
error_key = (
'Hierarchical attributes with set "default" value (not allowed)'
)
for cust_attr in all_avalon_attr['custom_attribute_configurations']:
if 'avalon_' in cust_attr['key']:
continue
if not cust_attr['is_hierarchical']:
continue
if cust_attr['default']:
if error_key not in self.interface_messages:
self.interface_messages[error_key] = []
self.interface_messages[error_key].append(
cust_attr['label']
)
self.log.warning((
'Custom attribute "{}" has set default value.'
' This attribute can\'t be synchronized'
).format(cust_attr['label']))
continue
custom_attributes[cust_attr['key']] = cust_attr
self.log.debug(
'Collecting Hierarchical custom attributes has finished'
)
if not custom_attributes:
msg = 'No hierarchical attributes to sync.'
self.log.debug(msg)
return {
'success': True,
'message': msg
}
entity = entities[0]
if entity.entity_type.lower() == 'project':
project_name = entity['full_name']
else:
project_name = entity['project']['full_name']
self.db_con.install()
self.db_con.Session['AVALON_PROJECT'] = project_name
_entities = self._get_entities(event, process_session)
for entity in _entities:
self.log.debug(30*'-')
self.log.debug(
'Processing entity "{}"'.format(entity.get('name', entity))
)
ent_name = entity.get('name', entity)
if entity.entity_type.lower() == 'project':
ent_name = entity['full_name']
for key in custom_attributes:
self.log.debug(30*'*')
self.log.debug(
'Processing Custom attribute key "{}"'.format(key)
)
# check if entity has that attribute
if key not in entity['custom_attributes']:
error_key = 'Missing key on entities'
if error_key not in self.interface_messages:
self.interface_messages[error_key] = []
self.interface_messages[error_key].append(
'- key: "{}" - entity: "{}"'.format(key, ent_name)
)
self.log.error((
'- key "{}" not found on "{}"'
).format(key, entity.get('name', entity)))
continue
value = self.get_hierarchical_value(key, entity)
if value is None:
error_key = (
'Missing value for key on entity'
' and its parents (synchronization was skipped)'
)
if error_key not in self.interface_messages:
self.interface_messages[error_key] = []
self.interface_messages[error_key].append(
'- key: "{}" - entity: "{}"'.format(key, ent_name)
)
self.log.warning((
'- key "{}" not set on "{}" or its parents'
).format(key, ent_name))
continue
self.update_hierarchical_attribute(entity, key, value)
job['status'] = 'done'
session.commit()
except Exception:
self.log.error(
'Action "{}" failed'.format(self.label),
exc_info=True
)
finally:
self.db_con.uninstall()
if job['status'] in ('queued', 'running'):
job['status'] = 'failed'
session.commit()
if self.interface_messages:
self.show_interface_from_dict(
messages=self.interface_messages,
title="something went wrong",
event=event
)
return True
def get_hierarchical_value(self, key, entity):
value = entity['custom_attributes'][key]
if (
value is not None or
entity.entity_type.lower() == 'project'
):
return value
return self.get_hierarchical_value(key, entity['parent'])
def update_hierarchical_attribute(self, entity, key, value):
if (
entity['context_type'].lower() not in ('show', 'task') or
entity.entity_type.lower() == 'task'
):
return
ent_name = entity.get('name', entity)
if entity.entity_type.lower() == 'project':
ent_name = entity['full_name']
hierarchy = '/'.join(
[a['name'] for a in entity.get('ancestors', [])]
)
if hierarchy:
hierarchy = '/'.join(
[entity['project']['full_name'], hierarchy, entity['name']]
)
elif entity.entity_type.lower() == 'project':
hierarchy = entity['full_name']
else:
hierarchy = '/'.join(
[entity['project']['full_name'], entity['name']]
)
self.log.debug('- updating entity "{}"'.format(hierarchy))
# collect entity's custom attributes
custom_attributes = entity.get('custom_attributes')
if not custom_attributes:
return
mongoid = custom_attributes.get(self.ca_mongoid)
if not mongoid:
error_key = 'Missing MongoID on entities (try SyncToAvalon first)'
if error_key not in self.interface_messages:
self.interface_messages[error_key] = []
if ent_name not in self.interface_messages[error_key]:
self.interface_messages[error_key].append(ent_name)
self.log.warning(
'-- entity "{}" is not synchronized to avalon. Skipping'.format(
ent_name
)
)
return
try:
mongoid = ObjectId(mongoid)
except Exception:
error_key = 'Invalid MongoID on entities (try SyncToAvalon)'
if error_key not in self.interface_messages:
self.interface_messages[error_key] = []
if ent_name not in self.interface_messages[error_key]:
self.interface_messages[error_key].append(ent_name)
self.log.warning(
'-- entity "{}" has stored invalid MongoID. Skipping'.format(
ent_name
)
)
return
# Find entity in Mongo DB
mongo_entity = self.db_con.find_one({'_id': mongoid})
if not mongo_entity:
error_key = 'Entities not found in Avalon DB (try SyncToAvalon)'
if error_key not in self.interface_messages:
self.interface_messages[error_key] = []
if ent_name not in self.interface_messages[error_key]:
self.interface_messages[error_key].append(ent_name)
self.log.warning(
'-- entity "{}" was not found in DB by id "{}". Skipping'.format(
ent_name, str(mongoid)
)
)
return
# Change value if entity has set it's own
entity_value = custom_attributes[key]
if entity_value is not None:
value = entity_value
data = mongo_entity.get('data') or {}
data[key] = value
self.db_con.update_many(
{'_id': mongoid},
{'$set': {'data': data}}
)
for child in entity.get('children', []):
self.update_hierarchical_attribute(child, key, value)
def register(session, plugins_presets):
'''Register plugin. Called when used as an plugin.'''
if not isinstance(session, ftrack_api.session.Session):
return
SyncHierarchicalAttrs(session, plugins_presets).register()
def main(arguments=None):
'''Set up logging and register action.'''
if arguments is None:
arguments = []
parser = argparse.ArgumentParser()
# Allow setting of logging level from arguments.
loggingLevels = {}
for level in (
logging.NOTSET, logging.DEBUG, logging.INFO, logging.WARNING,
logging.ERROR, logging.CRITICAL
):
loggingLevels[logging.getLevelName(level).lower()] = level
parser.add_argument(
'-v', '--verbosity',
help='Set the logging output verbosity.',
choices=loggingLevels.keys(),
default='info'
)
namespace = parser.parse_args(arguments)
# Set up basic logging
logging.basicConfig(level=loggingLevels[namespace.verbosity])
session = ftrack_api.Session()
register(session)
# Wait for events
logging.info(
'Registered actions and listening for events. Use Ctrl-C to abort.'
)
session.event_hub.wait()
if __name__ == '__main__':
raise SystemExit(main(sys.argv[1:]))

File diff suppressed because it is too large Load diff

View file

@ -1,6 +1,6 @@
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseEvent, get_ca_mongoid
from pype.ftrack.events.event_sync_to_avalon import Sync_to_Avalon
from pype.ftrack.events.event_sync_to_avalon import SyncToAvalon
class DelAvalonIdFromNew(BaseEvent):
@ -11,7 +11,7 @@ class DelAvalonIdFromNew(BaseEvent):
Priority of this event must be less than SyncToAvalon event
'''
priority = Sync_to_Avalon.priority - 1
priority = SyncToAvalon.priority - 1
def launch(self, session, event):
created = []
@ -53,7 +53,5 @@ class DelAvalonIdFromNew(BaseEvent):
def register(session, plugins_presets):
'''Register plugin. Called when used as an plugin.'''
if not isinstance(session, ftrack_api.session.Session):
return
DelAvalonIdFromNew(session, plugins_presets).register()

View file

@ -1,4 +1,4 @@
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseEvent
import operator
@ -80,15 +80,13 @@ class NextTaskUpdate(BaseEvent):
'>>> [ {} ] updated to [ Ready ]'
).format(path))
except Exception as e:
session.rollback()
self.log.warning((
'!!! [ {} ] status couldnt be set: [ {} ]'
).format(path, e))
session.rollback()
).format(path, str(e)), exc_info=True)
def register(session, plugins_presets):
'''Register plugin. Called when used as an plugin.'''
if not isinstance(session, ftrack_api.session.Session):
return
NextTaskUpdate(session, plugins_presets).register()

View file

@ -1,8 +1,8 @@
from pype.vendor import ftrack_api
from pype.ftrack import BaseEvent
import ftrack_api
from pype.ftrack.lib import BaseEvent
class Radio_buttons(BaseEvent):
class RadioButtons(BaseEvent):
ignore_me = True
@ -36,7 +36,5 @@ class Radio_buttons(BaseEvent):
def register(session, plugins_presets):
'''Register plugin. Called when used as an plugin.'''
if not isinstance(session, ftrack_api.session.Session):
return
Radio_buttons(session, plugins_presets).register()
RadioButtons(session, plugins_presets).register()

View file

@ -3,7 +3,7 @@ import sys
from pype.ftrack.lib.io_nonsingleton import DbConnector
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseEvent, lib
from bson.objectid import ObjectId
@ -209,7 +209,5 @@ class SyncHierarchicalAttrs(BaseEvent):
def register(session, plugins_presets):
'''Register plugin. Called when used as an plugin.'''
if not isinstance(session, ftrack_api.session.Session):
return
SyncHierarchicalAttrs(session, plugins_presets).register()

View file

@ -1,8 +1,8 @@
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseEvent, lib
class Sync_to_Avalon(BaseEvent):
class SyncToAvalon(BaseEvent):
priority = 100
@ -31,6 +31,42 @@ class Sync_to_Avalon(BaseEvent):
ft_project = session.get(base_proj['type'], base_proj['id'])
break
for ent_info in event['data']['entities']:
# filter project
if ent_info.get("entityType") != "show":
continue
if ent_info.get("action") != "update":
continue
changes = ent_info.get("changes") or {}
if 'avalon_auto_sync' not in changes:
continue
auto_sync = changes['avalon_auto_sync']["new"]
if auto_sync == "1":
# Trigger sync to avalon action if auto sync was turned on
self.log.debug((
"Auto sync was turned on for project <{}>."
" Triggering syncToAvalon action."
).format(ft_project["full_name"]))
selection = [{
"entityId": ft_project["id"],
"entityType": "show"
}]
# Stop event so sync hierarchical won't be affected
# - other event should not be affected since auto-sync
# is in all cases single data event
event.stop()
# Trigger action
self.trigger_action(
action_name="sync.to.avalon.server",
event=event,
selection=selection
)
# Exit for both cases
return True
# check if project is set to auto-sync
if (
ft_project is None or
@ -101,6 +137,9 @@ class Sync_to_Avalon(BaseEvent):
avalon_project = result['project']
except Exception as e:
# reset session to clear it
session.rollback()
message = str(e)
title = 'Hey You! Unknown Error has been raised! (*look below*)'
ftrack_message = (
@ -122,8 +161,4 @@ class Sync_to_Avalon(BaseEvent):
def register(session, plugins_presets):
'''Register plugin. Called when used as an plugin.'''
if not isinstance(session, ftrack_api.session.Session):
return
Sync_to_Avalon(session, plugins_presets).register()
SyncToAvalon(session, plugins_presets).register()

View file

@ -1,14 +1,14 @@
import os
import sys
import re
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseEvent
class Test_Event(BaseEvent):
class TestEvent(BaseEvent):
ignore_me = True
priority = 10000
def launch(self, session, event):
@ -22,7 +22,5 @@ class Test_Event(BaseEvent):
def register(session, plugins_presets):
'''Register plugin. Called when used as an plugin.'''
if not isinstance(session, ftrack_api.session.Session):
return
Test_Event(session, plugins_presets).register()
TestEvent(session, plugins_presets).register()

View file

@ -1,4 +1,4 @@
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseEvent
@ -20,7 +20,8 @@ class ThumbnailEvents(BaseEvent):
if parent.get('thumbnail') and not task.get('thumbnail'):
task['thumbnail'] = parent['thumbnail']
self.log.info('>>> Updated thumbnail on [ %s/%s ]'.format(
parent['name'], task['name']))
parent['name'], task['name']
))
# Update task thumbnail from published version
# if (entity['entityType'] == 'assetversion' and
@ -32,22 +33,29 @@ class ThumbnailEvents(BaseEvent):
version = session.get('AssetVersion', entity['entityId'])
thumbnail = version.get('thumbnail')
task = version['task']
if thumbnail:
task['thumbnail'] = thumbnail
task['parent']['thumbnail'] = thumbnail
self.log.info('>>> Updating thumbnail for task and shot\
[ {} ]'.format(task['name']))
parent = version['asset']['parent']
task = version['task']
parent['thumbnail_id'] = version['thumbnail_id']
if parent.entity_type.lower() == "project":
name = parent["full_name"]
else:
name = parent["name"]
msg = '>>> Updating thumbnail for shot [ {} ]'.format(name)
session.commit()
if task:
task['thumbnail_id'] = version['thumbnail_id']
msg += " and task [ {} ]".format(task["name"])
pass
self.log.info(msg)
try:
session.commit()
except Exception:
session.rollback()
def register(session, plugins_presets):
'''Register plugin. Called when used as an plugin.'''
if not isinstance(session, ftrack_api.session.Session):
return
ThumbnailEvents(session, plugins_presets).register()

View file

@ -1,4 +1,4 @@
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseEvent, lib
from pype.ftrack.lib.io_nonsingleton import DbConnector
from bson.objectid import ObjectId
@ -233,7 +233,5 @@ def register(session, plugins_presets):
"""
Register plugin. Called when used as an plugin.
"""
if not isinstance(session, ftrack_api.session.Session):
return
UserAssigmentEvent(session, plugins_presets).register()

View file

@ -1,4 +1,4 @@
from pype.vendor import ftrack_api
import ftrack_api
from pype.ftrack import BaseEvent
@ -6,7 +6,6 @@ class VersionToTaskStatus(BaseEvent):
def launch(self, session, event):
'''Propagates status from version to task when changed'''
session.commit()
# start of event procedure ----------------------------------
for entity in event['data'].get('entities', []):
@ -62,8 +61,10 @@ class VersionToTaskStatus(BaseEvent):
task['status'] = task_status
session.commit()
except Exception as e:
session.rollback()
self.log.warning('!!! [ {} ] status couldnt be set:\
[ {} ]'.format(path, e))
session.rollback()
else:
self.log.info('>>> [ {} ] updated to [ {} ]'.format(
path, task_status['name']))
@ -71,7 +72,5 @@ class VersionToTaskStatus(BaseEvent):
def register(session, plugins_presets):
'''Register plugin. Called when used as an plugin.'''
if not isinstance(session, ftrack_api.session.Session):
return
VersionToTaskStatus(session, plugins_presets).register()

View file

@ -1,7 +1,2 @@
from .ftrack_server import FtrackServer
from . import event_server_cli
__all__ = [
'event_server_cli',
'FtrackServer'
]
from .lib import check_ftrack_url

View file

@ -1,48 +1,58 @@
import os
import sys
import signal
import datetime
import subprocess
import socket
import argparse
import requests
from pype.vendor import ftrack_api
from pype.ftrack import credentials
import atexit
import time
from urllib.parse import urlparse
import ftrack_api
from pype.ftrack.lib import credentials
from pype.ftrack.ftrack_server import FtrackServer
from pypeapp import Logger
log = Logger().get_logger('Ftrack event server', "ftrack-event-server-cli")
from pype.ftrack.ftrack_server.lib import (
ftrack_events_mongo_settings, check_ftrack_url
)
import socket_thread
def check_url(url):
if not url:
log.error('Ftrack URL is not set!')
return None
class MongoPermissionsError(Exception):
"""Is used when is created multiple objects of same RestApi class."""
def __init__(self, message=None):
if not message:
message = "Exiting because have issue with acces to MongoDB"
super().__init__(message)
url = url.strip('/ ')
if 'http' not in url:
if url.endswith('ftrackapp.com'):
url = 'https://' + url
else:
url = 'https://{0}.ftrackapp.com'.format(url)
def check_mongo_url(host, port, log_error=False):
"""Checks if mongo server is responding"""
sock = None
try:
result = requests.get(url, allow_redirects=False)
except requests.exceptions.RequestException:
log.error('Entered Ftrack URL is not accesible!')
return None
sock = socket.create_connection(
(host, port),
timeout=1
)
return True
except socket.error as err:
if log_error:
print("Can't connect to MongoDB at {}:{} because: {}".format(
host, port, err
))
return False
finally:
if sock is not None:
sock.close()
if (result.status_code != 200 or 'FTRACK_VERSION' not in result.headers):
log.error('Entered Ftrack URL is not accesible!')
return None
log.debug('Ftrack server {} is accessible.'.format(url))
return url
def validate_credentials(url, user, api):
first_validation = True
if not user:
log.error('Ftrack Username is not set! Exiting.')
print('- Ftrack Username is not set')
first_validation = False
if not api:
log.error('Ftrack API key is not set! Exiting.')
print('- Ftrack API key is not set')
first_validation = False
if not first_validation:
return False
@ -55,21 +65,21 @@ def validate_credentials(url, user, api):
)
session.close()
except Exception as e:
log.error(
'Can\'t log into Ftrack with used credentials:'
print(
'ERROR: Can\'t log into Ftrack with used credentials:'
' Ftrack server: "{}" // Username: {} // API key: {}'.format(
url, user, api
))
return False
log.debug('Credentials Username: "{}", API key: "{}" are valid.'.format(
print('DEBUG: Credentials Username: "{}", API key: "{}" are valid.'.format(
user, api
))
return True
def process_event_paths(event_paths):
log.debug('Processing event paths: {}.'.format(str(event_paths)))
print('DEBUG: Processing event paths: {}.'.format(str(event_paths)))
return_paths = []
not_found = []
if not event_paths:
@ -87,14 +97,250 @@ def process_event_paths(event_paths):
return os.pathsep.join(return_paths), not_found
def run_event_server(ftrack_url, username, api_key, event_paths):
os.environ['FTRACK_SERVER'] = ftrack_url
os.environ['FTRACK_API_USER'] = username
os.environ['FTRACK_API_KEY'] = api_key
os.environ['FTRACK_EVENTS_PATH'] = event_paths
def legacy_server(ftrack_url):
# Current file
file_path = os.path.dirname(os.path.realpath(__file__))
min_fail_seconds = 5
max_fail_count = 3
wait_time_after_max_fail = 10
subproc = None
subproc_path = "{}/sub_legacy_server.py".format(file_path)
subproc_last_failed = datetime.datetime.now()
subproc_failed_count = 0
ftrack_accessible = False
printed_ftrack_error = False
while True:
if not ftrack_accessible:
ftrack_accessible = check_ftrack_url(ftrack_url)
# Run threads only if Ftrack is accessible
if not ftrack_accessible and not printed_ftrack_error:
print("Can't access Ftrack {} <{}>".format(
ftrack_url, str(datetime.datetime.now())
))
if subproc is not None:
if subproc.poll() is None:
subproc.terminate()
subproc = None
printed_ftrack_error = True
time.sleep(1)
continue
printed_ftrack_error = False
if subproc is None:
if subproc_failed_count < max_fail_count:
subproc = subprocess.Popen(
["python", subproc_path],
stdout=subprocess.PIPE
)
elif subproc_failed_count == max_fail_count:
print((
"Storer failed {}times I'll try to run again {}s later"
).format(str(max_fail_count), str(wait_time_after_max_fail)))
subproc_failed_count += 1
elif ((
datetime.datetime.now() - subproc_last_failed
).seconds > wait_time_after_max_fail):
subproc_failed_count = 0
# If thread failed test Ftrack and Mongo connection
elif subproc.poll() is not None:
subproc = None
ftrack_accessible = False
_subproc_last_failed = datetime.datetime.now()
delta_time = (_subproc_last_failed - subproc_last_failed).seconds
if delta_time < min_fail_seconds:
subproc_failed_count += 1
else:
subproc_failed_count = 0
subproc_last_failed = _subproc_last_failed
time.sleep(1)
def main_loop(ftrack_url):
""" This is main loop of event handling.
Loop is handling threads which handles subprocesses of event storer and
processor. When one of threads is stopped it is tested to connect to
ftrack and mongo server. Threads are not started when ftrack or mongo
server is not accessible. When threads are started it is checked for socket
signals as heartbeat. Heartbeat must become at least once per 30sec
otherwise thread will be killed.
"""
# Get mongo hostname and port for testing mongo connection
mongo_list = ftrack_events_mongo_settings()
mongo_hostname = mongo_list[0]
mongo_port = mongo_list[1]
# Current file
file_path = os.path.dirname(os.path.realpath(__file__))
min_fail_seconds = 5
max_fail_count = 3
wait_time_after_max_fail = 10
# Threads data
storer_name = "StorerThread"
storer_port = 10001
storer_path = "{}/sub_event_storer.py".format(file_path)
storer_thread = None
storer_last_failed = datetime.datetime.now()
storer_failed_count = 0
processor_name = "ProcessorThread"
processor_port = 10011
processor_path = "{}/sub_event_processor.py".format(file_path)
processor_thread = None
processor_last_failed = datetime.datetime.now()
processor_failed_count = 0
ftrack_accessible = False
mongo_accessible = False
printed_ftrack_error = False
printed_mongo_error = False
# stop threads on exit
# TODO check if works and args have thread objects!
def on_exit(processor_thread, storer_thread):
if processor_thread is not None:
processor_thread.stop()
processor_thread.join()
processor_thread = None
if storer_thread is not None:
storer_thread.stop()
storer_thread.join()
storer_thread = None
atexit.register(
on_exit, processor_thread=processor_thread, storer_thread=storer_thread
)
# Main loop
while True:
# Check if accessible Ftrack and Mongo url
if not ftrack_accessible:
ftrack_accessible = check_ftrack_url(ftrack_url)
if not mongo_accessible:
mongo_accessible = check_mongo_url(mongo_hostname, mongo_port)
# Run threads only if Ftrack is accessible
if not ftrack_accessible or not mongo_accessible:
if not mongo_accessible and not printed_mongo_error:
mongo_url = mongo_hostname + ":" + mongo_port
print("Can't access Mongo {}".format(mongo_url))
if not ftrack_accessible and not printed_ftrack_error:
print("Can't access Ftrack {}".format(ftrack_url))
if storer_thread is not None:
storer_thread.stop()
storer_thread.join()
storer_thread = None
if processor_thread is not None:
processor_thread.stop()
processor_thread.join()
processor_thread = None
printed_ftrack_error = True
printed_mongo_error = True
time.sleep(1)
continue
printed_ftrack_error = False
printed_mongo_error = False
# Run backup thread which does not requeire mongo to work
if storer_thread is None:
if storer_failed_count < max_fail_count:
storer_thread = socket_thread.SocketThread(
storer_name, storer_port, storer_path
)
storer_thread.start()
elif storer_failed_count == max_fail_count:
print((
"Storer failed {}times I'll try to run again {}s later"
).format(str(max_fail_count), str(wait_time_after_max_fail)))
storer_failed_count += 1
elif ((
datetime.datetime.now() - storer_last_failed
).seconds > wait_time_after_max_fail):
storer_failed_count = 0
# If thread failed test Ftrack and Mongo connection
elif not storer_thread.isAlive():
if storer_thread.mongo_error:
raise MongoPermissionsError()
storer_thread.join()
storer_thread = None
ftrack_accessible = False
mongo_accessible = False
_storer_last_failed = datetime.datetime.now()
delta_time = (_storer_last_failed - storer_last_failed).seconds
if delta_time < min_fail_seconds:
storer_failed_count += 1
else:
storer_failed_count = 0
storer_last_failed = _storer_last_failed
if processor_thread is None:
if processor_failed_count < max_fail_count:
processor_thread = socket_thread.SocketThread(
processor_name, processor_port, processor_path
)
processor_thread.start()
elif processor_failed_count == max_fail_count:
print((
"Processor failed {}times in row"
" I'll try to run again {}s later"
).format(str(max_fail_count), str(wait_time_after_max_fail)))
processor_failed_count += 1
elif ((
datetime.datetime.now() - processor_last_failed
).seconds > wait_time_after_max_fail):
processor_failed_count = 0
# If thread failed test Ftrack and Mongo connection
elif not processor_thread.isAlive():
if storer_thread.mongo_error:
raise Exception(
"Exiting because have issue with acces to MongoDB"
)
processor_thread.join()
processor_thread = None
ftrack_accessible = False
mongo_accessible = False
_processor_last_failed = datetime.datetime.now()
delta_time = (
_processor_last_failed - processor_last_failed
).seconds
if delta_time < min_fail_seconds:
processor_failed_count += 1
else:
processor_failed_count = 0
processor_last_failed = _processor_last_failed
time.sleep(1)
server = FtrackServer('event')
server.run_server()
def main(argv):
'''
@ -184,7 +430,11 @@ def main(argv):
help="Load creadentials from apps dir",
action="store_true"
)
parser.add_argument(
'-legacy',
help="Load creadentials from apps dir",
action="store_true"
)
ftrack_url = os.environ.get('FTRACK_SERVER')
username = os.environ.get('FTRACK_API_USER')
api_key = os.environ.get('FTRACK_API_KEY')
@ -209,33 +459,53 @@ def main(argv):
if kwargs.ftrackapikey:
api_key = kwargs.ftrackapikey
legacy = kwargs.legacy
# Check url regex and accessibility
ftrack_url = check_url(ftrack_url)
ftrack_url = check_ftrack_url(ftrack_url)
if not ftrack_url:
print('Exiting! < Please enter Ftrack server url >')
return 1
# Validate entered credentials
if not validate_credentials(ftrack_url, username, api_key):
print('Exiting! < Please enter valid credentials >')
return 1
# Process events path
event_paths, not_found = process_event_paths(event_paths)
if not_found:
log.warning(
'These paths were not found: {}'.format(str(not_found))
print(
'WARNING: These paths were not found: {}'.format(str(not_found))
)
if not event_paths:
if not_found:
log.error('Any of entered paths is valid or can be accesible.')
print('ERROR: Any of entered paths is valid or can be accesible.')
else:
log.error('Paths to events are not set. Exiting.')
print('ERROR: Paths to events are not set. Exiting.')
return 1
if kwargs.storecred:
credentials._save_credentials(username, api_key, True)
run_event_server(ftrack_url, username, api_key, event_paths)
# Set Ftrack environments
os.environ["FTRACK_SERVER"] = ftrack_url
os.environ["FTRACK_API_USER"] = username
os.environ["FTRACK_API_KEY"] = api_key
os.environ["FTRACK_EVENTS_PATH"] = event_paths
if legacy:
return legacy_server(ftrack_url)
return main_loop(ftrack_url)
if (__name__ == ('__main__')):
if __name__ == "__main__":
# Register interupt signal
def signal_handler(sig, frame):
print("You pressed Ctrl+C. Process ended.")
sys.exit(0)
signal.signal(signal.SIGINT, signal_handler)
signal.signal(signal.SIGTERM, signal_handler)
sys.exit(main(sys.argv))

View file

@ -2,7 +2,7 @@ import os
import sys
import types
import importlib
from pype.vendor import ftrack_api
import ftrack_api
import time
import logging
import inspect
@ -100,7 +100,10 @@ class FtrackServer:
log.warning(msg, exc_info=e)
if len(register_functions_dict) < 1:
raise Exception
raise Exception((
"There are no events with register function."
" Registered paths: \"{}\""
).format("| ".join(paths)))
# Load presets for setting plugins
key = "user"
@ -126,23 +129,27 @@ class FtrackServer:
msg = '"{}" - register was not successful ({})'.format(
function_dict['name'], str(exc)
)
log.warning(msg)
log.warning(msg, exc_info=True)
def run_server(self):
self.session = ftrack_api.Session(auto_connect_event_hub=True,)
def run_server(self, session=None, load_files=True):
if not session:
session = ftrack_api.Session(auto_connect_event_hub=True)
paths_str = os.environ.get(self.env_key)
if paths_str is None:
log.error((
"Env var \"{}\" is not set, \"{}\" server won\'t launch"
).format(self.env_key, self.server_type))
return
self.session = session
paths = paths_str.split(os.pathsep)
self.set_files(paths)
if load_files:
paths_str = os.environ.get(self.env_key)
if paths_str is None:
log.error((
"Env var \"{}\" is not set, \"{}\" server won\'t launch"
).format(self.env_key, self.server_type))
return
log.info(60*"*")
log.info('Registration of actions/events has finished!')
paths = paths_str.split(os.pathsep)
self.set_files(paths)
log.info(60*"*")
log.info('Registration of actions/events has finished!')
# keep event_hub on session running
self.session.event_hub.wait()

View file

@ -0,0 +1,99 @@
import os
import requests
try:
from urllib.parse import urlparse, parse_qs
except ImportError:
from urlparse import urlparse, parse_qs
def ftrack_events_mongo_settings():
host = None
port = None
username = None
password = None
collection = None
database = None
auth_db = ""
if os.environ.get('FTRACK_EVENTS_MONGO_URL'):
result = urlparse(os.environ['FTRACK_EVENTS_MONGO_URL'])
host = result.hostname
try:
port = result.port
except ValueError:
raise RuntimeError("invalid port specified")
username = result.username
password = result.password
try:
database = result.path.lstrip("/").split("/")[0]
collection = result.path.lstrip("/").split("/")[1]
except IndexError:
if not database:
raise RuntimeError("missing database name for logging")
try:
auth_db = parse_qs(result.query)['authSource'][0]
except KeyError:
# no auth db provided, mongo will use the one we are connecting to
pass
else:
host = os.environ.get('FTRACK_EVENTS_MONGO_HOST')
port = int(os.environ.get('FTRACK_EVENTS_MONGO_PORT', "0"))
database = os.environ.get('FTRACK_EVENTS_MONGO_DB')
username = os.environ.get('FTRACK_EVENTS_MONGO_USER')
password = os.environ.get('FTRACK_EVENTS_MONGO_PASSWORD')
collection = os.environ.get('FTRACK_EVENTS_MONGO_COL')
auth_db = os.environ.get('FTRACK_EVENTS_MONGO_AUTH_DB', 'avalon')
return host, port, database, username, password, collection, auth_db
def get_ftrack_event_mongo_info():
host, port, database, username, password, collection, auth_db = ftrack_events_mongo_settings()
user_pass = ""
if username and password:
user_pass = "{}:{}@".format(username, password)
socket_path = "{}:{}".format(host, port)
dab = ""
if database:
dab = "/{}".format(database)
auth = ""
if auth_db:
auth = "?authSource={}".format(auth_db)
url = "mongodb://{}{}{}{}".format(user_pass, socket_path, dab, auth)
return url, database, collection
def check_ftrack_url(url, log_errors=True):
"""Checks if Ftrack server is responding"""
if not url:
print('ERROR: Ftrack URL is not set!')
return None
url = url.strip('/ ')
if 'http' not in url:
if url.endswith('ftrackapp.com'):
url = 'https://' + url
else:
url = 'https://{0}.ftrackapp.com'.format(url)
try:
result = requests.get(url, allow_redirects=False)
except requests.exceptions.RequestException:
if log_errors:
print('ERROR: Entered Ftrack URL is not accesible!')
return False
if (result.status_code != 200 or 'FTRACK_VERSION' not in result.headers):
if log_errors:
print('ERROR: Entered Ftrack URL is not accesible!')
return False
print('DEBUG: Ftrack server {} is accessible.'.format(url))
return url

View file

@ -0,0 +1,292 @@
import logging
import os
import atexit
import datetime
import tempfile
import threading
import time
import requests
import queue
import pymongo
import ftrack_api
import ftrack_api.session
import ftrack_api.cache
import ftrack_api.operation
import ftrack_api._centralized_storage_scenario
import ftrack_api.event
from ftrack_api.logging import LazyLogMessage as L
from pype.ftrack.lib.custom_db_connector import DbConnector
from pype.ftrack.ftrack_server.lib import get_ftrack_event_mongo_info
from pypeapp import Logger
log = Logger().get_logger("Session processor")
class ProcessEventHub(ftrack_api.event.hub.EventHub):
url, database, table_name = get_ftrack_event_mongo_info()
is_table_created = False
def __init__(self, *args, **kwargs):
self.dbcon = DbConnector(
mongo_url=self.url,
database_name=self.database,
table_name=self.table_name
)
self.sock = kwargs.pop("sock")
super(ProcessEventHub, self).__init__(*args, **kwargs)
def prepare_dbcon(self):
try:
self.dbcon.install()
self.dbcon._database.collection_names()
except pymongo.errors.AutoReconnect:
log.error("Mongo server \"{}\" is not responding, exiting.".format(
os.environ["AVALON_MONGO"]
))
sys.exit(0)
except pymongo.errors.OperationFailure:
log.error((
"Error with Mongo access, probably permissions."
"Check if exist database with name \"{}\""
" and collection \"{}\" inside."
).format(self.database, self.table_name))
self.sock.sendall(b"MongoError")
sys.exit(0)
def wait(self, duration=None):
"""Overriden wait
Event are loaded from Mongo DB when queue is empty. Handled event is
set as processed in Mongo DB.
"""
started = time.time()
self.prepare_dbcon()
while True:
try:
event = self._event_queue.get(timeout=0.1)
except queue.Empty:
if not self.load_events():
time.sleep(0.5)
else:
try:
self._handle(event)
self.dbcon.update_one(
{"id": event["id"]},
{"$set": {"pype_data.is_processed": True}}
)
except pymongo.errors.AutoReconnect:
log.error((
"Mongo server \"{}\" is not responding, exiting."
).format(os.environ["AVALON_MONGO"]))
sys.exit(0)
# Additional special processing of events.
if event['topic'] == 'ftrack.meta.disconnected':
break
if duration is not None:
if (time.time() - started) > duration:
break
def load_events(self):
"""Load not processed events sorted by stored date"""
ago_date = datetime.datetime.now() - datetime.timedelta(days=3)
result = self.dbcon.delete_many({
"pype_data.stored": {"$lte": ago_date},
"pype_data.is_processed": True
})
not_processed_events = self.dbcon.find(
{"pype_data.is_processed": False}
).sort(
[("pype_data.stored", pymongo.ASCENDING)]
)
found = False
for event_data in not_processed_events:
new_event_data = {
k: v for k, v in event_data.items()
if k not in ["_id", "pype_data"]
}
try:
event = ftrack_api.event.base.Event(**new_event_data)
except Exception:
self.logger.exception(L(
'Failed to convert payload into event: {0}',
event_data
))
continue
found = True
self._event_queue.put(event)
return found
def _handle_packet(self, code, packet_identifier, path, data):
"""Override `_handle_packet` which skip events and extend heartbeat"""
code_name = self._code_name_mapping[code]
if code_name == "event":
return
if code_name == "heartbeat":
self.sock.sendall(b"processor")
return self._send_packet(self._code_name_mapping["heartbeat"])
return super()._handle_packet(code, packet_identifier, path, data)
class ProcessSession(ftrack_api.session.Session):
'''An isolated session for interaction with an ftrack server.'''
def __init__(
self, server_url=None, api_key=None, api_user=None, auto_populate=True,
plugin_paths=None, cache=None, cache_key_maker=None,
auto_connect_event_hub=None, schema_cache_path=None,
plugin_arguments=None, sock=None
):
super(ftrack_api.session.Session, self).__init__()
self.logger = logging.getLogger(
__name__ + '.' + self.__class__.__name__
)
self._closed = False
if server_url is None:
server_url = os.environ.get('FTRACK_SERVER')
if not server_url:
raise TypeError(
'Required "server_url" not specified. Pass as argument or set '
'in environment variable FTRACK_SERVER.'
)
self._server_url = server_url
if api_key is None:
api_key = os.environ.get(
'FTRACK_API_KEY',
# Backwards compatibility
os.environ.get('FTRACK_APIKEY')
)
if not api_key:
raise TypeError(
'Required "api_key" not specified. Pass as argument or set in '
'environment variable FTRACK_API_KEY.'
)
self._api_key = api_key
if api_user is None:
api_user = os.environ.get('FTRACK_API_USER')
if not api_user:
try:
api_user = getpass.getuser()
except Exception:
pass
if not api_user:
raise TypeError(
'Required "api_user" not specified. Pass as argument, set in '
'environment variable FTRACK_API_USER or one of the standard '
'environment variables used by Python\'s getpass module.'
)
self._api_user = api_user
# Currently pending operations.
self.recorded_operations = ftrack_api.operation.Operations()
self.record_operations = True
self.cache_key_maker = cache_key_maker
if self.cache_key_maker is None:
self.cache_key_maker = ftrack_api.cache.StringKeyMaker()
# Enforce always having a memory cache at top level so that the same
# in-memory instance is returned from session.
self.cache = ftrack_api.cache.LayeredCache([
ftrack_api.cache.MemoryCache()
])
if cache is not None:
if callable(cache):
cache = cache(self)
if cache is not None:
self.cache.caches.append(cache)
self._managed_request = None
self._request = requests.Session()
self._request.auth = ftrack_api.session.SessionAuthentication(
self._api_key, self._api_user
)
self.auto_populate = auto_populate
# Fetch server information and in doing so also check credentials.
self._server_information = self._fetch_server_information()
# Now check compatibility of server based on retrieved information.
self.check_server_compatibility()
# Construct event hub and load plugins.
self._event_hub = ProcessEventHub(
self._server_url,
self._api_user,
self._api_key,
sock=sock
)
self._auto_connect_event_hub_thread = None
if auto_connect_event_hub in (None, True):
# Connect to event hub in background thread so as not to block main
# session usage waiting for event hub connection.
self._auto_connect_event_hub_thread = threading.Thread(
target=self._event_hub.connect
)
self._auto_connect_event_hub_thread.daemon = True
self._auto_connect_event_hub_thread.start()
# To help with migration from auto_connect_event_hub default changing
# from True to False.
self._event_hub._deprecation_warning_auto_connect = (
auto_connect_event_hub is None
)
# Register to auto-close session on exit.
atexit.register(self.close)
self._plugin_paths = plugin_paths
if self._plugin_paths is None:
self._plugin_paths = os.environ.get(
'FTRACK_EVENT_PLUGIN_PATH', ''
).split(os.pathsep)
self._discover_plugins(plugin_arguments=plugin_arguments)
# TODO: Make schemas read-only and non-mutable (or at least without
# rebuilding types)?
if schema_cache_path is not False:
if schema_cache_path is None:
schema_cache_path = os.environ.get(
'FTRACK_API_SCHEMA_CACHE_PATH', tempfile.gettempdir()
)
schema_cache_path = os.path.join(
schema_cache_path, 'ftrack_api_schema_cache.json'
)
self.schemas = self._load_schemas(schema_cache_path)
self.types = self._build_entity_type_classes(self.schemas)
ftrack_api._centralized_storage_scenario.register(self)
self._configure_locations()
self.event_hub.publish(
ftrack_api.event.base.Event(
topic='ftrack.api.session.ready',
data=dict(
session=self
)
),
synchronous=True
)

View file

@ -0,0 +1,257 @@
import logging
import os
import atexit
import tempfile
import threading
import requests
import ftrack_api
import ftrack_api.session
import ftrack_api.cache
import ftrack_api.operation
import ftrack_api._centralized_storage_scenario
import ftrack_api.event
from ftrack_api.logging import LazyLogMessage as L
class StorerEventHub(ftrack_api.event.hub.EventHub):
def __init__(self, *args, **kwargs):
self.sock = kwargs.pop("sock")
super(StorerEventHub, self).__init__(*args, **kwargs)
def _handle_packet(self, code, packet_identifier, path, data):
"""Override `_handle_packet` which extend heartbeat"""
if self._code_name_mapping[code] == "heartbeat":
# Reply with heartbeat.
self.sock.sendall(b"storer")
return self._send_packet(self._code_name_mapping['heartbeat'])
return super(StorerEventHub, self)._handle_packet(
code, packet_identifier, path, data
)
class StorerSession(ftrack_api.session.Session):
'''An isolated session for interaction with an ftrack server.'''
def __init__(
self, server_url=None, api_key=None, api_user=None, auto_populate=True,
plugin_paths=None, cache=None, cache_key_maker=None,
auto_connect_event_hub=None, schema_cache_path=None,
plugin_arguments=None, sock=None
):
'''Initialise session.
*server_url* should be the URL of the ftrack server to connect to
including any port number. If not specified attempt to look up from
:envvar:`FTRACK_SERVER`.
*api_key* should be the API key to use for authentication whilst
*api_user* should be the username of the user in ftrack to record
operations against. If not specified, *api_key* should be retrieved
from :envvar:`FTRACK_API_KEY` and *api_user* from
:envvar:`FTRACK_API_USER`.
If *auto_populate* is True (the default), then accessing entity
attributes will cause them to be automatically fetched from the server
if they are not already. This flag can be changed on the session
directly at any time.
*plugin_paths* should be a list of paths to search for plugins. If not
specified, default to looking up :envvar:`FTRACK_EVENT_PLUGIN_PATH`.
*cache* should be an instance of a cache that fulfils the
:class:`ftrack_api.cache.Cache` interface and will be used as the cache
for the session. It can also be a callable that will be called with the
session instance as sole argument. The callable should return ``None``
if a suitable cache could not be configured, but session instantiation
can continue safely.
.. note::
The session will add the specified cache to a pre-configured layered
cache that specifies the top level cache as a
:class:`ftrack_api.cache.MemoryCache`. Therefore, it is unnecessary
to construct a separate memory cache for typical behaviour. Working
around this behaviour or removing the memory cache can lead to
unexpected behaviour.
*cache_key_maker* should be an instance of a key maker that fulfils the
:class:`ftrack_api.cache.KeyMaker` interface and will be used to
generate keys for objects being stored in the *cache*. If not specified,
a :class:`~ftrack_api.cache.StringKeyMaker` will be used.
If *auto_connect_event_hub* is True then embedded event hub will be
automatically connected to the event server and allow for publishing and
subscribing to **non-local** events. If False, then only publishing and
subscribing to **local** events will be possible until the hub is
manually connected using :meth:`EventHub.connect
<ftrack_api.event.hub.EventHub.connect>`.
.. note::
The event hub connection is performed in a background thread to
improve session startup time. If a registered plugin requires a
connected event hub then it should check the event hub connection
status explicitly. Subscribing to events does *not* require a
connected event hub.
Enable schema caching by setting *schema_cache_path* to a folder path.
If not set, :envvar:`FTRACK_API_SCHEMA_CACHE_PATH` will be used to
determine the path to store cache in. If the environment variable is
also not specified then a temporary directory will be used. Set to
`False` to disable schema caching entirely.
*plugin_arguments* should be an optional mapping (dict) of keyword
arguments to pass to plugin register functions upon discovery. If a
discovered plugin has a signature that is incompatible with the passed
arguments, the discovery mechanism will attempt to reduce the passed
arguments to only those that the plugin accepts. Note that a warning
will be logged in this case.
'''
super(ftrack_api.session.Session, self).__init__()
self.logger = logging.getLogger(
__name__ + '.' + self.__class__.__name__
)
self._closed = False
if server_url is None:
server_url = os.environ.get('FTRACK_SERVER')
if not server_url:
raise TypeError(
'Required "server_url" not specified. Pass as argument or set '
'in environment variable FTRACK_SERVER.'
)
self._server_url = server_url
if api_key is None:
api_key = os.environ.get(
'FTRACK_API_KEY',
# Backwards compatibility
os.environ.get('FTRACK_APIKEY')
)
if not api_key:
raise TypeError(
'Required "api_key" not specified. Pass as argument or set in '
'environment variable FTRACK_API_KEY.'
)
self._api_key = api_key
if api_user is None:
api_user = os.environ.get('FTRACK_API_USER')
if not api_user:
try:
api_user = getpass.getuser()
except Exception:
pass
if not api_user:
raise TypeError(
'Required "api_user" not specified. Pass as argument, set in '
'environment variable FTRACK_API_USER or one of the standard '
'environment variables used by Python\'s getpass module.'
)
self._api_user = api_user
# Currently pending operations.
self.recorded_operations = ftrack_api.operation.Operations()
self.record_operations = True
self.cache_key_maker = cache_key_maker
if self.cache_key_maker is None:
self.cache_key_maker = ftrack_api.cache.StringKeyMaker()
# Enforce always having a memory cache at top level so that the same
# in-memory instance is returned from session.
self.cache = ftrack_api.cache.LayeredCache([
ftrack_api.cache.MemoryCache()
])
if cache is not None:
if callable(cache):
cache = cache(self)
if cache is not None:
self.cache.caches.append(cache)
self._managed_request = None
self._request = requests.Session()
self._request.auth = ftrack_api.session.SessionAuthentication(
self._api_key, self._api_user
)
self.auto_populate = auto_populate
# Fetch server information and in doing so also check credentials.
self._server_information = self._fetch_server_information()
# Now check compatibility of server based on retrieved information.
self.check_server_compatibility()
# Construct event hub and load plugins.
self._event_hub = StorerEventHub(
self._server_url,
self._api_user,
self._api_key,
sock=sock
)
self._auto_connect_event_hub_thread = None
if auto_connect_event_hub in (None, True):
# Connect to event hub in background thread so as not to block main
# session usage waiting for event hub connection.
self._auto_connect_event_hub_thread = threading.Thread(
target=self._event_hub.connect
)
self._auto_connect_event_hub_thread.daemon = True
self._auto_connect_event_hub_thread.start()
# To help with migration from auto_connect_event_hub default changing
# from True to False.
self._event_hub._deprecation_warning_auto_connect = (
auto_connect_event_hub is None
)
# Register to auto-close session on exit.
atexit.register(self.close)
self._plugin_paths = plugin_paths
if self._plugin_paths is None:
self._plugin_paths = os.environ.get(
'FTRACK_EVENT_PLUGIN_PATH', ''
).split(os.pathsep)
self._discover_plugins(plugin_arguments=plugin_arguments)
# TODO: Make schemas read-only and non-mutable (or at least without
# rebuilding types)?
if schema_cache_path is not False:
if schema_cache_path is None:
schema_cache_path = os.environ.get(
'FTRACK_API_SCHEMA_CACHE_PATH', tempfile.gettempdir()
)
schema_cache_path = os.path.join(
schema_cache_path, 'ftrack_api_schema_cache.json'
)
self.schemas = self._load_schemas(schema_cache_path)
self.types = self._build_entity_type_classes(self.schemas)
ftrack_api._centralized_storage_scenario.register(self)
self._configure_locations()
self.event_hub.publish(
ftrack_api.event.base.Event(
topic='ftrack.api.session.ready',
data=dict(
session=self
)
),
synchronous=True
)

View file

@ -0,0 +1,123 @@
import os
import sys
import time
import signal
import socket
import threading
import subprocess
from pypeapp import Logger
class SocketThread(threading.Thread):
"""Thread that checks suprocess of storer of processor of events"""
MAX_TIMEOUT = 35
def __init__(self, name, port, filepath):
super(SocketThread, self).__init__()
self.log = Logger().get_logger("SocketThread", "Event Thread")
self.setName(name)
self.name = name
self.port = port
self.filepath = filepath
self.sock = None
self.subproc = None
self.connection = None
self._is_running = False
self.finished = False
self.mongo_error = False
def stop(self):
self._is_running = False
def run(self):
self._is_running = True
time_socket = time.time()
# Create a TCP/IP socket
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
self.sock = sock
# Bind the socket to the port - skip already used ports
while True:
try:
server_address = ("localhost", self.port)
sock.bind(server_address)
break
except OSError:
self.port += 1
self.log.debug(
"Running Socked thread on {}:{}".format(*server_address)
)
self.subproc = subprocess.Popen(
["python", self.filepath, "-port", str(self.port)],
stdout=subprocess.PIPE
)
# Listen for incoming connections
sock.listen(1)
sock.settimeout(1.0)
while True:
if not self._is_running:
break
try:
connection, client_address = sock.accept()
time_socket = time.time()
connection.settimeout(1.0)
self.connection = connection
except socket.timeout:
if (time.time() - time_socket) > self.MAX_TIMEOUT:
self.log.error("Connection timeout passed. Terminating.")
self._is_running = False
self.subproc.terminate()
break
continue
try:
time_con = time.time()
# Receive the data in small chunks and retransmit it
while True:
try:
if not self._is_running:
break
try:
data = connection.recv(16)
time_con = time.time()
except socket.timeout:
if (time.time() - time_con) > self.MAX_TIMEOUT:
self.log.error(
"Connection timeout passed. Terminating."
)
self._is_running = False
self.subproc.terminate()
break
continue
except ConnectionResetError:
self._is_running = False
break
if data:
if data == b"MongoError":
self.mongo_error = True
connection.sendall(data)
except Exception as exc:
self.log.error(
"Event server process failed", exc_info=True
)
finally:
# Clean up the connection
connection.close()
if self.subproc.poll() is None:
self.subproc.terminate()
lines = self.subproc.stdout.readlines()
if lines:
print("*** Socked Thread stdout ***")
for line in lines:
os.write(1, line)
self.finished = True

View file

@ -0,0 +1,50 @@
import os
import sys
import datetime
import signal
import socket
import pymongo
from ftrack_server import FtrackServer
from pype.ftrack.ftrack_server.session_processor import ProcessSession
from pypeapp import Logger
log = Logger().get_logger("Event processor")
def main(args):
port = int(args[-1])
# Create a TCP/IP socket
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
# Connect the socket to the port where the server is listening
server_address = ("localhost", port)
log.debug("Processor connected to {} port {}".format(*server_address))
sock.connect(server_address)
sock.sendall(b"CreatedProcess")
try:
session = ProcessSession(auto_connect_event_hub=True, sock=sock)
server = FtrackServer('event')
log.debug("Launched Ftrack Event processor")
server.run_server(session)
except Exception as exc:
log.error("Event server crashed. See traceback below", exc_info=True)
finally:
log.debug("First closing socket")
sock.close()
return 1
if __name__ == "__main__":
# Register interupt signal
def signal_handler(sig, frame):
print("You pressed Ctrl+C. Process ended.")
sys.exit(0)
signal.signal(signal.SIGINT, signal_handler)
signal.signal(signal.SIGTERM, signal_handler)
sys.exit(main(sys.argv))

View file

@ -0,0 +1,116 @@
import os
import sys
import datetime
import signal
import socket
import pymongo
from ftrack_server import FtrackServer
from pype.ftrack.ftrack_server.lib import get_ftrack_event_mongo_info
from pype.ftrack.lib.custom_db_connector import DbConnector
from session_storer import StorerSession
from pypeapp import Logger
log = Logger().get_logger("Event storer")
url, database, table_name = get_ftrack_event_mongo_info()
dbcon = DbConnector(
mongo_url=url,
database_name=database,
table_name=table_name
)
# ignore_topics = ["ftrack.meta.connected"]
ignore_topics = []
def install_db():
try:
dbcon.install()
dbcon._database.collection_names()
except pymongo.errors.AutoReconnect:
log.error("Mongo server \"{}\" is not responding, exiting.".format(
os.environ["AVALON_MONGO"]
))
sys.exit(0)
def launch(event):
if event.get("topic") in ignore_topics:
return
event_data = event._data
event_id = event["id"]
event_data["pype_data"] = {
"stored": datetime.datetime.utcnow(),
"is_processed": False
}
try:
# dbcon.insert_one(event_data)
dbcon.update({"id": event_id}, event_data, upsert=True)
log.debug("Event: {} stored".format(event_id))
except pymongo.errors.AutoReconnect:
log.error("Mongo server \"{}\" is not responding, exiting.".format(
os.environ["AVALON_MONGO"]
))
sys.exit(0)
except Exception as exc:
log.error(
"Event: {} failed to store".format(event_id),
exc_info=True
)
def register(session):
'''Registers the event, subscribing the discover and launch topics.'''
install_db()
session.event_hub.subscribe("topic=*", launch)
def main(args):
port = int(args[-1])
# Create a TCP/IP socket
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
# Connect the socket to the port where the server is listening
server_address = ("localhost", port)
log.debug("Storer connected to {} port {}".format(*server_address))
sock.connect(server_address)
sock.sendall(b"CreatedStore")
try:
session = StorerSession(auto_connect_event_hub=True, sock=sock)
register(session)
server = FtrackServer("event")
log.debug("Launched Ftrack Event storer")
server.run_server(session, load_files=False)
except pymongo.errors.OperationFailure:
log.error((
"Error with Mongo access, probably permissions."
"Check if exist database with name \"{}\""
" and collection \"{}\" inside."
).format(database, table_name))
sock.sendall(b"MongoError")
finally:
log.debug("First closing socket")
sock.close()
return 1
if __name__ == "__main__":
# Register interupt signal
def signal_handler(sig, frame):
print("You pressed Ctrl+C. Process ended.")
sys.exit(0)
signal.signal(signal.SIGINT, signal_handler)
signal.signal(signal.SIGTERM, signal_handler)
sys.exit(main(sys.argv))

View file

@ -0,0 +1,98 @@
import os
import sys
import time
import datetime
import signal
import threading
from ftrack_server import FtrackServer
import ftrack_api
from ftrack_api.event.hub import EventHub
from pypeapp import Logger
log = Logger().get_logger("Event Server Legacy")
class TimerChecker(threading.Thread):
max_time_out = 35
def __init__(self, server, session):
self.server = server
self.session = session
self.is_running = False
self.failed = False
super().__init__()
def stop(self):
self.is_running = False
def run(self):
start = datetime.datetime.now()
self.is_running = True
connected = False
while True:
if not self.is_running:
break
if not self.session.event_hub.connected:
if not connected:
if (datetime.datetime.now() - start).seconds > self.max_time_out:
log.error((
"Exiting event server. Session was not connected"
" to ftrack server in {} seconds."
).format(self.max_time_out))
self.failed = True
break
else:
log.error(
"Exiting event server. Event Hub is not connected."
)
self.server.stop_session()
self.failed = True
break
else:
if not connected:
connected = True
time.sleep(1)
def main(args):
check_thread = None
try:
server = FtrackServer('event')
session = ftrack_api.Session(auto_connect_event_hub=True)
check_thread = TimerChecker(server, session)
check_thread.start()
log.debug("Launching Ftrack Event Legacy Server")
server.run_server(session)
except Exception as exc:
import traceback
traceback.print_tb(exc.__traceback__)
finally:
log_info = True
if check_thread is not None:
check_thread.stop()
check_thread.join()
if check_thread.failed:
log_info = False
if log_info:
log.info("Exiting Event server subprocess")
return 1
if __name__ == "__main__":
# Register interupt signal
def signal_handler(sig, frame):
print("You pressed Ctrl+C. Process ended.")
sys.exit(0)
signal.signal(signal.SIGINT, signal_handler)
signal.signal(signal.SIGTERM, signal_handler)
sys.exit(main(sys.argv))

View file

@ -1,6 +1,6 @@
import os
import json
from pype.vendor import ftrack_api
import ftrack_api
import appdirs

View file

@ -13,6 +13,7 @@ import logging
import tempfile
import functools
import contextlib
import atexit
import requests
@ -20,6 +21,14 @@ import requests
import pymongo
from pymongo.client_session import ClientSession
class NotActiveTable(Exception):
def __init__(self, *args, **kwargs):
msg = "Active table is not set. (This is bug)"
if not (args or kwargs):
args = (default_message,)
super().__init__(*args, **kwargs)
def auto_reconnect(func):
"""Handling auto reconnect in 3 retry times"""
@functools.wraps(func)
@ -33,16 +42,35 @@ def auto_reconnect(func):
time.sleep(0.1)
else:
raise
return decorated
def check_active_table(func):
"""Check if DbConnector has active table before db method is called"""
@functools.wraps(func)
def decorated(obj, *args, **kwargs):
if not obj.active_table:
raise NotActiveTable()
return func(obj, *args, **kwargs)
return decorated
def check_active_table(func):
"""Handling auto reconnect in 3 retry times"""
@functools.wraps(func)
def decorated(obj, *args, **kwargs):
if not obj.active_table:
raise NotActiveTable("Active table is not set. (This is bug)")
return func(obj, *args, **kwargs)
return decorated
class DbConnector:
log = logging.getLogger(__name__)
timeout = 1000
def __init__(self, mongo_url, database_name, table_name):
def __init__(self, mongo_url, database_name, table_name=None):
self._mongo_client = None
self._sentry_client = None
self._sentry_logging_handler = None
@ -53,11 +81,25 @@ class DbConnector:
self.active_table = table_name
def __getitem__(self, key):
# gives direct access to collection withou setting `active_table`
return self._database[key]
def __getattribute__(self, attr):
# not all methods of PyMongo database are implemented with this it is
# possible to use them too
try:
return super(DbConnector, self).__getattribute__(attr)
except AttributeError:
if self.active_table is None:
raise NotActiveTable()
return self._database[self.active_table].__getattribute__(attr)
def install(self):
"""Establish a persistent connection to the database"""
if self._is_installed:
return
atexit.register(self.uninstall)
logging.basicConfig()
self._mongo_client = pymongo.MongoClient(
@ -99,6 +141,25 @@ class DbConnector:
self._mongo_client = None
self._database = None
self._is_installed = False
atexit.unregister(self.uninstall)
def create_table(self, name, **options):
if self.exist_table(name):
return
return self._database.create_collection(name, **options)
def exist_table(self, table_name):
return table_name in self.tables()
def create_table(self, name, **options):
if self.exist_table(name):
return
return self._database.create_collection(name, **options)
def exist_table(self, table_name):
return table_name in self.tables()
def tables(self):
"""List available tables
@ -115,93 +176,87 @@ class DbConnector:
def collections(self):
return self._database.collection_names()
@check_active_table
@auto_reconnect
def insert_one(self, item, session=None):
def insert_one(self, item, **options):
assert isinstance(item, dict), "item must be of type <dict>"
return self._database[self.active_table].insert_one(
item,
session=session
)
return self._database[self.active_table].insert_one(item, **options)
@check_active_table
@auto_reconnect
def insert_many(self, items, ordered=True, session=None):
def insert_many(self, items, ordered=True, **options):
# check if all items are valid
assert isinstance(items, list), "`items` must be of type <list>"
for item in items:
assert isinstance(item, dict), "`item` must be of type <dict>"
return self._database[self.active_table].insert_many(
items,
ordered=ordered,
session=session
)
options["ordered"] = ordered
return self._database[self.active_table].insert_many(items, **options)
@check_active_table
@auto_reconnect
def find(self, filter, projection=None, sort=None, session=None):
def find(self, filter, projection=None, sort=None, **options):
options["sort"] = sort
return self._database[self.active_table].find(
filter=filter,
projection=projection,
sort=sort,
session=session
filter, projection, **options
)
@check_active_table
@auto_reconnect
def find_one(self, filter, projection=None, sort=None, session=None):
def find_one(self, filter, projection=None, sort=None, **options):
assert isinstance(filter, dict), "filter must be <dict>"
options["sort"] = sort
return self._database[self.active_table].find_one(
filter=filter,
projection=projection,
sort=sort,
session=session
filter,
projection,
**options
)
@check_active_table
@auto_reconnect
def replace_one(self, filter, replacement, session=None):
def replace_one(self, filter, replacement, **options):
return self._database[self.active_table].replace_one(
filter, replacement,
session=session
filter, replacement, **options
)
@check_active_table
@auto_reconnect
def update_one(self, filter, update, session=None):
def update_one(self, filter, update, **options):
return self._database[self.active_table].update_one(
filter, update,
session=session
filter, update, **options
)
@check_active_table
@auto_reconnect
def update_many(self, filter, update, session=None):
def update_many(self, filter, update, **options):
return self._database[self.active_table].update_many(
filter, update,
session=session
filter, update, **options
)
@check_active_table
@auto_reconnect
def distinct(self, *args, **kwargs):
return self._database[self.active_table].distinct(
*args, **kwargs
)
def distinct(self, **options):
return self._database[self.active_table].distinct(**options)
@check_active_table
@auto_reconnect
def drop_collection(self, name_or_collection, session=None):
def drop_collection(self, name_or_collection, **options):
return self._database[self.active_table].drop(
name_or_collection,
session=session
name_or_collection, **options
)
@check_active_table
@auto_reconnect
def delete_one(filter, collation=None, session=None):
def delete_one(self, filter, collation=None, **options):
options["collation"] = collation
return self._database[self.active_table].delete_one(
filter,
collation=collation,
session=session
filter, **options
)
@check_active_table
@auto_reconnect
def delete_many(filter, collation=None, session=None):
def delete_many(self, filter, collation=None, **options):
options["collation"] = collation
return self._database[self.active_table].delete_many(
filter,
collation=collation,
session=session
filter, **options
)

View file

@ -345,25 +345,44 @@ class AppAction(BaseHandler):
statuses = presets['status_update']
actual_status = entity['status']['name'].lower()
next_status_name = None
for key, value in statuses.items():
if actual_status in value or '_any_' in value:
if key != '_ignore_':
next_status_name = key
already_tested = []
ent_path = "/".join(
[ent["name"] for ent in entity['link']]
)
while True:
next_status_name = None
for key, value in statuses.items():
if key in already_tested:
continue
if actual_status in value or '_any_' in value:
if key != '_ignore_':
next_status_name = key
already_tested.append(key)
break
already_tested.append(key)
if next_status_name is None:
break
if next_status_name is not None:
try:
query = 'Status where name is "{}"'.format(
next_status_name
)
status = session.query(query).one()
entity['status'] = status
session.commit()
self.log.debug("Changing status to \"{}\" <{}>".format(
next_status_name, ent_path
))
break
except Exception:
session.rollback()
msg = (
'Status "{}" in presets wasn\'t found on Ftrack'
).format(next_status_name)
'Status "{}" in presets wasn\'t found'
' on Ftrack entity type "{}"'
).format(next_status_name, entity.entity_type)
self.log.warning(msg)
# Set origin avalon environments

View file

@ -1,8 +1,9 @@
import functools
import time
from pypeapp import Logger
from pype.vendor import ftrack_api
from pype.vendor.ftrack_api import session as fa_session
import ftrack_api
from ftrack_api import session as fa_session
from pype.ftrack.ftrack_server import session_processor
class MissingPermision(Exception):
@ -12,6 +13,13 @@ class MissingPermision(Exception):
super().__init__(message)
class PreregisterException(Exception):
def __init__(self, message=None):
if not message:
message = "Pre-registration conditions were not met"
super().__init__(message)
class BaseHandler(object):
'''Custom Action base class
@ -31,8 +39,21 @@ class BaseHandler(object):
def __init__(self, session, plugins_presets={}):
'''Expects a ftrack_api.Session instance'''
self._session = session
self.log = Logger().get_logger(self.__class__.__name__)
if not(
isinstance(session, ftrack_api.session.Session) or
isinstance(session, session_processor.ProcessSession)
):
raise Exception((
"Session object entered with args is instance of \"{}\""
" but expected instances are \"{}\" and \"{}\""
).format(
str(type(session)),
str(ftrack_api.session.Session),
str(session_processor.ProcessSession)
))
self._session = session
# Using decorator
self.register = self.register_decorator(self.register)
@ -75,15 +96,17 @@ class BaseHandler(object):
'!{} "{}" - You\'re missing required {} permissions'
).format(self.type, label, str(MPE)))
except AssertionError as ae:
self.log.info((
self.log.warning((
'!{} "{}" - {}'
).format(self.type, label, str(ae)))
except NotImplementedError:
self.log.error((
'{} "{}" - Register method is not implemented'
).format(
self.type, label)
)
).format(self.type, label))
except PreregisterException as exc:
self.log.warning((
'{} "{}" - {}'
).format(self.type, label, str(exc)))
except Exception as e:
self.log.error('{} "{}" - Registration failed ({})'.format(
self.type, label, str(e))
@ -105,6 +128,7 @@ class BaseHandler(object):
try:
return func(*args, **kwargs)
except Exception as exc:
self.session.rollback()
msg = '{} "{}": Failed ({})'.format(self.type, label, str(exc))
self.log.error(msg, exc_info=True)
return {
@ -149,10 +173,10 @@ class BaseHandler(object):
if result is True:
return
msg = "Pre-register conditions were not met"
msg = None
if isinstance(result, str):
msg = result
raise Exception(msg)
raise PreregisterException(msg)
def preregister(self):
'''
@ -579,3 +603,24 @@ class BaseHandler(object):
self.log.debug(
"Action \"{}\" Triggered successfully".format(action_name)
)
def trigger_event(
self, topic, event_data={}, session=None, source=None,
event=None, on_error="ignore"
):
if session is None:
session = self.session
if not source and event:
source = event.get("source")
# Create and trigger event
event = fa_session.ftrack_api.event.base.Event(
topic=topic,
data=event_data,
source=source
)
session.event_hub.publish(event, on_error=on_error)
self.log.debug((
"Publishing event: {}"
).format(str(event.__dict__)))

View file

@ -26,6 +26,7 @@ class BaseEvent(BaseHandler):
try:
func(*args, **kwargs)
except Exception as exc:
self.session.rollback()
self.log.error(
'Event "{}" Failed: {}'.format(
self.__class__.__name__, str(exc)

View file

@ -50,6 +50,19 @@ class DbConnector(object):
self._database = None
self._is_installed = False
def __getitem__(self, key):
# gives direct access to collection withou setting `active_table`
return self._database[key]
def __getattribute__(self, attr):
# not all methods of PyMongo database are implemented with this it is
# possible to use them too
try:
return super(DbConnector, self).__getattribute__(attr)
except AttributeError:
cur_proj = self.Session["AVALON_PROJECT"]
return self._database[cur_proj].__getattribute__(attr)
def install(self):
"""Establish a persistent connection to the database"""
if self._is_installed:

View file

@ -4,9 +4,9 @@ import threading
import time
from Qt import QtCore, QtGui, QtWidgets
from pype.vendor import ftrack_api
import ftrack_api
from pypeapp import style
from pype.ftrack import FtrackServer, credentials
from pype.ftrack import FtrackServer, check_ftrack_url, credentials
from . import login_dialog
from pype import api as pype
@ -24,7 +24,8 @@ class FtrackModule:
self.thread_timer = None
self.bool_logged = False
self.bool_action_server = False
self.bool_action_server_running = False
self.bool_action_thread_running = False
self.bool_timer_event = False
def show_login_widget(self):
@ -74,28 +75,50 @@ class FtrackModule:
# Actions part
def start_action_server(self):
self.bool_action_thread_running = True
self.set_menu_visibility()
if (
self.thread_action_server is not None and
self.bool_action_thread_running is False
):
self.stop_action_server()
if self.thread_action_server is None:
self.thread_action_server = threading.Thread(
target=self.set_action_server
)
self.thread_action_server.daemon = True
self.thread_action_server.start()
log.info("Ftrack action server launched")
self.bool_action_server = True
self.set_menu_visibility()
def set_action_server(self):
try:
self.action_server.run_server()
except Exception as exc:
log.error(
"Ftrack Action server crashed! Please try to start again.",
exc_info=True
first_check = True
while self.bool_action_thread_running is True:
if not check_ftrack_url(os.environ['FTRACK_SERVER']):
if first_check:
log.warning(
"Could not connect to Ftrack server"
)
first_check = False
time.sleep(1)
continue
log.info(
"Connected to Ftrack server. Running actions session"
)
# TODO show message to user
self.bool_action_server = False
try:
self.bool_action_server_running = True
self.set_menu_visibility()
self.action_server.run_server()
if self.bool_action_thread_running:
log.debug("Ftrack action server has stopped")
except Exception:
log.warning(
"Ftrack Action server crashed. Trying to connect again",
exc_info=True
)
self.bool_action_server_running = False
self.set_menu_visibility()
first_check = True
self.bool_action_thread_running = False
def reset_action_server(self):
self.stop_action_server()
@ -103,16 +126,21 @@ class FtrackModule:
def stop_action_server(self):
try:
self.bool_action_thread_running = False
self.action_server.stop_session()
if self.thread_action_server is not None:
self.thread_action_server.join()
self.thread_action_server = None
log.info("Ftrack action server stopped")
self.bool_action_server = False
log.info("Ftrack action server was forced to stop")
self.bool_action_server_running = False
self.set_menu_visibility()
except Exception as e:
log.error("During Killing action server: {0}".format(e))
except Exception:
log.warning(
"Error has happened during Killing action server",
exc_info=True
)
# Definition of Tray menu
def tray_menu(self, parent_menu):
@ -158,6 +186,9 @@ class FtrackModule:
def tray_start(self):
self.validate()
def tray_exit(self):
self.stop_action_server()
# Definition of visibility of each menu actions
def set_menu_visibility(self):
@ -170,9 +201,9 @@ class FtrackModule:
self.stop_timer_thread()
return
self.aRunActionS.setVisible(not self.bool_action_server)
self.aResetActionS.setVisible(self.bool_action_server)
self.aStopActionS.setVisible(self.bool_action_server)
self.aRunActionS.setVisible(not self.bool_action_thread_running)
self.aResetActionS.setVisible(self.bool_action_thread_running)
self.aStopActionS.setVisible(self.bool_action_thread_running)
if self.bool_timer_event is False:
self.start_timer_thread()

View file

@ -7,8 +7,6 @@ import contextlib
import subprocess
import inspect
from .vendor import pather
from .vendor.pather.error import ParseError
import avalon.io as io
import avalon.api
@ -562,7 +560,7 @@ def get_subsets(asset_name,
find_dict = {"type": "representation",
"parent": version_sel["_id"]}
filter_repr = {"$or": [{"name": repr} for repr in representations]}
filter_repr = {"name": {"$in": representations}}
find_dict.update(filter_repr)
repres_out = [i for i in io.find(find_dict)]
@ -572,3 +570,43 @@ def get_subsets(asset_name,
"representaions": repres_out}
return output_dict
class CustomNone:
"""Created object can be used as custom None (not equal to None).
WARNING: Multiple created objects are not equal either.
Exmple:
>>> a = CustomNone()
>>> a == None
False
>>> b = CustomNone()
>>> a == b
False
>>> a == a
True
"""
def __init__(self):
"""Create uuid as identifier for custom None."""
import uuid
self.identifier = str(uuid.uuid4())
def __bool__(self):
"""Return False (like default None)."""
return False
def __eq__(self, other):
"""Equality is compared by identifier value."""
if type(other) == type(self):
if other.identifier == self.identifier:
return True
return False
def __str__(self):
"""Return value of identifier when converted to string."""
return self.identifier
def __repr__(self):
"""Representation of custom None."""
return "<CustomNone-{}>".format(str(self.identifier))

View file

@ -3,6 +3,7 @@
import re
import os
import uuid
import math
import bson
import json
@ -1774,6 +1775,12 @@ def set_scene_fps(fps, update=True):
'48000': '48000fps'}
# pull from mapping
# this should convert float string to float and int to int
# so 25.0 is converted to 25, but 23.98 will be still float.
dec, ipart = math.modf(fps)
if dec == 0.0:
fps = int(ipart)
unit = fps_mapping.get(str(fps), None)
if unit is None:
raise ValueError("Unsupported FPS value: `%s`" % fps)
@ -1856,6 +1863,7 @@ def set_context_settings():
# Set project fps
fps = asset_data.get("fps", project_data.get("fps", 25))
api.Session["AVALON_FPS"] = fps
set_scene_fps(fps)
# Set project resolution

View file

@ -43,8 +43,10 @@ class MusterModule:
self.aShowLogin.trigger()
if "RestApiServer" in modules:
def api_show_login():
self.aShowLogin.trigger()
modules["RestApiServer"].register_callback(
"muster/show_login", api_callback, "post"
"/show_login", api_show_login, "muster", "post"
)
# Definition of Tray menu
@ -93,7 +95,7 @@ class MusterModule:
'password': password
}
api_entry = '/api/login'
response = requests.post(
response = self._requests_post(
MUSTER_REST_URL + api_entry, params=params)
if response.status_code != 200:
self.log.error(
@ -125,3 +127,17 @@ class MusterModule:
Show dialog to enter credentials
"""
self.widget_login.show()
def _requests_post(self, *args, **kwargs):
""" Wrapper for requests, disabling SSL certificate validation if
DONT_VERIFY_SSL environment variable is found. This is useful when
Deadline or Muster server are running with self-signed certificates
and their certificate is not added to trusted certificates on
client machines.
WARNING: disabling SSL certificate validation is defeating one line
of defense SSL is providing and it is not recommended.
"""
if 'verify' not in kwargs:
kwargs['verify'] = False if os.getenv("PYPE_DONT_VERIFY_SSL", True) else True # noqa
return requests.post(*args, **kwargs)

View file

@ -80,17 +80,21 @@ def reload_config():
for module in (
"{}.api".format(AVALON_CONFIG),
"{}.nuke.actions".format(AVALON_CONFIG),
"{}.nuke.templates".format(AVALON_CONFIG),
"{}.nuke.presets".format(AVALON_CONFIG),
"{}.nuke.menu".format(AVALON_CONFIG),
"{}.nuke.plugin".format(AVALON_CONFIG),
"{}.nuke.lib".format(AVALON_CONFIG),
):
log.info("Reloading module: {}...".format(module))
module = importlib.import_module(module)
try:
module = importlib.import_module(module)
reload(module)
except Exception as e:
log.warning("Cannot reload module: {}".format(e))
importlib.reload(module)
except AttributeError as e:
log.warning("Cannot reload module: {}".format(e))
reload(module)
def install():

View file

@ -1,20 +1,23 @@
import os
import re
import sys
import getpass
from collections import OrderedDict
from pprint import pprint
from avalon import api, io, lib
import avalon.nuke
import pype.api as pype
import nuke
from .templates import (
from .presets import (
get_colorspace_preset,
get_node_dataflow_preset,
get_node_colorspace_preset
)
from .templates import (
from .presets import (
get_anatomy
)
# TODO: remove get_anatomy and import directly Anatomy() here
@ -55,7 +58,8 @@ def checkInventoryVersions():
if container:
node = container["_node"]
avalon_knob_data = avalon.nuke.get_avalon_knob_data(node)
avalon_knob_data = avalon.nuke.get_avalon_knob_data(
node, ['avalon:', 'ak:'])
# get representation from io
representation = io.find_one({
@ -101,7 +105,8 @@ def writes_version_sync():
for each in nuke.allNodes():
if each.Class() == 'Write':
avalon_knob_data = avalon.nuke.get_avalon_knob_data(each)
avalon_knob_data = avalon.nuke.get_avalon_knob_data(
each, ['avalon:', 'ak:'])
try:
if avalon_knob_data['families'] not in ["render"]:
@ -134,7 +139,8 @@ def get_render_path(node):
''' Generate Render path from presets regarding avalon knob data
'''
data = dict()
data['avalon'] = avalon.nuke.get_avalon_knob_data(node)
data['avalon'] = avalon.nuke.get_avalon_knob_data(
node, ['avalon:', 'ak:'])
data_preset = {
"class": data['avalon']['family'],
@ -169,8 +175,13 @@ def format_anatomy(data):
anatomy = get_anatomy()
log.debug("__ anatomy.templates: {}".format(anatomy.templates))
# TODO: perhaps should be in try!
padding = int(anatomy.templates['render']['padding'])
try:
padding = int(anatomy.templates['render']['padding'])
except KeyError as e:
log.error("`padding` key is not in `render` "
"Anatomy template. Please, add it there and restart "
"the pipeline (padding: \"4\"): `{}`".format(e))
version = data.get("version", None)
if not version:
file = script_name()
@ -207,12 +218,13 @@ def add_button_write_to_read(node):
node.addKnob(k)
def create_write_node(name, data, prenodes=None):
def create_write_node(name, data, input=None, prenodes=None):
''' Creating write node which is group node
Arguments:
name (str): name of node
data (dict): data to be imprinted
input (node): selected node to connect to
prenodes (list, optional): list of lists, definitions for nodes
to be created before write
@ -224,9 +236,9 @@ def create_write_node(name, data, prenodes=None):
("knobName", "knobValue"),
("knobName", "knobValue")
),
( # list inputs
"firstPrevNodeName",
"secondPrevNodeName"
( # list outputs
"firstPostNodeName",
"secondPostNodeName"
)
)
]
@ -268,28 +280,44 @@ def create_write_node(name, data, prenodes=None):
})
# adding dataflow template
log.debug("nuke_dataflow_writes: `{}`".format(nuke_dataflow_writes))
{_data.update({k: v})
for k, v in nuke_dataflow_writes.items()
if k not in ["_id", "_previous"]}
# adding dataflow template
# adding colorspace template
log.debug("nuke_colorspace_writes: `{}`".format(nuke_colorspace_writes))
{_data.update({k: v})
for k, v in nuke_colorspace_writes.items()}
_data = avalon.nuke.lib.fix_data_for_node_create(_data)
log.debug(_data)
log.debug("_data: `{}`".format(_data))
_data["frame_range"] = data.get("frame_range", None)
if "frame_range" in data.keys():
_data["frame_range"] = data.get("frame_range", None)
log.debug("_data[frame_range]: `{}`".format(_data["frame_range"]))
# todo: hange this to new way
GN = nuke.createNode("Group", "name {}".format(name))
prev_node = None
with GN:
connections = list()
if input:
# if connected input node was defined
connections.append({
"node": input,
"inputName": input.name()})
prev_node = nuke.createNode(
"Input", "name {}".format(input.name()))
else:
# generic input node connected to nothing
prev_node = nuke.createNode(
"Input", "name {}".format("rgba"))
# creating pre-write nodes `prenodes`
if prenodes:
for name, klass, properties, set_input_to in prenodes:
for name, klass, properties, set_output_to in prenodes:
# create node
now_node = nuke.createNode(klass, "name {}".format(name))
@ -299,34 +327,41 @@ def create_write_node(name, data, prenodes=None):
now_node[k].serValue(str(v))
# connect to previous node
if set_input_to:
if isinstance(set_input_to, (tuple or list)):
for i, node_name in enumerate(set_input_to):
input_node = nuke.toNode(node_name)
if set_output_to:
if isinstance(set_output_to, (tuple or list)):
for i, node_name in enumerate(set_output_to):
input_node = nuke.createNode(
"Input", "name {}".format(node_name))
connections.append({
"node": nuke.toNode(node_name),
"inputName": node_name})
now_node.setInput(1, input_node)
elif isinstance(set_input_to, str):
input_node = nuke.toNode(set_input_to)
elif isinstance(set_output_to, str):
input_node = nuke.createNode(
"Input", "name {}".format(node_name))
connections.append({
"node": nuke.toNode(set_output_to),
"inputName": set_output_to})
now_node.setInput(0, input_node)
else:
now_node.setInput(0, prev_node)
# swith actual node to previous
prev_node = now_node
else:
prev_node = nuke.createNode("Input", "name rgba")
# creating write node
now_node = avalon.nuke.lib.add_write_node("inside_{}".format(name),
**_data
)
write_node = now_node
write_node = now_node = avalon.nuke.lib.add_write_node(
"inside_{}".format(name),
**_data
)
# connect to previous node
now_node.setInput(0, prev_node)
# swith actual node to previous
prev_node = now_node
now_node = nuke.createNode("Output", "name write")
now_node = nuke.createNode("Output", "name Output1")
# connect to previous node
now_node.setInput(0, prev_node)
@ -379,6 +414,10 @@ def add_rendering_knobs(node):
knob = nuke.Boolean_Knob("render_farm", "Render on Farm")
knob.setValue(False)
node.addKnob(knob)
if "review" not in node.knobs():
knob = nuke.Boolean_Knob("review", "Review")
knob.setValue(True)
node.addKnob(knob)
return node
@ -389,6 +428,14 @@ def add_deadline_tab(node):
knob.setValue(1)
node.addKnob(knob)
knob = nuke.Int_Knob("deadlinePriority", "Priority")
knob.setValue(50)
node.addKnob(knob)
def get_deadline_knob_names():
return ["Deadline", "deadlineChunkSize", "deadlinePriority"]
def create_backdrop(label="", color=None, layer=0,
nodes=None):
@ -543,17 +590,34 @@ class WorkfileSettings(object):
assert isinstance(root_dict, dict), log.error(
"set_root_colorspace(): argument should be dictionary")
log.debug(">> root_dict: {}".format(root_dict))
# first set OCIO
if self._root_node["colorManagement"].value() \
not in str(root_dict["colorManagement"]):
self._root_node["colorManagement"].setValue(
str(root_dict["colorManagement"]))
log.debug("nuke.root()['{0}'] changed to: {1}".format(
"colorManagement", root_dict["colorManagement"]))
root_dict.pop("colorManagement")
# second set ocio version
if self._root_node["OCIO_config"].value() \
not in str(root_dict["OCIO_config"]):
self._root_node["OCIO_config"].setValue(
str(root_dict["OCIO_config"]))
log.debug("nuke.root()['{0}'] changed to: {1}".format(
"OCIO_config", root_dict["OCIO_config"]))
root_dict.pop("OCIO_config")
# third set ocio custom path
if root_dict.get("customOCIOConfigPath"):
self._root_node["customOCIOConfigPath"].setValue(
str(root_dict["customOCIOConfigPath"]).format(**os.environ)
)
log.debug("nuke.root()['{}'] changed to: {}".format(
"customOCIOConfigPath", root_dict["customOCIOConfigPath"]))
root_dict.pop("customOCIOConfigPath")
# then set the rest
for knob, value in root_dict.items():
@ -798,10 +862,12 @@ def get_write_node_template_attr(node):
'''
# get avalon data from node
data = dict()
data['avalon'] = avalon.nuke.get_avalon_knob_data(node)
data['avalon'] = avalon.nuke.get_avalon_knob_data(
node, ['avalon:', 'ak:'])
data_preset = {
"class": data['avalon']['family'],
"preset": data['avalon']['families']
"families": data['avalon']['families'],
"preset": data['avalon']['families'] # omit < 2.0.0v
}
# get template data
@ -884,7 +950,7 @@ class BuildWorkfile(WorkfileSettings):
**kwargs)
self.to_script = to_script
# collect data for formating
data = {
self.data_tmp = {
"root": root_path or api.Session["AVALON_PROJECTS"],
"project": {"name": self._project["name"],
"code": self._project["data"].get("code", '')},
@ -899,7 +965,7 @@ class BuildWorkfile(WorkfileSettings):
# get presets from anatomy
anatomy = get_anatomy()
# format anatomy
anatomy_filled = anatomy.format(data)
anatomy_filled = anatomy.format(self.data_tmp)
# get dir and file for workfile
self.work_dir = anatomy_filled["avalon"]["work"]
@ -927,7 +993,7 @@ class BuildWorkfile(WorkfileSettings):
def process(self,
regex_filter=None,
version=None,
representations=["exr", "dpx", "lutJson"]):
representations=["exr", "dpx", "lutJson", "mov", "preview"]):
"""
A short description.
@ -984,6 +1050,8 @@ class BuildWorkfile(WorkfileSettings):
version=version,
representations=representations)
log.info("__ subsets: `{}`".format(subsets))
nodes_backdrop = list()
for name, subset in subsets.items():
@ -1073,6 +1141,10 @@ class BuildWorkfile(WorkfileSettings):
representation (dict): avalon db entity
"""
task = self.data_tmp["task"]
sanitized_task = re.sub('[^0-9a-zA-Z]+', '', task)
subset_name = "render{}Main".format(
sanitized_task.capitalize())
Create_name = "CreateWriteRender"
@ -1084,7 +1156,7 @@ class BuildWorkfile(WorkfileSettings):
creator_plugin = Creator
# return api.create()
return creator_plugin("render_writeMain", self._asset).process()
return creator_plugin(subset_name, self._asset).process()
def create_backdrop(self, label="", color=None, layer=0,
nodes=None):

14
pype/nuke/plugin.py Normal file
View file

@ -0,0 +1,14 @@
import re
import avalon.api
import avalon.nuke
from pype import api as pype
from pypeapp import config
class PypeCreator(avalon.nuke.pipeline.Creator):
"""Pype Nuke Creator class wrapper
"""
def __init__(self, *args, **kwargs):
super(PypeCreator, self).__init__(*args, **kwargs)
self.presets = config.get_presets()['plugins']["nuke"]["create"].get(
self.__class__.__name__, {}
)

View file

@ -25,16 +25,23 @@ def get_node_dataflow_preset(**kwarg):
log.info(kwarg)
host = kwarg.get("host", "nuke")
cls = kwarg.get("class", None)
preset = kwarg.get("preset", None)
assert any([host, cls]), log.error("nuke.templates.get_node_dataflow_preset(): \
Missing mandatory kwargs `host`, `cls`")
families = kwarg.get("families", [])
preset = kwarg.get("preset", None) # omit < 2.0.0v
assert any([host, cls]), log.error(
"`{}`: Missing mandatory kwargs `host`, `cls`".format(__file__))
nuke_dataflow = get_dataflow_preset().get(str(host), None)
nuke_dataflow_nodes = nuke_dataflow.get('nodes', None)
nuke_dataflow_node = nuke_dataflow_nodes.get(str(cls), None)
if preset:
if preset: # omit < 2.0.0v
nuke_dataflow_node = nuke_dataflow_node.get(str(preset), None)
# omit < 2.0.0v
if families:
for family in families:
nuke_dataflow_node = nuke_dataflow_node.get(str(family), None)
log.info("Dataflow: {}".format(nuke_dataflow_node))
return nuke_dataflow_node
@ -46,14 +53,22 @@ def get_node_colorspace_preset(**kwarg):
log.info(kwarg)
host = kwarg.get("host", "nuke")
cls = kwarg.get("class", None)
preset = kwarg.get("preset", None)
assert any([host, cls]), log.error("nuke.templates.get_node_colorspace_preset(): \
Missing mandatory kwargs `host`, `cls`")
families = kwarg.get("families", [])
preset = kwarg.get("preset", None) # omit < 2.0.0v
assert any([host, cls]), log.error(
"`{}`: Missing mandatory kwargs `host`, `cls`".format(__file__))
nuke_colorspace = get_colorspace_preset().get(str(host), None)
nuke_colorspace_node = nuke_colorspace.get(str(cls), None)
if preset:
if preset: # omit < 2.0.0v
nuke_colorspace_node = nuke_colorspace_node.get(str(preset), None)
# omit < 2.0.0v
if families:
for family in families:
nuke_colorspace_node = nuke_colorspace_node.get(str(family), None)
log.info("Colorspace: {}".format(nuke_colorspace_node))
return nuke_colorspace_node

View file

@ -46,7 +46,7 @@ class ContextPlugin(pyblish.api.ContextPlugin):
class InstancePlugin(pyblish.api.InstancePlugin):
def process(cls, *args, **kwargs):
imprint_attributes(cls)
super(ContextPlugin, cls).process(cls, *args, **kwargs)
super(InstancePlugin, cls).process(cls, *args, **kwargs)
class Extractor(InstancePlugin):

View file

@ -1,5 +1,6 @@
import os
import sys
import six
import pyblish.api
import clique
@ -125,6 +126,12 @@ class IntegrateFtrackApi(pyblish.api.InstancePlugin):
metadata=asset_metadata
)
)
try:
session.commit()
except Exception:
tp, value, tb = sys.exc_info()
session.rollback()
six.reraise(tp, value, tb)
# Adding metadata
existing_asset_metadata = asset_entity["metadata"]
@ -137,8 +144,6 @@ class IntegrateFtrackApi(pyblish.api.InstancePlugin):
"version": 0,
"asset": asset_entity,
}
if task:
assetversion_data['task'] = task
assetversion_data.update(data.get("assetversion_data", {}))
@ -150,6 +155,9 @@ class IntegrateFtrackApi(pyblish.api.InstancePlugin):
# due to a ftrack_api bug where you can't add metadata on creation.
assetversion_metadata = assetversion_data.pop("metadata", {})
if task:
assetversion_data['task'] = task
# Create a new entity if none exits.
if not assetversion_entity:
assetversion_entity = session.create(
@ -162,6 +170,12 @@ class IntegrateFtrackApi(pyblish.api.InstancePlugin):
metadata=assetversion_metadata
)
)
try:
session.commit()
except Exception:
tp, value, tb = sys.exc_info()
session.rollback()
six.reraise(tp, value, tb)
# Adding metadata
existing_assetversion_metadata = assetversion_entity["metadata"]
@ -170,7 +184,12 @@ class IntegrateFtrackApi(pyblish.api.InstancePlugin):
# Have to commit the version and asset, because location can't
# determine the final location without.
session.commit()
try:
session.commit()
except Exception:
tp, value, tb = sys.exc_info()
session.rollback()
six.reraise(tp, value, tb)
# Component
# Get existing entity.
@ -209,7 +228,12 @@ class IntegrateFtrackApi(pyblish.api.InstancePlugin):
session.delete(member)
del(member)
session.commit()
try:
session.commit()
except Exception:
tp, value, tb = sys.exc_info()
session.rollback()
six.reraise(tp, value, tb)
# Reset members in memory
if "members" in component_entity.keys():
@ -320,4 +344,9 @@ class IntegrateFtrackApi(pyblish.api.InstancePlugin):
)
else:
# Commit changes.
session.commit()
try:
session.commit()
except Exception:
tp, value, tb = sys.exc_info()
session.rollback()
six.reraise(tp, value, tb)

View file

@ -0,0 +1,31 @@
import sys
import pyblish.api
import six
class IntegrateFtrackComments(pyblish.api.InstancePlugin):
"""Create comments in Ftrack."""
order = pyblish.api.IntegratorOrder
label = "Integrate Comments to Ftrack."
families = ["shot"]
def process(self, instance):
session = instance.context.data["ftrackSession"]
entity = session.query(
"Shot where name is \"{}\"".format(instance.data["item"].name())
).one()
notes = []
for comment in instance.data["comments"]:
notes.append(session.create("Note", {"content": comment}))
entity["notes"].extend(notes)
try:
session.commit()
except Exception:
tp, value, tb = sys.exc_info()
session.rollback()
six.reraise(tp, value, tb)

View file

@ -37,6 +37,8 @@ class IntegrateFtrackInstance(pyblish.api.InstancePlugin):
if instance.data.get('version'):
version_number = int(instance.data.get('version'))
else:
raise ValueError("Instance version not set")
family = instance.data['family'].lower()

View file

@ -1,3 +1,6 @@
import sys
import six
import pyblish.api
from avalon import io
@ -66,9 +69,10 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin):
# try to find if entity already exists
else:
query = '{} where name is "{}" and parent_id is "{}"'.format(
entity_type, entity_name, parent['id']
)
query = (
'TypedContext where name is "{0}" and '
'project_id is "{1}"'
).format(entity_name, self.ft_project["id"])
try:
entity = self.session.query(query).one()
except Exception:
@ -98,7 +102,12 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin):
for instance in instances:
instance.data['ftrackEntity'] = entity
self.session.commit()
try:
self.session.commit()
except Exception:
tp, value, tb = sys.exc_info()
self.session.rollback()
six.reraise(tp, value, tb)
# TASKS
tasks = entity_data.get('tasks', [])
@ -121,11 +130,21 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin):
task_type=task,
parent=entity
)
self.session.commit()
try:
self.session.commit()
except Exception:
tp, value, tb = sys.exc_info()
self.session.rollback()
six.reraise(tp, value, tb)
# Incoming links.
self.create_links(entity_data, entity)
self.session.commit()
try:
self.session.commit()
except Exception:
tp, value, tb = sys.exc_info()
self.session.rollback()
six.reraise(tp, value, tb)
if 'childs' in entity_data:
self.import_to_ftrack(
@ -135,7 +154,12 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin):
# Clear existing links.
for link in entity.get("incoming_links", []):
self.session.delete(link)
self.session.commit()
try:
self.session.commit()
except Exception:
tp, value, tb = sys.exc_info()
self.session.rollback()
six.reraise(tp, value, tb)
# Create new links.
for input in entity_data.get("inputs", []):
@ -171,7 +195,12 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin):
self.log.info(self.task_types)
task['type'] = self.task_types[task_type]
self.session.commit()
try:
self.session.commit()
except Exception:
tp, value, tb = sys.exc_info()
self.session.rollback()
six.reraise(tp, value, tb)
return task
@ -180,6 +209,11 @@ class IntegrateHierarchyToFtrack(pyblish.api.ContextPlugin):
'name': name,
'parent': parent
})
self.session.commit()
try:
self.session.commit()
except Exception:
tp, value, tb = sys.exc_info()
self.session.rollback()
six.reraise(tp, value, tb)
return entity

View file

@ -3,7 +3,7 @@ import json
import re
import pyblish.api
from pype.vendor import clique
import clique
class CollectJSON(pyblish.api.ContextPlugin):

View file

@ -4,7 +4,7 @@ import datetime
import time
import pyblish.api
from pype.vendor import clique
import clique
class ExtractJSON(pyblish.api.ContextPlugin):

View file

@ -1,7 +1,7 @@
import os
import pyblish.api
import subprocess
from pype.vendor import clique
import clique
class ExtractQuicktimeEXR(pyblish.api.InstancePlugin):

View file

@ -40,6 +40,15 @@ class CleanUp(pyblish.api.InstancePlugin):
active = True
def process(self, instance):
# Get the errored instances
failed = []
for result in instance.context.data["results"]:
if (result["error"] is not None and result["instance"] is not None
and result["instance"] not in failed):
failed.append(result["instance"])
assert instance not in failed, ("Result of '{}' instance "
"were not success".format(instance.data["name"]))
if [ef for ef in self.exclude_families
if instance.data["family"] in ef]:
return

View file

@ -24,4 +24,4 @@ class CollectSceneVersion(pyblish.api.ContextPlugin):
rootVersion = pype.get_version_from_path(filename)
context.data['version'] = rootVersion
self.log.info('Scene Version: %s' % context.data('version'))
self.log.info('Scene Version: %s' % context.data.get('version'))

View file

@ -72,13 +72,7 @@ class ExtractHierarchyToAvalon(pyblish.api.ContextPlugin):
entity = io.find_one({"type": "asset", "name": name})
# Create entity if doesn"t exist
if entity is None:
if self.project["_id"] == parent["_id"]:
silo = None
elif parent["silo"] is None:
silo = parent["name"]
else:
silo = parent["silo"]
entity = self.create_avalon_asset(name, silo, data)
entity = self.create_avalon_asset(name, data)
# Update entity data with input data
io.update_many({"_id": entity["_id"]}, {"$set": {"data": data}})
@ -86,11 +80,10 @@ class ExtractHierarchyToAvalon(pyblish.api.ContextPlugin):
if "childs" in entity_data:
self.import_to_avalon(entity_data["childs"], entity)
def create_avalon_asset(self, name, silo, data):
def create_avalon_asset(self, name, data):
item = {
"schema": "avalon-core:asset-3.0",
"name": name,
"silo": silo,
"parent": self.project["_id"],
"type": "asset",
"data": data

View file

@ -1,7 +1,7 @@
import os
import pyblish.api
from pype.vendor import clique
import clique
import pype.api
@ -48,7 +48,8 @@ class ExtractJpegEXR(pyblish.api.InstancePlugin):
profile = config_data.get(proj_name, config_data['__default__'])
jpeg_items = []
jpeg_items.append("ffmpeg")
jpeg_items.append(
os.path.join(os.environ.get("FFMPEG_PATH"), "ffmpeg"))
# override file if already exists
jpeg_items.append("-y")
# use same input args like with mov

View file

@ -1,7 +1,7 @@
import os
import pyblish.api
from pype.vendor import clique
import clique
import pype.api
from pypeapp import config
@ -163,7 +163,10 @@ class ExtractReview(pyblish.api.InstancePlugin):
# output filename
output_args.append(full_output_path)
mov_args = [
"ffmpeg",
os.path.join(
os.environ.get(
"FFMPEG_PATH",
""), "ffmpeg"),
" ".join(input_args),
" ".join(output_args)
]

View file

@ -66,7 +66,7 @@ class IntegrateAssumedDestination(pyblish.api.InstancePlugin):
"""Create a filepath based on the current data available
Example template:
{root}/{project}/{silo}/{asset}/publish/{subset}/v{version:0>3}/
{root}/{project}/{asset}/publish/{subset}/v{version:0>3}/
{subset}.{representation}
Args:
instance: the instance to publish
@ -95,7 +95,6 @@ class IntegrateAssumedDestination(pyblish.api.InstancePlugin):
assert asset, ("No asset found by the name '{}' "
"in project '{}'".format(asset_name, project_name))
silo = asset['silo']
subset = io.find_one({"type": "subset",
"name": subset_name,
@ -126,7 +125,6 @@ class IntegrateAssumedDestination(pyblish.api.InstancePlugin):
template_data = {"root": api.Session["AVALON_PROJECTS"],
"project": {"name": project_name,
"code": project['data']['code']},
"silo": silo,
"family": instance.data['family'],
"asset": asset_name,
"subset": subset_name,

View file

@ -1,18 +1,23 @@
import os
from os.path import getsize
import logging
import speedcopy
import sys
import clique
import errno
import pyblish.api
from avalon import api, io
from avalon.vendor import filelink
# this is needed until speedcopy for linux is fixed
if sys.platform == "win32":
from speedcopy import copyfile
else:
from shutil import copyfile
log = logging.getLogger(__name__)
class IntegrateAssetNew(pyblish.api.InstancePlugin):
"""Resolve any dependency issius
"""Resolve any dependency issues
This plug-in resolves any paths which, if not updated might break
the published file.
@ -57,7 +62,6 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
"render",
"imagesequence",
"review",
"render",
"rendersetup",
"rig",
"plate",
@ -268,7 +272,9 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
template = os.path.normpath(
anatomy.templates[template_name]["path"])
if isinstance(files, list):
sequence_repre = isinstance(files, list)
if sequence_repre:
src_collections, remainder = clique.assemble(files)
self.log.debug(
"src_tail_collections: {}".format(str(src_collections)))
@ -305,14 +311,21 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
dst_tail = dst_collection.format("{tail}")
index_frame_start = None
if repre.get("frameStart"):
frame_start_padding = len(str(
repre.get("frameEnd")))
index_frame_start = int(repre.get("frameStart"))
dst_padding_exp = src_padding_exp
dst_start_frame = None
for i in src_collection.indexes:
src_padding = src_padding_exp % i
# for adding first frame into db
if not dst_start_frame:
dst_start_frame = src_padding
src_file_name = "{0}{1}{2}".format(
src_head, src_padding, src_tail)
@ -323,19 +336,22 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
dst_padding = dst_padding_exp % index_frame_start
index_frame_start += 1
dst = "{0}{1}{2}".format(dst_head, dst_padding, dst_tail).replace("..", ".")
dst = "{0}{1}{2}".format(
dst_head,
dst_padding,
dst_tail).replace("..", ".")
self.log.debug("destination: `{}`".format(dst))
src = os.path.join(stagingdir, src_file_name)
self.log.debug("source: {}".format(src))
instance.data["transfers"].append([src, dst])
repre['published_path'] = "{0}{1}{2}".format(dst_head,
dst_padding_exp,
dst_tail)
# for imagesequence version data
hashes = '#' * len(dst_padding)
dst = os.path.normpath("{0}{1}{2}".format(
dst_head, hashes, dst_tail))
dst = "{0}{1}{2}".format(
dst_head,
dst_start_frame,
dst_tail).replace("..", ".")
repre['published_path'] = dst
else:
# Single file
@ -391,6 +407,10 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
"representation": repre['ext']
}
}
if sequence_repre and repre.get("frameStart"):
representation['context']['frame'] = repre.get("frameStart")
self.log.debug("__ representation: {}".format(representation))
destination_list.append(dst)
self.log.debug("__ destination_list: {}".format(destination_list))
@ -459,7 +479,7 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
# copy file with speedcopy and check if size of files are simetrical
while True:
speedcopy.copyfile(src, dst)
copyfile(src, dst)
if str(getsize(src)) in str(getsize(dst)):
break
@ -477,7 +497,6 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
filelink.create(src, dst, filelink.HARDLINK)
def get_subset(self, asset, instance):
subset = io.find_one({"type": "subset",
"parent": asset["_id"],
"name": instance.data["subset"]})
@ -485,18 +504,33 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
if subset is None:
subset_name = instance.data["subset"]
self.log.info("Subset '%s' not found, creating.." % subset_name)
self.log.debug("families. %s" % instance.data.get('families'))
self.log.debug(
"families. %s" % type(instance.data.get('families')))
_id = io.insert_one({
"schema": "pype:subset-3.0",
"type": "subset",
"name": subset_name,
"families": instance.data.get('families'),
"data": {},
"data": {
"families": instance.data.get('families')
},
"parent": asset["_id"]
}).inserted_id
subset = io.find_one({"_id": _id})
# add group if available
if instance.data.get("subsetGroup"):
subset["data"].update(
{"subsetGroup": instance.data.get("subsetGroup")}
)
io.update_many({
'type': 'subset',
'_id': io.ObjectId(subset["_id"])
}, {'$set': subset["data"]}
)
return subset
def create_version(self, subset, version_number, locations, data=None):
@ -546,6 +580,8 @@ class IntegrateAssetNew(pyblish.api.InstancePlugin):
source = instance.data['source']
except KeyError:
source = context.data["currentFile"]
source = source.replace(os.getenv("PYPE_STUDIO_PROJECTS_MOUNT"),
api.registered_root())
relative_path = os.path.relpath(source, api.registered_root())
source = os.path.join("{root}", relative_path).replace("\\", "/")

View file

@ -152,7 +152,7 @@ class IntegrateFrames(pyblish.api.InstancePlugin):
template_data = {"root": root,
"project": {"name": PROJECT,
"code": project['data']['code']},
"silo": asset['silo'],
"silo": asset.get('silo'),
"task": api.Session["AVALON_TASK"],
"asset": ASSET,
"family": instance.data['family'],

View file

@ -174,7 +174,8 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
"JobDependency0": job["_id"],
"UserName": job["Props"]["User"],
"Comment": instance.context.data.get("comment", ""),
"InitialStatus": state
"InitialStatus": state,
"Priority": job["Props"]["Pri"]
},
"PluginInfo": {
"Version": "3.6",
@ -275,6 +276,9 @@ class ProcessSubmittedJobOnFarm(pyblish.api.InstancePlugin):
except KeyError:
source = context.data["currentFile"]
source = source.replace(os.getenv("PYPE_STUDIO_PROJECTS_MOUNT"),
api.registered_root())
relative_path = os.path.relpath(source, api.registered_root())
source = os.path.join("{root}", relative_path).replace("\\", "/")

View file

@ -0,0 +1,28 @@
import pyblish.api
import pype.lib
from avalon.tools import cbsceneinventory
class ShowInventory(pyblish.api.Action):
label = "Show Inventory"
icon = "briefcase"
on = "failed"
def process(self, context, plugin):
cbsceneinventory.show()
class ValidateContainers(pyblish.api.ContextPlugin):
"""Containers are must be updated to latest version on publish."""
label = "Validate Containers"
order = pyblish.api.ValidatorOrder
hosts = ["maya", "houdini", "nuke"]
optional = True
actions = [ShowInventory]
def process(self, context):
if pype.lib.any_outdated():
raise ValueError("There are outdated containers in the scene.")

View file

@ -0,0 +1,191 @@
import pyblish.api
import pype.api
class ValidateFtrackAttributes(pyblish.api.InstancePlugin):
"""
This will validate attributes in ftrack against data in scene.
Attributes to be validated are specified in:
`$PYPE_CONFIG/presets/<host>/ftrack_attributes.json`
This is array (list) of checks in format:
[
[<attribute>, <operator>, <expression>]
]
Where <attribute> is name of ftrack attribute, <operator> is one of:
"is", is_not", "greater_than", "less_than", "contains", "not_contains",
"starts_with", "ends_with"
<expression> is python code that is evaluated by validator. This allows
you to fetch whatever value in scene you want, for example in Maya:
[
"fps", "is",
"from maya import mel; out = mel.eval('currentTimeUnitToFPS()')"
]
will test if ftrack fps attribute on current Task parent is same as fps
info we get from maya. Store the value you need to compare in
variable `out` in your expression.
"""
label = "Validate Custom Ftrack Attributes"
order = pype.api.ValidateContentsOrder
families = ["ftrack"]
optional = True
def process(self, instance):
context = instance.context
task = context.data.get('ftrackTask', False)
if not task:
self._raise(AttributeError,
"Missing FTrack Task entity in context")
host = pyblish.api.current_host()
to_check = context.data["presets"].get(
host, {}).get("ftrack_attributes")
if not to_check:
self.log.warning("ftrack_attributes preset not found")
return
self.log.info("getting attributes from ftrack ...")
# get parent of task
custom_attributes = {}
try:
parent = task["parent"]
custom_attributes = parent["custom_attributes"].items()
except KeyError:
self._raise(KeyError, "missing `parent` or `attributes`")
custom_attributes = dict(custom_attributes)
# get list of hierarchical attributes from ftrack
session = context.data["ftrackSession"]
custom_hier_attributes = self._get_custom_hier_attrs(session)
custom_attributes = {}
_nonhier = {}
custom_hier_attributes = {k: None for k in custom_hier_attributes}
for key, value in dict(parent["custom_attributes"]).items():
if key in custom_hier_attributes:
custom_hier_attributes[key] = value
else:
_nonhier[key] = value
custom_hier_values = self._get_hierarchical_values(
custom_hier_attributes, parent)
custom_hier_values.update(_nonhier)
errors = []
attribs = custom_hier_values
for check in to_check:
ev = {}
# WARNING(Ondrej Samohel): This is really not secure as we are
# basically executing user code. But there's no other way to make
# it flexible enough for users to get stuff from
exec(str(check[2]), {}, ev)
if not ev.get("out"):
errors.append("{} code doesn't return 'out': '{}'".format(
check[0], check[2]))
continue
if check[0] in attribs:
if check[1] == "is":
if attribs[check[0]] != ev["out"]:
errors.append("{}: {} is not {}".format(
check[0], attribs[check[0]], ev["out"]))
elif check[1] == "is_not":
if attribs[check[0]] == ev["out"]:
errors.append("{}: {} is {}".format(
check[0], attribs[check[0]], ev["out"]))
elif check[1] == "less_than":
if attribs[check[0]] < ev["out"]:
errors.append("{}: {} is greater {}".format(
check[0], attribs[check[0]], ev["out"]))
elif check[1] == "greater_than":
if attribs[check[0]] < ev["out"]:
errors.append("{}: {} is less {}".format(
check[0], attribs[check[0]], ev["out"]))
elif check[1] == "contains":
if attribs[check[0]] in ev["out"]:
errors.append("{}: {} does not contain {}".format(
check[0], attribs[check[0]], ev["out"]))
elif check[1] == "not_contains":
if attribs[check[0]] not in ev["out"]:
errors.append("{}: {} contains {}".format(
check[0], attribs[check[0]], ev["out"]))
elif check[1] == "starts_with":
if attribs[check[0]].startswith(ev["out"]):
errors.append("{}: {} does not starts with {}".format(
check[0], attribs[check[0]], ev["out"]))
elif check[1] == "ends_with":
if attribs[check[0]].endswith(ev["out"]):
errors.append("{}: {} does not end with {}".format(
check[0], attribs[check[0]], ev["out"]))
if errors:
self.log.error('There are invalid values for attributes:')
for e in errors:
self.log.error(e)
raise ValueError("ftrack attributes doesn't match")
def _get_custom_hier_attrs(self, session):
hier_custom_attributes = []
cust_attrs_query = (
"select id, entity_type, object_type_id, is_hierarchical"
" from CustomAttributeConfiguration"
)
all_avalon_attr = session.query(cust_attrs_query).all()
for cust_attr in all_avalon_attr:
if cust_attr["is_hierarchical"]:
hier_custom_attributes.append(cust_attr["key"])
return hier_custom_attributes
def _get_hierarchical_values(self, keys_dict, entity):
# check values already set
_set_keys = []
for key, value in keys_dict.items():
if value is not None:
_set_keys.append(key)
# pop set values from keys_dict
set_keys = {}
for key in _set_keys:
set_keys[key] = keys_dict.pop(key)
# find if entity has set values and pop them out
keys_to_pop = []
for key in keys_dict.keys():
_val = entity["custom_attributes"][key]
if _val:
keys_to_pop.append(key)
set_keys[key] = _val
for key in keys_to_pop:
keys_dict.pop(key)
# if there are not keys to find value return found
if not keys_dict:
return set_keys
# end recursion if entity is project
if entity.entity_type.lower() == "project":
for key, value in keys_dict.items():
set_keys[key] = value
else:
result = self._get_hierarchical_values(keys_dict, entity["parent"])
for key, value in result.items():
set_keys[key] = value
return set_keys
def _raise(self, exc, msg):
self.log.error(msg)
raise exc(msg)

View file

@ -1,6 +1,10 @@
import pyblish.api
import os
import subprocess
try:
import os.errno as errno
except ImportError:
import errno
class ValidateFfmpegInstallef(pyblish.api.Validator):
@ -18,11 +22,13 @@ class ValidateFfmpegInstallef(pyblish.api.Validator):
[name], stdout=devnull, stderr=devnull
).communicate()
except OSError as e:
if e.errno == os.errno.ENOENT:
if e.errno == errno.ENOENT:
return False
return True
def process(self, instance):
if self.is_tool('ffmpeg') is False:
if self.is_tool(
os.path.join(
os.environ.get("FFMPEG_PATH", ""), "ffmpeg")) is False:
self.log.error("ffmpeg not found in PATH")
raise RuntimeError('ffmpeg not installed.')

View file

@ -1,8 +1,9 @@
import pyblish.api
import os
class ValidateTemplates(pyblish.api.ContextPlugin):
"""Check if all templates were filed"""
"""Check if all templates were filled"""
label = "Validate Templates"
order = pyblish.api.ValidatorOrder - 0.1
@ -18,12 +19,12 @@ class ValidateTemplates(pyblish.api.ContextPlugin):
"root": os.environ["PYPE_STUDIO_PROJECTS_PATH"],
"project": {"name": "D001_projectsx",
"code": "prjX"},
"ext": "exr",
"version": 3,
"task": "animation",
"asset": "sh001",
"hierarchy": "ep101/sq01/sh010"}
"ext": "exr",
"version": 3,
"task": "animation",
"asset": "sh001",
"app": "maya",
"hierarchy": "ep101/sq01/sh010"}
anatomy_filled = anatomy.format(data)
self.log.info(anatomy_filled)
@ -31,11 +32,12 @@ class ValidateTemplates(pyblish.api.ContextPlugin):
data = {"root": os.environ["PYPE_STUDIO_PROJECTS_PATH"],
"project": {"name": "D001_projectsy",
"code": "prjY"},
"ext": "abc",
"version": 1,
"task": "lookdev",
"asset": "bob",
"hierarchy": "ep101/sq01/bob"}
"ext": "abc",
"version": 1,
"task": "lookdev",
"asset": "bob",
"app": "maya",
"hierarchy": "ep101/sq01/bob"}
anatomy_filled = context.data["anatomy"].format(data)
self.log.info(anatomy_filled["work"]["folder"])

View file

@ -18,3 +18,6 @@ class CreateLook(avalon.maya.Creator):
# Whether to automatically convert the textures to .tx upon publish.
self.data["maketx"] = True
# Enable users to force a copy.
self.data["forceCopy"] = False

View file

@ -38,7 +38,7 @@ class CreateRenderGlobals(avalon.maya.Creator):
self.log.warning("Deadline REST API url not found.")
else:
argument = "{}/api/pools?NamesOnly=true".format(deadline_url)
response = requests.get(argument)
response = self._requests_get(argument)
if not response.ok:
self.log.warning("No pools retrieved")
else:
@ -135,7 +135,7 @@ class CreateRenderGlobals(avalon.maya.Creator):
'authToken': self._token
}
api_entry = '/api/pools/list'
response = requests.get(
response = self._requests_get(
self.MUSTER_REST_URL + api_entry, params=params)
if response.status_code != 200:
if response.status_code == 401:
@ -160,7 +160,35 @@ class CreateRenderGlobals(avalon.maya.Creator):
api_url = "{}/muster/show_login".format(
os.environ["PYPE_REST_API_URL"])
self.log.debug(api_url)
login_response = requests.post(api_url, timeout=1)
login_response = self._requests_post(api_url, timeout=1)
if login_response.status_code != 200:
self.log.error('Cannot show login form to Muster')
raise Exception('Cannot show login form to Muster')
def _requests_post(self, *args, **kwargs):
""" Wrapper for requests, disabling SSL certificate validation if
DONT_VERIFY_SSL environment variable is found. This is useful when
Deadline or Muster server are running with self-signed certificates
and their certificate is not added to trusted certificates on
client machines.
WARNING: disabling SSL certificate validation is defeating one line
of defense SSL is providing and it is not recommended.
"""
if 'verify' not in kwargs:
kwargs['verify'] = False if os.getenv("PYPE_DONT_VERIFY_SSL", True) else True # noqa
return requests.post(*args, **kwargs)
def _requests_get(self, *args, **kwargs):
""" Wrapper for requests, disabling SSL certificate validation if
DONT_VERIFY_SSL environment variable is found. This is useful when
Deadline or Muster server are running with self-signed certificates
and their certificate is not added to trusted certificates on
client machines.
WARNING: disabling SSL certificate validation is defeating one line
of defense SSL is providing and it is not recommended.
"""
if 'verify' not in kwargs:
kwargs['verify'] = False if os.getenv("PYPE_DONT_VERIFY_SSL", True) else True # noqa
return requests.get(*args, **kwargs)

View file

@ -116,6 +116,7 @@ class ImportMayaLoader(api.Loader):
with maya.maintained_selection():
cmds.file(self.fname,
i=True,
preserveReferences=True,
namespace=namespace,
returnNewNodes=True,
groupReference=True,

View file

@ -1,7 +1,6 @@
from avalon import api
import pype.maya.plugin
import os
import pymel.core as pm
from pypeapp import config
@ -70,6 +69,7 @@ class AssProxyLoader(pype.maya.plugin.ReferenceLoader):
import os
from maya import cmds
import pymel.core as pm
node = container["objectName"]

View file

@ -2,7 +2,6 @@ from avalon import api
import pype.maya.plugin
import os
from pypeapp import config
import pymel.core as pm
reload(config)

View file

@ -1,5 +1,3 @@
import pymel.core as pc
from avalon import api
from Qt import QtWidgets
@ -14,6 +12,8 @@ class ImagePlaneLoader(api.Loader):
color = "orange"
def load(self, context, name, namespace, data):
import pymel.core as pc
new_nodes = []
image_plane_depth = 1000

View file

@ -1,10 +1,9 @@
import pype.maya.plugin
import os
from pypeapp import config
import pymel.core as pm
reload(config)
import pype.maya.plugin
reload(pype.maya.plugin)
class ReferenceLoader(pype.maya.plugin.ReferenceLoader):
"""Load the model"""
@ -19,9 +18,10 @@ class ReferenceLoader(pype.maya.plugin.ReferenceLoader):
color = "orange"
def process_reference(self, context, name, namespace, data):
import maya.cmds as cmds
from avalon import maya
import pymel.core as pm
try:
family = context["representation"]["context"]["family"]
@ -42,11 +42,17 @@ class ReferenceLoader(pype.maya.plugin.ReferenceLoader):
namespace = cmds.referenceQuery(nodes[0], namespace=True)
shapes = cmds.ls(nodes, shapes=True, long=True)
print(shapes)
newNodes = (list(set(nodes) - set(shapes)))
print(newNodes)
groupNode = pm.PyNode(groupName)
roots = set()
print(nodes)
for node in nodes:
for node in newNodes:
try:
roots.add(pm.PyNode(node).getAllParents()[-2])
except:
@ -59,7 +65,6 @@ class ReferenceLoader(pype.maya.plugin.ReferenceLoader):
root.setParent(groupNode)
cmds.setAttr(groupName + ".displayHandle", 1)
groupNode
presets = config.get_presets(project=os.environ['AVALON_PROJECT'])
colors = presets['plugins']['maya']['load']['colors']
@ -68,7 +73,7 @@ class ReferenceLoader(pype.maya.plugin.ReferenceLoader):
groupNode.useOutlinerColor.set(1)
groupNode.outlinerColor.set(c[0], c[1], c[2])
self[:] = nodes
self[:] = newNodes
cmds.setAttr(groupName + ".displayHandle", 1)
# get bounding box
@ -88,7 +93,7 @@ class ReferenceLoader(pype.maya.plugin.ReferenceLoader):
cmds.setAttr(groupName + ".selectHandleY", cy)
cmds.setAttr(groupName + ".selectHandleZ", cz)
return nodes
return newNodes
def switch(self, container, representation):
self.update(container, representation)

View file

@ -47,12 +47,18 @@ class RigLoader(pype.maya.plugin.ReferenceLoader):
cmds.setAttr(groupName + ".outlinerColor",
c[0], c[1], c[2])
shapes = cmds.ls(nodes, shapes=True, long=True)
print(shapes)
newNodes = (list(set(nodes) - set(shapes)))
print(newNodes)
# Store for post-process
self[:] = nodes
self[:] = newNodes
if data.get("post_process", True):
self._post_process(name, namespace, context, data)
return nodes
return newNodes
def _post_process(self, name, namespace, context, data):

View file

@ -1,9 +1,17 @@
import pype.maya.plugin
import os
from collections import defaultdict
from pypeapp import config
import pype.maya.plugin
from pype.maya import lib
class YetiRigLoader(pype.maya.plugin.ReferenceLoader):
"""
This loader will load Yeti rig. You can select something in scene and if it
has same ID as mesh published with rig, their shapes will be linked
together.
"""
families = ["yetiRig"]
representations = ["ma"]
@ -18,6 +26,32 @@ class YetiRigLoader(pype.maya.plugin.ReferenceLoader):
import maya.cmds as cmds
from avalon import maya
# get roots of selected hierarchies
selected_roots = []
for sel in cmds.ls(sl=True, long=True):
selected_roots.append(sel.split("|")[1])
# get all objects under those roots
selected_hierarchy = []
for root in selected_roots:
selected_hierarchy.append(cmds.listRelatives(
root,
allDescendents=True) or [])
# flatten the list and filter only shapes
shapes_flat = []
for root in selected_hierarchy:
shapes = cmds.ls(root, long=True, type="mesh") or []
for shape in shapes:
shapes_flat.append(shape)
# create dictionary of cbId and shape nodes
scene_lookup = defaultdict(list)
for node in shapes_flat:
cb_id = lib.get_id(node)
scene_lookup[cb_id] = node
# load rig
with maya.maintained_selection():
nodes = cmds.file(self.fname,
namespace=namespace,
@ -26,6 +60,20 @@ class YetiRigLoader(pype.maya.plugin.ReferenceLoader):
groupReference=True,
groupName="{}:{}".format(namespace, name))
# for every shape node we've just loaded find matching shape by its
# cbId in selection. If found outMesh of scene shape will connect to
# inMesh of loaded shape.
for destination_node in nodes:
source_node = scene_lookup[lib.get_id(destination_node)]
if source_node:
self.log.info("found: {}".format(source_node))
self.log.info(
"creating connection to {}".format(destination_node))
cmds.connectAttr("{}.outMesh".format(source_node),
"{}.inMesh".format(destination_node),
force=True)
groupName = "{}:{}".format(namespace, name)
presets = config.get_presets(project=os.environ['AVALON_PROJECT'])
@ -38,6 +86,4 @@ class YetiRigLoader(pype.maya.plugin.ReferenceLoader):
c[0], c[1], c[2])
self[:] = nodes
self.log.info("Yeti Rig Connection Manager will be available soon")
return nodes

View file

@ -93,6 +93,9 @@ class CollectInstances(pyblish.api.ContextPlugin):
parents = self.get_all_parents(members)
members_hierarchy = list(set(members + children + parents))
if 'families' not in data:
data['families'] = [data.get('family')]
# Create the instance
instance = context.create_instance(objset)
instance[:] = members_hierarchy
@ -100,6 +103,7 @@ class CollectInstances(pyblish.api.ContextPlugin):
# Store the exact members of the object set
instance.data["setMembers"] = members
# Define nice label
name = cmds.ls(objset, long=False)[0] # use short name
label = "{0} ({1})".format(name,
@ -117,6 +121,8 @@ class CollectInstances(pyblish.api.ContextPlugin):
# Produce diagnostic message for any graphical
# user interface interested in visualising it.
self.log.info("Found: \"%s\" " % instance.data["name"])
self.log.debug("DATA: \"%s\" " % instance.data)
def sort_by_family(instance):
"""Sort by family"""

View file

@ -37,6 +37,7 @@ class CollectMayaScene(pyblish.api.ContextPlugin):
"label": subset,
"publish": False,
"family": 'workfile',
"families": ['workfile'],
"setMembers": [current_file]
})

View file

@ -206,6 +206,11 @@ class ExtractLook(pype.api.Extractor):
destination = self.resource_destination(
instance, source, do_maketx
)
# Force copy is specified.
if instance.data.get("forceCopy", False):
mode = COPY
if mode == COPY:
transfers.append((source, destination))
elif mode == HARDLINK:

Some files were not shown because too many files have changed in this diff Show more