简体   繁体   中英

Using Appkit and Python to hide mouse cursor on OSX

I'm trying to script hiding the mouse cursor on OSX 10.9. I have Chrome starting and going full screen for a kiosk and I'd like to periodically run a script to hide the cursor.

Applescript no longer directly supports "call method" to call the objective C method, so I thought the simplest method would be to use AppKit from the provided python.

It crashes:

$ python
Python 2.7.5 (default, Mar  9 2014, 22:15:05) 
[GCC 4.2.1 Compatible Apple LLVM 5.0 (clang-500.0.68)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import AppKit
>>> AppKit.NSCursor.hide()
Assertion failed: (CGAtomicGet(&is_initialized)), function CGSConnectionByID, file Services/Connection/CGSConnection.c, line 123.
Abort trap: 6

I suspect there is a pre-requisite call I need to make to initialize something , but I haven't found anything yet while digging through docs/google.

What am I missing?

I've had luck with the Quartz included with PyObjC :

import Quartz
Quartz.CGDisplayHideCursor(Quartz.CGMainDisplayID())

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM