I am using the following script to 'maximize' the current application window on my mac.
It works fine when the frontmost application window is on the main / larger external display, or when my laptop is not connected to an external display.
However, when I am connected to an external display, and the window I am trying to maximize is on my laptop display (not the external, primary display), this script ends up enlarging the window to the dimensions of the larger display.
try
tell application "Finder" to set b to bounds of window of desktop
try
tell application (path to frontmost application as text)
set bounds of window 1 to {item 1 of b, 22, item 3 of b, item 4 of b}
end tell
on error
tell application "System Events" to tell window 1 of (process 1 where it is frontmost)
try
set position to {0, 22}
set size to {item 3 of b, (item 4 of b) - 22}
on error
click (button 1 of window 1 where subrole is "AXZoomButton")
end try
end tell
end try
end try
I believe the issue is the way I am getting the desired bounds
, b
:
bounds of window of desktop
This seems to always return the bounds of the primary display.
What I need is a way to detect the bounds of the display for the current running application that has keyboard and mouse in focus.
NB: I am totally cool if the solution to this is in Swift and not Apple Script.
Have you considered this:
tell application (path to frontmost application as text)
set zoomed of window 1 to true
end tell
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.