简体   繁体   中英

Why is a buffer used in Win32 API syscall cast to [1<<20]<type> array?

I'm writing a golang application which interacts with Windows Services using the windows/svc package.

When I'm looking at the package source code how syscalls are being done I see interesting cast construct:

name := syscall.UTF16ToString((*[1 << 20]uint16)(unsafe.Pointer(s.ServiceName))[:]

Extracted from mgr.go

This is a common patttern when dealing with Win32 API when one needs to pass a pre-allocated buffer to receive a value from Win32 API function, usually an array or a structure.

I understand that Win API returns a unicode string represented by its pointer and it is passed to the syscall.UTF16ToString(s []uint16) function to convert it to the go string in this case.

I'm confused from the part when an unsafe pointer is cast to the pointer to 1M array, *[1<<20]uint16 .

Why the size if 1M [1<<20]?

Buffer for a value is allocated dynamically, not with fixed size of 1M.

You need to choose a static size for the array type, so 1<<20 is chosen to be large enough to allow for any reasonable buffer returned by the call.

There is nothing special about this size, sometimes you'll see 1<<31-1 since it's the largest array for 32bit platforms, or 1<<30 since it looks nicer. It really doesn't matter as long as the type can contain the returned data.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM