When supporting signal handling for my tiny OS, I notice that the defination of `sigset_t` which is used in signal handling is weird. ``` // include/alltypes.h.in TYPEDEF struct __sigset_t { unsigned long __bits[128/sizeof(long)]; } sigset_t; ``` 128 bytes (16 * long) are used for sigmask​ when 128 bits (2 * long) is enough. Why? For strange compatibility? In my opinion, `unsigned long __bits[128/sizeof(long)/8]` is more reasonable.