I'm supporting to raise the limit. My suggestion would be 1'000 instead of 100. This is based on a statistical evaluation we performed.
I'm able to share with you the the distribution  of domains on our servers at cyon. We serve 256852 unique domain names. Note: We count example.com and www.example.com as two separate domain names.
The first column is the count of domains grouped by a VirtualHost. The second column is the occurrence/frequency of the domain count per VirtualHost.
We identified 20 VirtualHosts with more than 100 domains. This 20 VirtualHost make up 1.6% of the total amount of domain names.
The upper limit of 1000 SAN's per certificate works in Firefox and Chrome. Tested with .
Our hosting control panel does not allow to split up a bucket of domain names in smaller chunks. In addition the management of splitting domain names groups into smaller junks is currently not an option due to the current beta rate limits. Namely 'Certificates/Domain'.
What are the use cases of so many domain names in one VirtualHost?
We see the following scenarios used on our servers:
- Multidomain/Multisite setup. One CMS that servers different content bases on the domain 
- Company/organisation with multiple domain names pointing to the main website
- mycorp.com, mycorp.net
- english-name-of-organisation.com, german-name-of-organisation.com
- This is especially the case in Switzerland with four official languages
- productname1.com, productname2.com
- Company that bought domains and is selling them
Note: As mentioned, for every domain someone installs on our system we automatically add second domain name with an 'www.' prefix. This drives up the numbers of SAN's quite fast.
I think we are not the only reaching this limit now or in the future.
PS: I currently don't know how our ACME system behaves if we have to challenge up to 500 domains in one batch. It might take some time.