Dynamic on an Athlon X2 server box. :) Serious optimisation and I learned a lot from the posts here from Markus Frind (I think). Not all of those URLs are worth including in the sitemaps because many domain names are one hit wonders that are only registered for a year and then never reregistered. A popularity ranking was used for some of the sitemaps with only domian names registered across TLDs being included. The launch of the new gTLDs in 2012 mushroomed the gTLDs from about 16 to over 1,200. It was possible to extend the algorithm but most of the registrations in the early phase of the launch of these gTLDs were speculative and many were targeting the popular names. The URLs have the hosting history of each domain name (which hoster provided DNS for them and whether the hoster was a PPC parker, a sales site, a brand protection hoster or just an ordinary hoster). With someone researching a domain name, it is possible to see if it was registered previously and had any potentially iffy history.
Google tried to kill off web directories about ten years ago. One of the things I am working on is the breakdown of the gTLDs by country and by web hosting provider. I have it at 99.36% resolution for all gTLD websites. These are the stats for gTLDs in some coutries for December:
Region - country - cc - providers - websites - identified - resolution - unidentified
It would be possible to build a directory of hosting providers from that. It would be different to the domain name data though it could be made searchable by provider.
AP - Australia - AU - 1,149 - 1,837,220 - 1,837,105 - 99.99% - 115
NA - Canada - CA - 301 - 6,142,334 - 6,106,987 - 99.42% - 35,347
EUR - Germany - DE - 4,886 - 17,601,604 - 17,512,947 - 99.50% - 88,657
EUR - Ireland - IE - 328 - 408,134 - 408,134 - 100.00% - 0
EUR - United Kingdom - UK2,219 - 2,013,181 - 1,950,560 - 96.89% - 62,621
NA - United States - US - 2,280 - 130,230,631 - 129,875,598 - 99.73% - 355,033
(had to re-edit as the WW software ignores tabs)
The data is updated monthly. There are about 1M hosters (DNS providers) but around 600K can be excluded as they only host a single domain name. (Auto configuration on large registrars create a DNS, MX and website automatically for some new registrations). There are millions of transactions (New/Deleted/Transfers) each month and that current data might be the most interesting for users and SEs. The most interesting for usres is actually the deletions and that directory idea is probably doable for both users and SEs and would cut the size of the current sitemaps considerably.
It really needs a new sitemap strategy to optimise things and focus on quality rather than quantity.
Regards...jmcc