Don’t forget to deploy all that code to the edge. Your $5 VPS in NY running both nginx and MySQL is going to be a huge bottle neck when somebody in Texas tries to view your simple website and it takes .10 seconds longer to load. You’re going to want that contact form code on at least 1,000 servers worldwide for maximum benefits and conversions.
Also, now that you are on the edge and your form submissions are web scale you likely need to model eventually consistency for the processing so you will need a highly available queue running somewhere.
Joking aside, as someone who runs their entire saas off a single bare metal server in Las Vegas, I have been toying with the idea of pushing some stuff to the edge. The reason being, sometimes those average load times are not giving you the real picture. If my average load time is 560ms per request because 90% of requests are 400ms and 10% are 2 full seconds, that might be a problem. Fixing those 10% requests and bringing them down in line with the rest would equate to an average speed up of only around 100ms.
However, I'm not yet sure a complicated edge deployment would actually fix the 10% problem.
I like the idea of a "Progressively Enhanced Monolith". Start by building the core of the application in something like Laravel that can do anything and be developed quickly, then stick it behind a proxy/CDN like Cloudflare. Then if you find endpoints or pages that are accessed frequently and would be better as their own microservice then use a small serverless worker to either respond directly from the edge or forward the request onto a microservice.
I'm jumping on the conversation , actually my portfolio website is on cloudflare pages with astrojs , it costs me nothing to host it , has a quick way to develop . I higly recomend it