A good litmus test is: how many new use cases would require a recompilation cycle?
There are many foundational technologies, but two that are near and dear to me are the internet and the web.
Both of these are foundational because they enable un-anticipated applications to be built on top of them without requiring them to ask for permission: they were designed to be content / domain agnostic which enabled apps like gmail.com, youtube.com and bittorrent to be invented without having to re-deploy / re-model the foundation at each and every step.
Foundations are un-opinionated about the substance of discourse (e.g. TCP/IP transmits packages and HTML transmits GUIs). They use attribution (e.g. IP addresses and domain names) to create a clear line of accountability between content and authorship. They launch, switch and notify.
While foundations don't make editorial judgement (e.g. what goes on top of them), they make plenty of structural judgement (e.g. what's the maximum weight they can support). In software, foundations typically opine in terms of structural elements like performance, security, privacy and isolation.