Since 2015, Googlebot can execute javascript files and read DOM. So, when Googlebot looking for our collection page, it executes each js file and builds a full page, that it can read and index. Therefore, there is no difference between a static page and a dynamic loading page, Googlebot will index them equally.
There had been a few tests done to check the crawling ability of the Googlebot, here are the most important ones:
- Test the search engine’s ability to account for dynamically inserted text when the text is within the HTML source of the page.
- Test the search engine’s ability to account for dynamically inserted text when the text is outside the HTML source of the page (in an external JavaScript file).
**Result: **In both cases, the text was crawled and indexed, and the page ranked for the content.