giffgaff.com and community.giffgaff.com do not have a robots.txt file, which ought to mean that search engines are allowed to index those pages if they wish.
But for a search engine to index a page, it needs to know it is there -- and that usually means that the page must have links to it which do not themselves forbid search engines from following those links.
There's a sitemap.xml for the community, and it includes the relevant folders such as https://community.giffgaff.com/t/announcements and all the other main tags -- which in turn means that search engines would be able to find their way to the index pages of each tag. But search engines aren't necessarily expected to find their way to the sitemap.xml - usually, there would be an entry in robots.txt pointing to the sitemap -- and giffgaff doesn't have robots.txt files.
That's not necessarily conclusive, because there are other ways of submitting the sitemap (to google at least -- but other search engines might not use those methods).
For google, search results do suggest that the site is not being fully indexed -- which is disappointing as search engines always did a good job of finding pages in the lithium forums.
I wonder too how much a search engine would see from those indexes even if it reached them. It would immediately see the newest 20 pages in each of the indexes. But I'm not certain it could follow the more prompt to reach the next 20 pages. In lithium there were always clickable links to next and previous pages. The more button in flarum is clickable, but doesn't contain an actual link to follow to the later threads. And the same possibly applies to the threads themselves -- a search engine ,ight initially see the newest posts in the thread, but would it know how to simulate scrolling down to get to the later posts?
But a first step might be to create a robots.txt and include a pointer to the sitemap, then see if that improves anything.