Forums › Forums › Search & Filter Pro › Indexing latency with large data set
Tagged: cache index, efficiency, speed
- This topic has 2 replies, 2 voices, and was last updated 9 years ago by Anonymous.
-
Anonymous(Private) October 29, 2015 at 4:32 pm #28559
Hi,
We are trying to use this plugin with a dataset that has 37,052 records. We’ve observed that the cache indexing on this dataset is very slow, and when the full 37,000 records are indexed on our dev AWS server the job ABENDs and the server freezes.
All of which means, we cannot use this plugin in production even though we would like to. We cannot use it with a dataset of this size anyway, not until the latency & efficiency issues have been corrected.
Have you tested with datasets of this size? Please advise,
Reese, for PhillipB
Dominion Enterprises IncRoss Moderator(Private) October 30, 2015 at 10:10 am #28604Hey Phillip
I’m working on a fix for this as we speak – once I’ve updated it should work fine with this kind of data set, the only feature that will not work (because it requires some heavy queries) is the auto count feature.
If you’re happy to wait until next week I’ll send you the update.
RE the indexing – this has been slowed intentionally to a point so other setups don’t fall over. However, if you’re on a dev server, and have it all locked down behind a password then S&F has to use ajax to build the cache, which makes the process slower still.
Thanks
Anonymous(Private) October 30, 2015 at 12:36 pm #28623We can wait until next week, sure.
I’m not sure what controls were put in place on the AWS dev server, I didn’t set it up. I can find out, if it is important. While running on my local machine the indexing job did finish, but was interminably slow – it took well over an hour.Reese, for Phillip
-
AuthorPosts