opt-SNE is a variant of the t-SNE algorithm that features several improvements above the traditional Barnes-Hut implementation of t-SNE, including the ability to detect the rate of improvement of KL divergence (KLD - functionally, how good the low dimensional projection of many dimensions is) and then automatically stop the algorithm when it begins to suffer from diminishing returns in that metric. The other optimizations that are present in opt-SNE however, can be applied to other versions including Fast Fourier Transform-accelerated Interpolation-based t-SNE (FIt-SNE).

To apply the same optimized hyperparameter strategy from opt-SNE to FIt-SNE, enter the following:

**Max Iterations: 700**

**Stop Early Exaggeration: 150**

**Learning Rate: LR = (# of events)/36**

So for example, FIt-SNE's default hyperparameters in OMIQ are:

Max Iterations: 1000

Stop Early Exaggeration: 250

Learning Rate: 5000

If you had 10 million events, the settings should be changed to the following:

Max Iterations: 700

Stop Early Exaggeration: 150

Learning Rate: 277,777