Skip to contents

In here, we use fit_highd_model() function to construct the model in 2D and high-dimensional space using the provided training data (s_curve_noise_training) and the precomputed scaled UMAP embeddings (s_curve_noise_umap_scaled). The function takes various parameters to configure the model construction process, such as hexagonal binning parameters (bin1, s1, s2, `r2``), and options for binning and hexagon visualization.

r2 <- diff(range(s_curve_noise_umap$UMAP2))/diff(range(s_curve_noise_umap$UMAP1))

model <- fit_highd_model(training_data = s_curve_noise_training, 
                         emb_df = s_curve_noise_umap_scaled, bin1 = 6,
                         r2 = r2, is_bin_centroid = TRUE, 
                         is_rm_lwd_hex = FALSE, col_start_highd = "x")
## 2D model
glimpse(model$df_bin)
#> Rows: 24
#> Columns: 8
#> $ hb_id <int> 2, 7, 8, 9, 10, 14, 15, 16, 17, 18, 22, 23, 26, 27, 28, 29, 31, …
#> $ x1    <dbl> 0.60502895, 0.89506510, 0.15057032, -0.61992321, -0.96031558, 0.…
#> $ x2    <dbl> 1.7979353, 1.2803295, 1.2721767, 1.1659372, 1.5043088, 0.3120879…
#> $ x3    <dbl> 1.783629951, 1.379896396, 1.953298180, 1.757914643, 1.044983514,…
#> $ x4    <dbl> -3.639940e-03, -3.839172e-04, -7.412989e-04, -4.615018e-05, 6.22…
#> $ x5    <dbl> 2.986351e-04, -8.150889e-04, 1.088423e-03, 4.499042e-04, 5.26555…
#> $ x6    <dbl> -0.0104338345, -0.0018955210, 0.0011397559, -0.0004884167, 0.002…
#> $ x7    <dbl> 2.827596e-04, -9.927984e-05, 6.435293e-06, 3.529844e-04, 6.46460…

## high-D model
glimpse(model$df_bin_centroids)
#> Rows: 24
#> Columns: 6
#> $ hexID      <int> 2, 7, 8, 9, 10, 14, 15, 16, 17, 18, 22, 23, 26, 27, 28, 29,…
#> $ c_x        <dbl> 0.117873476, 0.008936738, 0.226810215, 0.444683691, 0.66255…
#> $ c_y        <dbl> -0.09382762, 0.09485635, 0.09485635, 0.09485635, 0.09485635…
#> $ bin_counts <int> 49, 203, 288, 182, 198, 161, 90, 215, 216, 153, 36, 140, 15…
#> $ std_counts <dbl> 0.16896552, 0.70000000, 0.99310345, 0.62758621, 0.68275862,…
#> $ drop_empty <lgl> FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FAL…