Skip to contents

In here, we use fit_highd_model() function to construct the model in 2D and high-dimensional space using the provided training data (s_curve_noise_training) and the precomputed scaled UMAP embeddings (s_curve_noise_umap_scaled). The function takes various parameters to configure the model construction process, such as hexagonal binning parameters (bin1, s1, s2, `r2``), and options for binning and hexagon visualization.

lim1 <- s_curve_obj$s_curve_umap_scaled_obj$lim1
lim2 <- s_curve_obj$s_curve_umap_scaled_obj$lim2
r2 <- diff(lim2)/diff(lim1)

umap_scaled <- s_curve_obj$s_curve_umap_scaled_obj$scaled_nldr

model <- fit_highd_model(highd_data = s_curve_noise_training, 
                         nldr_data = umap_scaled, bin1 = 15,
                         r2 = r2, is_bin_centroid = TRUE)
## 2D model
glimpse(model$df_bin)
#> Rows: 125
#> Columns: 8
#> $ hb_id <int> 18, 19, 20, 34, 35, 36, 37, 38, 39, 40, 41, 48, 49, 50, 51, 52, 
#> $ x1    <dbl> 0.99930247, 0.94287531, 0.70062586, 0.96707217, 0.81517204, 0.51…
#> $ x2    <dbl> 0.06589512, 0.09959561, 0.05541615, 0.35401749, 0.31058901, 0.26…
#> $ x3    <dbl> 1.0309217, 1.3117090, 1.7012223, 1.2250527, 1.5684607, 1.8482762…
#> $ x4    <dbl> -0.0033356106, 0.0066771925, 0.0012777401, -0.0011772351, -0.000…
#> $ x5    <dbl> 1.224449e-03, -2.129098e-03, -6.202391e-04, 1.234268e-03, -3.318…
#> $ x6    <dbl> 0.015545625, 0.033324342, -0.014773297, -0.001446298, -0.0049625…
#> $ x7    <dbl> -1.373899e-03, -1.270457e-03, -2.645191e-03, -6.119639e-04, 1.79…

## high-D model
glimpse(model$df_bin_centroids)
#> Rows: 125
#> Columns: 6
#> $ hexID      <int> 18, 19, 20, 34, 35, 36, 37, 38, 39, 40, 41, 48, 49, 50, 51,
#> $ c_x        <dbl> 0.1065886, 0.1892240, 0.2718594, 0.1479063, 0.2305417, 0.31…
#> $ c_y        <dbl> -0.01693250, -0.01693250, -0.01693250, 0.05463188, 0.054631…
#> $ bin_counts <int> 5, 13, 11, 41, 38, 36, 39, 21, 24, 19, 10, 13, 29, 34, 35, 
#> $ std_counts <dbl> 0.078125, 0.203125, 0.171875, 0.640625, 0.593750, 0.562500,
#> $ drop_empty <lgl> FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FAL…