-
Notifications
You must be signed in to change notification settings - Fork 23
/
resources.qmd
117 lines (95 loc) · 3.73 KB
/
resources.qmd
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
---
title: "Resources for Building Production-Quality Shiny Applications"
format:
html:
page-layout: full
link-external-newwindow: true
execute:
echo: false
message: false
warning: false
---
```{r}
#| label: setup
library(reactable)
library(htmltools)
library(dplyr)
source(file.path(here::here(), "R", "gen_art_metadata.R"))
df <- gen_art_metadata()
df_names <- names(df)
df <- purrr::map2(df, df_names, ~{
subdf <- mutate(.x, dataset = .y)
tibble::as_tibble(subdf)
})
df_all <- dplyr::bind_rows(df) %>%
distinct() %>%
arrange(dataset, variable)
```
## Important Links
* RStudio Cloud [Workspace]({{< var rstudio_cloud_workspace_url >}}): Click to automatically join the RStudio Cloud Project for this workshop
* RStudio Connect: [rsc.training.rstudio.com](https://rsc.training.rstudio.com)
* GitHub Discussion Board: [github.com/rstudio-conf-2022/shiny-prod-apps/discussions](https://github.com/rstudio-conf-2022/shiny-prod-apps/discussions)
* Collaborate Google Doc: [bitl.y/shinynotes](https://bit.ly/shinynotes)
* Discord channel: `#building-production-quality-shiny-applications`
## Workshop Data Guide
Throughout this workshop, each of the exercises and live-coding sessions will utilize data from the [Metropolitan Museum of Art](https://www.metmuseum.org), otherwise known as the MET, located in New York, United States. The museum offers data sets associated with the art and objects hosted in the museum via their public [API](https://metmuseum.github.io). Additional metadata associated with each image of the art piece or object was also generated using the [Google Vision API](https://cloud.google.com/vision), and in particular the following methods:
* [Label detection](https://cloud.google.com/vision/docs/labels): Detect and extract information about entities in an image across a broad group of categories.
* [Object annotation](https://cloud.google.com/vision/docs/object-localizer): Detect and extract information about multiple objects in an image.
* [Image properties](https://cloud.google.com/vision/docs/detecting-properties): Detect color attributes of an image.
* [Crop hints](https://cloud.google.com/vision/docs/detecting-crop-hints): Obtain vertices of a cropped region for an image.
```{r}
#| label: art-metadata
#| eval: true
tagList(
div(
div(tags$label("Dataset Filter", `for` = "art-type-filter")),
tags$select(
id = "art-type-filter",
onchange = "Reactable.setFilter('art-filter-table', 'dataset', this.value)",
tags$option("All", value = ""),
lapply(unique(df_all$dataset), tags$option)
)
),
tags$hr("aria-hidden" = "false"),
reactable(
df_all,
columns = list(
variable = colDef(
name = "Variable",
minWidth = 200,
cell = function(value, index) {
type <- df_all$type[index]
div(
div(style = list(fontweight = 500), value),
div(style = list(fontSize = "0.75rem"), type)
)
}
),
type = colDef(show = FALSE),
description = colDef(minWidth = 400),
dataset = colDef(show = FALSE),
example = colDef(minWidth = 300)
),
defaultPageSize = 10,
elementId = "art-filter-table")
)
```
```{r}
#| label: test
#| eval: false
library(htmltools)
data <- MASS::Cars93[1:15, c("Manufacturer", "Model", "Type", "Price")]
tagList(
div(
div(tags$label("Filter Type", `for` = "cars-type-filter")),
tags$select(
id = "cars-type-filter",
onchange = "Reactable.setFilter('cars-filter-table', 'Type', this.value)",
tags$option("All", value = ""),
lapply(unique(data$Type), tags$option)
)
),
tags$hr("aria-hidden" = "false"),
reactable(data, defaultPageSize = 5, elementId = "cars-filter-table")
)
```