02:00
-posit-conf-2023.github.io/shiny-r-prod/#workshop-instructors
diff --git a/.nojekyll b/.nojekyll index e70cbde..f0c44e6 100644 --- a/.nojekyll +++ b/.nojekyll @@ -1 +1 @@ -65a7a637 \ No newline at end of file +c8176aa1 \ No newline at end of file diff --git a/LICENSE.html b/LICENSE.html index 3d3cb76..e8ad999 100644 --- a/LICENSE.html +++ b/LICENSE.html @@ -2,7 +2,7 @@
- + diff --git a/index.html b/index.html index 30f37e8..3ea0872 100644 --- a/index.html +++ b/index.html @@ -2,7 +2,7 @@ - + diff --git a/listings.json b/listings.json index 136f0cd..4a8050a 100644 --- a/listings.json +++ b/listings.json @@ -1,17 +1,10 @@ [ { - "listing": "/units/d1-01-welcome.html", - "items": [] - }, - { - "listing": "/units/d1-04-deploy-admin.html", - "items": [] - }, - { - "listing": "/units/d1-02-structure.html", + "listing": "/units/d1-03-performance.html", "items": [ - "/materials/d1-02-structure/ex-1.html", - "/materials/d1-02-structure/codealong-1.html" + "/materials/d1-03-performance/ex-1.html", + "/materials/d1-03-performance/codealong-1.html", + "/materials/d1-03-performance/codealong-2.html" ] }, { @@ -22,19 +15,16 @@ "/units/d1-02b-break.html", "/units/d1-03-performance.html", "/units/d1-03b-lunch.html", - "/units/d1-04-deploy-admin.html", + "/units/d1-04-loadtesting.html", "/units/d1-04b-break.html", - "/units/d1-9001-loadtesting.html" + "/units/d1-05-deploy-admin.html" ] }, { - "listing": "/units/d1-03-performance.html", + "listing": "/units/d1-02-structure.html", "items": [ - "/materials/d1-03-performance/codealong-1.html" + "/materials/d1-02-structure/ex-1.html", + "/materials/d1-02-structure/codealong-1.html" ] - }, - { - "listing": "/units/d1-9001-loadtesting.html", - "items": [] } ] \ No newline at end of file diff --git a/materials/d1-01-welcome/index.html b/materials/d1-01-welcome/index.html index 5548550..19c23ec 100644 --- a/materials/d1-01-welcome/index.html +++ b/materials/d1-01-welcome/index.html @@ -8,7 +8,7 @@ - +Shiny in Production: Tools & Techniques
posit::conf(2023)
TODO: Link to Meet the Team Section on Home Page
-Introduce yourself to your neighbor(s)
-What is your most memorable Shiny application in production experience?
-02:00
-posit-conf-2023.github.io/shiny-r-prod/#workshop-instructors
Follow Setup Procedure to connect with the workshop resources:
+Complete any remaining Setup Procedures
Introduce yourself to your neighbor(s)
+What is your most memorable Shiny application in production experience?
+05:00
+Imagine your application is working great!
-ggplot2
version 0.9.3
ggplot2
version 1.0.0
{renv}
--Create reproducible environments for your R projects.
-
{packrat}
Upon initializing a project:
-.Rprofile
to activate custom package library on startuprenv.lock
to describe state of project libraryrenv/library
to hold private project libraryrenv/activate.R
performs activationSticking with {renv}
will pay off (trust me)
Change the example code below to match LEGO data. Delete this note when finished.
-id
: String to use for namespacemod_picker.R
+set_picker_server <- function(id, sets_rv) {
- moduleServer(
- id,
- function(input, output, session) {
- set_choices <- reactive({
- # do something with sets_rv
- })
-
- observeEvent(set_choices(), {
- req(set_choices())
- updateSelectInput(
- "set_num",
- choices = set_choices()
- )
- })
- }
- )
-}
Minimal changes necessary
-mod_picker.R
+set_picker_server <- function(id, sets_rv) {
+ moduleServer(
+ id,
+ function(input, output, session) {
+ set_choices <- reactive({
+ # do something with sets_rv
+ })
+
+ observeEvent(set_choices(), {
+ req(set_choices())
+ updateSelectInput(
+ "set_num",
+ choices = set_choices()
+ )
+ })
+ }
+ )
+}
Minimal changes necessary
set_picker_server <- function(id, sets_rv) {
- moduleServer(
- id,
- function(input, output, session) {
- set_choices <- reactive({
- # do something with sets_rv
- })
-
- observeEvent(set_choices(), {
- req(set_choices())
- updateSelectInput(
- "set_num",
- choices = set_choices()
- )
- })
- }
- )
-}
set_picker_server <- function(id, sets_rv) {
+ moduleServer(
+ id,
+ function(input, output, session) {
+ set_choices <- reactive({
+ # do something with sets_rv
+ })
+
+ observeEvent(set_choices(), {
+ req(set_choices())
+ updateSelectInput(
+ "set_num",
+ choices = set_choices()
+ )
+ })
+ }
+ )
+}
:thinking: id
🤔 id
moduleServer()
: Encapsulate server-side logic with namespace applied.moduleServer()
: Encapsulate server-side logic with namespace applied.
mod_picker.R
+ +tagList()
of inputs, output placeholders, and other UI elementsset_picker_server <- function(id, sets_rv) {
- moduleServer(
- id,
- function(input, output, session) {
- set_choices <- reactive({
- # do something with sets_rv
- })
-
- observeEvent(set_choices(), {
- req(set_choices())
- updateSelectInput(
- "set_num",
- choices = set_choices()
- )
- })
- }
- )
-}
mod_picker.R
+set_picker_server <- function(id, sets_rv) {
+ moduleServer(
+ id,
+ function(input, output, session) {
+ set_choices <- reactive({
+ # do something with sets_rv
+ })
+
+ observeEvent(set_choices(), {
+ req(set_choices())
+ updateSelectInput(
+ "set_num",
+ choices = set_choices()
+ )
+ })
+ }
+ )
+}
Input & return values can be a mix of static and reactive objects
+set_picker_server <- function(id, sets_rv) {
- moduleServer(
- id,
- function(input, output, session) {
- # ...
-
- set_selection <- reactive({
- input$set_num
- })
-
- set_selection
- }
- )
-}
sets_rv
sets_rv()
set_selection
, set_selection()
Code-Along 1: Add a new Shiny module to pick themes
+Add a new Shiny module to pick LEGO set themes
+Create a new Shiny module with LEGO data metrics!
+10:00
+Imagine your application is working great!
+ggplot2
version 0.9.3
ggplot2
version 1.0.0
{renv}
++Create reproducible environments for your R projects.
+
{packrat}
Upon initializing a project:
+.Rprofile
to activate custom package library on startuprenv.lock
to describe state of project libraryrenv/library
to hold private project libraryrenv/activate.R
performs activationSticking with {renv}
will pay off (trust me)
posit-conf-2023.github.io/shiny-r-prod
@@ -955,6 +988,7 @@.parquet
files in ShinyThe current version of our Shiny application contains a module for generating predictions of the number of LEGO parts in a set using the number of unique colors and number of unique part categories. The API is executed and processed using the {httr2}
package. Here is the function wrapping the API execution:
#' @importFrom httr2 request req_body_json req_perform resp_body_json
-run_prediction <- function(df, endpoint_url, back_transform = TRUE, round_result = TRUE) {
- # create request object
- req <- request(endpoint_url)
-
- # perform request
- resp <- req |>
- req_body_json(df) |>
- req_perform()
-
- # extract predictions from response
- pred_values <- resp_body_json(resp)$.pred |> unlist()
-
- # back-transform log10 value of predicted number of parts if requested
- if (back_transform) {
- pred_values <- 10 ^ pred_values
- }
-
- # round result up to nearest integer if requested
- if (round_result) pred_values <- ceiling(pred_values)
-
- # append predictions to supplied data frame
- dplyr::mutate(df, predicted_num_parts = pred_values)
-}
Unfortunately, the prediction API call takes a bit of time to execute due to some extremely sophisticated processing 😅. As a result, any interactions within the application will not be processed until the prediction call completes. Our goal is to convert the prediction processing from synchronous to asynchronous using {crew}
The current version of our Shiny application performs additional data processing to generate part summaries that are utilized by reactive data frames. The custom function is called gen_part_metaset()
which is located in the R/fct_data_processing.R
script. For the purposes of this exercise, we are not going to try and optimize this specific function (certainly you are welcome to try after the workshop), instead we are going to see if we can more efficiently utilize the results of the function inside our Shiny application.
observeEvent
to push the prediction task to the controller, ensuring the key objects and required packages are passed on to the controller.First we create the following reactiveVal
objects to keep track of the prediction state:
Next we set up a new controller:
-Inside the observeEvent
for the user clicking the prediction button, we update the logic to push the prediction task to the controller:
Upon closer inspection, we see that the calls to gen_part_metaset()
do not take any dynamic parameters when used in the application. In addition, the function is called multiple times inside a set of reactive
expressions. A first attempt at removing the bottleneck would be to move this function call to the beginning of the app_server.R
logic and feeding the resulting object directly into the reactives that consume it.
Knowing that the processing function is not leveraging any dynamic parameters, we can do even better. In our mission to ensure the Shiny application performs only the data processing that it absolutely needs to do, we can instead run this function outside of the application, and save the result of the processing as a .parquet
file inside the inst/extdata
directory using the {arrow}
package.
Lastly, we create a new observe
block that periodically checks whether the running {crew}
tasks have completed, ensuring that this is only executed when a prediction has been launched:
With the processed data set available in the app infrastructure, we can utilize it inside the application with the following:
observe({
- req(pred_poll())
-
- invalidateLater(millis = 100)
- result <- controller$pop()$result
-
- if (!is.null(result)) {
- pred_data_rv$data <- result[[1]]
- print(controller$summary())
- }
-
- if (isFALSE(controller$nonempty())) {
- pred_status("Prediction Complete")
- pred_poll(controller$nonempty())
- removeNotification(id = "pred_message")
- }
-})
Why do set the parameter as_data_frame
to FALSE
in the call above? This ensures the contents of the data file are not read into R’s memory right away, and we can perform data processing on this file in a tidyverse-like pipeline and collect()
the results at the end of the pipeline to minimize overhead.
We could add this call to the top of our app_server.R
logic, which would already lead to decreased processing time. For an application that is being used very infrequently, that might be good enough. But if we have an application that is going to be used concurrently by multiple users, we may be able to increase performance by ensuring this data file is read in R once for each process launched that servers the application, instead of once for each R session corresponding to different user’s Shiny sessions. More to come later in the workshop on how we can accomplish this with {golem}
!
The current version of our Shiny application contains a module for generating predictions of the number of LEGO parts in a set using the number of unique colors and number of unique part categories. The API is executed and processed using the {httr2}
package. Here is the function wrapping the API execution:
#' @importFrom httr2 request req_body_json req_perform resp_body_json
+run_prediction <- function(df, endpoint_url, back_transform = TRUE, round_result = TRUE) {
+ # create request object
+ req <- request(endpoint_url)
+
+ # perform request
+ resp <- req |>
+ req_body_json(df) |>
+ req_perform()
+
+ # extract predictions from response
+ pred_values <- resp_body_json(resp)$.pred |> unlist()
+
+ # back-transform log10 value of predicted number of parts if requested
+ if (back_transform) {
+ pred_values <- 10 ^ pred_values
+ }
+
+ # round result up to nearest integer if requested
+ if (round_result) pred_values <- ceiling(pred_values)
+
+ # append predictions to supplied data frame
+ dplyr::mutate(df, predicted_num_parts = pred_values)
+}
Unfortunately, the prediction API call takes a bit of time to execute due to some extremely sophisticated processing 😅. As a result, any interactions within the application will not be processed until the prediction call completes. Our goal is to convert the prediction processing from synchronous to asynchronous using {crew}
observeEvent
to push the prediction task to the controller, ensuring the key objects and required packages are passed on to the controller.First we create the following reactiveVal
objects to keep track of the prediction state:
Next we set up a new controller:
+Inside the observeEvent
for the user clicking the prediction button, we update the logic to push the prediction task to the controller:
Lastly, we create a new observe
block that periodically checks whether the running {crew}
tasks have completed, ensuring that this is only executed when a prediction has been launched:
observe({
+ req(pred_poll())
+
+ invalidateLater(millis = 100)
+ result <- controller$pop()$result
+
+ if (!is.null(result)) {
+ pred_data_rv$data <- result[[1]]
+ print(controller$summary())
+ }
+
+ if (isFALSE(controller$nonempty())) {
+ pred_status("Prediction Complete")
+ pred_poll(controller$nonempty())
+ removeNotification(id = "pred_message")
+ }
+})
The project used for this particular exercise is hosted on Posit Cloud in this space. The project for this exercise is called performance-exercise1.
+Using what you just learned about the {profvis}
, your task is to run the profiler for the LEGO Bricks application contained in this project. Recall that to run the profiler in a Shiny app created with {golem}
, change the run_app()
call at the end of the dev/run_dev.R
script to the following:
Once the application is open, try changing a couple of the inputs contained in the sidebar. Once the outputs in the app refresh, close the profiler by stopping the R process in the R console. You should now see the {profvis}
report appear as a new tab in your web browser.
Profile the LEGO Bricks app!
+05:00
+Using .parquet
in the LEGO Bricks Shiny app
If you are the only user for a quick and efficient app: Likely not
-TODO: Find a way to center the sentence vertically in the slide
-Code-Along 1: Asynchronous calls of a web API
+Asynchronous calls of a model prediction API.
+posit-conf-2023.github.io/shiny-r-prod
@@ -872,6 +891,44 @@TODO: Find pictures demonstrating the “App & Packages Only” (Posit Connect, Shinyapps.io) versus “The whole environment” (Containers)
-Push-Button Deployment to Posit Connect
-