flowchart TD
A[Traditional Shiny Development] --> A1[Single script files]
A --> A2[Manual dependency management]
A --> A3[Ad-hoc testing approaches]
A --> A4[Informal documentation]
A --> A5[Complex deployment procedures]
A1 --> C1[Difficult maintenance]
A2 --> C2[Version conflicts]
A3 --> C3[Quality assurance gaps]
A4 --> C4[Poor enterprise adoption]
A5 --> C5[Deployment complexity]
style A fill:#ffebee
style C1 fill:#ffcccb
style C2 fill:#ffcccb
style C3 fill:#ffcccb
style C4 fill:#ffcccb
style C5 fill:#ffcccb
Key Takeaways
- Package Architecture: Transform functional Shiny apps into professional R packages with standardized structure, dependency management, and enterprise deployment capabilities
- Development Workflow: Master professional development processes including version control integration, automated testing, and continuous integration for statistical applications
- Golem Advantages: Leverage industry-standard framework that provides testing infrastructure, documentation automation, and deployment readiness out-of-the-box
- Enterprise Standards: Implement package structure that meets regulatory requirements for validation, audit trails, and professional software engineering practices
- Practical Transformation: Apply Golem framework to convert the sophisticated t-test application into a production-ready package structure
Introduction
The Golem framework represents the industry standard for professional Shiny application development, transforming individual scripts into enterprise-grade R packages that meet rigorous software engineering standards. For biostatisticians and data scientists working in regulated environments, Golem provides the structural foundation necessary for validation, compliance, and professional deployment.
This tutorial guides you through the complete transformation of our sophisticated independent samples t-test application into a professional package structure using Golem. You’ll learn to implement development workflows that support testing, documentation, version control, and deployment automation - skills essential for statistical applications in pharmaceutical, clinical research, and healthcare environments.
Unlike basic Shiny development that treats applications as standalone scripts, Golem enforces professional software engineering practices that ensure maintainability, scalability, and regulatory compliance. This transformation is crucial for career advancement in enterprise environments where ad-hoc development approaches are insufficient for production deployment.
Understanding the Golem Framework
Why Golem is Essential for Enterprise Development
The Golem framework addresses fundamental limitations of traditional Shiny development by providing structured approaches to challenges that inevitably arise in enterprise environments.
Traditional Shiny Development Challenges
Traditional Shiny application development often leads to maintenance difficulties and deployment challenges:
How Golem Solves These Challenges
The Golem framework addresses these limitations through structured package architecture:
flowchart TD
B[Golem Framework Benefits] --> B1[Structured R package architecture]
B --> B2[Automated dependency management]
B --> B3[Integrated testing framework]
B --> B4[Professional documentation standards]
B --> B5[Streamlined deployment workflows]
B1 --> D1[Scalable maintenance]
B2 --> D2[Reproducible environments]
B3 --> D3[Comprehensive quality assurance]
B4 --> D4[Enterprise-ready documentation]
B5 --> D5[Professional deployment]
style B fill:#e8f5e8
style D1 fill:#90ee90
style D2 fill:#90ee90
style D3 fill:#90ee90
style D4 fill:#90ee90
style D5 fill:#90ee90
Enterprise Benefits of Golem:
- Regulatory Compliance: Package structure supports validation documentation, change control, and audit trails required in regulated industries
- Professional Development: Enforces software engineering best practices including testing, documentation, and version control integration
- Deployment Readiness: Provides containerization support, dependency management, and production deployment workflows
- Team Collaboration: Standardized structure enables multiple developers to contribute effectively with clear conventions and protocols
Golem Framework Architecture
Golem organizes Shiny applications as R packages with predefined structure and conventions:
golem_package/
├── DESCRIPTION # Package metadata and dependencies
├── NAMESPACE # Function exports and imports
├── LICENSE # Professional licensing
├── NEWS.md # Change log documentation
├── README.md # Package overview and installation
├── renv.lock # Reproducible environment
├── .Rbuildignore # Build configuration
├── .gitignore # Version control configuration
├── app.R # Application launcher
├── R/ # All R functions and logic
│ ├── app_config.R # Configuration management
│ ├── app_server.R # Server logic organization
│ ├── app_ui.R # UI structure and layout
│ ├── golem_utils_*.R # Utility functions
│ ├── mod_*.R # Shiny modules
│ ├── fct_*.R # Business logic functions
│ └── utils_*.R # Helper functions
├── inst/ # Installed package resources
│ ├── app/ # Application-specific resources
│ │ └── www/ # Web assets (CSS, JS, images)
│ └── golem-config.yml # Environment configuration
├── man/ # Generated documentation
├── tests/ # Testing framework
│ ├── testthat/ # Unit and integration tests
│ └── testthat.R # Test configuration
├── vignettes/ # Long-form documentation
├── data-raw/ # Data processing scripts
└── docker/ # Containerization support
├── Dockerfile # Container definition
└── docker-compose.yml # Multi-service configuration
This structure provides several enterprise advantages:
- Standardized Organization: Consistent structure across all applications enables team collaboration and maintenance
- Dependency Management: DESCRIPTION file and renv integration ensure reproducible environments
- Testing Integration: Built-in testing framework supports quality assurance and validation requirements
- Documentation Automation: Roxygen2 integration generates professional documentation automatically
- Deployment Support: Docker integration and configuration management support enterprise deployment
Installing and Configuring Golem
Installation Requirements
Before beginning the transformation, ensure your development environment meets enterprise standards:
Install the latest versions of required software:
# Check R version (4.1.0 or higher recommended)
R.version.string
# Required packages for Golem development
install.packages(c(
"golem", # Framework core
"devtools", # Development tools
"usethis", # Project setup utilities
"testthat", # Testing framework
"roxygen2", # Documentation generation
"pkgdown", # Website generation
"renv", # Environment management
"here", # Path management
"config", # Configuration management
"attachment", # Dependency management
"dockerfiler" # Docker integration
))Verify RStudio configuration for package development:
# Enable package development features
usethis::use_git_config(
user.name = "Your Name",
user.email = "your.email@company.com"
)
# Configure development options
options(
usethis.protocol = "https",
usethis.full_name = "Your Name",
usethis.description = list(
"Authors@R" = utils::person(
"Your", "Name",
email = "your.email@company.com",
role = c("aut", "cre")
),
License = "MIT + file LICENSE",
Version = "0.0.0.9000"
)
)Configure enterprise development settings:
# Corporate network configuration (if applicable)
# Set proxy settings for package installation
Sys.setenv(
http_proxy = "http://proxy.company.com:8080",
https_proxy = "http://proxy.company.com:8080"
)
# Configure enterprise repositories
options(repos = c(
CRAN = "https://cran.rstudio.com/",
COMPANY = "https://internal.company.com/R/"
))
# Set up renv for reproducible environments
renv::init()Establish corporate development standards:
# Corporate coding standards
usethis::use_code_of_conduct()
usethis::use_mit_license()
# Enterprise-specific gitignore
usethis::use_git_ignore(c(
"*.log",
"sensitive_data/",
".Renviron",
"config/production.yml"
))For regulated industries, establish validation-ready development:
# Create validation documentation structure
dir.create("validation", recursive = TRUE)
dir.create("validation/protocols")
dir.create("validation/reports")
dir.create("validation/evidence")
# Initialize validation tracking
writeLines(
c("# Validation Documentation",
"This directory contains validation protocols and evidence.",
"## Structure:",
"- protocols/: Validation protocols and procedures",
"- reports/: Validation execution reports",
"- evidence/: Supporting documentation and evidence"),
"validation/README.md"
)Golem Project Creation
Create a new Golem project for our t-test application transformation:
# Create enterprise Golem project
golem::create_golem(
path = "ttestEnterprise",
package_name = "ttestEnterprise",
author_name = "Your Name",
author_email = "your.email@company.com",
open = TRUE
)This command creates the complete package structure with:
- Professional DESCRIPTION file with metadata
- Golem configuration files and utilities
- Testing framework integration
- Documentation templates
- Version control initialization
- Docker containerization support
Configure the new project for enterprise development:
# Navigate to the new project directory
setwd("ttestEnterprise")
# Initialize version control
usethis::use_git()
# Set up enterprise-specific configurations
golem::use_recommended_deps()
golem::use_recommended_tests()
# Configure for corporate environment
usethis::use_build_ignore(c("^.*\\.Rproj$", "^\\.Rproj\\.user$"))
usethis::use_git_ignore(c("*.log", ".DS_Store", "Thumbs.db"))Transforming the t-Test Application
Analyzing Current Application Structure
Before transformation, let’s analyze our sophisticated t-test application structure:
# Current application components
current_components <- list(
ui = "independentTTestUI function",
server = "independentTTestServer function",
utilities = c(
"parse_group_input",
"parse_response_input",
"f_levene_test",
"sample_datasets"
),
features = c(
"Multiple data input methods",
"Statistical assumption testing",
"Professional visualizations",
"APA-style reporting",
"Download capabilities"
)
)
# Golem transformation mapping
transformation_plan <- list(
"independentTTestUI" = "mod_ttest_ui",
"independentTTestServer" = "mod_ttest_server",
"utility functions" = "fct_statistical_utils.R",
"data processing" = "fct_data_processing.R",
"visualization" = "fct_visualizations.R",
"reporting" = "fct_reporting.R"
)Step-by-Step Transformation Process
Step 1: Configure Package Metadata
Update the DESCRIPTION file with enterprise-appropriate metadata:
# Edit DESCRIPTION file
desc::desc_set(
Title = "Enterprise Independent Samples t-Test Application",
Description = "Professional statistical application for independent samples t-test analysis with enterprise features including validation, reporting, and regulatory compliance support.",
Version = "0.1.0",
Authors = 'person("Your", "Name",
email = "your.email@company.com",
role = c("aut", "cre"))',
License = "MIT + file LICENSE",
URL = "https://github.com/yourcompany/ttestEnterprise",
BugReports = "https://github.com/yourcompany/ttestEnterprise/issues"
)
# Add dependencies
desc::desc_add_deps(
"shiny", "bslib", "ggplot2", "dplyr",
"bsicons", "shinyjs", "DT", "plotly"
)Step 2: Create Modular Structure
Transform the monolithic application into modular components:
# Create the main t-test module
golem::add_module(
name = "ttest",
with_test = TRUE
)
# Create supporting modules for complex functionality
golem::add_module(name = "data_input", with_test = TRUE)
golem::add_module(name = "results_display", with_test = TRUE)
golem::add_module(name = "assumptions_check", with_test = TRUE)
golem::add_module(name = "report_generation", with_test = TRUE)
# Create function files for business logic
golem::add_fct("statistical_calculations", with_test = TRUE)
golem::add_fct("data_validation", with_test = TRUE)
golem::add_fct("visualization_helpers", with_test = TRUE)
golem::add_fct("report_formatting", with_test = TRUE)
# Create utility files for helper functions
golem::add_utils("input_processing", with_test = TRUE)
golem::add_utils("error_handling", with_test = TRUE)
golem::add_utils("configuration", with_test = TRUE)Step 3: Implement Main Module Structure
Create the main t-test module by transforming the original UI and server functions:
# File: R/mod_ttest.R
#' Independent Samples t-Test Module
#'
#' @description A shiny Module for conducting independent samples t-tests
#' with comprehensive statistical analysis and professional reporting.
#'
#' @param id character. Module identifier
#'
#' @rdname mod_ttest
#'
#' @keywords internal
#' @export
#' @import shiny
#' @import bslib
#' @import ggplot2
#' @import dplyr
mod_ttest_ui <- function(id) {
ns <- NS(id)
page_sidebar(
title = "Independent Samples t-Test Calculator",
sidebar = sidebar(
width = 425,
# Data input section
card(
card_header("Data Input"),
mod_data_input_ui(ns("data_input"))
),
# Analysis options
card(
card_header("Analysis Options"),
mod_analysis_options_ui(ns("analysis_options"))
),
# Quick guide
card(
card_header("Quick Guide"),
mod_quick_guide_ui(ns("quick_guide"))
)
),
# Main content area
navset_card_tab(
height = 800,
nav_panel(
"Results",
mod_results_display_ui(ns("results"))
),
nav_panel(
"Help & Learning",
mod_help_learning_ui(ns("help"))
)
)
)
}
#' Independent Samples t-Test Module Server
#'
#' @rdname mod_ttest
#' @export
#' @keywords internal
mod_ttest_server <- function(id) {
moduleServer(id, function(input, output, session) {
# Initialize reactive values
values <- reactiveValues(
data = NULL,
test_result = NULL,
validation_results = NULL
)
# Data input module
data_input_results <- mod_data_input_server("data_input")
# Update values when data changes
observe({
values$data <- data_input_results$data()
values$validation_results <- validate_input_data(values$data)
})
# Analysis options module
analysis_options <- mod_analysis_options_server("analysis_options")
# Statistical calculations
observe({
req(values$data, analysis_options$run_analysis())
values$test_result <- calculate_ttest_comprehensive(
data = values$data,
options = analysis_options$options()
)
})
# Results display module
mod_results_display_server("results",
reactive(values$test_result),
reactive(values$validation_results))
# Help and learning module
mod_help_learning_server("help")
return(values)
})
}Step 4: Implement Statistical Functions
Create professional statistical calculation functions:
# File: R/fct_statistical_calculations.R
#' Comprehensive Independent Samples t-Test Analysis
#'
#' @description Performs complete independent samples t-test analysis with
#' assumption checking, effect size calculation, and professional reporting.
#'
#' @param data data.frame containing group and response variables
#' @param options list of analysis options including confidence level,
#' alternative hypothesis, and variance assumptions
#'
#' @return list containing test results, assumption tests, effect sizes,
#' and diagnostic information
#'
#' @export
#' @examples
#' \dontrun{
#' data <- data.frame(
#' group = rep(c("A", "B"), each = 10),
#' response = c(rnorm(10, 5), rnorm(10, 6))
#' )
#' results <- calculate_ttest_comprehensive(data, list(conf_level = 0.95))
#' }
calculate_ttest_comprehensive <- function(data, options = list()) {
# Validate input data
validation_result <- validate_ttest_data(data)
if (!validation_result$valid) {
stop(validation_result$message)
}
# Extract analysis options with defaults
conf_level <- options$conf_level %||% 0.95
alternative <- options$alternative %||% "two.sided"
var_equal <- options$var_equal %||% FALSE
auto_method <- options$auto_method %||% TRUE
# Prepare data
groups <- unique(data$group)
group1_data <- data$response[data$group == groups[1]]
group2_data <- data$response[data$group == groups[2]]
# Assumption testing
assumptions <- check_ttest_assumptions(group1_data, group2_data)
# Automatic method selection if requested
if (auto_method) {
var_equal <- assumptions$levene_result$p_value >= 0.05
}
# Perform t-test
test_result <- t.test(
response ~ group,
data = data,
alternative = alternative,
var.equal = var_equal,
conf.level = conf_level
)
# Calculate effect size
effect_size <- calculate_cohens_d(group1_data, group2_data, var_equal)
# Compile comprehensive results
list(
test_result = test_result,
effect_size = effect_size,
assumptions = assumptions,
descriptives = calculate_descriptives(group1_data, group2_data, groups),
method_used = if (var_equal) "Student's t-test" else "Welch's t-test",
validation = validation_result,
options = options
)
}
#' Check t-Test Statistical Assumptions
#'
#' @description Performs comprehensive assumption checking for independent
#' samples t-test including normality and homogeneity of variance.
#'
#' @param group1_data numeric vector of group 1 observations
#' @param group2_data numeric vector of group 2 observations
#'
#' @return list containing assumption test results and interpretations
#'
#' @export
check_ttest_assumptions <- function(group1_data, group2_data) {
# Normality testing
shapiro1 <- shapiro.test(group1_data)
shapiro2 <- shapiro.test(group2_data)
# Homogeneity of variance (Levene's test)
combined_data <- data.frame(
value = c(group1_data, group2_data),
group = factor(rep(c("Group1", "Group2"),
c(length(group1_data), length(group2_data))))
)
levene_result <- perform_levene_test(combined_data$value, combined_data$group)
# Interpretation
normality_ok <- shapiro1$p.value >= 0.05 && shapiro2$p.value >= 0.05
variance_ok <- levene_result$p_value >= 0.05
list(
shapiro_group1 = shapiro1,
shapiro_group2 = shapiro2,
levene_result = levene_result,
normality_assumption = normality_ok,
variance_assumption = variance_ok,
overall_assumptions = normality_ok && variance_ok,
recommendations = generate_assumption_recommendations(normality_ok, variance_ok)
)
}Step 5: Configure Application Settings
Set up enterprise configuration management:
# File: inst/golem-config.yml
production:
golem_name: "ttestEnterprise"
golem_version: "0.1.0"
app_prod: true
database:
driver: "PostgreSQL"
host: "prod-db.company.com"
port: 5432
name: "statistical_apps"
logging:
level: "INFO"
file: "/var/log/ttest_enterprise.log"
security:
session_timeout: 1800
max_file_size: "50MB"
allowed_extensions: ["csv", "txt", "xlsx"]
development:
golem_name: "ttestEnterprise"
golem_version: "0.1.0"
app_prod: false
database:
driver: "SQLite"
file: "dev_database.sqlite"
logging:
level: "DEBUG"
console: true
security:
session_timeout: 3600
max_file_size: "10MB"
allowed_extensions: ["csv", "txt", "xlsx", "rds"]
testing:
golem_name: "ttestEnterprise"
golem_version: "0.1.0"
app_prod: false
database:
driver: "SQLite"
file: ":memory:"
logging:
level: "ERROR"
console: false
security:
session_timeout: 600
max_file_size: "1MB"
allowed_extensions: ["csv", "txt"]Configure the main application file:
# File: R/app_config.R
#' Access Configuration Values
#'
#' @description Retrieve configuration values based on the current environment.
#' Supports production, development, and testing configurations.
#'
#' @param value character. Configuration key to retrieve
#' @param config character. Configuration environment (optional)
#' @param use_parent logical. Whether to use parent configuration values
#'
#' @return Configuration value or NULL if not found
#'
#' @export
get_golem_config <- function(value, config = Sys.getenv("GOLEM_CONFIG_ACTIVE", "development"), use_parent = TRUE) {
config::get(
value = value,
config = config,
file = app_sys("golem-config.yml"),
use_parent = use_parent
)
}
#' Configure Application Environment
#'
#' @description Set up application environment including logging,
#' database connections, and security settings.
#'
#' @param config character. Configuration environment to use
#'
#' @return invisibly TRUE if successful
#'
#' @export
configure_app_environment <- function(config = get_golem_config("golem_name")) {
# Set up logging
logging_config <- get_golem_config("logging")
if (!is.null(logging_config)) {
setup_application_logging(logging_config)
}
# Configure security settings
security_config <- get_golem_config("security")
if (!is.null(security_config)) {
configure_security_settings(security_config)
}
# Database connection setup
db_config <- get_golem_config("database")
if (!is.null(db_config)) {
setup_database_connection(db_config)
}
invisible(TRUE)
}Testing Framework Integration
Create comprehensive testing structure for enterprise reliability:
# File: tests/testthat/test-statistical_calculations.R
test_that("comprehensive t-test calculation works correctly", {
# Create test data
test_data <- data.frame(
group = rep(c("Control", "Treatment"), each = 10),
response = c(rnorm(10, 5, 1), rnorm(10, 6, 1))
)
# Test basic functionality
result <- calculate_ttest_comprehensive(test_data)
expect_type(result, "list")
expect_true("test_result" %in% names(result))
expect_true("effect_size" %in% names(result))
expect_true("assumptions" %in% names(result))
# Test that p-value is numeric and in valid range
expect_type(result$test_result$p.value, "double")
expect_gte(result$test_result$p.value, 0)
expect_lte(result$test_result$p.value, 1)
# Test effect size calculation
expect_type(result$effect_size$cohens_d, "double")
expect_false(is.na(result$effect_size$cohens_d))
})
test_that("assumption checking provides comprehensive results", {
# Normal data
normal_data1 <- rnorm(30, 5, 1)
normal_data2 <- rnorm(30, 6, 1)
assumptions <- check_ttest_assumptions(normal_data1, normal_data2)
expect_type(assumptions, "list")
expect_true("shapiro_group1" %in% names(assumptions))
expect_true("shapiro_group2" %in% names(assumptions))
expect_true("levene_result" %in% names(assumptions))
expect_type(assumptions$normality_assumption, "logical")
expect_type(assumptions$variance_assumption, "logical")
})
test_that("error handling works for invalid data", {
# Test with insufficient data
invalid_data <- data.frame(
group = c("A", "B"),
response = c(1, 2)
)
expect_error(
calculate_ttest_comprehensive(invalid_data),
"Insufficient data"
)
# Test with non-numeric response
invalid_data2 <- data.frame(
group = rep(c("A", "B"), each = 5),
response = rep(c("low", "high"), each = 5)
)
expect_error(
calculate_ttest_comprehensive(invalid_data2),
"Response variable must be numeric"
)
})Create integration tests for module functionality:
# File: tests/testthat/test-mod_ttest.R
test_that("t-test module UI generates proper structure", {
ui_output <- mod_ttest_ui("test")
expect_s3_class(ui_output, "shiny.tag")
# Test that main components are present
ui_string <- as.character(ui_output)
expect_true(grepl("Independent Samples t-Test", ui_string))
expect_true(grepl("data_input", ui_string))
expect_true(grepl("analysis_options", ui_string))
})
test_that("t-test module server handles data processing", {
# Test server logic with mock data
testServer(mod_ttest_server, {
# Simulate data input
session$setInputs(
"data_input-group_input" = "A\nA\nB\nB",
"data_input-response_input" = "1\n2\n3\n4"
)
# Simulate analysis trigger
session$setInputs("analysis_options-run_test" = 1)
# Test that calculations complete
expect_true(is.reactive(values$test_result))
# Test error handling
session$setInputs(
"data_input-response_input" = "invalid\ndata"
)
expect_null(values$test_result)
})
})Development Workflow and Best Practices
Version Control Integration
Establish professional version control workflows for enterprise development:
# Initialize Git repository with enterprise standards
usethis::use_git()
# Create professional README
usethis::use_readme_md()
# Set up GitHub integration (or corporate Git server)
usethis::use_github(
organisation = "your-company",
private = TRUE
)
# Configure enterprise-appropriate .gitignore
usethis::use_git_ignore(c(
"*.log",
".Renviron",
"config/production.yml",
"sensitive_data/",
"*.rds",
".DS_Store",
"Thumbs.db"
))
# Set up branch protection and workflows
usethis::use_github_action("check-standard")
usethis::use_github_action("test-coverage")
usethis::use_github_action("pkgdown")Create professional commit message standards:
# Professional commit message format
git commit -m "feat: add comprehensive assumption testing module
- Implement Shapiro-Wilk tests for normality checking
- Add Levene's test for homogeneity of variance
- Create assumption interpretation functions
- Add comprehensive test coverage for statistical validation
Resolves: #123
Reviewed-by: Senior-Statistician"Continuous Integration Setup
Configure automated testing and quality assurance:
# File: .github/workflows/R-CMD-check.yml
name: R-CMD-check
on:
push:
branches: [ main, develop ]
pull_request:
branches: [ main ]
jobs:
R-CMD-check:
runs-on: ${{ matrix.config.os }}
name: ${{ matrix.config.os }} (${{ matrix.config.r }})
strategy:
fail-fast: false
matrix:
config:
- {os: ubuntu-latest, r: 'release'}
- {os: ubuntu-latest, r: 'devel'}
- {os: macOS-latest, r: 'release'}
- {os: windows-latest, r: 'release'}
env:
GITHUB_PAT: ${{ secrets.GITHUB_TOKEN }}
R_KEEP_PKG_SOURCE: yes
steps:
- uses: actions/checkout@v2
- uses: r-lib/actions/setup-r@v2
with:
r-version: ${{ matrix.config.r }}
- uses: r-lib/actions/setup-pandoc@v2
- name: Install system dependencies
run: |
sudo apt-get update
sudo apt-get install -y libcurl4-openssl-dev
- uses: r-lib/actions/setup-r-dependencies@v2
with:
extra-packages: any::rcmdcheck
needs: check
- uses: r-lib/actions/check-r-package@v2Documentation Automation
Set up professional documentation generation:
# Configure pkgdown for package website
usethis::use_pkgdown()
# Create comprehensive package documentation
usethis::use_vignette("getting-started", title = "Getting Started with ttestEnterprise")
usethis::use_vignette("statistical-methods", title = "Statistical Methods and Validation")
usethis::use_vignette("enterprise-deployment", title = "Enterprise Deployment Guide")
# Configure documentation theme for professional appearance
pkgdown_config <- list(
url = "https://your-company.github.io/ttestEnterprise",
title = "ttestEnterprise",
description = "Professional Independent Samples t-Test Application",
authors = list(
"Your Name" = list(
href = "https://github.com/yourusername",
html = "Your Name"
)
),
template = list(
bootstrap = 5,
theme = "arrow-light",
bslib = list(
primary = "#0054AD",
border_radius = 0.5,
font_scale = NULL,
bootswatch = "flatly"
)
),
navbar = list(
structure = list(
left = c("intro", "reference", "articles", "tutorials"),
right = c("search", "github")
)
)
)
# Write configuration
yaml::write_yaml(pkgdown_config, "_pkgdown.yml")Performance Monitoring and Optimization
Implement performance monitoring for enterprise applications:
# File: R/utils_performance.R
#' Application Performance Monitoring
#'
#' @description Monitor application performance including memory usage,
#' execution time, and user session metrics for enterprise environments.
#'
#' @param session shiny session object
#' @param action character string describing the action being monitored
#'
#' @return invisibly returns performance metrics list
#'
#' @export
monitor_performance <- function(session, action = "general") {
start_time <- Sys.time()
start_memory <- pryr::mem_used()
# Create performance tracking function
log_performance <- function() {
end_time <- Sys.time()
end_memory <- pryr::mem_used()
metrics <- list(
timestamp = end_time,
action = action,
execution_time = as.numeric(end_time - start_time, units = "secs"),
memory_usage = as.numeric(end_memory),
memory_change = as.numeric(end_memory - start_memory),
session_id = session$token
)
# Log to enterprise monitoring system
log_to_monitoring_system(metrics)
invisible(metrics)
}
# Return logging function for deferred execution
return(log_performance)
}
#' Enterprise Logging System Integration
#'
#' @description Log performance metrics and application events to enterprise
#' monitoring and logging infrastructure.
#'
#' @param metrics list containing performance and event data
#' @param level character logging level (DEBUG, INFO, WARN, ERROR)
#'
#' @return invisibly TRUE if successful
#'
#' @export
log_to_monitoring_system <- function(metrics, level = "INFO") {
# Configure based on environment
logging_config <- get_golem_config("logging")
if (get_golem_config("app_prod")) {
# Production logging to enterprise systems
tryCatch({
# Format for enterprise log aggregation (e.g., ELK stack)
log_entry <- list(
timestamp = format(Sys.time(), "%Y-%m-%dT%H:%M:%S%z"),
application = "ttestEnterprise",
level = level,
metrics = metrics,
environment = "production"
)
# Send to centralized logging
send_to_log_aggregator(log_entry)
}, error = function(e) {
# Fallback to local logging
cat(paste(Sys.time(), "- ERROR:", e$message), "\n",
file = logging_config$file, append = TRUE)
})
} else {
# Development logging
if (logging_config$console) {
cat(paste(Sys.time(), "-", level, ":",
jsonlite::toJSON(metrics, auto_unbox = TRUE)), "\n")
}
}
invisible(TRUE)
}Deployment Preparation
Docker Integration
Configure containerization for enterprise deployment:
# Generate Dockerfile using golem utilities
golem::add_dockerfile()
# Customize for enterprise requirements
dockerfile_content <- readLines("Dockerfile")
# Add enterprise-specific configurations
enterprise_additions <- c(
"# Enterprise security configurations",
"RUN groupadd -r shinyuser && useradd -r -g shinyuser shinyuser",
"RUN mkdir -p /var/log/shiny-server && chown shinyuser:shinyuser /var/log/shiny-server",
"",
"# Health check for enterprise monitoring",
"HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \\",
" CMD curl -f http://localhost:3838/ || exit 1",
"",
"# Security: run as non-root user",
"USER shinyuser"
)
# Write enhanced Dockerfile
writeLines(c(dockerfile_content, enterprise_additions), "Dockerfile")Create Docker Compose configuration for multi-service deployment:
# File: docker-compose.yml
version: '3.8'
services:
ttest-app:
build: .
container_name: ttest-enterprise
ports:
- "3838:3838"
environment:
- GOLEM_CONFIG_ACTIVE=production
- DATABASE_URL=postgresql://user:password@db:5432/statistical_apps
depends_on:
- db
- redis
volumes:
- ./logs:/var/log/shiny-server
- ./config:/app/config
restart: unless-stopped
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:3838/"]
interval: 30s
timeout: 10s
retries: 3
start_period: 40s
db:
image: postgres:13
container_name: ttest-postgres
environment:
POSTGRES_DB: statistical_apps
POSTGRES_USER: appuser
POSTGRES_PASSWORD: ${DB_PASSWORD}
volumes:
- postgres_data:/var/lib/postgresql/data
- ./init-db:/docker-entrypoint-initdb.d
restart: unless-stopped
redis:
image: redis:6-alpine
container_name: ttest-redis
command: redis-server --appendonly yes
volumes:
- redis_data:/data
restart: unless-stopped
nginx:
image: nginx:alpine
container_name: ttest-nginx
ports:
- "80:80"
- "443:443"
volumes:
- ./nginx.conf:/etc/nginx/nginx.conf
- ./ssl:/etc/nginx/ssl
depends_on:
- ttest-app
restart: unless-stopped
volumes:
postgres_data:
redis_data:Environment Configuration Management
Create comprehensive environment configuration:
# File: R/app_prod.R
#' Production Application Launcher
#'
#' @description Launch the application in production mode with enterprise
#' configurations including security, monitoring, and performance optimization.
#'
#' @param ... Additional arguments passed to shiny::runApp
#'
#' @export
run_prod <- function(...) {
# Set production environment
Sys.setenv(GOLEM_CONFIG_ACTIVE = "production")
# Configure application environment
configure_app_environment("production")
# Initialize enterprise features
initialize_enterprise_features()
# Launch with production settings
options(
shiny.port = 3838,
shiny.host = "0.0.0.0",
shiny.autoreload = FALSE,
shiny.sanitize.errors = TRUE,
shiny.error = function() {
log_to_monitoring_system(
list(error = "Application error occurred"),
level = "ERROR"
)
}
)
# Run application
shiny::runApp(
appDir = app_sys(),
host = "0.0.0.0",
port = 3838,
...
)
}
#' Initialize Enterprise Features
#'
#' @description Set up enterprise-specific features including security,
#' monitoring, caching, and database connections.
#'
#' @return invisibly TRUE if successful
#'
#' @export
initialize_enterprise_features <- function() {
# Set up database connection pool
setup_database_pool()
# Initialize caching system
setup_redis_cache()
# Configure security middleware
setup_security_middleware()
# Initialize monitoring
setup_application_monitoring()
# Set up backup and recovery
setup_backup_procedures()
invisible(TRUE)
}Common Questions About Golem Framework
Golem provides critical enterprise infrastructure that traditional Shiny development lacks: standardized package structure for team collaboration, integrated testing frameworks for quality assurance, automated documentation generation, dependency management for reproducible deployments, and containerization support for enterprise infrastructure. For regulated industries, Golem’s structured approach supports validation documentation and audit trails required for compliance.
Golem enforces consistent project structure, naming conventions, and development practices that enable multiple developers to contribute effectively. The package-based approach provides clear boundaries between modules, standardized testing procedures, and automated documentation that reduces onboarding time for new team members. Version control integration and continuous integration support ensure code quality and prevent integration conflicts.
Golem actually improves performance for enterprise applications through better code organization, modular architecture that enables selective loading, dependency management that prevents conflicts, and built-in support for caching and optimization. The package structure allows for more efficient deployment and scaling strategies. However, initial development may be slower due to the more structured approach - this overhead pays dividends in maintenance and scaling phases.
Golem’s package structure inherently supports regulatory requirements through comprehensive documentation (required for validation), integrated testing frameworks (essential for qualification), version control integration (needed for change control), and standardized development processes (required for quality systems). The framework facilitates creation of validation protocols, testing evidence, and audit trails necessary for 21 CFR Part 11 compliance and other regulatory standards.
Yes, complex applications like our t-test example are excellent candidates for Golem migration. The main challenges include: refactoring monolithic code into modular structure, adapting existing reactive patterns to module architecture, migrating custom dependencies and configurations, and creating comprehensive tests for existing functionality. However, well-structured applications with clear separation of concerns (like our example) typically migrate smoothly with significant long-term benefits.
Test Your Understanding
You’re transforming a Shiny application with the following components: UI function (150 lines), server function (300 lines), 5 utility functions, and 3 datasets. Using Golem best practices, how would you organize this into the package structure?
- Single module with all functionality combined
- One module for UI, one for server, utility files for functions
- Multiple focused modules (data input, analysis, results) with supporting function files
- Separate modules for each utility function
- Consider the principle of single responsibility for modules
- Think about logical groupings of functionality
- Remember that modules should represent cohesive features
- Utility functions should be organized by purpose, not quantity
C) Multiple focused modules (data input, analysis, results) with supporting function files
Golem Organization Strategy:
Modules (cohesive features):
mod_data_input- Handle all data input methods and validationmod_analysis_engine- Statistical calculations and method selection
mod_results_display- Output formatting and visualizationmod_reporting- Export and download functionality
Function Files (supporting logic):
fct_statistical_calculations.R- Core statistical algorithmsfct_data_validation.R- Data checking and cleaningfct_visualization_helpers.R- Plot generation utilitiesutils_input_processing.R- Input parsing and formattingutils_configuration.R- Settings and environment management
Data:
data-raw/- Source datasets and processing scriptsR/sysdata.rda- Internal datasets for package use
This approach creates maintainable, testable modules while organizing supporting functions by logical purpose rather than arbitrary splits.
Your pharmaceutical company requires different configurations for development, testing, and production environments. Which Golem configuration approach best supports regulatory validation requirements?
- Hard-code different settings in conditional statements within the application
- Use environment variables exclusively for all configuration management
- Implement golem-config.yml with environment-specific sections and validation documentation
- Create separate application versions for each environment
- Consider regulatory requirements for configuration control and documentation
- Think about the need for audit trails and change management
- Environment-specific settings need to be clearly documented and validated
- Configuration changes should be traceable and reproducible
C) Implement golem-config.yml with environment-specific sections and validation documentation
Regulatory Compliance Benefits:
Structured Configuration Management:
production:
database:
driver: "PostgreSQL"
validation_required: true
logging:
level: "INFO"
audit_trail: true
security:
encryption: "AES-256"
session_timeout: 1800
development:
database:
driver: "SQLite"
validation_required: false
logging:
level: "DEBUG"
audit_trail: falseValidation Support:
- Documented configurations for each environment in version control
- Change tracking through Git history for audit trails
- Environment validation through automated testing
- Configuration testing to ensure consistency across environments
Regulatory Advantages:
- 21 CFR Part 11 compliance through documented configuration control
- Audit trail creation via version control integration
- Validation evidence through automated configuration testing
- Change control documentation for all configuration modifications
This approach provides the structure and documentation necessary for regulatory validation while maintaining flexibility for different operational environments.
You’re setting up a Golem project for a team of 5 biostatisticians with varying R experience levels. Which development workflow setup would maximize productivity while maintaining enterprise quality standards?
- Full GitHub Actions CI/CD with comprehensive automated testing and documentation
- Basic Git repository with manual testing and documentation procedures
- Local development only with periodic manual integration
- Advanced containerized development with complex orchestration
- Consider the team’s varying experience levels
- Think about balancing automation with learning opportunities
- Enterprise quality standards require certain automated processes
- Team productivity depends on appropriate tooling complexity
A) Full GitHub Actions CI/CD with comprehensive automated testing and documentation
Strategic Reasoning for Mixed-Experience Teams:
Automated Quality Gates:
- Consistent standards regardless of individual experience levels
- Immediate feedback helps less experienced developers learn best practices
- Automated testing prevents quality regressions from any team member
- Documentation generation ensures comprehensive coverage without manual overhead
Team Development Benefits:
- Standardized workflows reduce confusion and inconsistency
- Automated checks teach best practices through enforcement
- Shared responsibility for quality through automated systems
- Reduced review burden on senior team members
Enterprise Requirements:
- Audit trails through automated Git workflows and testing records
- Quality assurance through comprehensive automated testing
- Documentation compliance through automated generation and validation
- Deployment readiness through container integration and testing
Implementation Strategy:
# .github/workflows/comprehensive-qa.yml
- R CMD check across multiple platforms
- Test coverage reporting and enforcement
- Automated documentation building and deployment
- Security vulnerability scanning
- Code quality metrics and reportingThis approach provides the enterprise-grade infrastructure while accelerating team learning through automated feedback and standardized processes.
Conclusion
The Golem framework transformation represents a fundamental shift from ad-hoc Shiny development to professional software engineering practices that meet enterprise standards for quality, maintainability, and regulatory compliance. Through systematic restructuring of our sophisticated t-test application, you’ve learned to implement package architecture that supports testing, documentation, version control, and deployment automation.
This transformation creates a foundation that scales from individual applications to enterprise statistical software platforms. The modular structure, comprehensive testing, and professional documentation standards you’ve implemented directly support career advancement by demonstrating software engineering capabilities that pharmaceutical and clinical research organizations require for production applications.
The investment in Golem framework mastery pays long-term dividends through reduced maintenance overhead, improved team collaboration, enhanced deployment capabilities, and regulatory compliance support that positions your applications for successful enterprise adoption.
Next Steps
Based on your Golem framework foundation, here are the recommended paths for continuing your enterprise development journey:
Immediate Next Steps (Complete These First)
- Professional UI Design Enhancement - Apply enterprise UI/UX standards to your Golem-structured application
- Data Validation and Security Implementation - Build bulletproof validation systems using the package structure
- Practice Exercise: Complete the transformation of your t-test application to full Golem structure with testing and documentation
Building on Your Golem Foundation (Choose Your Path)
For Statistical Excellence Focus:
For Quality Assurance Mastery:
For Production Deployment:
Long-term Goals (2-3 Months)
- Master the complete enterprise development series using your Golem foundation
- Build a portfolio of enterprise-ready statistical packages
- Establish team development workflows and standards using Golem
- Pursue advanced containerization and orchestration for large-scale deployment
Explore More Enterprise Development
Ready to enhance your Golem-structured application with enterprise features? Continue with the transformation series.
Reuse
Citation
@online{kassambara2025,
author = {Kassambara, Alboukadel},
title = {Golem {Framework} {Setup:} {Transform} {Shiny} {Apps} into
{Professional} {Packages}},
date = {2025-05-23},
url = {https://www.datanovia.com/learn/tools/shiny-apps/enterprise-development/golem-framework-setup.html},
langid = {en}
}
