❮ Back to Blog

Caching Strategies for Ultra-High Performance in Ruby on Rails, Part 2

In the part 1 of this series, we covered the fundamentals of caching (like fragment and Russian Doll caching). Before continuing, read that post here. This time, we’ll look at understanding Rails cache stores at a deeper level, so let’s jump in!

Before we start part 2, let’s note that another way to make sure your Rails apps are ultra-fast is with Scout’s integrated monitoring and logging, so you can spot problems before your users do.

Understanding Rails Cache Stores

Caching in Rails is built on a flexible foundation that allows different storage mechanisms to work seamlessly with your application. At its core is the ActiveSupport::Cache::Store class, which provides a consistent interface regardless of where your cached data actually lives. This abstraction means you can switch cache stores without changing your application code, making it easier to evolve your caching strategy as your application grows.

Cache Store Configuration

Before we explore specific cache stores, it's important to understand how Rails manages cache configuration. Think of cache configuration as setting up the rules of engagement between your application and its cache system. These rules determine everything from how long data stays cached to how to handle errors when something goes wrong.

The configuration typically lives in environment-specific files, allowing different caching strategies for development, testing, and production:

# config/environments/production.rb
config.cache_store = :redis_cache_store, {
  url: ENV['REDIS_URL'],
  error_handler: -> (method:, returning:, exception:) {
    Raven.capture_exception(exception, level: 'warning',
      tags: { method: method, returning: returning }
    )
  }
}

Here, we're not just telling Rails to use Redis – we're also setting up error handling to ensure we're notified when caching issues occur. This monitoring is crucial in production environments where cache failures can significantly impact performance.

Memory Store

The Memory Store keeps cached items readily available but needs to be managed carefully to avoid consuming too much memory. Let's look at how to configure a Memory Store with some advanced options:

config.cache_store = :memory_store, {
  size: 64.megabytes,
  compress: true,
  compress_threshold: 32.kilobytes,
  expires_in: 1.hour
}

This configuration tells Rails several important things:

  • The cache can use up to 64MB of memory
  • Items larger than 32KB will be compressed automatically
  • Cached items will expire after an hour if not accessed

Under the hood, the Memory Store uses Least Recently Used (LRU) pruning. Here's how Rails implements this:

class MemoryStore < Store
  private
  
  def prune(target_size)
    return if read_entry_size < target_size
    
    # Remove least recently used entries until we're under target size
    @data.each_pair do |key, entry|
      delete_entry(key, options) if read_entry_size > target_size
      break if read_entry_size <= target_size
    end
  end
end

When using Memory Store in a production environment with multiple server processes (like Puma workers), it's critical to understand that each process maintains its own separate cache – if an item is available in one cache, it doesn't mean it's available in another.

Redis Cache Store

Redis takes caching to another level by providing not just a place to store data but a rich set of data structures and operations:

config.cache_store = :redis_cache_store, {
  url: ENV['REDIS_URL'],
  connect_timeout: 30,  # Seconds
  read_timeout: 0.2,   # Seconds
  write_timeout: 0.2,  # Seconds
  reconnect_attempts: 1,
  error_handler: -> (method:, returning:, exception:) {
    Raven.capture_exception(
      exception,
      tags: { cache_operation: method }
    )
  },
  compression_threshold: 1.kilobyte,
  compression: :zstd,  # Or :snappy if available
  expires_in: 1.hour,
  race_condition_ttl: 5.seconds,
  pool: { 
    size: 5,
    timeout: 5
  }
}

This configuration might look intimidating, but each option serves an important purpose:

  • The timeouts prevent slow cache operations from bringing down your application
  • The connection pool manages how many simultaneous connections your app can make to Redis
  • Compression helps save memory by automatically compressing larger items
  • The race condition TTL helps prevent thundering herds

One of Redis's most powerful features is its support for different data structures. Here's how you might use them:

# Counting unique visitors
Rails.cache.redis.sadd('todays_visitors', user_id)
visitor_count = Rails.cache.redis.scard('todays_visitors')

# Managing a leaderboard
Rails.cache.redis.zadd('high_scores', 100, player_id)
top_players = Rails.cache.redis.zrevrange('high_scores', 0, 9, withscores: true)

# Implementing a job queue
Rails.cache.redis.lpush('background_jobs', job_data)
next_job = Rails.cache.redis.rpop('background_jobs')

These operations go well beyond simple key-value storage. The set operations (sadd) are perfect for tracking unique items, sorted sets (zadd) make implementing leaderboards trivial, and lists (lpush/rpop) can serve as lightweight job queues.

Choosing the Right Cache Store

Selecting the appropriate cache store depends on your specific needs. Here's a framework for making this decision:

  • For Development Environments: The Memory Store or File Store are typically best for development. They're simple to set up and don't require external services. The Memory Store is particularly useful because it lets you simulate caching behavior without the complexity of a production cache setup.
  • For Small to Medium Production Applications: If you're running on a single server, the Memory Store can be perfectly adequate. It's fast and simple, though you must be mindful of memory usage. The File Store can be a good option if you need persistence between server restarts and don't mind the disk I/O overhead.
  • For Large Production Applications: Redis or Memcached become essential at scale. Redis is particularly valuable when you need complex data structures (like sets for unique counts or sorted sets for leaderboards), atomic operations for counters or locks, pub/sub capabilities for real-time features, or data persistence options.

The choice of cache store can evolve with your application. Many successful applications start with the Memory Store and graduate to Redis as their needs grow. The key is understanding the trade-offs each store offers and aligning them with your application's requirements.

Making Caching Work for Your Application

Caching is not a set-and-forget solution. It requires ongoing monitoring, maintenance, and adjustment as your application grows and usage patterns change. Scout’s performance monitoring is a great place to start your caching journey!

The key to successful caching is not implementing every available strategy but choosing the right approaches for your needs. Identify your application's performance bottlenecks: 

  • Are your database queries slow? 
  • Are view renderings taking too long?
  • Is external API communication creating delays? 

Once you understand where performance suffers, you can select the appropriate caching strategy — whether that's using fragment caching for slow-rendering partials, Russian Doll caching for complex view hierarchies, or low-level caching for expensive computations.

When implemented thoughtfully, caching can transform your Rails application's performance, providing your users with a faster, more responsive experience while keeping your server resources in check.

Ready to Optimize Your App?

Join engineering teams who trust Scout Monitoring for hassle-free performance monitoring. With our 3-step setup, powerful tooling, and responsive support, you can quickly identify and fix performance issues before they impact your users.