github.com/insionng/yougam@v0.0.0-20170714101924-2bc18d833463/libraries/karlseguin/ccache/readme.md (about)

     1  # CCache
     2  CCache is an LRU Cache, written in Go, focused on supporting high concurrency.
     3  
     4  Lock contention on the list is reduced by:
     5  
     6  * Introducing a window which limits the frequency that an item can get promoted
     7  * Using a buffered channel to queue promotions for a single worker
     8  * Garbage collecting within the same thread as the worker
     9  
    10  ## Setup
    11  
    12  First, download the project:
    13  
    14      go get github.com/karlseguin/ccache
    15  
    16  ## Configuration
    17  Next, import and create a `Cache` instance:
    18  
    19  
    20  ```go
    21  import (
    22    "github.com/karlseguin/ccache"
    23  )
    24  
    25  var cache = ccache.New(ccache.Configure())
    26  ```
    27  
    28  `Configure` exposes a chainable API:
    29  
    30  ```go
    31  var cache = ccache.New(ccache.Configure().MaxSize(1000).itemsToPrune(100))
    32  ```
    33  
    34  The most likely configuration options to tweak are:
    35  
    36  * `MaxSize(int)` - the maximum number size  to store in the cache (default: 5000)
    37  * `GetsPerPromote(int)` - the number of times an item is fetched before we promote it. For large caches with long TTLs, it normally isn't necessary to promote an item after every fetch (default: 3)
    38  * `ItemsToPrune(int)` - the number of items to prune when we hit `MaxSize`. Freeing up more than 1 slot at a time improved performance (default: 500)
    39  
    40  Configurations that change the internals of the cache, which aren't as likely to need tweaking:
    41  
    42  * `Buckets` - ccache shards its internal map to provide a greater amount of concurrency. Must be a power of 2 (default: 16).
    43  * `PromoteBuffer(int)` - the size of the buffer to use to queue promotions (default: 1024)
    44  * `DeleteBuffer(int)` the size of the buffer to use to queue deletions (default: 1024)
    45  
    46  ## Usage
    47  
    48  Once the cache is setup, you can  `Get`, `Set` and `Delete` items from it. A `Get` returns an `*Item`:
    49  
    50  ### Get
    51  ```go
    52  item := cache.Get("user:4")
    53  if item == nil {
    54    //handle
    55  } else {
    56    user := item.Value().(*User)
    57  }
    58  ```
    59  The returned `*Item` exposes a number of methods:
    60  
    61  * `Value() interface{}` - the value cached
    62  * `Expired() bool` - whether the item is expired or not
    63  * `TTL() time.Duration` - the duration before the item expires (will be a negative value for expired items)
    64  * `Expires() time.Time` - the time the item will expire
    65  
    66  By returning expired items, CCache lets you decide if you want to serve stale content or not. For example, you might decide to serve up slightly stale content (< 30 seconds old) while re-fetching newer data in the background. You might also decide to serve up infinitely stale content if you're unable to get new data from your source.
    67  
    68  ### Set
    69  `Set` expects the key, value and ttl:
    70  
    71  ```go
    72  cache.Set("user:4", user, time.Minute * 10)
    73  ```
    74  
    75  ### Fetch
    76  There's also a `Fetch` which mixes a `Get` and a `Set`:
    77  
    78  ```go
    79  item, err := cache.Fetch("user:4", time.Minute * 10, func() (interface{}, error) {
    80    //code to fetch the data incase of a miss
    81    //should return the data to cache and the error, if any
    82  })
    83  ```
    84  
    85  ### Delete
    86  `Delete` expects the key to delete. It's ok to call `Delete` on a non-existant key:
    87  
    88  ```go
    89  cache.Delete("user:4")
    90  ```
    91  
    92  ### Extend
    93  The life of an item can be changed via the `Extend` method. This will change the expiry of the item by the specified duration relative to the current time.
    94  
    95  ### Replace
    96  The value of an item can be updated to a new value without renewing the item's TTL or it's position in the LRU:
    97  
    98  ```go
    99  cache.Replace("user:4", user)
   100  ```
   101  
   102  `Replace` returns true if the item existed (and thus was replaced). In the case where the key was not in the cache, the value *is not* inserted and false is returned.
   103  
   104  ### Stop
   105  The cache's background worker can be stopped by calling `Stop`. Once `Stop` is called
   106  the cache should not be used (calls are likely to panic). Stop must be called in order to allow the garbage collector to reap the cache.
   107  
   108  ## Tracking
   109  CCache supports a special tracking mode which is meant to be used in conjunction with other pieces of your code that maintains a long-lived reference to data.
   110  
   111  When you configure your cache with `Track()`:
   112  
   113  ```go
   114  cache = ccache.New(ccache.Configure().Track())
   115  ```
   116  
   117  The items retrieved via `TrackingGet` will not be eligible for purge until `Release` is called on them:
   118  
   119  ```go
   120  item := cache.TrackingGet("user:4")
   121  user := item.Value()   //will be nil if "user:4" didn't exist in the cache
   122  item.Release()  //can be called even if item.Value() returned nil
   123  ```
   124  
   125  In practive, `Release` wouldn't be called until later, at some other place in your code.
   126  
   127  There's a couple reason to use the tracking mode if other parts of your code also hold references to objects. First, if you're already going to hold a reference to these objects, there's really no reason not to have them in the cache - the memory is used up anyways.
   128  
   129  More important, it helps ensure that you're code returns consistent data. With tracking, "user:4" might be purged, and a subsequent `Fetch` would reload the data. This can result in different versions of "user:4" being returned by different parts of your system.
   130  
   131  ## LayeredCache
   132  
   133  CCache's `LayeredCache` stores and retrieves values by both a primary and secondary key. Deletion can happen against either the primary and secondary key, or the primary key only (removing all values that share the same primary key).
   134  
   135  `LayeredCache` is useful for HTTP caching, when you want to purge all variations of a request.
   136  
   137  `LayeredCache` takes the same configuration object as the main cache, exposes the same optional tracking capabilities, but exposes a slightly different API:
   138  
   139  ```go
   140  cache := ccache.Layered(ccache.Configure())
   141  
   142  cache.Set("/users/goku", "type:json", "{value_to_cache}", time.Minute * 5)
   143  cache.Set("/users/goku", "type:xml", "<value_to_cache>", time.Minute * 5)
   144  
   145  json := cache.Get("/users/goku", "type:json")
   146  xml := cache.Get("/users/goku", "type:xml")
   147  
   148  cache.Delete("/users/goku", "type:json")
   149  cache.Delete("/users/goku", "type:xml")
   150  // OR
   151  cache.DeleteAll("/users/goku")
   152  ```
   153  
   154  ## Size
   155  By default, items added to a cache have a size of 1. This means that if you configure `MaxSize(10000)`, you'll be able to store 10000 items in the cache.
   156  
   157  However, if the values you set into the cache have a method `Size() int64`, this size will be used. Note that ccache has an overhead of ~350 bytes per entry, which isn't taken into account. In other words, given a filled up cache, with `MaxSize(4096000)` and items that return a `Size() int64` of 2048, we can expect to find 2000 items (4096000/2048) taking a total space of 4796000 bytes.
   158  
   159  ## Want Something Simpler?
   160  For a simpler cache, checkout out [rcache](https://github.com/karlseguin/rcache)