Why memcache
● Improve Application Performance
● Reduce Application Cost
● Caching for Read heavy operations
● Caching In Front of Datastore
● Semi-durable Shared state Across App
Instances
Memcache - operation
MemcacheService cache = MemcacheServiceFactory.getMemcacheService();
cache.setErrorHandler(
ErrorHandlers.getConsistentLogAndContinue(Level.INFO));
// read from cache
value = (byte[]) cache.get(key);
// save to cache
if (value == null) {
// ........
cache.put(key, value);
}
Async memcache - get
// Using the asynchronous cache
AsyncMemcacheService asyncCache =
MemcacheServiceFactory.getAsyncMemcacheService();
asyncCache.setErrorHandler(
ErrorHandlers.getConsistentLogAndContinue(Level.INFO));
// read from cache
Future<Object> futureValue = asyncCache.get(key);
// ... do other work in parallel to cache retrieval
value = (byte[]) futureValue.get();
Async memcache - put
if (value == null) {
// get value from other source
// ........
// asynchronously populate the cache
// Returns a Future<Void> which can be used to block until completion
asyncCache.put(key, value);
}
Using jcache - get & put
String key; // ...
byte[] value; // ...
// Put the value into the cache.
cache.put(key, value);
// Get the value from the cache.
value = (byte[]) cache.get(key);
Testing with ab benchmark
ab -c 50 -n 50 http://[app-address]/[servlet_path]
or
ab -c 50 -t 50 http://[app-address]/[servlet_path]
Other Operation
● getAll(), putAll(), deleteAll()
A single read or write operation for multiple memcache
entries
● increment(key, delta), incrementAll(...)
Provide atomic increment of numeric value(s)
● getIdentifiable(), putIfUntouched()
A mechanism to update a value consistently by
concurrent requests
Why Task Queue
● Frontend restriction of 60 second limit
● Need longer-running tasks
● Asynchronously outside front end request
● Best practice:
○ Email
○ Write datastore
○ Web crawler
○ Backend jobs
Task Queue Types
● By name
○ Default Queue (Push Queue)
○ Named Queue
● By type
○ Push queues
■ tasks execute automatically
■ only for App Engine app
○ Pull queues
■ tasks wait to be leased
■ available to workers outside the app
■ tasks can be batched
● rate - usual rate
● bucket-size - cap on peak demand
● max concurrent requests
● Retry options
○ task-retry-limit - min retries
○ task-age-limit - min elapsed time to keep retrying
(Both task-retry-limit and task-age-limit are reached, task is deleted)
○ min-backoff-seconds -- min delay between retries
○ max-backoff-seconds -- max delay between retries
○ max-doublings -- max times to double delay
● more parameters see DOC
queue.xml parameters
● every 12 hours
● every 5 minutes from 10:00 to 14:00
● 2nd,third mon,wed,thu of march 17:00
● every monday 09:00
● 1st monday of sep,oct,nov 17:00
● every day 00:00
● every N (hours|mins|minutes) ["from" (time) "to" (time)]
● every 2 hours synchronized
Cron Schedule Format