2. 1. Create a HTTP Request
GET /1/statuses/sample.json HTTP/1.1
Authorization: OAuth realm="",oauth_no...
Host: stream.twitter.com
Accept: */*
Sign it with OAuth
3. 2. Handle the response
HTTP/1.1 200 OK
Content-Type: application/json
Transfer-Encoding: chunked
Server: Jetty(6.1.25)
{"in_reply_to_status_id_str":null,"geo":n
ull,"text":"Hey fat boyyyyy!!...
What’s missing? No Content-Length header!
HTTP Libraries won’t know when to stop consuming
4. 3. Keep handling it
• Twitter will keep sending you JSON chunks
until you kill the connection
• Normal HTTP libraries make this awkward
• Try non-blocking / event-based I/O
5. The Streaming API lets
you do cool stuff
• Sample the firehose, free ~1%
• Get interesting subsets of all tweets
• By filtering, can get all tweets on a topic
• Interact in near-realtime
• Process more efficiently