Bring your own code.
Simple resource model: only one thing that needs to be configured -> memory. CPU and Network is allocated proportionately which means tha t a 256MB function will have twice the CPU and Network as a 128MB one.
Flexible use: trigger or invoke synchronously or asynchonously. Hook up with many othe other AWS services
Use IAM roles under the hood. So you can very fine grain security so you can for example say my lambda function can access only one particular S3 bucket. VPC integration makes it even more control over what your lambda funciton can and cannot do.
Build your function the same way you would do in your standard enviroment (threads.. )
Deploy using existing tools and plugins, cli tools and frameworks (demo)
Lambda function are stateless so you need to use S3, elasticache or dynamodb to persist the state so you can excahneg data betweene functions.
Use Amazon Cloudwatch for monitoring
Also a classic in Data Lake designs.
Amazon S3 is a simple key-based object store whose scalability and low cost make it ideal for storing large datasets.
S3 to provide excellent performance for storing and retrieving objects based on a known key.
Taking advantage of AWS Lambda event-driven triggers from S3.
S3 storage tier – lifecycle policies
Modern businesses typically collect data from internal and external sources at various frequencies throughout the day. These data sources could be franchise stores, subsidiaries, or new systems integrated as a result of merger and acquisitions.
For example, a retail chain might collect point-of-sale (POS) data from all franchise stores three times a day to get insights into sales as well as to identify the right number of staff at a given time in any given store. As each franchise functions as an independent business, the format and structure of the data might not be consistent across the board. Depending on the geographical region, each franchise would provide data at a different frequency and the analysis of these datasets should wait until all the required data is provided (event-driven) from the individual franchises. In most cases, the individual data volumes received from each franchise are usually small but the velocity of the data being generated and the collective volume can be challenging to manage.
The basics are pretty simple, but the service has deep functionality.
You can send the service a simple string of text, and it will generate the life like voice in your choice of 47 different voices.
But it’s not naive of the context of the text. For example, the text here - ‘WA’ and ‘degree F’, that would sound strange if it were spoken out loud.
Instead, Polly will automatically expand the text strings ‘WA’ and ‘degree F’, to ‘Washington’ and ‘degrees fahrenheit’, to create more life like speech. The developer doesn’t have to do anything - just send the text, and get life like voice back.
30
24
Amazon Rekognition currently supports the JPEG and PNG image formats. You can submit images either as an S3 object or as a byte array.Amazon Rekognition supports image file sizes up to 15MB when passed as an S3 object, and up to 5MB when submitted as an image byte array.Amazon Rekognition is currently available in US East (Northern Virginia), US West (Oregon) and EU (Ireland) regions.
Mxnet convolutional deep neural networks (CNNs),
26
24
Sometimes, though, you may want to analyze the anomalies or at least be notified of their presence.
such algo could be : calculating an average and standard deviation of the time-series data
Data is sent from our sensor to AWS IoT, where it is routed to AWS Lambda through the AWS IoT Rules Engine. Lambda executes the logic for anomaly detection and because the algorithm requires knowledge of previous measurements, uses Amazon DynamoDB as a key-value store. The Lambda function republishes the message along with parameters extracted from the PEWMA algorithm. The results can be viewed in your browser through a WebSocket connection to AWS IoT on your local machine
To build that, no genius was involved .. but like Edison, we stitched innovations together to create a Wow experience for our customers. (security, scalable, reliable..)
And this is exactly what the cloud is about!
Just few years ago, it would have taken months or even years to build a scalable, reliable and secure application like that from scratch.