官术网_书友最值得收藏!

How to do it...

  1. Create the project from the following template:
$ sls create --template-url https://github.com/danteinc/js-cloud-native-cookbook/tree/master/ch2/data-lake-s3 --path cncb-data-lake-s3
  1. Navigate to the cncb-data-lake-s3 directory with cd cncb-data-lake-s3.
  2. Review the file named serverless.yml with the following content:
service: cncb-data-lake-s3

provider:
name: aws
runtime: nodejs8.10

functions:
transformer:
handler: handler.transform
timeout: 120

resources:
Resources:
Bucket:
Type: AWS::S3::Bucket
DeletionPolicy: Retain
DeliveryStream:
Type: AWS::KinesisFirehose::DeliveryStream
Properties:
DeliveryStreamType: KinesisStreamAsSource
KinesisStreamSourceConfiguration:
KinesisStreamARN: ${cf:cncb-event-stream-${opt:stage}.streamArn}
...
ExtendedS3DestinationConfiguration:
BucketARN:
Fn::GetAtt: [ Bucket, Arn ]
Prefix: ${cf:cncb-event-stream-${opt:stage}.streamName}/
...

Outputs:
DataLakeBucketName:
Value:
Ref: Bucket
  1. Review the file named handler.js with the following content:
exports.transform = (event, context, callback) => {
const output = event.records.map((record, i) => {
// store all available data
const uow = {
event: JSON.parse((Buffer.from(record.data, 'base64')).toString('utf8')),
kinesisRecordMetadata: record.kinesisRecordMetadata,
firehoseRecordMetadata: {
deliveryStreamArn: event.deliveryStreamArn,
region: event.region,
invocationId: event.invocationId,
recordId: record.recordId,
approximateArrivalTimestamp: record.approximateArrivalTimestamp,
}
};

return {
recordId: record.recordId,
result: 'Ok',
data: Buffer.from(JSON.stringify(uow) + '\n', 'utf-8').toString('base64'),
};
});

callback(null, { records: output });
};
  1. Install the dependencies with npm install.
  2. Run the tests with npm test -- -s $MY_STAGE.
  3. Review the contents generated in the .serverless directory.
  4. Deploy the stack:
$ npm run dp:lcl -- -s $MY_STAGE

> cncb-data-lake-s3@1.0.0 dp:lcl <path-to-your-workspace>/cncb-data-lake-s3
> sls deploy -v -r us-east-1 "-s" "john"

Serverless: Packaging service...
...
Serverless: Stack update finished...
...
Stack Outputs
DataLakeBucketName: cncb-data-lake-s3-john-bucket-1851i1c16lnha
...
  1. Review the stack, data lake bucket, and Firehose delivery stream in the AWS Console.
  2. Publish an event from a separate Terminal with the following commands:
$ cd <path-to-your-workspace>/cncb-event-stream
$ sls invoke -r us-east-1 -f publish -s $MY_STAGE -d '{"type":"thing-created"}'
{
"ShardId": "shardId-000000000000",
"SequenceNumber": "49582906351415672136958521360120605392824155736450793474"
}
  1. Allow the Firehose buffer time to process and then review the data lake contents created in the S3 bucket.
  1. Remove the stack once you have finished with npm run rm:lcl -- -s $MY_STAGE.
Remove the data lake stack after you have worked through all the other recipes. This will allow you to watch the data lake accumulating all the other events.
主站蜘蛛池模板: 松阳县| 泗洪县| 郑州市| 响水县| 噶尔县| 台湾省| 济宁市| 遂平县| 乌拉特前旗| 景宁| 通辽市| 普格县| 睢宁县| 衡阳市| 栖霞市| 陆良县| 南投市| 达拉特旗| 洪江市| 怀柔区| 东阿县| 武威市| 兴和县| 兴化市| 枞阳县| 道孚县| 新营市| 皋兰县| 青田县| 永胜县| 平谷区| 五寨县| 运城市| 高邑县| 伊吾县| 南木林县| 南川市| 昭平县| 七台河市| 东乌珠穆沁旗| 分宜县|